Project acronym 2F4BIODYN
Project Two-Field Nuclear Magnetic Resonance Spectroscopy for the Exploration of Biomolecular Dynamics
Researcher (PI) Fabien Ferrage
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE4, ERC-2011-StG_20101014
Summary The paradigm of the structure-function relationship in proteins is outdated. Biological macromolecules and supramolecular assemblies are highly dynamic objects. Evidence that their motions are of utmost importance to their functions is regularly identified. The understanding of the physical chemistry of biological processes at an atomic level has to rely not only on the description of structure but also on the characterization of molecular motions.
The investigation of protein motions will be undertaken with a very innovative methodological approach in nuclear magnetic resonance relaxation. In order to widen the ranges of frequencies at which local motions in proteins are probed, we will first use and develop new techniques for a prototype shuttle system for the measurement of relaxation at low fields on a high-field NMR spectrometer. Second, we will develop a novel system: a set of low-field NMR spectrometers designed as accessories for high-field spectrometers. Used in conjunction with the shuttle, this system will offer (i) the sensitivity and resolution (i.e. atomic level information) of a high-field spectrometer (ii) the access to low fields of a relaxometer and (iii) the ability to measure a wide variety of relaxation rates with high accuracy. This system will benefit from the latest technology in homogeneous permanent magnet development to allow a control of spin systems identical to that of a high-resolution probe. This new apparatus will open the way to the use of NMR relaxation at low fields for the refinement of protein motions at an atomic scale.
Applications of this novel approach will focus on the bright side of protein dynamics: (i) the largely unexplored dynamics of intrinsically disordered proteins, and (ii) domain motions in large proteins. In both cases, we will investigate a series of diverse protein systems with implications in development, cancer and immunity.
Summary
The paradigm of the structure-function relationship in proteins is outdated. Biological macromolecules and supramolecular assemblies are highly dynamic objects. Evidence that their motions are of utmost importance to their functions is regularly identified. The understanding of the physical chemistry of biological processes at an atomic level has to rely not only on the description of structure but also on the characterization of molecular motions.
The investigation of protein motions will be undertaken with a very innovative methodological approach in nuclear magnetic resonance relaxation. In order to widen the ranges of frequencies at which local motions in proteins are probed, we will first use and develop new techniques for a prototype shuttle system for the measurement of relaxation at low fields on a high-field NMR spectrometer. Second, we will develop a novel system: a set of low-field NMR spectrometers designed as accessories for high-field spectrometers. Used in conjunction with the shuttle, this system will offer (i) the sensitivity and resolution (i.e. atomic level information) of a high-field spectrometer (ii) the access to low fields of a relaxometer and (iii) the ability to measure a wide variety of relaxation rates with high accuracy. This system will benefit from the latest technology in homogeneous permanent magnet development to allow a control of spin systems identical to that of a high-resolution probe. This new apparatus will open the way to the use of NMR relaxation at low fields for the refinement of protein motions at an atomic scale.
Applications of this novel approach will focus on the bright side of protein dynamics: (i) the largely unexplored dynamics of intrinsically disordered proteins, and (ii) domain motions in large proteins. In both cases, we will investigate a series of diverse protein systems with implications in development, cancer and immunity.
Max ERC Funding
1 462 080 €
Duration
Start date: 2012-01-01, End date: 2017-12-31
Project acronym ANAMORPHISM
Project Asymptotic and Numerical Analysis of MOdels of Resonant Physics Involving Structured Materials
Researcher (PI) Sebastien Roger Louis Guenneau
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE8, ERC-2011-StG_20101014
Summary One already available method to expand the range of material properties is to adjust the composition of materials at the molecular level using chemistry. We would like to develop the alternative approach of homogenization which broadens the definition of a material to include artificially structured media (fluids and solids) in which the effective electromagnetic, hydrodynamic or elastic responses result from a macroscopic patterning or arrangement of two or more distinct materials. This project will explore the latter avenue in order to markedly enhance control of surface water waves and elastodynamic waves propagating within artificially structured fluids and solid materials, thereafter called acoustic metamaterials.
Pendry's perfect lens, the paradigm of electromagnetic metamaterials, is a slab of negative refractive index material that takes rays of light and causes them to converge with unprecedented resolution. This flat lens is a combination of periodically arranged resonant electric and magnetic elements. We will draw systematic analogies with resonant mechanical systems in order to achieve similar control of hydrodynamic and elastic waves. This will allow us to extend the design of metamaterials to acoustics to go beyond the scope of Snell-Descartes' laws of optics and Newton's laws of mechanics.
Acoustic metamaterials allow the construction of invisibility cloaks for non-linear surface water waves (e.g. tsunamis) propagating in structured fluids, as well as seismic waves propagating in thin structured elastic plates.
Maritime and civil engineering applications are in the protection of harbours, off-shore platforms and anti-earthquake passive systems. Acoustic cloaks for an enhanced control of pressure waves in fluids will be also designed for underwater camouflaging.
Light and sound interplay will be finally analysed in order to design controllable metamaterials with a special emphasis on undetectable microstructured fibres (acoustic wormholes).
Summary
One already available method to expand the range of material properties is to adjust the composition of materials at the molecular level using chemistry. We would like to develop the alternative approach of homogenization which broadens the definition of a material to include artificially structured media (fluids and solids) in which the effective electromagnetic, hydrodynamic or elastic responses result from a macroscopic patterning or arrangement of two or more distinct materials. This project will explore the latter avenue in order to markedly enhance control of surface water waves and elastodynamic waves propagating within artificially structured fluids and solid materials, thereafter called acoustic metamaterials.
Pendry's perfect lens, the paradigm of electromagnetic metamaterials, is a slab of negative refractive index material that takes rays of light and causes them to converge with unprecedented resolution. This flat lens is a combination of periodically arranged resonant electric and magnetic elements. We will draw systematic analogies with resonant mechanical systems in order to achieve similar control of hydrodynamic and elastic waves. This will allow us to extend the design of metamaterials to acoustics to go beyond the scope of Snell-Descartes' laws of optics and Newton's laws of mechanics.
Acoustic metamaterials allow the construction of invisibility cloaks for non-linear surface water waves (e.g. tsunamis) propagating in structured fluids, as well as seismic waves propagating in thin structured elastic plates.
Maritime and civil engineering applications are in the protection of harbours, off-shore platforms and anti-earthquake passive systems. Acoustic cloaks for an enhanced control of pressure waves in fluids will be also designed for underwater camouflaging.
Light and sound interplay will be finally analysed in order to design controllable metamaterials with a special emphasis on undetectable microstructured fibres (acoustic wormholes).
Max ERC Funding
1 280 391 €
Duration
Start date: 2011-10-01, End date: 2016-09-30
Project acronym ANTICS
Project Algorithmic Number Theory in Computer Science
Researcher (PI) Andreas Enge
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary "During the past twenty years, we have witnessed profound technological changes, summarised under the terms of digital revolution or entering the information age. It is evident that these technological changes will have a deep societal impact, and questions of privacy and security are primordial to ensure the survival of a free and open society.
Cryptology is a main building block of any security solution, and at the heart of projects such as electronic identity and health cards, access control, digital content distribution or electronic voting, to mention only a few important applications. During the past decades, public-key cryptology has established itself as a research topic in computer science; tools of theoretical computer science are employed to “prove” the security of cryptographic primitives such as encryption or digital signatures and of more complex protocols. It is often forgotten, however, that all practically relevant public-key cryptosystems are rooted in pure mathematics, in particular, number theory and arithmetic geometry. In fact, the socalled security “proofs” are all conditional to the algorithmic untractability of certain number theoretic problems, such as factorisation of large integers or discrete logarithms in algebraic curves. Unfortunately, there is a large cultural gap between computer scientists using a black-box security reduction to a supposedly hard problem in algorithmic number theory and number theorists, who are often interested in solving small and easy instances of the same problem. The theoretical grounds on which current algorithmic number theory operates are actually rather shaky, and cryptologists are generally unaware of this fact.
The central goal of ANTICS is to rebuild algorithmic number theory on the firm grounds of theoretical computer science."
Summary
"During the past twenty years, we have witnessed profound technological changes, summarised under the terms of digital revolution or entering the information age. It is evident that these technological changes will have a deep societal impact, and questions of privacy and security are primordial to ensure the survival of a free and open society.
Cryptology is a main building block of any security solution, and at the heart of projects such as electronic identity and health cards, access control, digital content distribution or electronic voting, to mention only a few important applications. During the past decades, public-key cryptology has established itself as a research topic in computer science; tools of theoretical computer science are employed to “prove” the security of cryptographic primitives such as encryption or digital signatures and of more complex protocols. It is often forgotten, however, that all practically relevant public-key cryptosystems are rooted in pure mathematics, in particular, number theory and arithmetic geometry. In fact, the socalled security “proofs” are all conditional to the algorithmic untractability of certain number theoretic problems, such as factorisation of large integers or discrete logarithms in algebraic curves. Unfortunately, there is a large cultural gap between computer scientists using a black-box security reduction to a supposedly hard problem in algorithmic number theory and number theorists, who are often interested in solving small and easy instances of the same problem. The theoretical grounds on which current algorithmic number theory operates are actually rather shaky, and cryptologists are generally unaware of this fact.
The central goal of ANTICS is to rebuild algorithmic number theory on the firm grounds of theoretical computer science."
Max ERC Funding
1 453 507 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym BetaRegeneration
Project Induction of Insulin-producing beta-cells Regeneration in vivo
Researcher (PI) Patrick Collombat
Host Institution (HI) INSTITUT NATIONAL DE LA SANTE ET DE LA RECHERCHE MEDICALE
Call Details Starting Grant (StG), LS4, ERC-2011-StG_20101109
Summary Diabetes has become one of the most widespread metabolic disorders with epidemic dimensions affecting almost 6% of the world’s population. Despite modern treatments, the life expectancy of patients with Type 1 diabetes remains reduced as compared to healthy subjects. There is therefore a need for alternative therapies. Towards this aim, using the mouse, we recently demonstrated that the in vivo forced expression of a single factor in pancreatic alpha-cells is sufficient to induce a continuous regeneration of alpha-cells and their subsequent conversion into beta-like cells, such converted cells being capable of reversing the consequences of chemically-induced diabetes in vivo (Collombat et al. Cell, 2009).
The PI and his team therefore propose to further decipher the mechanisms involved in this alpha-cell-mediated beta-cell regeneration process and determine whether this approach may be applied to adult animals and whether it would efficiently reverse Type 1 diabetes. Furthermore, a major effort will be made to verify whether our findings could be translated to human. Specifically, we will use a tri-partite approach to address the following issues: (1) Can the in vivo alpha-cell-mediated beta-cell regeneration be induced in adults mice? What would be the genetic determinants involved? (2) Can alpha-cell-mediated beta-cell regeneration reverse diabetes in the NOD Type 1 diabetes mouse model? (3) Can adult human alpha-cells be converted into beta-like cells?
Together, these ambitious objectives will most certainly allow us to gain new insight into the mechanisms defining the identity and the reprogramming capabilities of mouse and human endocrine cells and may thereby open new avenues for the treatment of diabetes. Similarly, the determination of the molecular triggers implicated in the beta-cell regeneration observed in our diabetic mice may lead to exciting new findings, including the identification of “drugable” targets of importance for human diabetic patients.
Summary
Diabetes has become one of the most widespread metabolic disorders with epidemic dimensions affecting almost 6% of the world’s population. Despite modern treatments, the life expectancy of patients with Type 1 diabetes remains reduced as compared to healthy subjects. There is therefore a need for alternative therapies. Towards this aim, using the mouse, we recently demonstrated that the in vivo forced expression of a single factor in pancreatic alpha-cells is sufficient to induce a continuous regeneration of alpha-cells and their subsequent conversion into beta-like cells, such converted cells being capable of reversing the consequences of chemically-induced diabetes in vivo (Collombat et al. Cell, 2009).
The PI and his team therefore propose to further decipher the mechanisms involved in this alpha-cell-mediated beta-cell regeneration process and determine whether this approach may be applied to adult animals and whether it would efficiently reverse Type 1 diabetes. Furthermore, a major effort will be made to verify whether our findings could be translated to human. Specifically, we will use a tri-partite approach to address the following issues: (1) Can the in vivo alpha-cell-mediated beta-cell regeneration be induced in adults mice? What would be the genetic determinants involved? (2) Can alpha-cell-mediated beta-cell regeneration reverse diabetes in the NOD Type 1 diabetes mouse model? (3) Can adult human alpha-cells be converted into beta-like cells?
Together, these ambitious objectives will most certainly allow us to gain new insight into the mechanisms defining the identity and the reprogramming capabilities of mouse and human endocrine cells and may thereby open new avenues for the treatment of diabetes. Similarly, the determination of the molecular triggers implicated in the beta-cell regeneration observed in our diabetic mice may lead to exciting new findings, including the identification of “drugable” targets of importance for human diabetic patients.
Max ERC Funding
1 500 000 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym BRiCPT
Project Basic Research in Cryptographic Protocol Theory
Researcher (PI) Jesper Buus Nielsen
Host Institution (HI) AARHUS UNIVERSITET
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary In cryptographic protocol theory, we consider a situation where a number of entities want to solve some problem over a computer network. Each entity has some secret data it does not want the other entities to learn, yet, they all want to learn something about the common set of data. In an electronic election, they want to know the number of yes-votes without revealing who voted what. For instance, in an electronic auction, they want to find the winner without leaking the bids of the losers.
A main focus of the project is to develop new techniques for solving such protocol problems. We are in particular interested in techniques which can automatically construct a protocol solving a problem given only a description of what the problem is. My focus will be theoretical basic research, but I believe that advancing the theory of secure protocol compilers will have an immense impact on the practice of developing secure protocols for practice.
When one develops complex protocols, it is important to be able to verify their correctness before they are deployed, in particular so, when the purpose of the protocols is to protect information. If and when an error is found and corrected, the sensitive data will possibly already be compromised. Therefore, cryptographic protocol theory develops models of what it means for a protocol to be secure, and techniques for analyzing whether a given protocol is secure or not.
A main focuses of the project is to develop better security models, as existing security models either suffer from the problem that it is possible to prove some protocols secure which are not secure in practice, or they suffer from the problem that it is impossible to prove security of some protocol which are believed to be secure in practice. My focus will again be on theoretical basic research, but I believe that better security models are important for advancing a practice where protocols are verified as secure before deployed.
Summary
In cryptographic protocol theory, we consider a situation where a number of entities want to solve some problem over a computer network. Each entity has some secret data it does not want the other entities to learn, yet, they all want to learn something about the common set of data. In an electronic election, they want to know the number of yes-votes without revealing who voted what. For instance, in an electronic auction, they want to find the winner without leaking the bids of the losers.
A main focus of the project is to develop new techniques for solving such protocol problems. We are in particular interested in techniques which can automatically construct a protocol solving a problem given only a description of what the problem is. My focus will be theoretical basic research, but I believe that advancing the theory of secure protocol compilers will have an immense impact on the practice of developing secure protocols for practice.
When one develops complex protocols, it is important to be able to verify their correctness before they are deployed, in particular so, when the purpose of the protocols is to protect information. If and when an error is found and corrected, the sensitive data will possibly already be compromised. Therefore, cryptographic protocol theory develops models of what it means for a protocol to be secure, and techniques for analyzing whether a given protocol is secure or not.
A main focuses of the project is to develop better security models, as existing security models either suffer from the problem that it is possible to prove some protocols secure which are not secure in practice, or they suffer from the problem that it is impossible to prove security of some protocol which are believed to be secure in practice. My focus will again be on theoretical basic research, but I believe that better security models are important for advancing a practice where protocols are verified as secure before deployed.
Max ERC Funding
1 171 019 €
Duration
Start date: 2011-12-01, End date: 2016-11-30
Project acronym CHILDGROWTH2CANCER
Project Childhood body size, growth and pubertal timing and the risk of cancer in adulthood
Researcher (PI) Jennifer Lyn Baker
Host Institution (HI) REGION HOVEDSTADEN
Call Details Starting Grant (StG), LS7, ERC-2011-StG_20101109
Summary The goal of the proposed research is to examine how the independent and combined effects of childhood adiposity (assessed by body mass index [BMI]; kg/m2) height, change in BMI and height, and pubertal timing from the ages of 7 to 13 years are associated with the risk of cancer incidence in adulthood. Greater body size (adipose tissue and different types of lean tissue) reflecting past or ongoing growth may increase the risk of cancer in individuals as greater numbers of proliferating cells increase the risk that mutations leading to the subsequent development of cancer occur. As childhood is a period of growth, it is plausible that it is of particular relevance for the early establishment of the risk of cancer.
Data from the Copenhagen School Health Records Register, which is based on a population of schoolchildren born between 1930-1983 and contains computerised weight and height measurements on >350.000 boys and girls in the capital city of Denmark, as well as data from other cohorts will be used. Survival analysis techniques and the newly developed Dynamic Path Analysis model will be used to examine how body size (BMI and height) at each age from 7 to 13 years as well as change in body size during this period is associated with the risk of multiple forms of cancer in adulthood with a simultaneous exploration of the effects of birth weight and pubertal timing. Additionally, potential effects of childhood and adult health and social circumstances will be investigated in sub-cohorts with this information available.
Results from this research will demonstrate if childhood is a critical period for the establishment of the risk for cancer in adulthood and will lead into mechanistic explorations of the associations at the biological level, investigations into associations between childhood body size and mortality and contribute to developing improved definitions of childhood overweight and obesity that are based upon long-term health outcomes.
Summary
The goal of the proposed research is to examine how the independent and combined effects of childhood adiposity (assessed by body mass index [BMI]; kg/m2) height, change in BMI and height, and pubertal timing from the ages of 7 to 13 years are associated with the risk of cancer incidence in adulthood. Greater body size (adipose tissue and different types of lean tissue) reflecting past or ongoing growth may increase the risk of cancer in individuals as greater numbers of proliferating cells increase the risk that mutations leading to the subsequent development of cancer occur. As childhood is a period of growth, it is plausible that it is of particular relevance for the early establishment of the risk of cancer.
Data from the Copenhagen School Health Records Register, which is based on a population of schoolchildren born between 1930-1983 and contains computerised weight and height measurements on >350.000 boys and girls in the capital city of Denmark, as well as data from other cohorts will be used. Survival analysis techniques and the newly developed Dynamic Path Analysis model will be used to examine how body size (BMI and height) at each age from 7 to 13 years as well as change in body size during this period is associated with the risk of multiple forms of cancer in adulthood with a simultaneous exploration of the effects of birth weight and pubertal timing. Additionally, potential effects of childhood and adult health and social circumstances will be investigated in sub-cohorts with this information available.
Results from this research will demonstrate if childhood is a critical period for the establishment of the risk for cancer in adulthood and will lead into mechanistic explorations of the associations at the biological level, investigations into associations between childhood body size and mortality and contribute to developing improved definitions of childhood overweight and obesity that are based upon long-term health outcomes.
Max ERC Funding
1 199 998 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym CHROMATINREPLICATION
Project How to Replicate Chromatin - Maturation, Timing Control and Stress-Induced Aberrations
Researcher (PI) Anja Groth
Host Institution (HI) KOBENHAVNS UNIVERSITET
Call Details Starting Grant (StG), LS1, ERC-2011-StG_20101109
Summary Inheritance of DNA sequence and its proper organization into chromatin is fundamental for eukaryotic life. The challenge of propagating genetic and epigenetic information is met in S phase and entails genome-wide disruption and restoration of chromatin coupled to faithful copying of DNA. How specific chromatin structures are restored on new DNA and transmitted through mitotic cell division remains a fundamental question in biology central to understand cell fate and identity.
Chromatin restoration on new DNA involves a complex set of events including nucleosome assembly and remodelling, restoration of marks on DNA and histones, deposition of histone variants and establishment of higher order chromosomal structures including sister-chromatid cohesion. To dissect these fundamental processes and their coordination in time and space with DNA replication, we have developed a novel technology termed nascent chromatin capture (NCC) that provides unique possibility for biochemical and proteomic analysis of chromatin replication in human cells. I propose to apply this innovative cutting-edge technique for a comprehensive characterization of chromatin restoration during DNA replication and to reveal how replication timing and genotoxic stress impact on final chromatin state. This highly topical project brings together the fields of chromatin biology, DNA replication, epigenetics and genome stability and we expect to make groundbreaking discoveries that will improve our understanding of human development, somatic cell reprogramming and complex diseases like cancer.
The proposed research will 1) identify and characterize novel mechanisms in chromatin restoration and 2) address molecularly how replication timing and genotoxic insults influence chromatin maturation and final chromatin state.
Summary
Inheritance of DNA sequence and its proper organization into chromatin is fundamental for eukaryotic life. The challenge of propagating genetic and epigenetic information is met in S phase and entails genome-wide disruption and restoration of chromatin coupled to faithful copying of DNA. How specific chromatin structures are restored on new DNA and transmitted through mitotic cell division remains a fundamental question in biology central to understand cell fate and identity.
Chromatin restoration on new DNA involves a complex set of events including nucleosome assembly and remodelling, restoration of marks on DNA and histones, deposition of histone variants and establishment of higher order chromosomal structures including sister-chromatid cohesion. To dissect these fundamental processes and their coordination in time and space with DNA replication, we have developed a novel technology termed nascent chromatin capture (NCC) that provides unique possibility for biochemical and proteomic analysis of chromatin replication in human cells. I propose to apply this innovative cutting-edge technique for a comprehensive characterization of chromatin restoration during DNA replication and to reveal how replication timing and genotoxic stress impact on final chromatin state. This highly topical project brings together the fields of chromatin biology, DNA replication, epigenetics and genome stability and we expect to make groundbreaking discoveries that will improve our understanding of human development, somatic cell reprogramming and complex diseases like cancer.
The proposed research will 1) identify and characterize novel mechanisms in chromatin restoration and 2) address molecularly how replication timing and genotoxic insults influence chromatin maturation and final chromatin state.
Max ERC Funding
1 692 737 €
Duration
Start date: 2011-11-01, End date: 2017-04-30
Project acronym COLDNANO
Project UltraCOLD ion and electron beams for NANOscience
Researcher (PI) Daniel Comparat
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE7, ERC-2011-StG_20101014
Summary COLDNANO (UltraCOLD ion and electron beams for NANOscience), aspires to build novel ion and electron sources with superior performance in terms of brightness, energy spread and minimum achievable spot size. Such monochromatic, spatially focused and well controlled electron and ion beams are expected to open many research possibilities in material sciences, in surface investigations (imaging, lithography) and in semiconductor diagnostics. The proposed project intends to develop sources with the best beam quality ever produced and to assess them in some advanced surface science research domains. Laterally, I will develop expertise exchange with one Small and Medium Enterprise who will exploit industrial prototypes.
The novel concept is to create ion and electron sources using advanced laser cooling techniques combined with the particular ionization properties of cold atoms. It would then be first time that “laser cooling” would lead to a real industrial development.
A cesium magneto-optical trap will first be used. The atoms will then be excited by lasers and ionized in order to provide the electron source. The specific extraction optics for the electrons will be developed. This source will be compact and portable to be used for several applications such as Low Energy Electron Microscopy, functionalization of semi-conducting surfaces or high resolution Electron Energy Loss Spectrometry by coupling to a Scanning Transmission Electron Microscope.
Based on the knowledge developed with the first experiment, a second ambitious xenon dual ion and electron beam machine will then be realized and used to study the scattering of ion and electron at low energy.
Finally, I present a very innovative scheme to control the time, position and velocity of individual particles in the beams. Such a machine providing ions or electrons on demand would open the way for the “ultimate” resolution in time and space for surface analysis, lithography, microscopy or implantation.
Summary
COLDNANO (UltraCOLD ion and electron beams for NANOscience), aspires to build novel ion and electron sources with superior performance in terms of brightness, energy spread and minimum achievable spot size. Such monochromatic, spatially focused and well controlled electron and ion beams are expected to open many research possibilities in material sciences, in surface investigations (imaging, lithography) and in semiconductor diagnostics. The proposed project intends to develop sources with the best beam quality ever produced and to assess them in some advanced surface science research domains. Laterally, I will develop expertise exchange with one Small and Medium Enterprise who will exploit industrial prototypes.
The novel concept is to create ion and electron sources using advanced laser cooling techniques combined with the particular ionization properties of cold atoms. It would then be first time that “laser cooling” would lead to a real industrial development.
A cesium magneto-optical trap will first be used. The atoms will then be excited by lasers and ionized in order to provide the electron source. The specific extraction optics for the electrons will be developed. This source will be compact and portable to be used for several applications such as Low Energy Electron Microscopy, functionalization of semi-conducting surfaces or high resolution Electron Energy Loss Spectrometry by coupling to a Scanning Transmission Electron Microscope.
Based on the knowledge developed with the first experiment, a second ambitious xenon dual ion and electron beam machine will then be realized and used to study the scattering of ion and electron at low energy.
Finally, I present a very innovative scheme to control the time, position and velocity of individual particles in the beams. Such a machine providing ions or electrons on demand would open the way for the “ultimate” resolution in time and space for surface analysis, lithography, microscopy or implantation.
Max ERC Funding
1 944 000 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym COMEDIA
Project Complex Media Investigation with Adaptive Optics
Researcher (PI) Sylvain Hervé Gigan
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE2, ERC-2011-StG_20101014
Summary "Wave propagation in complex (disordered) media stretches our knowledge to the limit in many different fields of physics. It has important applications in seismology, acoustics, radar, and condensed matter. It is a problem of large fundamental interest, notably for the study of Anderson localization.
In optics, it is of great importance in photonic devices, such as photonic crystals, plasmonic structures or random lasers. It is also at the heart of many biomedical-imaging issues: scattering ultimately limits the depth and resolution of all imaging techniques.
We have recently demonstrated that wavefront shaping –i.e. adaptive optics applied to complex media- is the tool of choice to match and address the huge complexity of this problem in optics. The COMEDIA project aims at developing a novel wavefront shaping toolbox, addressing both spatial and spectral degrees of freedom of light. Thanks to this toolbox, we plan to fulfill the following objectives:
1) A full spatiotemporal control of the optical field in a complex environment,
2) Breakthrough results in imaging and nano-optics,
3) Original answers to some of the most intriguing fundamental questions in mesoscopic physics."
Summary
"Wave propagation in complex (disordered) media stretches our knowledge to the limit in many different fields of physics. It has important applications in seismology, acoustics, radar, and condensed matter. It is a problem of large fundamental interest, notably for the study of Anderson localization.
In optics, it is of great importance in photonic devices, such as photonic crystals, plasmonic structures or random lasers. It is also at the heart of many biomedical-imaging issues: scattering ultimately limits the depth and resolution of all imaging techniques.
We have recently demonstrated that wavefront shaping –i.e. adaptive optics applied to complex media- is the tool of choice to match and address the huge complexity of this problem in optics. The COMEDIA project aims at developing a novel wavefront shaping toolbox, addressing both spatial and spectral degrees of freedom of light. Thanks to this toolbox, we plan to fulfill the following objectives:
1) A full spatiotemporal control of the optical field in a complex environment,
2) Breakthrough results in imaging and nano-optics,
3) Original answers to some of the most intriguing fundamental questions in mesoscopic physics."
Max ERC Funding
1 497 000 €
Duration
Start date: 2011-11-01, End date: 2016-10-31
Project acronym DIBOSON
Project Direct and Indirect Searches for New Physics with Diboson Final States at ATLAS
Researcher (PI) Samira Hassani
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE2, ERC-2011-StG_20101014
Summary The Large Hadron Collider (LHC) at the European Organisation for Nuclear Research (CERN) promises a major step forward in the understanding of the fundamental nature of matter. Four large experiments at the LHC are complementary addressing the question of the origin of our Universe by searching for the so-called New Physics.
The ”Standard Model” (SM), the theory that reflects our understanding of elementary particles and their fundamental interactions, has been extensively studied and experimentally verified to an unprecedented precision over the past decades. Despite its impressive success, there are many unanswered questions; which suggest that there is a more fundamental theory which incorporates New Physics. It is expected that at the LHC either New Physics beyond the SM will be discovered or excluded up to a very high energies, thus our view of the fundamental structure of the Universe will be challenged and probably revolutionized in the coming years.
The ATLAS experiment is dedicated to address the key issue of ElectroWeak Symmetry Breaking (EWSB) and linked to this the search for the Higgs boson as well as the search for Physics beyond the Standard Model. The analysis proposed here is measurement and searches for New Physics in diboson processes . The New Physics effects in the diboson sector will be observed either directly, as in the case of new particle production decaying to diboson, e.g., new vector bosons
and extra-dimensions, or indirectly through deviations from the SM predictions of observable such as cross sections and asymmetries. Triple gauge boson self-coupling (TGC) are extremely sensitive to New Physics, thus a very powerful tool for indirect searches for New Physics contributions through loop corrections.
At the LHC, the unprecedented center-of-mass energy and luminosity will allow to measure the TGC with a high accuracy and to probe regions that are inaccessible at previous experiments even with modest amounts of data.
Summary
The Large Hadron Collider (LHC) at the European Organisation for Nuclear Research (CERN) promises a major step forward in the understanding of the fundamental nature of matter. Four large experiments at the LHC are complementary addressing the question of the origin of our Universe by searching for the so-called New Physics.
The ”Standard Model” (SM), the theory that reflects our understanding of elementary particles and their fundamental interactions, has been extensively studied and experimentally verified to an unprecedented precision over the past decades. Despite its impressive success, there are many unanswered questions; which suggest that there is a more fundamental theory which incorporates New Physics. It is expected that at the LHC either New Physics beyond the SM will be discovered or excluded up to a very high energies, thus our view of the fundamental structure of the Universe will be challenged and probably revolutionized in the coming years.
The ATLAS experiment is dedicated to address the key issue of ElectroWeak Symmetry Breaking (EWSB) and linked to this the search for the Higgs boson as well as the search for Physics beyond the Standard Model. The analysis proposed here is measurement and searches for New Physics in diboson processes . The New Physics effects in the diboson sector will be observed either directly, as in the case of new particle production decaying to diboson, e.g., new vector bosons
and extra-dimensions, or indirectly through deviations from the SM predictions of observable such as cross sections and asymmetries. Triple gauge boson self-coupling (TGC) are extremely sensitive to New Physics, thus a very powerful tool for indirect searches for New Physics contributions through loop corrections.
At the LHC, the unprecedented center-of-mass energy and luminosity will allow to measure the TGC with a high accuracy and to probe regions that are inaccessible at previous experiments even with modest amounts of data.
Max ERC Funding
904 190 €
Duration
Start date: 2011-12-01, End date: 2016-11-30
Project acronym DIRONAKI
Project Differentiation and role of Natural Killer cell subsets
Researcher (PI) Thierry Walzer
Host Institution (HI) INSTITUT NATIONAL DE LA SANTE ET DE LA RECHERCHE MEDICALE
Call Details Starting Grant (StG), LS6, ERC-2011-StG_20101109
Summary "NK cells are innate lymphocytes that play a role in the early response against intracellular pathogens and against tumors. Several NK cell subsets have been described in peripheral organs that correspond to discrete stages of in vivo maturation. How NK cells differentiate from early precursors and what are the specific functions of each NK cell subset are unresolved issues. Here, we propose a three-aim program to address these questions. First, we want to revisit the partition of the NK cell population that is currently based on surface markers of undefined function by looking at the expression of transcription factors (TF) essential for NK cell development and maturation, such as T-bet and Eomes, using novel TF reporter mice. This strategy should also allow us to identify very early steps of NK cell development (NK cell progenitors) that remain ill defined. Second, we will try and identify molecular mechanisms that induce transition between NK cell maturation stages. For this we will take advantage of a previous gene profiling analysis that pointed at several pathways and TF that were highly regulated during NK cell maturation. The role of these pathways and TF in the differentiation of NK cells will be measured using a novel Cre/lox system allowing NK-specific gene deletion. Detailed analysis of mouse mutants will be used to delineate the role of selected genes and pathways in NK cell differentiation. Third, we will compare patterns of migration, cytokine secretion, in vivo cytotoxicity and global gene expression by individual NK cell subsets during an airway infection by Influenza to get insight on the specific functions of NK cell subsets during immune responses. Altogether, the results of this study should provide developmental, molecular and functional evidences to support the physiological relevance of NK cell subsets. This may improve strategies that aim at manipulating NK cell function for the benefit of patients with cancer or chronic infectious diseases"
Summary
"NK cells are innate lymphocytes that play a role in the early response against intracellular pathogens and against tumors. Several NK cell subsets have been described in peripheral organs that correspond to discrete stages of in vivo maturation. How NK cells differentiate from early precursors and what are the specific functions of each NK cell subset are unresolved issues. Here, we propose a three-aim program to address these questions. First, we want to revisit the partition of the NK cell population that is currently based on surface markers of undefined function by looking at the expression of transcription factors (TF) essential for NK cell development and maturation, such as T-bet and Eomes, using novel TF reporter mice. This strategy should also allow us to identify very early steps of NK cell development (NK cell progenitors) that remain ill defined. Second, we will try and identify molecular mechanisms that induce transition between NK cell maturation stages. For this we will take advantage of a previous gene profiling analysis that pointed at several pathways and TF that were highly regulated during NK cell maturation. The role of these pathways and TF in the differentiation of NK cells will be measured using a novel Cre/lox system allowing NK-specific gene deletion. Detailed analysis of mouse mutants will be used to delineate the role of selected genes and pathways in NK cell differentiation. Third, we will compare patterns of migration, cytokine secretion, in vivo cytotoxicity and global gene expression by individual NK cell subsets during an airway infection by Influenza to get insight on the specific functions of NK cell subsets during immune responses. Altogether, the results of this study should provide developmental, molecular and functional evidences to support the physiological relevance of NK cell subsets. This may improve strategies that aim at manipulating NK cell function for the benefit of patients with cancer or chronic infectious diseases"
Max ERC Funding
1 340 757 €
Duration
Start date: 2012-01-01, End date: 2017-12-31
Project acronym DNATRAFFIC
Project DNA traffic during bacterial cell division
Researcher (PI) François-Xavier Andre Fernand Barre
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), LS1, ERC-2011-StG_20101109
Summary The molecular mechanisms that serve to couple DNA replication, chromosome segregation and cell division are largely unknown in bacteria. This led a considerable interest to the study of Escherichia coli FtsK, an essential cell division protein that assembles into DNA-pumps to transfer chromosomal DNA between the two daughter cell compartments during septation. Indeed, our recent work suggests that FtsK might regulate the late stages of septation to ensure DNA is fully cleared from the septum before it is allowed to close. This would be the first example of a cell cycle checkpoint in bacteria.
FtsK-mediated DNA transfer is required in 15% of the cells at each generation in E. coli, in which it serves to promote the resolution of topological problems arising from the circularity of the chromosome by Xer recombination. However, the FtsK checkpoint could be a more general feature of the bacterial cell cycle since FtsK is highly conserved among eubacteria, including species that do not possess a Xer system. Indeed, preliminary results from the lab indicate that DNA transfer by FtsK is required independently of Xer recombination in Vibrio cholerae.
To confirm the existence and the generality of the FtsK checkpoint in bacteria, we will determine the different situations that lead to a requirement for FtsK-mediated DNA transfer by studying chromosome segregation and cell division in V. cholerae. In parallel, we will apply new fluorescent microscopy tools to follow the progression of cell division and chromosome segregation in single live bacterial cells. PALM will notably serve to probe the structure of the FtsK DNA-pumps at a high spatial resolution, FRET will be used to determine their timing of assembly and their interactions with the other cell division proteins, and TIRF will serve to follow in real time their activity with respect to the progression of chromosome dimer resolution, chromosome segregation, and septum closure.
Summary
The molecular mechanisms that serve to couple DNA replication, chromosome segregation and cell division are largely unknown in bacteria. This led a considerable interest to the study of Escherichia coli FtsK, an essential cell division protein that assembles into DNA-pumps to transfer chromosomal DNA between the two daughter cell compartments during septation. Indeed, our recent work suggests that FtsK might regulate the late stages of septation to ensure DNA is fully cleared from the septum before it is allowed to close. This would be the first example of a cell cycle checkpoint in bacteria.
FtsK-mediated DNA transfer is required in 15% of the cells at each generation in E. coli, in which it serves to promote the resolution of topological problems arising from the circularity of the chromosome by Xer recombination. However, the FtsK checkpoint could be a more general feature of the bacterial cell cycle since FtsK is highly conserved among eubacteria, including species that do not possess a Xer system. Indeed, preliminary results from the lab indicate that DNA transfer by FtsK is required independently of Xer recombination in Vibrio cholerae.
To confirm the existence and the generality of the FtsK checkpoint in bacteria, we will determine the different situations that lead to a requirement for FtsK-mediated DNA transfer by studying chromosome segregation and cell division in V. cholerae. In parallel, we will apply new fluorescent microscopy tools to follow the progression of cell division and chromosome segregation in single live bacterial cells. PALM will notably serve to probe the structure of the FtsK DNA-pumps at a high spatial resolution, FRET will be used to determine their timing of assembly and their interactions with the other cell division proteins, and TIRF will serve to follow in real time their activity with respect to the progression of chromosome dimer resolution, chromosome segregation, and septum closure.
Max ERC Funding
1 565 938 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym DU
Project Demographic Uncertainty
Researcher (PI) Hippolyte Charles Guillaume D'albis
Host Institution (HI) ECOLE D'ECONOMIE DE PARIS
Call Details Starting Grant (StG), SH3, ERC-2011-StG_20101124
Summary "The aim of my research project is to build a mathematical model for the quantitative assessment of the effects of demographic changes on economic activity. It is an ambitious project as it involves the integration of the latest developments in demographic and economic models. It is also highly innovative as it proposes an original treatment of demographic uncertainty. Most existing models consider demographics as a deterministic variable and foresee a set of scenarios. At best, the models incorporate demographics as a risk variable and assume that agents know the stochastic process underlying the demographic dynamics. In the present research project, I wish to build a demographic-economic model in which the future demographics are uncertain. This will have three consequences. First, individual decisions are different and depend on the individuals' attitudes towards uncertainty. Second, the aggregation of individual decisions is more complex, especially because of the fact that the latter are not necessarily temporally consistent. Third, the approach to economic policy is renewed. The government is not necessarily perceived as an omniscient being who corrects market dysfunctions, but rather, it is itself under uncertainty and must compromise with the choices made by agents."
Summary
"The aim of my research project is to build a mathematical model for the quantitative assessment of the effects of demographic changes on economic activity. It is an ambitious project as it involves the integration of the latest developments in demographic and economic models. It is also highly innovative as it proposes an original treatment of demographic uncertainty. Most existing models consider demographics as a deterministic variable and foresee a set of scenarios. At best, the models incorporate demographics as a risk variable and assume that agents know the stochastic process underlying the demographic dynamics. In the present research project, I wish to build a demographic-economic model in which the future demographics are uncertain. This will have three consequences. First, individual decisions are different and depend on the individuals' attitudes towards uncertainty. Second, the aggregation of individual decisions is more complex, especially because of the fact that the latter are not necessarily temporally consistent. Third, the approach to economic policy is renewed. The government is not necessarily perceived as an omniscient being who corrects market dysfunctions, but rather, it is itself under uncertainty and must compromise with the choices made by agents."
Max ERC Funding
1 000 000 €
Duration
Start date: 2012-03-01, End date: 2017-02-28
Project acronym E-MARS
Project Evolution of Mars
Researcher (PI) Cathy Monique Quantin
Host Institution (HI) UNIVERSITE LYON 1 CLAUDE BERNARD
Call Details Starting Grant (StG), PE9, ERC-2011-StG_20101014
Summary The primary questions that drive the Mars exploration program focus on life. Has the Martian climate ever been favorable for life development? Such scenario would imply a distinct planetary system from today with a magnetic flied able to retain the atmosphere. Where is the evidence of such past climate and intern conditions? The clues for answering these questions are locked up in the geologic record of the planet. The volume of data acquired in the past 15 years by the 4 Martian orbiters (ESA and NASA) reach the petaoctet, what is indecent as regard to the size of the Martian community. e-Mars propose to built a science team composed by the PI, Two post-doctorates, one PhD student and one engineer to exploit the data characterizing the surface of Mars. e-Mars proposes the unprecedented approach to combine topographic data, imagery data in diverse spectral domain and hyperspectral data from multiple orbiter captors to study the evolution of Mars and to propose pertinent landing sites for next missions. e-Mars will focus on three scientific themes: the composition of the Martian crust to constraint the early evolution of the planet, the research of possible habitable places based on evidence of past liquid water activity from both morphological record and hydrated mineral locations, and the study of current climatic and geological processes driven by the CO2 cycle. These scientific themes will be supported by three axis of methodological development: the geodatabase management via Geographic Information Systems (G.I.S.)., the automatic hyperspectral data analysis and the age estimates of planetary surface based on small size crater counts.
Summary
The primary questions that drive the Mars exploration program focus on life. Has the Martian climate ever been favorable for life development? Such scenario would imply a distinct planetary system from today with a magnetic flied able to retain the atmosphere. Where is the evidence of such past climate and intern conditions? The clues for answering these questions are locked up in the geologic record of the planet. The volume of data acquired in the past 15 years by the 4 Martian orbiters (ESA and NASA) reach the petaoctet, what is indecent as regard to the size of the Martian community. e-Mars propose to built a science team composed by the PI, Two post-doctorates, one PhD student and one engineer to exploit the data characterizing the surface of Mars. e-Mars proposes the unprecedented approach to combine topographic data, imagery data in diverse spectral domain and hyperspectral data from multiple orbiter captors to study the evolution of Mars and to propose pertinent landing sites for next missions. e-Mars will focus on three scientific themes: the composition of the Martian crust to constraint the early evolution of the planet, the research of possible habitable places based on evidence of past liquid water activity from both morphological record and hydrated mineral locations, and the study of current climatic and geological processes driven by the CO2 cycle. These scientific themes will be supported by three axis of methodological development: the geodatabase management via Geographic Information Systems (G.I.S.)., the automatic hyperspectral data analysis and the age estimates of planetary surface based on small size crater counts.
Max ERC Funding
1 392 000 €
Duration
Start date: 2011-11-01, End date: 2017-10-31
Project acronym ECOGENOMICINBREEDING
Project Comparative studies of inbreeding effects on evolutionary processes in non-model animal populations
Researcher (PI) Trine Bilde
Host Institution (HI) AARHUS UNIVERSITET
Call Details Starting Grant (StG), LS8, ERC-2011-StG_20101109
Summary Comparative studies of inbreeding and evolution in non-model animal populations: a research proposal directed towards integrating ecological and evolutionary research on inbreeding. Specifically, my aim is to apply novel ecogenomics tools in the study of evolutionary consequences of inbreeding in non-model animal populations. At present, our understanding of inbreeding is dominated by studies of a small number of model organisms. I will undertake comparative studies on inbreeding effects in a genus of spiders containing independently evolved naturally inbreeding species as well as outcrossing sister species. The study of a naturally inbreeding animal species will provide unique insights to consequences of inbreeding for population genetic structure, genome-wide genetic diversity, and evolution of life history traits. Social spiders are not only unique because they naturally inbreed, but also by being cooperative and showing allomaternal brood care including self-sacrifice, and they evolve highly female-biased sex-ratios, a trait that is not well understood in diploid species. My research objectives are 1) to establish a robust phylogeny for comparative studies; 2) to quantify the effects of inbreeding on the genetic diversity within and between populations; 3) to estimate gene flow among inbred lineages to determine whether inbred lineages diversify but retain the potential for gene exchange, or undergo cryptic speciation; 4) to determine effects of inbreeding on gene expression; 5) to investigate the mechanism underlying the genetic sex determination system that cause female biased sex-ratios; and finally 6) to determine whether sex-ratio is under adaptive parental control in response to genetic relatedness and ecological constraints. Addressing these objectives will generate novel insights and expand current knowledge on the evolutionary ecology of inbreeding in wild animal populations.
Summary
Comparative studies of inbreeding and evolution in non-model animal populations: a research proposal directed towards integrating ecological and evolutionary research on inbreeding. Specifically, my aim is to apply novel ecogenomics tools in the study of evolutionary consequences of inbreeding in non-model animal populations. At present, our understanding of inbreeding is dominated by studies of a small number of model organisms. I will undertake comparative studies on inbreeding effects in a genus of spiders containing independently evolved naturally inbreeding species as well as outcrossing sister species. The study of a naturally inbreeding animal species will provide unique insights to consequences of inbreeding for population genetic structure, genome-wide genetic diversity, and evolution of life history traits. Social spiders are not only unique because they naturally inbreed, but also by being cooperative and showing allomaternal brood care including self-sacrifice, and they evolve highly female-biased sex-ratios, a trait that is not well understood in diploid species. My research objectives are 1) to establish a robust phylogeny for comparative studies; 2) to quantify the effects of inbreeding on the genetic diversity within and between populations; 3) to estimate gene flow among inbred lineages to determine whether inbred lineages diversify but retain the potential for gene exchange, or undergo cryptic speciation; 4) to determine effects of inbreeding on gene expression; 5) to investigate the mechanism underlying the genetic sex determination system that cause female biased sex-ratios; and finally 6) to determine whether sex-ratio is under adaptive parental control in response to genetic relatedness and ecological constraints. Addressing these objectives will generate novel insights and expand current knowledge on the evolutionary ecology of inbreeding in wild animal populations.
Max ERC Funding
1 497 248 €
Duration
Start date: 2012-01-01, End date: 2017-09-30
Project acronym EDECS
Project Exploring Dark Energy through Cosmic Structures: Observational Consequences of Dark Energy Clustering
Researcher (PI) Pier Stefano Corasaniti
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE9, ERC-2011-StG_20101014
Summary Understanding the nature of Dark Energy (DE) in the Universe is the central challenge of modern cosmology. Einstein’s Cosmological Constant (Λ) provides the simplest explanation fitting the available cosmological data thus far. However, its unnaturally tuned value indicates that other hypothesis must be explored. Furthermore, current observations do not by any means rule out alternative models in favor of the simplest “concordance” ΛCDM. In the absence of theoretical prejudice, observational tests have mainly focused on the DE equation of state. However, the detection of the inhomogeneous nature of DE will provide smoking-gun evidence that DE is dynamical, ruling out Λ. This key aspect has been mostly overlooked so far, particularly in the optimization design of the next generation of surveys dedicated to DE searches which will map the distribution of matter in the Universe with unprecedented accuracy. The success of these observations relies upon the ability to model the non-linear gravitational processes which affect the collapse of Dark Matter (DM) at small and intermediate scales. Therefore, it is of the highest importance to investigate the role of DE inhomogeneities throughout the non-linear evolution of cosmic structure formation. To achieve this, we will use specifically designed high-resolution numerical simulations and analytical methods to study the non-linear regime in different DE models. The hypothesis to be tested is whether the intrinsic clustering of DE can alter the predictions of the standard ΛCDM model. We will investigate the observational consequences on the DM density field and the properties of DM halos. The results will have a profound impact in the quest for DE and reveal new observable imprints on the distribution of cosmic structures, whose detection may disclose the ultimate origin of the DE phenomenon.
Summary
Understanding the nature of Dark Energy (DE) in the Universe is the central challenge of modern cosmology. Einstein’s Cosmological Constant (Λ) provides the simplest explanation fitting the available cosmological data thus far. However, its unnaturally tuned value indicates that other hypothesis must be explored. Furthermore, current observations do not by any means rule out alternative models in favor of the simplest “concordance” ΛCDM. In the absence of theoretical prejudice, observational tests have mainly focused on the DE equation of state. However, the detection of the inhomogeneous nature of DE will provide smoking-gun evidence that DE is dynamical, ruling out Λ. This key aspect has been mostly overlooked so far, particularly in the optimization design of the next generation of surveys dedicated to DE searches which will map the distribution of matter in the Universe with unprecedented accuracy. The success of these observations relies upon the ability to model the non-linear gravitational processes which affect the collapse of Dark Matter (DM) at small and intermediate scales. Therefore, it is of the highest importance to investigate the role of DE inhomogeneities throughout the non-linear evolution of cosmic structure formation. To achieve this, we will use specifically designed high-resolution numerical simulations and analytical methods to study the non-linear regime in different DE models. The hypothesis to be tested is whether the intrinsic clustering of DE can alter the predictions of the standard ΛCDM model. We will investigate the observational consequences on the DM density field and the properties of DM halos. The results will have a profound impact in the quest for DE and reveal new observable imprints on the distribution of cosmic structures, whose detection may disclose the ultimate origin of the DE phenomenon.
Max ERC Funding
1 468 800 €
Duration
Start date: 2012-04-01, End date: 2017-08-31
Project acronym EGGS
Project The first Galaxies
Researcher (PI) Johan Peter Uldall Fynbo
Host Institution (HI) KOBENHAVNS UNIVERSITET
Call Details Starting Grant (StG), PE9, ERC-2011-StG_20101014
Summary The goal of this project is to discover the first galaxies that formed after the Big Bang. The astrophysics of galaxy formation is deeply fascinating. From tiny density fluctuations of quantum mechanical nature believed to have formed during an inflationary period a tiny fraction of a second after the Big Bang during structure slowly formed through gravitational collapse. This process is strongly dependent on the nature of the dominant, but unknown form of matter - the dark matter. In the project proposed here I will study the epoch of first galaxy formation and the subsequent few billion years of cosmic evolution using gamma-ray bursts and Lyman-α (Lyα) emitting galaxies as probes. I am the principal investigator on two observational projects utilizing these probes. In the first project, I will over three years starting October 2009 be using the new X-shooter spectrograph on the European Southern Observatory Very Large Telescope to build a sample of ~100 gamma-ray bursts with UV/optical/near-IR spectroscopic follow-up. The objective of this project is to measure primarily metallicities, molecular content, and dust content of the gamma-ray burst host galaxies. I am primarily interested in the redshift range from 9 to 2 corresponding to about 500 million years to 3 billions years after the Big Bang. In the 2nd project we will use the new European Southern Observatory survey telescope VISTA. I am co-PI of the Ultra-VISTA project that over the next 5 years starting December 2009 will create an ultradeep image (about 2000 hr of total integration time) of a piece of sky known as the COSMOS field. I am responsible for the part of the project that will use a narrow-band filter to search for Lyα emitting galaxies at a redshift of 8.8 (corresponding to about 500 million years after the Big Bang) - believed to correspond to the epoch of formation of some of the very first galaxies.
Summary
The goal of this project is to discover the first galaxies that formed after the Big Bang. The astrophysics of galaxy formation is deeply fascinating. From tiny density fluctuations of quantum mechanical nature believed to have formed during an inflationary period a tiny fraction of a second after the Big Bang during structure slowly formed through gravitational collapse. This process is strongly dependent on the nature of the dominant, but unknown form of matter - the dark matter. In the project proposed here I will study the epoch of first galaxy formation and the subsequent few billion years of cosmic evolution using gamma-ray bursts and Lyman-α (Lyα) emitting galaxies as probes. I am the principal investigator on two observational projects utilizing these probes. In the first project, I will over three years starting October 2009 be using the new X-shooter spectrograph on the European Southern Observatory Very Large Telescope to build a sample of ~100 gamma-ray bursts with UV/optical/near-IR spectroscopic follow-up. The objective of this project is to measure primarily metallicities, molecular content, and dust content of the gamma-ray burst host galaxies. I am primarily interested in the redshift range from 9 to 2 corresponding to about 500 million years to 3 billions years after the Big Bang. In the 2nd project we will use the new European Southern Observatory survey telescope VISTA. I am co-PI of the Ultra-VISTA project that over the next 5 years starting December 2009 will create an ultradeep image (about 2000 hr of total integration time) of a piece of sky known as the COSMOS field. I am responsible for the part of the project that will use a narrow-band filter to search for Lyα emitting galaxies at a redshift of 8.8 (corresponding to about 500 million years after the Big Bang) - believed to correspond to the epoch of formation of some of the very first galaxies.
Max ERC Funding
1 002 000 €
Duration
Start date: 2011-11-01, End date: 2016-10-31
Project acronym ELECTROLITH
Project Electrical Petrology: tracking mantle melting and volatiles cycling using electrical conductivity
Researcher (PI) Fabrice Olivier Gaillard
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE10, ERC-2011-StG_20101014
Summary Melting in the Earth’s mantle rules the deep volatile cycles because it produces liquids that concentrate and redistribute volatile species. Such redistributions trigger volcanic degassing, magma emplacement in the crust and hydrothermal circulation, and other sorts of chemical redistribution within the mantle (metasomatism). Melting also affects mantle viscosities and therefore impacts on global geodynamics. So far, experimental petrology has been the main approach to construct a picture of the mantle structure and identify regions of partial melting.
Magnetotelluric (MT) surveys reveal the electrical properties of the deep Earth and show highly conductive regions within the mantle, most likely related to volatiles and melts. However, melting zones disclosed by electrical conductivity do not always corroborate usual pictures deduced from experimental petrology. In 2008, I proposed that small amount of melts, very rich in volatiles species and with unusual physical properties, could reconcile petrological and geophysical observations. The broadening of this idea is however limited by (i) the incomplete knowledge of both petrological and electrical properties of those melts and (ii) the lack of petrologically based models to fit MT data. ELECTROLITH will fill this gap by treating the following points:
- How volatiles in the H-C-S-Cl-F system trigger the beginning of melting and how it affects mantle conductivity?
- What are the atomic structures and the physical properties of such volatile-rich melts?
- How can such melts migrate in the mantle and what are the relationships with deformation?
- What are the scaling procedures to integrate lab-scale observations into a petrological scheme that could decipher MT data in terms of melt percolation models, strain distributions and chemical redistributions in the mantle
ELECTROLITH milestone is therefore a reconciled perspective of geophysics and petrology that will profoundly enrich our vision of the mantle geodynamics
Summary
Melting in the Earth’s mantle rules the deep volatile cycles because it produces liquids that concentrate and redistribute volatile species. Such redistributions trigger volcanic degassing, magma emplacement in the crust and hydrothermal circulation, and other sorts of chemical redistribution within the mantle (metasomatism). Melting also affects mantle viscosities and therefore impacts on global geodynamics. So far, experimental petrology has been the main approach to construct a picture of the mantle structure and identify regions of partial melting.
Magnetotelluric (MT) surveys reveal the electrical properties of the deep Earth and show highly conductive regions within the mantle, most likely related to volatiles and melts. However, melting zones disclosed by electrical conductivity do not always corroborate usual pictures deduced from experimental petrology. In 2008, I proposed that small amount of melts, very rich in volatiles species and with unusual physical properties, could reconcile petrological and geophysical observations. The broadening of this idea is however limited by (i) the incomplete knowledge of both petrological and electrical properties of those melts and (ii) the lack of petrologically based models to fit MT data. ELECTROLITH will fill this gap by treating the following points:
- How volatiles in the H-C-S-Cl-F system trigger the beginning of melting and how it affects mantle conductivity?
- What are the atomic structures and the physical properties of such volatile-rich melts?
- How can such melts migrate in the mantle and what are the relationships with deformation?
- What are the scaling procedures to integrate lab-scale observations into a petrological scheme that could decipher MT data in terms of melt percolation models, strain distributions and chemical redistributions in the mantle
ELECTROLITH milestone is therefore a reconciled perspective of geophysics and petrology that will profoundly enrich our vision of the mantle geodynamics
Max ERC Funding
1 051 236 €
Duration
Start date: 2011-11-01, End date: 2017-10-31
Project acronym ENVNANO
Project Environmental Effects and Risk Evaluation of Engineered Nanoparticles
Researcher (PI) Anders Baun
Host Institution (HI) DANMARKS TEKNISKE UNIVERSITET
Call Details Starting Grant (StG), LS9, ERC-2011-StG_20101109
Summary The objective of the project Environmental Effects and Risk Evaluation of Engineered Nanoparticles (EnvNano) is to elucidate the particle specific properties that govern the ecotoxicological effects of engineered nanoparticles and in this way shift the paradigm for environmental risk assessment of nanomaterials.
While current activities in the emerging field of nano-ecotoxicology and environmental risk assessment of nanomaterials are based on the assumption that the methodologies developed for chemicals can be adapted to be applicable for nanomaterials, EnvNano has a completely different starting point: The behaviour of nanoparticles in suspension is fundamentally different from that of chemicals in on solution.
Therefore, all modifications of existing techniques that do not take this fact into account are bound to have a limited sphere of application or in the worst case to be invalid. By replacing the assumption of dissolved chemicals with a particle behaviour assumption, the traditional risk assessment paradigm will be so seriously impaired that a shift of paradigm will be needed.
EnvNano is based on the following hypotheses: 1. The ecotoxicity and bioaccumulation of engineered nanoparticles will be a function of specific physical and chemical characteristics of the nanoparticles; 2. The environmental hazards of engineered nanoparticles cannot be derived from hazard identifications of the material in other forms; 3. Existing regulatory risk assessment procedures for chemicals will not be appropriate to assess the behaviour and potential harmful effects of engineered nanoparticles on the environment.
These research hypotheses will be addressed in the four interacting research topics of EnvNano: Particle Characterization, Ecotoxicty, Bioaccumulation, and Framework for Risk Evaluation of Nanoparticles aimed to form the foundation for a movement from coefficient-based to kinetic-based environmental nanotoxicology and risk assessment.
Summary
The objective of the project Environmental Effects and Risk Evaluation of Engineered Nanoparticles (EnvNano) is to elucidate the particle specific properties that govern the ecotoxicological effects of engineered nanoparticles and in this way shift the paradigm for environmental risk assessment of nanomaterials.
While current activities in the emerging field of nano-ecotoxicology and environmental risk assessment of nanomaterials are based on the assumption that the methodologies developed for chemicals can be adapted to be applicable for nanomaterials, EnvNano has a completely different starting point: The behaviour of nanoparticles in suspension is fundamentally different from that of chemicals in on solution.
Therefore, all modifications of existing techniques that do not take this fact into account are bound to have a limited sphere of application or in the worst case to be invalid. By replacing the assumption of dissolved chemicals with a particle behaviour assumption, the traditional risk assessment paradigm will be so seriously impaired that a shift of paradigm will be needed.
EnvNano is based on the following hypotheses: 1. The ecotoxicity and bioaccumulation of engineered nanoparticles will be a function of specific physical and chemical characteristics of the nanoparticles; 2. The environmental hazards of engineered nanoparticles cannot be derived from hazard identifications of the material in other forms; 3. Existing regulatory risk assessment procedures for chemicals will not be appropriate to assess the behaviour and potential harmful effects of engineered nanoparticles on the environment.
These research hypotheses will be addressed in the four interacting research topics of EnvNano: Particle Characterization, Ecotoxicty, Bioaccumulation, and Framework for Risk Evaluation of Nanoparticles aimed to form the foundation for a movement from coefficient-based to kinetic-based environmental nanotoxicology and risk assessment.
Max ERC Funding
1 196 260 €
Duration
Start date: 2011-12-01, End date: 2016-03-31
Project acronym EOS
Project Enzyme catalysis in organic solvents
Researcher (PI) Damien Laage
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE4, ERC-2011-StG_20101014
Summary Enzymes are remarkably efficient catalysts and their recent use in non-aqueous organic solvents is opening a tremendous range of applications in synthetic chemistry: since, surprisingly, most enzymes do not denature in these non-natural environments, new reactions involving e.g. water-insoluble reagents can be catalyzed, while unwanted degradation side reactions are suppressed.
However, a key challenge for these applications is to overcome the greatly reduced catalytic activity compared to aqueous conditions. Empirically, adding activators such as salts or small amounts of water dramatically enhances the activity, but the underlying mechanisms have remained elusive, thus preventing a rational optimization.
Through analytic modeling and numerical simulations, our project will provide the first atomic-scale detailed description of enzyme catalysis in organic solvents, including the key role of the environment. We will then use this unprecedented molecular insight to design rigorous new procedures for the rational engineering of systems with dramatically enhanced activities, both through optimized choices of solvents and additives, and through targeted protein mutations.
Specifically, we will first rigorously establish the influence of enzyme flexibility on catalytic activity through an original model accounting for the dynamic disorder arising from conformation fluctuations. Second, we will provide the first molecular explanation of the commonly invoked “lubricating” action of added water. Third, the underlying mechanism of the much employed salt-induced activation will be determined, probably calling for a radical change from the currently used picture of a water-mediated action.
Far-reaching practical impacts are expected for the numerous industrial syntheses already employing biocatalysis in non-aqueous media.
Summary
Enzymes are remarkably efficient catalysts and their recent use in non-aqueous organic solvents is opening a tremendous range of applications in synthetic chemistry: since, surprisingly, most enzymes do not denature in these non-natural environments, new reactions involving e.g. water-insoluble reagents can be catalyzed, while unwanted degradation side reactions are suppressed.
However, a key challenge for these applications is to overcome the greatly reduced catalytic activity compared to aqueous conditions. Empirically, adding activators such as salts or small amounts of water dramatically enhances the activity, but the underlying mechanisms have remained elusive, thus preventing a rational optimization.
Through analytic modeling and numerical simulations, our project will provide the first atomic-scale detailed description of enzyme catalysis in organic solvents, including the key role of the environment. We will then use this unprecedented molecular insight to design rigorous new procedures for the rational engineering of systems with dramatically enhanced activities, both through optimized choices of solvents and additives, and through targeted protein mutations.
Specifically, we will first rigorously establish the influence of enzyme flexibility on catalytic activity through an original model accounting for the dynamic disorder arising from conformation fluctuations. Second, we will provide the first molecular explanation of the commonly invoked “lubricating” action of added water. Third, the underlying mechanism of the much employed salt-induced activation will be determined, probably calling for a radical change from the currently used picture of a water-mediated action.
Far-reaching practical impacts are expected for the numerous industrial syntheses already employing biocatalysis in non-aqueous media.
Max ERC Funding
1 390 800 €
Duration
Start date: 2012-01-01, End date: 2017-12-31
Project acronym EVOIMMUNOPOP
Project Human Evolutionary Immunogenomics: population genetic variation in immune responses
Researcher (PI) Lluis Quintana-Murci
Host Institution (HI) INSTITUT PASTEUR
Call Details Starting Grant (StG), LS2, ERC-2011-StG_20101109
Summary Recent genome-wide association studies have successfully identified rare and common variants that correlate with complex traits. However, they have provided us with little insight into the nature of the genetic, biological and evolutionary relationships underlying such complex phenotypes. There is thus a growing need for approaches that provide a mechanistic understanding of how genetic variants function to impact phenotypic variation and why they have been substrates of natural selection. One set of traits that displays considerable heterogeneity and that has undoubtedly been shaped by natural selection is the host response to microorganisms. By integrating cutting-edge knowledge and technology in the fields of genomics, population genetics, immunology and bioinformatics, our aim is to establish a thorough understanding of how variable the human immune response is in the natural setting and how this phenotypic variation is under genetic control. Specifically, we aim (i) to characterise the genetic architecture of two populations differing in their ethnic background; (ii) to define individual and population-level variation in immune responses, in the same individuals, by establishing an ex vivo cell-based model to study levels of transcript abundance of both mRNA and miRNA, before and after activation with various immune stimuli; (iii) to map expression quantitative trait loci associated with variation in immune responses; and (iv) to identify adaptive immunological phenotypes. This study will increase our understanding of how genotypes influence the heterogeneity of immune response phenotypes at the level of the human population, and reveal immunological mechanisms under genetic control that have been crucial for our past and present survival against infection. In doing so, we will provide the foundations to define perturbations in these responses that correlate with the occurrence of various infectious and non-infectious diseases as well as with vaccine success.
Summary
Recent genome-wide association studies have successfully identified rare and common variants that correlate with complex traits. However, they have provided us with little insight into the nature of the genetic, biological and evolutionary relationships underlying such complex phenotypes. There is thus a growing need for approaches that provide a mechanistic understanding of how genetic variants function to impact phenotypic variation and why they have been substrates of natural selection. One set of traits that displays considerable heterogeneity and that has undoubtedly been shaped by natural selection is the host response to microorganisms. By integrating cutting-edge knowledge and technology in the fields of genomics, population genetics, immunology and bioinformatics, our aim is to establish a thorough understanding of how variable the human immune response is in the natural setting and how this phenotypic variation is under genetic control. Specifically, we aim (i) to characterise the genetic architecture of two populations differing in their ethnic background; (ii) to define individual and population-level variation in immune responses, in the same individuals, by establishing an ex vivo cell-based model to study levels of transcript abundance of both mRNA and miRNA, before and after activation with various immune stimuli; (iii) to map expression quantitative trait loci associated with variation in immune responses; and (iv) to identify adaptive immunological phenotypes. This study will increase our understanding of how genotypes influence the heterogeneity of immune response phenotypes at the level of the human population, and reveal immunological mechanisms under genetic control that have been crucial for our past and present survival against infection. In doing so, we will provide the foundations to define perturbations in these responses that correlate with the occurrence of various infectious and non-infectious diseases as well as with vaccine success.
Max ERC Funding
1 494 756 €
Duration
Start date: 2012-01-01, End date: 2017-08-31
Project acronym EVOMOBILOME
Project Evolution of gene mobility: how mobile elements shape the function and sociality of microbial communities
Researcher (PI) Eduardo Pimentel Cachapuz Rocha
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), LS8, ERC-2011-StG_20101109
Summary Prokaryotes evolve rapidly new functionalities by horizontal gene transfer delivered by mobile genetic elements (MGE). MGE increase relatedness between individuals and thereby might also promote the establishment of microbial social networks. Many studies have detailed the dynamics of specific systems in specific MGE. Yet, how MGE key and accessory functions evolve as a whole at the face of social dilemmas arising in microbial communities is largely ignored. Here, I aim at an integrative identification and analysis of self-mobilizable elements to unravel an evolutionary framework of MGE contributions to prokaryotic evolution.
We will use sequence similarity, phylogeny and population genetics techniques to detail how elements propagate and are maintained in populations. We will investigate how accessory functions work together in relation to interactions between MGE and of MGE with the host. We will then quantify the long-term impact of MGE to the gene repertoires of prokaryotes by analysis of the patterns of their degradation and/or domestication using regulatory networks and population genetics. The analysis of secretion systems and effectors in mobile elements will enlighten the role of gene mobility in promoting social behaviours thorugh production of public goods. The previous results will then be used to query metagenomics datasets about the roles of gene mobility and secretion in the social evolution of natural microbial populations.
This work will pioneer the application of theoretical works in population genetics and social evolution to the study of natural microbial communities by way of evolutionary genomics. Its integrative outlook will also provide essential breakthroughs in the understanding of the evolutionary history of mechanisms of gene mobility, e.g. conjugation. Finally, this project will pinpoint how manipulation of MGE might allows control of virulence, antibiotic resistance and other phenomena related with microbial social interactions.
Summary
Prokaryotes evolve rapidly new functionalities by horizontal gene transfer delivered by mobile genetic elements (MGE). MGE increase relatedness between individuals and thereby might also promote the establishment of microbial social networks. Many studies have detailed the dynamics of specific systems in specific MGE. Yet, how MGE key and accessory functions evolve as a whole at the face of social dilemmas arising in microbial communities is largely ignored. Here, I aim at an integrative identification and analysis of self-mobilizable elements to unravel an evolutionary framework of MGE contributions to prokaryotic evolution.
We will use sequence similarity, phylogeny and population genetics techniques to detail how elements propagate and are maintained in populations. We will investigate how accessory functions work together in relation to interactions between MGE and of MGE with the host. We will then quantify the long-term impact of MGE to the gene repertoires of prokaryotes by analysis of the patterns of their degradation and/or domestication using regulatory networks and population genetics. The analysis of secretion systems and effectors in mobile elements will enlighten the role of gene mobility in promoting social behaviours thorugh production of public goods. The previous results will then be used to query metagenomics datasets about the roles of gene mobility and secretion in the social evolution of natural microbial populations.
This work will pioneer the application of theoretical works in population genetics and social evolution to the study of natural microbial communities by way of evolutionary genomics. Its integrative outlook will also provide essential breakthroughs in the understanding of the evolutionary history of mechanisms of gene mobility, e.g. conjugation. Finally, this project will pinpoint how manipulation of MGE might allows control of virulence, antibiotic resistance and other phenomena related with microbial social interactions.
Max ERC Funding
1 298 925 €
Duration
Start date: 2012-07-01, End date: 2017-12-31
Project acronym EXTENDFRET
Project Extended fluorescence resonance energy transfer with plasmonic nanocircuits
Researcher (PI) Jerome Wenger
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE4, ERC-2011-StG_20101014
Summary Förster fluorescence resonance energy transfer (FRET) is one of the most popular methods to measure distance, structure, association, and dynamics at the single molecule level. However, major challenges are limiting FRET in several fields of physical and analytical sciences: (i) a short distance range below 8 nm, (ii) a concentration range in the nanomolar regime, and (iii) generally weak detected signals.
At the interface between physical chemistry and nano-optics, the proposal objective is to extend the effectiveness of single molecule FRET using plasmonic nanocircuits to: (i) perform FRET on a range up to 20 nm, (ii) detect a single FRET pair in a solution of micromolar concentration, and (iii) improve the statistical distribution in FRET measurements.
To meet its ambitious goals, the proposal introduces plasmonic nanocircuits to tailor the light-molecule interaction at the nanoscale. Energy transfer between donor and acceptor fluorophores is efficiently mediated through intense surface plasmon modes to extend the FRET distance range and improve the fluorescence signal. Moreover, the nanocircuits will be combined with recent innovations in biophotonics: stimulated emission of acceptor fluorescence, full dynamic analysis, and fluidic nanochannels.
The scientific breakthroughs and project impacts will open new horizons for proteomics, enzymology, genomics and photonics. For elucidating molecular structure, the long range FRET will enable understanding the folding structure of large DNA or protein molecules. For assessing chemical reactions, achieving single molecule analysis at micromolar concentration is essential to monitor relevant kinetics, reveal sample heterogeneity, and detect rare and/or transient species. For analytical chemistry, nanocircuits are ideal for sensitive biosensing on a chip. For photonics, nanocircuits can realize key components for optical information processing at the nanoscale.
Summary
Förster fluorescence resonance energy transfer (FRET) is one of the most popular methods to measure distance, structure, association, and dynamics at the single molecule level. However, major challenges are limiting FRET in several fields of physical and analytical sciences: (i) a short distance range below 8 nm, (ii) a concentration range in the nanomolar regime, and (iii) generally weak detected signals.
At the interface between physical chemistry and nano-optics, the proposal objective is to extend the effectiveness of single molecule FRET using plasmonic nanocircuits to: (i) perform FRET on a range up to 20 nm, (ii) detect a single FRET pair in a solution of micromolar concentration, and (iii) improve the statistical distribution in FRET measurements.
To meet its ambitious goals, the proposal introduces plasmonic nanocircuits to tailor the light-molecule interaction at the nanoscale. Energy transfer between donor and acceptor fluorophores is efficiently mediated through intense surface plasmon modes to extend the FRET distance range and improve the fluorescence signal. Moreover, the nanocircuits will be combined with recent innovations in biophotonics: stimulated emission of acceptor fluorescence, full dynamic analysis, and fluidic nanochannels.
The scientific breakthroughs and project impacts will open new horizons for proteomics, enzymology, genomics and photonics. For elucidating molecular structure, the long range FRET will enable understanding the folding structure of large DNA or protein molecules. For assessing chemical reactions, achieving single molecule analysis at micromolar concentration is essential to monitor relevant kinetics, reveal sample heterogeneity, and detect rare and/or transient species. For analytical chemistry, nanocircuits are ideal for sensitive biosensing on a chip. For photonics, nanocircuits can realize key components for optical information processing at the nanoscale.
Max ERC Funding
1 477 942 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym FPTOPT
Project First-passage times and optimization of target search strategies
Researcher (PI) Olivier, Jacques Benichou
Host Institution (HI) UNIVERSITE PIERRE ET MARIE CURIE - PARIS 6
Call Details Starting Grant (StG), PE3, ERC-2011-StG_20101014
Summary How long does it take a random walker to reach a given target? This quantity, known as a first-passage time (FPT), has been the subject of a growing number of theoretical studies over the past decade. The importance of FPTs originates from the crucial role played by properties related to first encounters in various real situations, including transport in disordered media, diffusion limited reactions, or more generally target search processes. First-passage times in confinement, their optimization and their relationship to biophysical experiments are at the heart of this project. The following two issues will be investigated.
1) We will determine key first-passage observables of general scale-invariant random walks in confinement, which up to now have remained inaccessible: FPT distribution in the presence of several targets and/or several searchers, statistical properties of the explored territory, FPT distribution of a non-Markovian random walker. Beyond their theoretical interest, these developments will allow us to address in close connection with single-molecule experiments the importance of transport and spatial organization for gene transcription kinetics and stochastic gene expression.
2) We will address the question of the optimization of the search time. We have recently introduced a new type of search strategies, the intermittent strategies, which minimize the search time under general conditions. Here, the objectives are: (i) to determine new first-passage observables of these intermittent processes (eg the full FPT distribution) to allow the comparison of optimal strategies to experimental situations; (ii) to understand the physical mechanisms underlying real intermittent pathways and assess their optimality at the molecular (homologous recombination kinetics), cellular (search for infection markers by dendritic cells) and macroscopic scales (individual search behavior of ants); (iii) to use intermittent strategies to design efficient searches.
Summary
How long does it take a random walker to reach a given target? This quantity, known as a first-passage time (FPT), has been the subject of a growing number of theoretical studies over the past decade. The importance of FPTs originates from the crucial role played by properties related to first encounters in various real situations, including transport in disordered media, diffusion limited reactions, or more generally target search processes. First-passage times in confinement, their optimization and their relationship to biophysical experiments are at the heart of this project. The following two issues will be investigated.
1) We will determine key first-passage observables of general scale-invariant random walks in confinement, which up to now have remained inaccessible: FPT distribution in the presence of several targets and/or several searchers, statistical properties of the explored territory, FPT distribution of a non-Markovian random walker. Beyond their theoretical interest, these developments will allow us to address in close connection with single-molecule experiments the importance of transport and spatial organization for gene transcription kinetics and stochastic gene expression.
2) We will address the question of the optimization of the search time. We have recently introduced a new type of search strategies, the intermittent strategies, which minimize the search time under general conditions. Here, the objectives are: (i) to determine new first-passage observables of these intermittent processes (eg the full FPT distribution) to allow the comparison of optimal strategies to experimental situations; (ii) to understand the physical mechanisms underlying real intermittent pathways and assess their optimality at the molecular (homologous recombination kinetics), cellular (search for infection markers by dendritic cells) and macroscopic scales (individual search behavior of ants); (iii) to use intermittent strategies to design efficient searches.
Max ERC Funding
1 242 800 €
Duration
Start date: 2011-10-01, End date: 2017-09-30
Project acronym FREECO
Project Freezing Colloids
Researcher (PI) Sylvain Stephane Francois Deville
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE8, ERC-2011-StG_20101014
Summary The freezing of colloids is an amazingly common phenomenon encountered in many natural and engineering processes such as the freezing of soils, food engineering or cryobiology. It can also be used as a bioinspired, versatile and environmentally-friendly processing route for bioinspired porous materials and composites exhibiting breakthroughs in functional properties. Yet, it is still a puzzling phenomenon with many unexplained features, due to the complexity of the system, the space and time scales at which the process should be investigated and the multidisciplinary approach required to completely apprehend it.
The objective is to progress towards a deep understanding of the freezing of colloids through novel in situ observations approaches and mathematical modelling, to exert a better control on the processing route and achieve the full potential of this novel class of bioinspired materials. Materials will be processed and their structure/properties relationships investigated and optimized.
This project offers a unique integration of approaches, competences and resources in materials science, chemistry, physics, mathematics and technological developments of observation techniques. For materials science only, the versatility of the process and its control could yield potential breakthroughs in numerous key applications of tremendous human, technological, environmental and economical importance such as catalysis, biomaterials or energy production, and open a whole new field of research. Far-reaching implications beyond materials science are expected, both from the developments in mathematics and physics, and from the implications of colloids freezing in many situations and fields of research.
Summary
The freezing of colloids is an amazingly common phenomenon encountered in many natural and engineering processes such as the freezing of soils, food engineering or cryobiology. It can also be used as a bioinspired, versatile and environmentally-friendly processing route for bioinspired porous materials and composites exhibiting breakthroughs in functional properties. Yet, it is still a puzzling phenomenon with many unexplained features, due to the complexity of the system, the space and time scales at which the process should be investigated and the multidisciplinary approach required to completely apprehend it.
The objective is to progress towards a deep understanding of the freezing of colloids through novel in situ observations approaches and mathematical modelling, to exert a better control on the processing route and achieve the full potential of this novel class of bioinspired materials. Materials will be processed and their structure/properties relationships investigated and optimized.
This project offers a unique integration of approaches, competences and resources in materials science, chemistry, physics, mathematics and technological developments of observation techniques. For materials science only, the versatility of the process and its control could yield potential breakthroughs in numerous key applications of tremendous human, technological, environmental and economical importance such as catalysis, biomaterials or energy production, and open a whole new field of research. Far-reaching implications beyond materials science are expected, both from the developments in mathematics and physics, and from the implications of colloids freezing in many situations and fields of research.
Max ERC Funding
1 469 034 €
Duration
Start date: 2012-01-01, End date: 2017-12-31
Project acronym FUTUREPOL
Project A Political History of the Future : Knowledge Production and Future Governance 1945-2010
Researcher (PI) Jenny Andersson
Host Institution (HI) FONDATION NATIONALE DES SCIENCES POLITIQUES
Call Details Starting Grant (StG), SH2, ERC-2011-StG_20101124
Summary FUTUREPOL seeks to open up a new field of historical and political enquiry around the history of future governance. As an object of governance, the future is notoriously rebellious: difficult to define, defying notions of objectivity and truth. Nevertheless, a crucial feature of modern societies is their belief in the knowability and governability of the future, the belief that through the means of scientific rationality and political power, the future can be controlled. FUTUREPOL aims to study shifting ideas of the knowability and governability of the future, in order to illuminate the process in which the future is transformed from its nebulous and uncertain state into an object of governance. Moreover, it intends an historical analysis of how this process varies over time in the post war period. The project thus asks two central research questions: How does the future become an object of governance? And how is this process different today, than earlier in the post war period? FUTUREPOL will address four problems: First, it will study the origins of futurology and its birth in transnational networks of futurists in the immediate post war period. Second, it intends to study the way that futurists’ ideas were translated into policy and gave rise to public institutions devoted to the future in many countries in Europe and beyond. Third, it will situate these problems in a global field where concerns with national futures are confronted to concerns with the survival of the world system as a whole, and fourth, it aims to study the evolution of the means of future governance over time, and proposes that such an historical analysis of future governance can permit us to historicize central forms of modern governance such as the governance of risk, foresight or scenarios, and thus help us understand the way that contemporary societies engage with the future.
Summary
FUTUREPOL seeks to open up a new field of historical and political enquiry around the history of future governance. As an object of governance, the future is notoriously rebellious: difficult to define, defying notions of objectivity and truth. Nevertheless, a crucial feature of modern societies is their belief in the knowability and governability of the future, the belief that through the means of scientific rationality and political power, the future can be controlled. FUTUREPOL aims to study shifting ideas of the knowability and governability of the future, in order to illuminate the process in which the future is transformed from its nebulous and uncertain state into an object of governance. Moreover, it intends an historical analysis of how this process varies over time in the post war period. The project thus asks two central research questions: How does the future become an object of governance? And how is this process different today, than earlier in the post war period? FUTUREPOL will address four problems: First, it will study the origins of futurology and its birth in transnational networks of futurists in the immediate post war period. Second, it intends to study the way that futurists’ ideas were translated into policy and gave rise to public institutions devoted to the future in many countries in Europe and beyond. Third, it will situate these problems in a global field where concerns with national futures are confronted to concerns with the survival of the world system as a whole, and fourth, it aims to study the evolution of the means of future governance over time, and proposes that such an historical analysis of future governance can permit us to historicize central forms of modern governance such as the governance of risk, foresight or scenarios, and thus help us understand the way that contemporary societies engage with the future.
Max ERC Funding
1 302 949 €
Duration
Start date: 2012-01-01, End date: 2017-09-30
Project acronym GENOCIDE
Project Corpses of Genocide and Mass Violence: Interdisciplinary and Comparative Approaches of Dead Bodies Treatment in the 20th Century (Destruction, Identification, Reconciliation)
Researcher (PI) Elisabeth Gessat Anstett
Host Institution (HI) ECOLE DES HAUTES ETUDES EN SCIENCES SOCIALES
Call Details Starting Grant (StG), SH2, ERC-2011-StG_20101124
Summary In Europe and all over the world, genocide and mass violence have been a structural feature of the 20th century. This project aims at questioning the social legacy of mass violence by studying how different societies have coped with the first consequence of mass destruction: the mass production of cadavers. What status and what value have indeed been given to corpses? What political, social or religious uses have been made of dead bodies in occupied Europe, Soviet Union, Serbia, Spain but also Rwanda, Argentina or Cambodia, both during and after the massacres? Bringing together perspectives of social anthropology, history and law, and raising the three main issues of destruction, identification and reconciliation, our project will enlighten how various social and cultural treatments of dead bodies simultaneously challenge common representations, legal practices and moral. Project outputs will therefore open and strengthen the field of genocide studies by providing proper intellectual and theoretical tools for a better understanding of mass violence’s aftermaths in today societies.
Summary
In Europe and all over the world, genocide and mass violence have been a structural feature of the 20th century. This project aims at questioning the social legacy of mass violence by studying how different societies have coped with the first consequence of mass destruction: the mass production of cadavers. What status and what value have indeed been given to corpses? What political, social or religious uses have been made of dead bodies in occupied Europe, Soviet Union, Serbia, Spain but also Rwanda, Argentina or Cambodia, both during and after the massacres? Bringing together perspectives of social anthropology, history and law, and raising the three main issues of destruction, identification and reconciliation, our project will enlighten how various social and cultural treatments of dead bodies simultaneously challenge common representations, legal practices and moral. Project outputs will therefore open and strengthen the field of genocide studies by providing proper intellectual and theoretical tools for a better understanding of mass violence’s aftermaths in today societies.
Max ERC Funding
1 197 367 €
Duration
Start date: 2012-02-01, End date: 2016-01-31
Project acronym GEODYCON
Project Geometry and dynamics via contact topology
Researcher (PI) Vincent Maurice Colin
Host Institution (HI) UNIVERSITE DE NANTES
Call Details Starting Grant (StG), PE1, ERC-2011-StG_20101014
Summary I intend to cross ressources of holomorphic curves techniques and traditional topological methods to study some fundamental questions in symplectic and contact geometry such as:
- The Weinstein conjecture in dimension greater than 3.
- The construction of new invariants for both smooth manifolds and Legendrian/contact manifolds, in particular, try to define an analogue of Heegaard Floer homology in dimension larger than 3.
- The link, in dimension 3, between the geometry of the ambient manifold (especially hyperbolicity) and the dynamical/topological properties of its Reeb vector fields and contact structures.
- The topological characterization of odd-dimensional manifolds admitting a contact structure.
A crucial ingredient of my program is to understand the key role played by open book decompositions in dimensions larger than three.
This program requires a huge amount of mathematical knowledges. My idea is to organize a team around Ghiggini, Laudenbach, Rollin, Sandon and myself, augmented by two post-docs and one PhD student funded by the project. This will give us the critical size to organize a very active working seminar and to have a worldwide attractivity and recognition.
I also plan to invite one confirmed researcher every year (for 1-2 months), to organize one conference and one summer school, as well as several focused weeks.
Summary
I intend to cross ressources of holomorphic curves techniques and traditional topological methods to study some fundamental questions in symplectic and contact geometry such as:
- The Weinstein conjecture in dimension greater than 3.
- The construction of new invariants for both smooth manifolds and Legendrian/contact manifolds, in particular, try to define an analogue of Heegaard Floer homology in dimension larger than 3.
- The link, in dimension 3, between the geometry of the ambient manifold (especially hyperbolicity) and the dynamical/topological properties of its Reeb vector fields and contact structures.
- The topological characterization of odd-dimensional manifolds admitting a contact structure.
A crucial ingredient of my program is to understand the key role played by open book decompositions in dimensions larger than three.
This program requires a huge amount of mathematical knowledges. My idea is to organize a team around Ghiggini, Laudenbach, Rollin, Sandon and myself, augmented by two post-docs and one PhD student funded by the project. This will give us the critical size to organize a very active working seminar and to have a worldwide attractivity and recognition.
I also plan to invite one confirmed researcher every year (for 1-2 months), to organize one conference and one summer school, as well as several focused weeks.
Max ERC Funding
887 600 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym GEOPARDI
Project Numerical integration of Geometric Partial Differential Equations
Researcher (PI) Erwan Faou
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE1, ERC-2011-StG_20101014
Summary "The goal of this project is to develop new numerical methods for the approximation of evolution equations possessing strong geometric properties such as Hamiltonian systems or stochastic differential equations. In such situations the exact solutions endow with many physical properties that are consequences of the geometric structure: Preservation of the total energy, momentum conservation or existence of ergodic invariant measures. However the preservation of such qualitative properties of the original system by numerical methods at a reasonable cost is not guaranteed at all, even for very precise (high order) methods.
The principal aim of geometric numerical integration is the understanding and analysis of such problems: How (and to which extend) reproduce qualitative behavior of differential equations over long time? The extension of this theory to partial differential equations is a fundamental ongoing challenge, which require the invention of a new mathematical framework bridging the most recent techniques used in the theory of nonlinear PDEs and stochastic ordinary and partial differential equations. The development of new efficient numerical schemes for geometric PDEs has to go together with the most recent progress in analysis (stability phenomena, energy transfers, multiscale problems, etc..)
The major challenges of the project are to derive new schemes by bridging the world of numerical simulation and the analysis community, and to consider deterministic and stochastic equations, with a general aim at deriving hybrid methods. We also aim to create a research platform devoted to extensive numerical simulations of difficult academic PDEs in order to highlight new nonlinear phenomena and test numerical methods."
Summary
"The goal of this project is to develop new numerical methods for the approximation of evolution equations possessing strong geometric properties such as Hamiltonian systems or stochastic differential equations. In such situations the exact solutions endow with many physical properties that are consequences of the geometric structure: Preservation of the total energy, momentum conservation or existence of ergodic invariant measures. However the preservation of such qualitative properties of the original system by numerical methods at a reasonable cost is not guaranteed at all, even for very precise (high order) methods.
The principal aim of geometric numerical integration is the understanding and analysis of such problems: How (and to which extend) reproduce qualitative behavior of differential equations over long time? The extension of this theory to partial differential equations is a fundamental ongoing challenge, which require the invention of a new mathematical framework bridging the most recent techniques used in the theory of nonlinear PDEs and stochastic ordinary and partial differential equations. The development of new efficient numerical schemes for geometric PDEs has to go together with the most recent progress in analysis (stability phenomena, energy transfers, multiscale problems, etc..)
The major challenges of the project are to derive new schemes by bridging the world of numerical simulation and the analysis community, and to consider deterministic and stochastic equations, with a general aim at deriving hybrid methods. We also aim to create a research platform devoted to extensive numerical simulations of difficult academic PDEs in order to highlight new nonlinear phenomena and test numerical methods."
Max ERC Funding
971 772 €
Duration
Start date: 2011-09-01, End date: 2016-08-31
Project acronym GTMT
Project Group Theory and Model Theory
Researcher (PI) Eric Herve Jaligot
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2011-StG_20101014
Summary The project is located between logic and mathematics, more precisely between model theory and group theory. There are extremely difficult questions arising about the model theory of groups, notably the question of the construction of new groups with prescribed algebraic properties and at the same time good model-theoretic properties. In particular, it is an important question, both in model theory and in group theory, to build new stable groups and eventually new nonalgebraic groups with a good dimension notion.
The present project aims at filling these gaps. It is divided into three main directions. Firstly, it consists in the continuation of the classification of groups with a good dimension notion, notably groups of finite Morley rank or related notions. Secondly, it consists in a systematic inspection of the combinatorial and geometric group theory which can be applied to build new groups, keeping a control on their first order theory. Thirdly, and in connection to the previous difficult problem, it consists in a very systematic and general study of infinite permutation groups.
Summary
The project is located between logic and mathematics, more precisely between model theory and group theory. There are extremely difficult questions arising about the model theory of groups, notably the question of the construction of new groups with prescribed algebraic properties and at the same time good model-theoretic properties. In particular, it is an important question, both in model theory and in group theory, to build new stable groups and eventually new nonalgebraic groups with a good dimension notion.
The present project aims at filling these gaps. It is divided into three main directions. Firstly, it consists in the continuation of the classification of groups with a good dimension notion, notably groups of finite Morley rank or related notions. Secondly, it consists in a systematic inspection of the combinatorial and geometric group theory which can be applied to build new groups, keeping a control on their first order theory. Thirdly, and in connection to the previous difficult problem, it consists in a very systematic and general study of infinite permutation groups.
Max ERC Funding
366 598 €
Duration
Start date: 2011-10-01, End date: 2013-12-31
Project acronym HIPODEMA
Project FROM DECISIONISM TO RATIONAL CHOICE: A History of Political Decision-Making in the 20th Century
Researcher (PI) Nicolas Michel Boian Guilhot
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), SH6, ERC-2011-StG_20101124
Summary Historians have good reasons to be highly suspicious of the “rational choice” methodologies that hold sway in economics, political science or sociology and that find a new lease on life today with the rise of the cognitive sciences. On the other hand, researchers using these methodologies show usually very little interest in history. The result is that we know very little about the historical development of “rational choice” as a way to define rationality in action, while this intellectual paradigm has become pervasive and reshaped the way we do science and the way we think about politics.
This project will follow the problem of decision-making through the 20th century and weave into a single historical narrative its different disciplinary formulations. It starts with a puzzle: while the “decisionist” critiques of legality of the 1920s associated the decision with an anti-rationalist vision of politics, this notion gradually morphed into the epitome of “rational choice” after 1945. How did this transformation occur?
The project will reconstruct this shift from a paradigm in which Law was the instrument that would make political decisions rational, to another in which the power of rationalization was vested in Science. It asks how the post-1945 efforts at specifying conditions of rationality for political decisions changed the meaning of “rationality.” It connects these developments to the interdisciplinary set of “decision sciences” that emerged in the 1950s around issues of strategic and political behavior and spawned our contemporary instruments of “conflict-resolution” or automated models of decision-making.
The project suggests that “rationality” in political decision-making is not a transcendental norm, but a historically contingent benchmark dependent on its technical instrumentations. Democratizing political decision-making, then, means opening these models and instruments of rationalization to scholarly debate and public scrutiny.
Summary
Historians have good reasons to be highly suspicious of the “rational choice” methodologies that hold sway in economics, political science or sociology and that find a new lease on life today with the rise of the cognitive sciences. On the other hand, researchers using these methodologies show usually very little interest in history. The result is that we know very little about the historical development of “rational choice” as a way to define rationality in action, while this intellectual paradigm has become pervasive and reshaped the way we do science and the way we think about politics.
This project will follow the problem of decision-making through the 20th century and weave into a single historical narrative its different disciplinary formulations. It starts with a puzzle: while the “decisionist” critiques of legality of the 1920s associated the decision with an anti-rationalist vision of politics, this notion gradually morphed into the epitome of “rational choice” after 1945. How did this transformation occur?
The project will reconstruct this shift from a paradigm in which Law was the instrument that would make political decisions rational, to another in which the power of rationalization was vested in Science. It asks how the post-1945 efforts at specifying conditions of rationality for political decisions changed the meaning of “rationality.” It connects these developments to the interdisciplinary set of “decision sciences” that emerged in the 1950s around issues of strategic and political behavior and spawned our contemporary instruments of “conflict-resolution” or automated models of decision-making.
The project suggests that “rationality” in political decision-making is not a transcendental norm, but a historically contingent benchmark dependent on its technical instrumentations. Democratizing political decision-making, then, means opening these models and instruments of rationalization to scholarly debate and public scrutiny.
Max ERC Funding
628 004 €
Duration
Start date: 2011-12-01, End date: 2016-11-30
Project acronym HYBRIDNANO
Project Engineering electronic quantum coherence
and correlations in hybrid nanostructures
Researcher (PI) Silvano De Franceschi
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE3, ERC-2011-StG_20101014
Summary Nanoelectronic devices can provide versatile and relatively simple systems to study complex quantum phenomena under well-controlled, adjustable conditions. Existing technologies enable the fabrication of low-dimensional nanostructures, such as quantum dots (QDs), in which it is possible to add or remove individual electrons, turn on and off interactions, and tune the properties of the confined electronic states, simply by acting on a gate voltage or by applying a magnetic field. The hybrid combination of such nanostructures, having microscopic (atomic-like) quantum properties, with metallic elements, embedding different types of macroscopic electronic properties (due, e.g., to ferromagnetism or superconductivity), can open the door to unprecedented research opportunities. Hybrid nanostructures can serve to explore new device concepts with so far unexploited functionalities and, simultaneously, provide powerful tools to study fundamental aspects of general relevance to condensed-matter physics. Only recently, following progress in nanotechnology, have hybrid nanostructures become accessible to experiments.
Here we propose an original approach that takes advantage of recently developed self-assembled QDs grown on Si-based substrates. These QDs have many attractive properties (well-established growth, ease of contacting, etc.). We will integrate single and multiple QDs with normal-metal, superconducting, and ferromagnetic electrodes and explore device concepts such as spin valves, spin pumps, and spin transistors (a long standing challenge). Using these hybrid devices we will study spin-related phenomena such as the dynamics of confined and propagating spin states in different solid-state environments (including superconducting boxes), long-distance spin correlations and entanglement. The new knowledge expected from these experiments is likely to have a broad impact extending from quantum spintronics to other areas of nanoelectronics (e.g. superconducting electronics).
Summary
Nanoelectronic devices can provide versatile and relatively simple systems to study complex quantum phenomena under well-controlled, adjustable conditions. Existing technologies enable the fabrication of low-dimensional nanostructures, such as quantum dots (QDs), in which it is possible to add or remove individual electrons, turn on and off interactions, and tune the properties of the confined electronic states, simply by acting on a gate voltage or by applying a magnetic field. The hybrid combination of such nanostructures, having microscopic (atomic-like) quantum properties, with metallic elements, embedding different types of macroscopic electronic properties (due, e.g., to ferromagnetism or superconductivity), can open the door to unprecedented research opportunities. Hybrid nanostructures can serve to explore new device concepts with so far unexploited functionalities and, simultaneously, provide powerful tools to study fundamental aspects of general relevance to condensed-matter physics. Only recently, following progress in nanotechnology, have hybrid nanostructures become accessible to experiments.
Here we propose an original approach that takes advantage of recently developed self-assembled QDs grown on Si-based substrates. These QDs have many attractive properties (well-established growth, ease of contacting, etc.). We will integrate single and multiple QDs with normal-metal, superconducting, and ferromagnetic electrodes and explore device concepts such as spin valves, spin pumps, and spin transistors (a long standing challenge). Using these hybrid devices we will study spin-related phenomena such as the dynamics of confined and propagating spin states in different solid-state environments (including superconducting boxes), long-distance spin correlations and entanglement. The new knowledge expected from these experiments is likely to have a broad impact extending from quantum spintronics to other areas of nanoelectronics (e.g. superconducting electronics).
Max ERC Funding
1 780 442 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym INNODYN
Project Integrated Analysis & Design in Nonlinear Dynamics
Researcher (PI) Jakob Søndergaard Jensen
Host Institution (HI) DANMARKS TEKNISKE UNIVERSITET
Call Details Starting Grant (StG), PE8, ERC-2011-StG_20101014
Summary Imagine lighter and more fuel economic cars with improved crashworthiness that help save lives, aircrafts and wind-turbine blades with significant weight reductions that lead to large savings in material costs and environmental impact, and light but efficient armour that helps to protect against potentially deadly blasts. These are the future perspectives with a new generation of advanced structures and micro-structured materials.
The goal of INNODYN is to bring current design procedures for structures and materials a significant step forward by developing new efficient procedures for integrated analysis and design taking the nonlinear dynamic performance into account. The assessment of nonlinear dynamic effects is essential for fully exploiting the vast potentials of structural and material capabilities, but a focused endeavour is strongly required to develop the methodology required to reach the ambitious goals.
INNODYN will in two interacting work-packages develop the necessary computational analysis and design tools using
1) reduced-order models (WP1) that enable optimization of the overall topology of structures which is today hindered by excessive computational costs when dealing with nonlinear dynamic systems
2) multi-scale models (WP2) that facilitates topological design of the material microstructure including essential nonlinear geometrical effects currently not included in state-of-the-art methods.
The work will be carried out by a research group with two PhD-students and a postdoc, led by a PI with a track-record for original ground-breaking research in analysis and optimization of linear and nonlinear dynamics and hosted by one of the world's leading research groups on topology optimization, the TOPOPT group at the Technical University of Denmark.
Summary
Imagine lighter and more fuel economic cars with improved crashworthiness that help save lives, aircrafts and wind-turbine blades with significant weight reductions that lead to large savings in material costs and environmental impact, and light but efficient armour that helps to protect against potentially deadly blasts. These are the future perspectives with a new generation of advanced structures and micro-structured materials.
The goal of INNODYN is to bring current design procedures for structures and materials a significant step forward by developing new efficient procedures for integrated analysis and design taking the nonlinear dynamic performance into account. The assessment of nonlinear dynamic effects is essential for fully exploiting the vast potentials of structural and material capabilities, but a focused endeavour is strongly required to develop the methodology required to reach the ambitious goals.
INNODYN will in two interacting work-packages develop the necessary computational analysis and design tools using
1) reduced-order models (WP1) that enable optimization of the overall topology of structures which is today hindered by excessive computational costs when dealing with nonlinear dynamic systems
2) multi-scale models (WP2) that facilitates topological design of the material microstructure including essential nonlinear geometrical effects currently not included in state-of-the-art methods.
The work will be carried out by a research group with two PhD-students and a postdoc, led by a PI with a track-record for original ground-breaking research in analysis and optimization of linear and nonlinear dynamics and hosted by one of the world's leading research groups on topology optimization, the TOPOPT group at the Technical University of Denmark.
Max ERC Funding
823 992 €
Duration
Start date: 2012-02-01, End date: 2016-01-31
Project acronym IT-DC
Project Large-scale integrative biology of human dendritic cells
Researcher (PI) Vassili Soumelis
Host Institution (HI) INSTITUT CURIE
Call Details Starting Grant (StG), LS6, ERC-2011-StG_20101109
Summary The immune system is composed of different cell types that work in a coordinated manner to maintain the integrity of the organism in situations of threats or danger. Dendritic cells (DC) act as sentinels in peripheral tissue and mucosal interfaces where they integrate a diversity of danger- and inflammation-associated stimuli, then migrate to secondary lymphoid organs, and instruct naïve CD4 T cells to differentiate into the appropriate effector T cells. Hence, they play a critical role in linking innate to adaptive immunity. Three important levels of complexity characterize the DC system: 1) DC integrate multiple stimuli within complex inflammatory microenvironments, 2) these signals induce a complex output response and modify the global state of DC (environmental plasticity), 3) DC exist in different subsets generated by distinct differentiation pathways (evolutionary selection). In this proposal, we use cellular and molecular immunology combined to computational biology and modeling to study these properties at the large-scale level, and understand their interdependence in controlling DC biology. We ask the following specific questions: WP1: How DC subsets integrate combinations of stimuli at the large-scale level; WP2: How single and multiple stimuli modify DC state over time (dynamic modeling); WP3: How the DC global state influences response to a given stimulus. These questions will be addressed using a data-driven strategy combining global unsupervised exploratory analysis, gene-by-gene analysis and modeling, experimental validation of testable hypothesis. Through this systems level integrative approach, we will dissect the complexity of DC reciprocal interactions with their complex microenvironment, and hope to unravel novel mechanisms and concepts determining DC function.
Summary
The immune system is composed of different cell types that work in a coordinated manner to maintain the integrity of the organism in situations of threats or danger. Dendritic cells (DC) act as sentinels in peripheral tissue and mucosal interfaces where they integrate a diversity of danger- and inflammation-associated stimuli, then migrate to secondary lymphoid organs, and instruct naïve CD4 T cells to differentiate into the appropriate effector T cells. Hence, they play a critical role in linking innate to adaptive immunity. Three important levels of complexity characterize the DC system: 1) DC integrate multiple stimuli within complex inflammatory microenvironments, 2) these signals induce a complex output response and modify the global state of DC (environmental plasticity), 3) DC exist in different subsets generated by distinct differentiation pathways (evolutionary selection). In this proposal, we use cellular and molecular immunology combined to computational biology and modeling to study these properties at the large-scale level, and understand their interdependence in controlling DC biology. We ask the following specific questions: WP1: How DC subsets integrate combinations of stimuli at the large-scale level; WP2: How single and multiple stimuli modify DC state over time (dynamic modeling); WP3: How the DC global state influences response to a given stimulus. These questions will be addressed using a data-driven strategy combining global unsupervised exploratory analysis, gene-by-gene analysis and modeling, experimental validation of testable hypothesis. Through this systems level integrative approach, we will dissect the complexity of DC reciprocal interactions with their complex microenvironment, and hope to unravel novel mechanisms and concepts determining DC function.
Max ERC Funding
1 498 997 €
Duration
Start date: 2012-03-01, End date: 2017-02-28
Project acronym KHAM
Project Territories, Communities and Exchanges in the Sino-Tibetan Kham Borderlands (China)
Researcher (PI) Stéphane Gros
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), SH2, ERC-2011-StG_20101124
Summary This research project will focus on the area of the Sino-Tibetan borderlands situated within the People’s Republic of China, and referred to as Kham by Tibetans who make up most of the population of this region divided between the provinces of Sichuan to the east, Yunnan to the south and the Tibet Autonomous Region to the west. This research project intends to explore from a comparative perspective the possible definitions of this entity called Kham, which in the course of history has never strictly corresponded to any administrative unit or coherent whole, and which ultimately should be considered as a land of encounters, a place of métissage (cultural exchanges).
By addressing a regional area virtually overlooked by Western research in social science, this project aims to strengthen international academic exchanges and to produce a strong network of collaboration on Kham studies. The multidisciplinary team will undertake ethnographic field studies and documentary research including archival research and contribute fresh, first-hand material to the socio-cultural diversity of Kham.
In-depth investigation of the internal diversity of Tibet and its connection with the outside remains sketchy and thus a particular focus of this project is to delve into the complexities of Tibetan society in China. This pioneering work will provide new materials on four complementary cross-disciplinary themes: 1) trade and commerce, 2) ethnicity, religion and local identities, 3) political entities and social organization, and 4) representations and cultural politics, each of which in its own way will improve our understanding of the particular historical, social and political context of the Kham region. Finally, this multi-tiered and multi-scalar approach, with an emphasis on networks, will enhance work on historical mapping, which is still practically non-existent in this region.
Summary
This research project will focus on the area of the Sino-Tibetan borderlands situated within the People’s Republic of China, and referred to as Kham by Tibetans who make up most of the population of this region divided between the provinces of Sichuan to the east, Yunnan to the south and the Tibet Autonomous Region to the west. This research project intends to explore from a comparative perspective the possible definitions of this entity called Kham, which in the course of history has never strictly corresponded to any administrative unit or coherent whole, and which ultimately should be considered as a land of encounters, a place of métissage (cultural exchanges).
By addressing a regional area virtually overlooked by Western research in social science, this project aims to strengthen international academic exchanges and to produce a strong network of collaboration on Kham studies. The multidisciplinary team will undertake ethnographic field studies and documentary research including archival research and contribute fresh, first-hand material to the socio-cultural diversity of Kham.
In-depth investigation of the internal diversity of Tibet and its connection with the outside remains sketchy and thus a particular focus of this project is to delve into the complexities of Tibetan society in China. This pioneering work will provide new materials on four complementary cross-disciplinary themes: 1) trade and commerce, 2) ethnicity, religion and local identities, 3) political entities and social organization, and 4) representations and cultural politics, each of which in its own way will improve our understanding of the particular historical, social and political context of the Kham region. Finally, this multi-tiered and multi-scalar approach, with an emphasis on networks, will enhance work on historical mapping, which is still practically non-existent in this region.
Max ERC Funding
651 722 €
Duration
Start date: 2012-03-01, End date: 2016-02-29
Project acronym LIC
Project Loop models, integrability and combinatorics
Researcher (PI) Paul Georges Zinn-Justin
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2011-StG_20101014
Summary The purpose of this proposal is to investigate new connections which
have emerged in the recent years between problems from statistical
mechanics, namely two-dimensional exactly solvable models, and a variety
of combinatorial problems, among which: the enumeration of plane partitions,
alternating sign matrices and related objects;
combinatorial properties of certain
algebro-geometric objects such as orbital varieties or the Brauer loop scheme;
or finally certain problems in free probability. One of the key methods
that emerged in recent years is the use
of quantum integrability and more precisely the quantum Knizhnik--Zamolodchikov
equation, which itself is related to many deep results in representation theory.
The fruitful interaction between all these ideas has led to many advances
in the last few years, including proofs of some old conjectures but
also completely new results. More specifically, loop models
are a class of statistical models where the PI has made
significant progress, in particular in relation to the so-called
Razumov--Stroganov conjecture (now Cantini--Sportiello theorem).
New directions that should be pursued include:
further applications to enumerative combinatorics such as proofs of various
open conjectures relating Alternating Sign Matrices, Plane Partitions
and their symmetry classes;
a full understanding of the quantum integrability of the
Fully Packed Loop model,
a specific loop model at the heart of the Razumov--Stroganov correspondence;
a complete description of the Brauer loop scheme, including its
defining equations, and of the underlying poset; the extension
of the work on Di Francesco and Zinn-Justin on the loop model/6-vertex vertex
relation to the case of the 8-vertex model
(corresponding to elliptic solutions of the Yang--Baxter equation);
the study of solvable tilings models, in relation to
generalizations of the Littlewood--Richardson rule, and the determination
of their limiting shapes.
Summary
The purpose of this proposal is to investigate new connections which
have emerged in the recent years between problems from statistical
mechanics, namely two-dimensional exactly solvable models, and a variety
of combinatorial problems, among which: the enumeration of plane partitions,
alternating sign matrices and related objects;
combinatorial properties of certain
algebro-geometric objects such as orbital varieties or the Brauer loop scheme;
or finally certain problems in free probability. One of the key methods
that emerged in recent years is the use
of quantum integrability and more precisely the quantum Knizhnik--Zamolodchikov
equation, which itself is related to many deep results in representation theory.
The fruitful interaction between all these ideas has led to many advances
in the last few years, including proofs of some old conjectures but
also completely new results. More specifically, loop models
are a class of statistical models where the PI has made
significant progress, in particular in relation to the so-called
Razumov--Stroganov conjecture (now Cantini--Sportiello theorem).
New directions that should be pursued include:
further applications to enumerative combinatorics such as proofs of various
open conjectures relating Alternating Sign Matrices, Plane Partitions
and their symmetry classes;
a full understanding of the quantum integrability of the
Fully Packed Loop model,
a specific loop model at the heart of the Razumov--Stroganov correspondence;
a complete description of the Brauer loop scheme, including its
defining equations, and of the underlying poset; the extension
of the work on Di Francesco and Zinn-Justin on the loop model/6-vertex vertex
relation to the case of the 8-vertex model
(corresponding to elliptic solutions of the Yang--Baxter equation);
the study of solvable tilings models, in relation to
generalizations of the Littlewood--Richardson rule, and the determination
of their limiting shapes.
Max ERC Funding
840 120 €
Duration
Start date: 2011-11-01, End date: 2016-10-31
Project acronym MARCHES
Project Modelling of Architectures Ruled by Coupled or Heightened Excited States
Researcher (PI) Denis, Marc, Hugues, Marie Jacquemin
Host Institution (HI) UNIVERSITE DE NANTES
Call Details Starting Grant (StG), PE4, ERC-2011-StG_20101014
Summary The goal of the MARCHES project is to rationalise and optimize the interplay between electronically excited-states in complex molecular architectures. The simulation of the properties of large conjugated architectures is to be performed with ab initio tools explicitly taking into account environmental effects. Though efficient methods able to tackle such task are to be conceived during this project, we aim to enlighten coupled excited-states so to pave the way towards chemically-intuitive designs of new molecules. Indeed, the rationalisation and optimisation of the excited-state properties of large compounds is not only one of the major challenges of computational chemistry and physics, it also opens new horizons for emergent properties. In that framework, this project will allow to design molecular switches usable as building blocks for complex logic gates, subsequently unlocking crucial steps towards more efficient storage materials. To this end, compounds containing several photochromic switches coupled at the excited state have to be designed: this is an important challenge. Indeed, photochromes are actually limited to uncoupled or simply additive systems: emergent multi-addressable features are impossible to achieve.
Summary
The goal of the MARCHES project is to rationalise and optimize the interplay between electronically excited-states in complex molecular architectures. The simulation of the properties of large conjugated architectures is to be performed with ab initio tools explicitly taking into account environmental effects. Though efficient methods able to tackle such task are to be conceived during this project, we aim to enlighten coupled excited-states so to pave the way towards chemically-intuitive designs of new molecules. Indeed, the rationalisation and optimisation of the excited-state properties of large compounds is not only one of the major challenges of computational chemistry and physics, it also opens new horizons for emergent properties. In that framework, this project will allow to design molecular switches usable as building blocks for complex logic gates, subsequently unlocking crucial steps towards more efficient storage materials. To this end, compounds containing several photochromic switches coupled at the excited state have to be designed: this is an important challenge. Indeed, photochromes are actually limited to uncoupled or simply additive systems: emergent multi-addressable features are impossible to achieve.
Max ERC Funding
1 500 000 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym MEIOSIGHT
Project MEIOtic inSIGHT: Deciphering the engine of heredity
Researcher (PI) Raphael Mercier
Host Institution (HI) INSTITUT NATIONAL DE LA RECHERCHE AGRONOMIQUE
Call Details Starting Grant (StG), LS2, ERC-2011-StG_20101109
Summary Meiosis is an essential stage in the life cycle of sexually-reproducing organisms. Indeed, meiosis is the specialized cell division that reduces the number of chromosomes from two sets in the parent to one set in gametes, while fertilization restores the original chromosome number. Meiosis is also the stage of development when genetic recombination occurs, thus being the heart of Mendelian heredity. Increasing our knowledge on meiotic mechanisms, in addition to its intrinsic interest, may have also important implications for agriculture and medicine.
In the last decade Arabidopsis emerged as one of the prominent models in the field of meiosis. Indeed, the meiotic field benefits greatly from a multi-model approach with several kingdoms represented, highlighting both conserved mechanisms and variation around the theme. Arabidopsis did not emerge only as a representative of its phylum, but is also a very good model to study meiosis in general, notably because of the possibility of large scale genetic studies and the availability of large mutant collections and wide range of molecular and cytological tools. In this project we aim to use original approaches to decipher much further meiotic mechanisms, by isolating a large number of novel genes and characterizing their functions in an integrated manner. To identify new meiotic functions, we will use innovative genetic approaches. The first work package is based on a new suppressor screen strategy, taking advantage of a unique and favourable situation in Arabidopsis. The second is an unprecedented screen that exploits the fact that we can now synthesize haploids in a higher eukaryote. The third work package aims to fully exploit the available transcriptome data. In the fourth work package we will use these new genes to deeply decipher the meiotic mechanisms in an integrated manner.
Summary
Meiosis is an essential stage in the life cycle of sexually-reproducing organisms. Indeed, meiosis is the specialized cell division that reduces the number of chromosomes from two sets in the parent to one set in gametes, while fertilization restores the original chromosome number. Meiosis is also the stage of development when genetic recombination occurs, thus being the heart of Mendelian heredity. Increasing our knowledge on meiotic mechanisms, in addition to its intrinsic interest, may have also important implications for agriculture and medicine.
In the last decade Arabidopsis emerged as one of the prominent models in the field of meiosis. Indeed, the meiotic field benefits greatly from a multi-model approach with several kingdoms represented, highlighting both conserved mechanisms and variation around the theme. Arabidopsis did not emerge only as a representative of its phylum, but is also a very good model to study meiosis in general, notably because of the possibility of large scale genetic studies and the availability of large mutant collections and wide range of molecular and cytological tools. In this project we aim to use original approaches to decipher much further meiotic mechanisms, by isolating a large number of novel genes and characterizing their functions in an integrated manner. To identify new meiotic functions, we will use innovative genetic approaches. The first work package is based on a new suppressor screen strategy, taking advantage of a unique and favourable situation in Arabidopsis. The second is an unprecedented screen that exploits the fact that we can now synthesize haploids in a higher eukaryote. The third work package aims to fully exploit the available transcriptome data. In the fourth work package we will use these new genes to deeply decipher the meiotic mechanisms in an integrated manner.
Max ERC Funding
1 492 663 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym MEMCAD
Project Memory Compositional Abstract Domains:
Certification of Memory Intensive Critical Softwares
Researcher (PI) Xavier Philippe Rival
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary Every year, software bugs cost hundreds of millions of euros to compagnies and administrations. A number of disasters such as the Ariane 5 first flight failure can are due to faulty softwares. Static analysis aims at computing automatically properties of softwares, so as to prove they are exempt from some class of bugs. In the last ten years, static analysis of numeric intensive applications improved dramatically so that the certification of safety properties like the absence of runtime errors in industrial size control-command, numeric intensive applications, such as Airbus fly-by-wire softwares is now feasible.
By contrast, the situation is much worse for memory intensive softwares. Existing static analyzers for such softwares do not scale to large scale softwares, and fail to prove strong invariants on large classes of softwares. These limitations stem from the fact they use a monolithic algebra of logical formulas (or abstract domain).
Our proposal is based on the observation that the complex memory properties that need be reasoned about should be decomposed in combinations of simpler properties. Therefore, in static analysis, a powerful memory abstract domain could be designed by combining several simpler domains, specific to common memory usage patterns. The benefit of this novel vision is twofold: first it would make it possible to simplify drastically the design of complex abstract domains required to reason about complex softwares, hereby allowing certification of complex memory intensive softwares by automatic static analysis; second, it would enable to split down and better control the cost of the analyses, thus significantly helping scalability.
This shift of focus will bring both theoretical and practical improvements to the program certification field. We propose to build a static analysis framework for reasoning about memory properties, and put it to work on important classes of applications, including large safety critical memory intensive softwares.
Summary
Every year, software bugs cost hundreds of millions of euros to compagnies and administrations. A number of disasters such as the Ariane 5 first flight failure can are due to faulty softwares. Static analysis aims at computing automatically properties of softwares, so as to prove they are exempt from some class of bugs. In the last ten years, static analysis of numeric intensive applications improved dramatically so that the certification of safety properties like the absence of runtime errors in industrial size control-command, numeric intensive applications, such as Airbus fly-by-wire softwares is now feasible.
By contrast, the situation is much worse for memory intensive softwares. Existing static analyzers for such softwares do not scale to large scale softwares, and fail to prove strong invariants on large classes of softwares. These limitations stem from the fact they use a monolithic algebra of logical formulas (or abstract domain).
Our proposal is based on the observation that the complex memory properties that need be reasoned about should be decomposed in combinations of simpler properties. Therefore, in static analysis, a powerful memory abstract domain could be designed by combining several simpler domains, specific to common memory usage patterns. The benefit of this novel vision is twofold: first it would make it possible to simplify drastically the design of complex abstract domains required to reason about complex softwares, hereby allowing certification of complex memory intensive softwares by automatic static analysis; second, it would enable to split down and better control the cost of the analyses, thus significantly helping scalability.
This shift of focus will bring both theoretical and practical improvements to the program certification field. We propose to build a static analysis framework for reasoning about memory properties, and put it to work on important classes of applications, including large safety critical memory intensive softwares.
Max ERC Funding
1 489 663 €
Duration
Start date: 2011-10-01, End date: 2017-09-30
Project acronym Micromecca
Project Molecular mechanisms underlying plant miRNA action
Researcher (PI) Anders Peter Brodersen
Host Institution (HI) KOBENHAVNS UNIVERSITET
Call Details Starting Grant (StG), LS2, ERC-2011-StG_20101109
Summary MicroRNAs (miRNAs) are 20-22 nt non-coding RNAs that regulate gene expression post transcriptionally via base pairing to complementary target mRNAs. They have fundamental importance for development and stress adaptation in plants and animals. Although a molecular frame work for miRNA biogenesis, degradation and action has been established, many aspects of this important gene regulatory pathway remain unknown. This project explores four main points. First, we propose to use genetic approaches to identify factors required for translational repression by miRNAs in plants. This mode of action was until recently thought to occur only exceptionally in plants. My post doctoral work showed that it occurs in many miRNA-target interactions. The mechanism remains unknown, however, leaving open a fertile area of investigation. Second, we wish to test specific hypotheses regarding the in vivo role of miRNA mediated endonucleolysis of mRNA targets. Long believed to serve exclusively as a degradation mechanism, we propose to test whether this process could have important functions in biogenesis of long non-coding RNA derived from mRNAs.
Third, my postdoctoral work has provided unique material to use molecular genetics to explore pathways responsible for miRNA degradation, an aspect of miRNA biology that only now is emerging as being of major importance. Finally, our unpublished results show that plant miRNAs and their associated effector protein Argonaute (AGO) are associated with membranes and that membrane association is crucial for function. This is in line with similar data recently obtained from different animal systems. We propose to use genetic, biochemical and cell biological approaches to clarify to which membrane compartment AGO and miRNAs are associated, how they are recruited to this compartment, and what the precise function of membrane association is.
These innovative approaches promise to give fundamental new insights into the inner workings of the pathway.
Summary
MicroRNAs (miRNAs) are 20-22 nt non-coding RNAs that regulate gene expression post transcriptionally via base pairing to complementary target mRNAs. They have fundamental importance for development and stress adaptation in plants and animals. Although a molecular frame work for miRNA biogenesis, degradation and action has been established, many aspects of this important gene regulatory pathway remain unknown. This project explores four main points. First, we propose to use genetic approaches to identify factors required for translational repression by miRNAs in plants. This mode of action was until recently thought to occur only exceptionally in plants. My post doctoral work showed that it occurs in many miRNA-target interactions. The mechanism remains unknown, however, leaving open a fertile area of investigation. Second, we wish to test specific hypotheses regarding the in vivo role of miRNA mediated endonucleolysis of mRNA targets. Long believed to serve exclusively as a degradation mechanism, we propose to test whether this process could have important functions in biogenesis of long non-coding RNA derived from mRNAs.
Third, my postdoctoral work has provided unique material to use molecular genetics to explore pathways responsible for miRNA degradation, an aspect of miRNA biology that only now is emerging as being of major importance. Finally, our unpublished results show that plant miRNAs and their associated effector protein Argonaute (AGO) are associated with membranes and that membrane association is crucial for function. This is in line with similar data recently obtained from different animal systems. We propose to use genetic, biochemical and cell biological approaches to clarify to which membrane compartment AGO and miRNAs are associated, how they are recruited to this compartment, and what the precise function of membrane association is.
These innovative approaches promise to give fundamental new insights into the inner workings of the pathway.
Max ERC Funding
1 459 011 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym MOLSTRUCTTRANSFO
Project Molecular and Structural Biology of Bacterial Transformation
Researcher (PI) Rémi Fronzes
Host Institution (HI) INSTITUT PASTEUR
Call Details Starting Grant (StG), LS1, ERC-2011-StG_20101109
Summary A common form of gene transfer is the vertical gene transfer between one organism and its offspring during sexual reproduction. However, some organisms, such as bacteria, are able to acquire genetic material independently of sexual reproduction by horizontal gene transfer (HGT). Three mechanisms mediate HGT in bacteria: conjugation, transduction and natural transformation. HGT and the selective pressure exerted by the widespread use antibiotics (in medicine, veterinary medicine, agriculture, animal feeding, etc) are responsible for the rapid spread of antibiotic resistance genes among pathogenic bacteria.
In this proposal, we focus on bacterial transformation systems, also named competence systems. Natural transformation is the acquisition of naked DNA from the extracellular milieu. It is the only programmed process for generalized genetic exchange found in bacteria. This highly efficient and regulated process promotes bacterial genome plasticity and adaptive response of bacteria to changes in their environment. It is essential for bacterial survival and/or virulence and greatly limits efficiency of treatments or vaccine against some pathogenic bacteria.
The architecture and functioning of the membrane protein complexes mediating DNA transfer through the cell envelope during bacterial transformation remain elusive. We want to decipher the molecular mechanism of this transfer. To attain this goal, we will carry out structural biology studies (X-ray crystallography and high resolution electron microscopy) as well as functional and structure-function in vivo studies. We have the ambition to make major contributions to the understanding of bacterial transformation. Ultimately, we hope that our results will also help to find compounds that could block natural transformation in bacterial pathogens.
Summary
A common form of gene transfer is the vertical gene transfer between one organism and its offspring during sexual reproduction. However, some organisms, such as bacteria, are able to acquire genetic material independently of sexual reproduction by horizontal gene transfer (HGT). Three mechanisms mediate HGT in bacteria: conjugation, transduction and natural transformation. HGT and the selective pressure exerted by the widespread use antibiotics (in medicine, veterinary medicine, agriculture, animal feeding, etc) are responsible for the rapid spread of antibiotic resistance genes among pathogenic bacteria.
In this proposal, we focus on bacterial transformation systems, also named competence systems. Natural transformation is the acquisition of naked DNA from the extracellular milieu. It is the only programmed process for generalized genetic exchange found in bacteria. This highly efficient and regulated process promotes bacterial genome plasticity and adaptive response of bacteria to changes in their environment. It is essential for bacterial survival and/or virulence and greatly limits efficiency of treatments or vaccine against some pathogenic bacteria.
The architecture and functioning of the membrane protein complexes mediating DNA transfer through the cell envelope during bacterial transformation remain elusive. We want to decipher the molecular mechanism of this transfer. To attain this goal, we will carry out structural biology studies (X-ray crystallography and high resolution electron microscopy) as well as functional and structure-function in vivo studies. We have the ambition to make major contributions to the understanding of bacterial transformation. Ultimately, we hope that our results will also help to find compounds that could block natural transformation in bacterial pathogens.
Max ERC Funding
1 405 149 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym MottMetals
Project Quantitative approaches for strongly correlated quantum systems in equilibrium and far from equilibrium
Researcher (PI) Olivier Paul Emile Parcollet
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE3, ERC-2011-StG_20101014
Summary Understanding electronic correlations remains one of the most important challenges in theoretical condensed matter physics. The interaction-induced metal-to-insulator Mott transition plays a major role in many transition metal oxides, f-electron materials and now in quantum optics. Upon doping or application of a strong electric field, strongly correlated Mott metals emerge from the Mott insulators, with fascinating properties. Moreover, the out-of-equilibrium behaviour of these systems is only beginning to be systematically explored experimentally. While these systems strongly challenge the standard concepts and methods of the quantum many-body theory, a new era is progressively unfolding, in which quantitative and detailed comparisons between theory and experiments is becoming possible in strong correlation regimes, even out of equilibrium.
The goal of this proposal is to construct, in close contact with experiments and phenomenology, a new generation of theoretical methods and algorithms in order to i) study the new states of matter induced by non-equilibrium phenomena in strongly correlated quantum systems, first in simple models, and then in realistic computations for real materials; ii) elucidate the mystery of high temperature superconductivity. Open source implementations of the methods and algorithms developed during this project will also be provided for a better knowledge diffusion.
Summary
Understanding electronic correlations remains one of the most important challenges in theoretical condensed matter physics. The interaction-induced metal-to-insulator Mott transition plays a major role in many transition metal oxides, f-electron materials and now in quantum optics. Upon doping or application of a strong electric field, strongly correlated Mott metals emerge from the Mott insulators, with fascinating properties. Moreover, the out-of-equilibrium behaviour of these systems is only beginning to be systematically explored experimentally. While these systems strongly challenge the standard concepts and methods of the quantum many-body theory, a new era is progressively unfolding, in which quantitative and detailed comparisons between theory and experiments is becoming possible in strong correlation regimes, even out of equilibrium.
The goal of this proposal is to construct, in close contact with experiments and phenomenology, a new generation of theoretical methods and algorithms in order to i) study the new states of matter induced by non-equilibrium phenomena in strongly correlated quantum systems, first in simple models, and then in realistic computations for real materials; ii) elucidate the mystery of high temperature superconductivity. Open source implementations of the methods and algorithms developed during this project will also be provided for a better knowledge diffusion.
Max ERC Funding
1 130 800 €
Duration
Start date: 2012-01-01, End date: 2017-12-31
Project acronym MULTICELL
Project Microfluidic multiplexed cell chips
Researcher (PI) Charles Baroud
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE8, ERC-2011-StG_20101014
Summary There exist very few techniques for studying a group of cells containing
a large number compared to a single cell but small compared to a whole
tissue. This implies that statistics are exceedingly difficult to obtain
from measurements of individual cells. Microfluidics provides a way to
amend this by allowing ways to observe individual cells and automate
such measurements. The aim in this project is to develop a cell
manipulation platforms based on microfluidics techniques developed in
our lab, while answering relevant biological questions.
The first question concerns Sickle Cell Anemia, a genetic disease for
which no treatment exists. We will study the polymerization of
hemoglobin within red blood cells, as they are submitted to cycles of
oxygenation and deoxygenation. Quantitative measurements of the response
of the cells to oxygen variations will allow physiological conditions to
be simulated, including in the presence of therapeutic candidates or
other biological agents.
The second question concerns the motility of adherent cells in a
three-dimensional environment. This question will be to understand the
migration of cells in a 3D gradient of chemo-attractant, as well as
gradients of rigidity of the environment. This part will require the
development of new technological tools which can later be applied to a
wide range of biological problems. The long term aim is to replace the
current tools of biological labs with miniaturized and integrated lab on
a chip devices.
Summary
There exist very few techniques for studying a group of cells containing
a large number compared to a single cell but small compared to a whole
tissue. This implies that statistics are exceedingly difficult to obtain
from measurements of individual cells. Microfluidics provides a way to
amend this by allowing ways to observe individual cells and automate
such measurements. The aim in this project is to develop a cell
manipulation platforms based on microfluidics techniques developed in
our lab, while answering relevant biological questions.
The first question concerns Sickle Cell Anemia, a genetic disease for
which no treatment exists. We will study the polymerization of
hemoglobin within red blood cells, as they are submitted to cycles of
oxygenation and deoxygenation. Quantitative measurements of the response
of the cells to oxygen variations will allow physiological conditions to
be simulated, including in the presence of therapeutic candidates or
other biological agents.
The second question concerns the motility of adherent cells in a
three-dimensional environment. This question will be to understand the
migration of cells in a 3D gradient of chemo-attractant, as well as
gradients of rigidity of the environment. This part will require the
development of new technological tools which can later be applied to a
wide range of biological problems. The long term aim is to replace the
current tools of biological labs with miniaturized and integrated lab on
a chip devices.
Max ERC Funding
1 494 744 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym NDOGS
Project Nuclear Dynamic, Organization and Genome Stability
Researcher (PI) Karine Marie Renée Dubrana
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), LS1, ERC-2011-StG_20101109
Summary The eukaryotic genome is packaged into large-scale chromatin structures occupying distinct domains in the cell nucleus. Nuclear compartmentalization has recently been proposed to play an important role in genome stability but the molecular steps regulated remain to be defined. Focusing on Double strand breaks (DSBs) in response to which cells activate checkpoint and DNA repair pathways, we propose to characterize the spatial and temporal behavior of damaged chromatin and determine how this affects the maintenance of genome integrity. Currently, most studies concerning DSBs signaling and repair have been realized on asynchronous cell populations, which makes it difficult to precisely define the kinetics of events that occur at the cellular level. We thus propose to follow the nuclear localization and dynamics of an inducible DSB concomitantly with the kinetics of checkpoint activation and DNA repair at a single cell level and along the cell cycle. This will be performed using budding yeast as a model system enabling the combination of genetics, molecular biology and advanced live microscopy. We recently demonstrated that DSBs relocated to the nuclear periphery where they contact nuclear pores. This change in localization possibly regulates the choice of the repair pathway through steps that are controlled by post-translational modifications. This proposal aims at dissecting the molecular pathways defining the position of DSBs in the nucleus by performing genetic and proteomic screens, testing the functional consequence of nuclear position for checkpoint activation and DNA repair by driving the DSB to specific nuclear landmarks and, defining the dynamics of DNA damages in different repair contexts. Our project will identify new players in the DNA repair and checkpoint pathways and further our understanding of how the compartmentalization of damaged chromatin into the nucleus regulates these processes to insure the transmission of a stable genome.
Summary
The eukaryotic genome is packaged into large-scale chromatin structures occupying distinct domains in the cell nucleus. Nuclear compartmentalization has recently been proposed to play an important role in genome stability but the molecular steps regulated remain to be defined. Focusing on Double strand breaks (DSBs) in response to which cells activate checkpoint and DNA repair pathways, we propose to characterize the spatial and temporal behavior of damaged chromatin and determine how this affects the maintenance of genome integrity. Currently, most studies concerning DSBs signaling and repair have been realized on asynchronous cell populations, which makes it difficult to precisely define the kinetics of events that occur at the cellular level. We thus propose to follow the nuclear localization and dynamics of an inducible DSB concomitantly with the kinetics of checkpoint activation and DNA repair at a single cell level and along the cell cycle. This will be performed using budding yeast as a model system enabling the combination of genetics, molecular biology and advanced live microscopy. We recently demonstrated that DSBs relocated to the nuclear periphery where they contact nuclear pores. This change in localization possibly regulates the choice of the repair pathway through steps that are controlled by post-translational modifications. This proposal aims at dissecting the molecular pathways defining the position of DSBs in the nucleus by performing genetic and proteomic screens, testing the functional consequence of nuclear position for checkpoint activation and DNA repair by driving the DSB to specific nuclear landmarks and, defining the dynamics of DNA damages in different repair contexts. Our project will identify new players in the DNA repair and checkpoint pathways and further our understanding of how the compartmentalization of damaged chromatin into the nucleus regulates these processes to insure the transmission of a stable genome.
Max ERC Funding
1 499 863 €
Duration
Start date: 2012-02-01, End date: 2018-01-31
Project acronym NEUROFEAR
Project Neuronal circuits controlling fear behavior
Researcher (PI) Cyril Herry
Host Institution (HI) INSTITUT NATIONAL DE LA SANTE ET DE LA RECHERCHE MEDICALE
Call Details Starting Grant (StG), LS5, ERC-2011-StG_20101109
Summary Accurate adaptation to stimuli predicting threatening outcome is critical to survival. An insufficient fear reaction may lead the animal to overlook future signs of danger, whereas overreacting may lead the animal to failure to explore and miss opportunities for feeding or mating. Numerous data indicate that the medial prefrontal cortex (mPFC) plays a key role in the control of fear behavior and that distinct prefrontal areas differentially regulate the expression/inhibition of fear responses. Whereas lesions/inactivations of the mPFC infralimbic (IL) area promote fear expression, lesions/inactivations of the mPFC prelimbic area (PL) promote fear inhibition. Moreover, PL and IL receive segregated inputs from functionally distinct amygdala circuits activated during high and low fear states. These data suggest that a key function of mPFC circuits might be to integrate inputs from the amygdala to ultimately gate fear expression via projections to specific neuronal circuits. However, little is known about the underlying neuronal circuits. Is the rapid switch between expression/suppression of fear behaviors mediated by the same circuits or does the mPFC contain distinct circuits dedicated to the control of opposite behaviors? Is there an organization in terms of afferents and efferent at the level of mPFC neuronal circuits? To address these questions we will use a cross-level approach combining in vivo electrophysiological optogenetic and behavioral approaches to elucidate the anatomical/physiological properties of mPFC circuits and to address their functional role in the control of fear behavior. We will first examine the activation and connectivity of mPFC circuits using in vivo extracellular recordings and extracellular stimulations. We will next selectively manipulate these circuits during behavior using light-activated proteins to establish causal relationships. Finally we will study their plasticity and anatomical properties using in vivo intracellular recordings.
Summary
Accurate adaptation to stimuli predicting threatening outcome is critical to survival. An insufficient fear reaction may lead the animal to overlook future signs of danger, whereas overreacting may lead the animal to failure to explore and miss opportunities for feeding or mating. Numerous data indicate that the medial prefrontal cortex (mPFC) plays a key role in the control of fear behavior and that distinct prefrontal areas differentially regulate the expression/inhibition of fear responses. Whereas lesions/inactivations of the mPFC infralimbic (IL) area promote fear expression, lesions/inactivations of the mPFC prelimbic area (PL) promote fear inhibition. Moreover, PL and IL receive segregated inputs from functionally distinct amygdala circuits activated during high and low fear states. These data suggest that a key function of mPFC circuits might be to integrate inputs from the amygdala to ultimately gate fear expression via projections to specific neuronal circuits. However, little is known about the underlying neuronal circuits. Is the rapid switch between expression/suppression of fear behaviors mediated by the same circuits or does the mPFC contain distinct circuits dedicated to the control of opposite behaviors? Is there an organization in terms of afferents and efferent at the level of mPFC neuronal circuits? To address these questions we will use a cross-level approach combining in vivo electrophysiological optogenetic and behavioral approaches to elucidate the anatomical/physiological properties of mPFC circuits and to address their functional role in the control of fear behavior. We will first examine the activation and connectivity of mPFC circuits using in vivo extracellular recordings and extracellular stimulations. We will next selectively manipulate these circuits during behavior using light-activated proteins to establish causal relationships. Finally we will study their plasticity and anatomical properties using in vivo intracellular recordings.
Max ERC Funding
1 496 300 €
Duration
Start date: 2011-11-01, End date: 2016-10-31
Project acronym NEWDARK
Project New Directions in Dark Matter Phenomenology at the TeV scale
Researcher (PI) Marco Cirelli
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE2, ERC-2011-StG_20101014
Summary Dark Matter constitutes about 80% of the total matter of the Universe, yet almost nothing is known of its nature: despite the huge experimental and theoretical efforts of the last decades, its true identity is yet to be determined. The recent years and the next few years, however, see several experimental exploratory techniques approaching for the first time the TeV scale, in a multi-faceted attack to the problem: the Large Hadron Collider at CERN in particle physics, the PAMELA and AMS-02 satellites in charged cosmic ray astronomy and the FERMI telescope in gamma ray astronomy. Since general theoretical arguments lead to believe that Dark Matter is a particle inherently related to the TeV scale, the stakes are high of being finally close to the physics that holds the key of the puzzle.
The NewDark project aims at exploring selected new directions in Dark Matter phenomenology, in a multi-disciplinary approach that has its roots in theoretical particle physics and cosmology but constantly looks at astrophysical observations and experimental particle physics results, making the most of the bi-directional interactions. The ultimate goal of the project, as part of the effort at the global scale, is the identification of the nature of the Dark Matter and the exploration of its full phenomenology.
The project is organized around five main themes of Dark Matter research: theory model building, collider signatures, direct detection, indirect detection and astrophysical/cosmological implications. For each one of these, some selected groundbreaking objectives are identified. The emphasis is on new, non-traditional directions, building on the experience gained by the community in studying more traditional avenues and applying it to the new scenarios.
The project requires funds to build up a small but structured multi-disciplinary research team (hiring 4 young post-docs with diverse expertise) and allow it to work on this frontier of astroparticle physics.
Summary
Dark Matter constitutes about 80% of the total matter of the Universe, yet almost nothing is known of its nature: despite the huge experimental and theoretical efforts of the last decades, its true identity is yet to be determined. The recent years and the next few years, however, see several experimental exploratory techniques approaching for the first time the TeV scale, in a multi-faceted attack to the problem: the Large Hadron Collider at CERN in particle physics, the PAMELA and AMS-02 satellites in charged cosmic ray astronomy and the FERMI telescope in gamma ray astronomy. Since general theoretical arguments lead to believe that Dark Matter is a particle inherently related to the TeV scale, the stakes are high of being finally close to the physics that holds the key of the puzzle.
The NewDark project aims at exploring selected new directions in Dark Matter phenomenology, in a multi-disciplinary approach that has its roots in theoretical particle physics and cosmology but constantly looks at astrophysical observations and experimental particle physics results, making the most of the bi-directional interactions. The ultimate goal of the project, as part of the effort at the global scale, is the identification of the nature of the Dark Matter and the exploration of its full phenomenology.
The project is organized around five main themes of Dark Matter research: theory model building, collider signatures, direct detection, indirect detection and astrophysical/cosmological implications. For each one of these, some selected groundbreaking objectives are identified. The emphasis is on new, non-traditional directions, building on the experience gained by the community in studying more traditional avenues and applying it to the new scenarios.
The project requires funds to build up a small but structured multi-disciplinary research team (hiring 4 young post-docs with diverse expertise) and allow it to work on this frontier of astroparticle physics.
Max ERC Funding
1 462 200 €
Duration
Start date: 2012-10-01, End date: 2018-09-30
Project acronym NEXTPHASE
Project NEXT generation of microwave PHotonic systems for AeroSpace Engineering
Researcher (PI) Yanne Chembo Kouomou
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE7, ERC-2011-StG_20101014
Summary Aerospace and communication engineering technologies are in constant need of microwaves with extremely high spectral purity and stability. Unfortunately, the generation of such ultra-pure microwaves with compact, versatile and transportable sources is still a very complex challenge. In aerospace engineering, ultra-stable quartz oscillators are overwhelmingly dominant as key components for both navigation and detection systems. However, it is unanimously recognized today that their frequency stability performance is reaching its floor, and will not improve significantly anymore. In the search for an alternative standard for the next generation of ultra-pure microwave sources in aerospace technology, we propose the exploration of an elegant and promising solution relying on optical resonators with ultra-high Q factors (Q ~ 1E10). In these quasi-perfectly shaped cavities, nonlinear effects are significantly enhanced and microwave generation is performed through the extraction of the intermodal frequency. This approach has several advantages over existing or other prospective methods: conceptual simplicity, higher robustness, smaller power consumption, longer lifetime, immunity to interferences, very compact volume, frequency versatility, easy chip integration, as well as a strong potential for integrating the mainstream of standard photonic components for both microwave and lightwave technologies. Our ambition in the NextPhase project is to significantly outperform quartz oscillators and demonstrate performances comparable to cryogenic sapphire oscillators, with a compact (< 100 cm3), versatile (up to at least 200 GHz) and ultra-stable (Allan variance ~ 1E-15 at 1 s; phase noise floor < -160 dBc/Hz) microwave photonic generator. We also expect our work to open new opportunities of research in optical communications (photonic components for full-optical processing, carrier synthesis), as well as in fundamental aspects of condensed matter and quantum physics.
Summary
Aerospace and communication engineering technologies are in constant need of microwaves with extremely high spectral purity and stability. Unfortunately, the generation of such ultra-pure microwaves with compact, versatile and transportable sources is still a very complex challenge. In aerospace engineering, ultra-stable quartz oscillators are overwhelmingly dominant as key components for both navigation and detection systems. However, it is unanimously recognized today that their frequency stability performance is reaching its floor, and will not improve significantly anymore. In the search for an alternative standard for the next generation of ultra-pure microwave sources in aerospace technology, we propose the exploration of an elegant and promising solution relying on optical resonators with ultra-high Q factors (Q ~ 1E10). In these quasi-perfectly shaped cavities, nonlinear effects are significantly enhanced and microwave generation is performed through the extraction of the intermodal frequency. This approach has several advantages over existing or other prospective methods: conceptual simplicity, higher robustness, smaller power consumption, longer lifetime, immunity to interferences, very compact volume, frequency versatility, easy chip integration, as well as a strong potential for integrating the mainstream of standard photonic components for both microwave and lightwave technologies. Our ambition in the NextPhase project is to significantly outperform quartz oscillators and demonstrate performances comparable to cryogenic sapphire oscillators, with a compact (< 100 cm3), versatile (up to at least 200 GHz) and ultra-stable (Allan variance ~ 1E-15 at 1 s; phase noise floor < -160 dBc/Hz) microwave photonic generator. We also expect our work to open new opportunities of research in optical communications (photonic components for full-optical processing, carrier synthesis), as well as in fundamental aspects of condensed matter and quantum physics.
Max ERC Funding
1 384 628 €
Duration
Start date: 2011-11-01, End date: 2016-10-31
Project acronym NPRGGLASS
Project Non Perturbative Renormalization Group Theory of Glassy Systems
Researcher (PI) Giulio Biroli
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE2, ERC-2011-StG_20101014
Summary "Glassy systems are central in several fields from statistical mechanics and soft matter to material sciences and biophysics and they appear even in completely different areas of science such as information theory, computer science, agent-based models and game theory.
The aim of this project is to develop a new, possibly groundbreaking, approach to glassy systems based on the non-perturbative renormalization group (NPRG) formalism. Modern theoretical approaches to glassy systems suffer from severe limitations; it is not clear whether and how one can improve them, and their current status is far from providing a coherent and satisfactory theory. For reasons detailed below, I believe that the NPRG approach is the long-sought theoretical framework to tackle the glass problem and that it will eventually lead to its solution. I will focus on the problem of the glass transition and the physics of glass-forming liquids. I expect that the progress we will make in this direction will also be instrumental also for other glassy systems such as spin glasses, quantum glasses and jamming systems."
Summary
"Glassy systems are central in several fields from statistical mechanics and soft matter to material sciences and biophysics and they appear even in completely different areas of science such as information theory, computer science, agent-based models and game theory.
The aim of this project is to develop a new, possibly groundbreaking, approach to glassy systems based on the non-perturbative renormalization group (NPRG) formalism. Modern theoretical approaches to glassy systems suffer from severe limitations; it is not clear whether and how one can improve them, and their current status is far from providing a coherent and satisfactory theory. For reasons detailed below, I believe that the NPRG approach is the long-sought theoretical framework to tackle the glass problem and that it will eventually lead to its solution. I will focus on the problem of the glass transition and the physics of glass-forming liquids. I expect that the progress we will make in this direction will also be instrumental also for other glassy systems such as spin glasses, quantum glasses and jamming systems."
Max ERC Funding
1 010 800 €
Duration
Start date: 2011-11-01, End date: 2017-10-31
Project acronym NUCHLEAR
Project Disarming bacterial weapons in the nucleus:
functional study of Chlamydia nuclear effectors
Researcher (PI) Agathe Subtil
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), LS6, ERC-2011-StG_20101109
Summary The high prevalence of Chlamydia infections and the heavy burden they inflict on public health justify the search for novel therapeutic approaches directed against this pathogen. Research on this obligate intracellular bacterium is very difficult, due to substantial technical impediments. One current challenge is to identify bacterial proteins that are required for Chlamydia to survive and proliferate in the host and that could serve as targets in novel therapeutic strategies. In particular, the identification of bacterial proteins implicated in the microbe’s ability to persist in host cells is highly desirable, since current treatments fail to erradicate the resulting chronic infections.
Our project is focused on the bacterial proteins that are secreted into the host cell during infection and translocate into the nucleus. Our preliminary work has already identified some of these “nuclear effectors” of Chlamydia. Targetting the “central system” of the host, these proteins are likely essential for infection. We will (i) identify all Chlamydia trachomatis nuclear effectors and define their frame of action (in time and space) during infection, (ii) identify the targets of the nuclear effectors and their roles in infection and (iii) test the hypothesis that nuclear effectors are necessary for the entry into and/or maintenance of the persistent state of infection.
Addressing for the first time the full repertoire of “nuclear weapons” of an intracellular bacterium, we will uncover new interactions between the pathogen and the host. This work will lead to the development of rationally-designed drugs that inhibit the activity of the nuclear effectors, thereby disrupting the microbe’s ability to survive in the host. Beyond this medical aim, our study, which lies at the interface between microbiology, cell biology and genome biology, will provide new angles of study to each of these three disciplines and improve our understanding of fundamental cellular processes.
Summary
The high prevalence of Chlamydia infections and the heavy burden they inflict on public health justify the search for novel therapeutic approaches directed against this pathogen. Research on this obligate intracellular bacterium is very difficult, due to substantial technical impediments. One current challenge is to identify bacterial proteins that are required for Chlamydia to survive and proliferate in the host and that could serve as targets in novel therapeutic strategies. In particular, the identification of bacterial proteins implicated in the microbe’s ability to persist in host cells is highly desirable, since current treatments fail to erradicate the resulting chronic infections.
Our project is focused on the bacterial proteins that are secreted into the host cell during infection and translocate into the nucleus. Our preliminary work has already identified some of these “nuclear effectors” of Chlamydia. Targetting the “central system” of the host, these proteins are likely essential for infection. We will (i) identify all Chlamydia trachomatis nuclear effectors and define their frame of action (in time and space) during infection, (ii) identify the targets of the nuclear effectors and their roles in infection and (iii) test the hypothesis that nuclear effectors are necessary for the entry into and/or maintenance of the persistent state of infection.
Addressing for the first time the full repertoire of “nuclear weapons” of an intracellular bacterium, we will uncover new interactions between the pathogen and the host. This work will lead to the development of rationally-designed drugs that inhibit the activity of the nuclear effectors, thereby disrupting the microbe’s ability to survive in the host. Beyond this medical aim, our study, which lies at the interface between microbiology, cell biology and genome biology, will provide new angles of study to each of these three disciplines and improve our understanding of fundamental cellular processes.
Max ERC Funding
1 491 660 €
Duration
Start date: 2012-07-01, End date: 2018-06-30
Project acronym PLEASE
Project "PLEASE: Projections, Learning, and Sparsity for Efficient data-processing"
Researcher (PI) Remi Gribonval
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary "Sparse models are at the core of many research domains where the large amount and high-dimensionality of digital data requires concise data descriptions for efficient information processing. A flagship application of sparsity is compressed sensing, which exploits sparsity for data acquisition using limited resources. Besides sparsity, a key pillar of compressed sensing is the use of random low-dimensional projections.
The standard principle of general sparse and redundant representations is to rely on overcomplete dictionaries of prototype signals called atoms. The foundational vision of this proposal is that the efficient deployment of sparse models for large-scale data is only possible if supported by a new generation of efficient sparse models, beyond dictionaries, which must encompass computational efficiency as well as the ability to provide sparse and structured data representations.
Further, I believe that the true impact of compressed sensing has been to unearth an extremely powerful yet counter-intuitive tool: random projections, which open new avenues in machine learning. I envision applications to data sizes and volumes of collections that cannot be handled by today’s technologies.
A particular challenge is to adapt the models to the data by learning from a training corpus. In line with the frontier research on sparse decomposition algorithms, I will focus on obtaining provably good, yet computationally efficient algorithms for learning sparse models from collections of training data, with a geometric insight on the reasons for their success.
My research program is expected to impact the whole data processing chain, from the analog level (data acquisition) to high level processing (mining, searching), where sparsity has been identified as a key factor to address the “curse of dimensionality”. Moreover, the theoretical and algorithmic framework I will develop will be directly applied to targeted audiovisual and biomedical applications."
Summary
"Sparse models are at the core of many research domains where the large amount and high-dimensionality of digital data requires concise data descriptions for efficient information processing. A flagship application of sparsity is compressed sensing, which exploits sparsity for data acquisition using limited resources. Besides sparsity, a key pillar of compressed sensing is the use of random low-dimensional projections.
The standard principle of general sparse and redundant representations is to rely on overcomplete dictionaries of prototype signals called atoms. The foundational vision of this proposal is that the efficient deployment of sparse models for large-scale data is only possible if supported by a new generation of efficient sparse models, beyond dictionaries, which must encompass computational efficiency as well as the ability to provide sparse and structured data representations.
Further, I believe that the true impact of compressed sensing has been to unearth an extremely powerful yet counter-intuitive tool: random projections, which open new avenues in machine learning. I envision applications to data sizes and volumes of collections that cannot be handled by today’s technologies.
A particular challenge is to adapt the models to the data by learning from a training corpus. In line with the frontier research on sparse decomposition algorithms, I will focus on obtaining provably good, yet computationally efficient algorithms for learning sparse models from collections of training data, with a geometric insight on the reasons for their success.
My research program is expected to impact the whole data processing chain, from the analog level (data acquisition) to high level processing (mining, searching), where sparsity has been identified as a key factor to address the “curse of dimensionality”. Moreover, the theoretical and algorithmic framework I will develop will be directly applied to targeted audiovisual and biomedical applications."
Max ERC Funding
1 493 537 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym QD-CQED
Project A quantum dot in a cavity: A solid state platform for quantum operations
Researcher (PI) Pascale Francoise Senellart
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE3, ERC-2011-StG_20101014
Summary "A quantum dot (QD) in a microcavity is an ideal single spin-single photon interface: the spin of a carrier trapped inside a QD can be used as a quantum bit and the coupling to photons can allow remote spin entanglement. A QD in a cavity can also generate single photons or entangled photon pairs, often referred to as flying quantum bit. Controlling the QD spontaneous emission is crucial to ensure optimal coupling of the photon and spin states. The present project relies on a unique and original technology we have developed which allows us to deterministically control the QD-cavity system. With this technique, we can fabricate a large number of identical coupled QD-cavity devices operating either in the weak or strong coupling regime. The potential of the technique has been proven by the fabrication of the brightest source of entangled photon pairs to date (Nature 2010).
The objective of the present project is to build up a platform for basic quantum operations using QDs in cavities. The first aim is to develop highly efficient light emitting devices emitting indistinguishable single photons and entangled photon pairs. The mechanisms leading to quantum decoherence in QD based sources will be investigated. We will also explore a new generation of devices where QDs are coupled to plasmonic nano-antenna. The second objective is to implement basic quantum operations ranging from entanglement purification to quantum teleportation using QD based sources. The third objective of the project is to control the spin-photon interface. We first aim at demonstrating quantum non-demolition spin measurement through highly sensitive off-resonant Faraday rotation. We then aim at entangling two spins separated by macroscopic distances, using their controlled interaction with photons. This will be obtained either by making a single photon interact with two spin in cavities or by interfering indistinguishable photons emitted by two independent charged QDs."
Summary
"A quantum dot (QD) in a microcavity is an ideal single spin-single photon interface: the spin of a carrier trapped inside a QD can be used as a quantum bit and the coupling to photons can allow remote spin entanglement. A QD in a cavity can also generate single photons or entangled photon pairs, often referred to as flying quantum bit. Controlling the QD spontaneous emission is crucial to ensure optimal coupling of the photon and spin states. The present project relies on a unique and original technology we have developed which allows us to deterministically control the QD-cavity system. With this technique, we can fabricate a large number of identical coupled QD-cavity devices operating either in the weak or strong coupling regime. The potential of the technique has been proven by the fabrication of the brightest source of entangled photon pairs to date (Nature 2010).
The objective of the present project is to build up a platform for basic quantum operations using QDs in cavities. The first aim is to develop highly efficient light emitting devices emitting indistinguishable single photons and entangled photon pairs. The mechanisms leading to quantum decoherence in QD based sources will be investigated. We will also explore a new generation of devices where QDs are coupled to plasmonic nano-antenna. The second objective is to implement basic quantum operations ranging from entanglement purification to quantum teleportation using QD based sources. The third objective of the project is to control the spin-photon interface. We first aim at demonstrating quantum non-demolition spin measurement through highly sensitive off-resonant Faraday rotation. We then aim at entangling two spins separated by macroscopic distances, using their controlled interaction with photons. This will be obtained either by making a single photon interact with two spin in cavities or by interfering indistinguishable photons emitted by two independent charged QDs."
Max ERC Funding
1 482 000 €
Duration
Start date: 2011-11-01, End date: 2016-10-31
Project acronym QUIET
Project Health consequences of noise exposure from road traffic
Researcher (PI) Mette Sørensen
Host Institution (HI) KRAEFTENS BEKAEMPELSE
Call Details Starting Grant (StG), LS7, ERC-2011-StG_20101109
Summary There is growing public concern about adverse effects of traffic noise on health, as research has found that traffic noise increases the risk for cardiovascular diseases. Noise is thought to act as a stressor and disturbs sleep. Though this potentially could increase the risk for other major diseases, noise effects on other than the cardiovascular diseases are virtually unexplored.
The main objective of this project is to investigate if long-term exposure to road traffic noise is detrimental to various health outcomes in susceptible groups, i.e. children and elderly. Outcomes in children include low birth weight, infections and cognitive performance, and in elderly outcomes include diabetes, cancer, cancer survival, health-related quality of life and health behaviour.
The basis of this proposal is two unique Danish cohorts of, respectively, 57,053 elderly and 101,042 children (a national birth cohort). Historic and present residential addresses for all cohort members will be obtained through linkage with the nationwide Central Population Registry, and exposure to road traffic noise and air pollution will be calculated by validated models at all addresses.
The health outcomes will be obtained from cohort interviews/questionnaires or found through linkage with unique, nationwide, population-based health registers, such as the Danish National Hospital Registry, the Diabetes Registry and the Cancer Registry.
Data will be analysed using a number of statistical analyses depending on design and the character of the endpoint variable. All analyses will be adjusted for potential confounders such as air pollution, smoking and education.
Within the EU, 30% of the population lives at locations where the 55dB WHO noise limit is exceeded. Knowledge of harmful effects of noise is, however, limited. The results of the proposed research have a high potential to influence the content and time schedule of noise action plans in the EU member states.
Summary
There is growing public concern about adverse effects of traffic noise on health, as research has found that traffic noise increases the risk for cardiovascular diseases. Noise is thought to act as a stressor and disturbs sleep. Though this potentially could increase the risk for other major diseases, noise effects on other than the cardiovascular diseases are virtually unexplored.
The main objective of this project is to investigate if long-term exposure to road traffic noise is detrimental to various health outcomes in susceptible groups, i.e. children and elderly. Outcomes in children include low birth weight, infections and cognitive performance, and in elderly outcomes include diabetes, cancer, cancer survival, health-related quality of life and health behaviour.
The basis of this proposal is two unique Danish cohorts of, respectively, 57,053 elderly and 101,042 children (a national birth cohort). Historic and present residential addresses for all cohort members will be obtained through linkage with the nationwide Central Population Registry, and exposure to road traffic noise and air pollution will be calculated by validated models at all addresses.
The health outcomes will be obtained from cohort interviews/questionnaires or found through linkage with unique, nationwide, population-based health registers, such as the Danish National Hospital Registry, the Diabetes Registry and the Cancer Registry.
Data will be analysed using a number of statistical analyses depending on design and the character of the endpoint variable. All analyses will be adjusted for potential confounders such as air pollution, smoking and education.
Within the EU, 30% of the population lives at locations where the 55dB WHO noise limit is exceeded. Knowledge of harmful effects of noise is, however, limited. The results of the proposed research have a high potential to influence the content and time schedule of noise action plans in the EU member states.
Max ERC Funding
1 334 890 €
Duration
Start date: 2012-03-01, End date: 2017-02-28
Project acronym REPODDID
Project Regulation of Polycomb Complex (PRC2) during development and in diseases
Researcher (PI) Raphaël, Florent Chaffrey Margueron
Host Institution (HI) INSTITUT CURIE
Call Details Starting Grant (StG), LS2, ERC-2011-StG_20101109
Summary Polycomb Group (PcG) proteins are pivotal for the specification and maintenance of cell identity by preventing inappropriate gene activation. They function mostly through the regulation of chromatin structure and, in particular, the post-translational modification of histones. Although the enzymatic activities of the main PcG complexes has been described, lots remain to be discovered in terms of how these chromatin modifying activities are regulated. Genome wide analysis uncovered genes that are targeted by PcG proteins in various model cell lines, however it still very unclear how PcG proteins are targeted to a specific set of genes depending on the cell type. Finally, PcG proteins are frequently fund deregulated in diseases among which cancer but whether the alteration of their expression is a causative event to pathologies requires further investigation.
This proposal is focused on the Polycomb Repressive Complex 2 (PRC2) whose function is pivotal to the polycomb machinery. In the first two aims of this proposal, we will investigate mechanistically how transcription factors, non-coding RNAs and chromatin structure might independently or in conjunction establish the conditions conducive to gene targeting by PRC2 and regulate its activity. In the third and fourth aims of this proposal, we will investigate what is the function of PRC2 during tumorigenesis and cell reprogramming and how its function is regulated during these processes.
Summary
Polycomb Group (PcG) proteins are pivotal for the specification and maintenance of cell identity by preventing inappropriate gene activation. They function mostly through the regulation of chromatin structure and, in particular, the post-translational modification of histones. Although the enzymatic activities of the main PcG complexes has been described, lots remain to be discovered in terms of how these chromatin modifying activities are regulated. Genome wide analysis uncovered genes that are targeted by PcG proteins in various model cell lines, however it still very unclear how PcG proteins are targeted to a specific set of genes depending on the cell type. Finally, PcG proteins are frequently fund deregulated in diseases among which cancer but whether the alteration of their expression is a causative event to pathologies requires further investigation.
This proposal is focused on the Polycomb Repressive Complex 2 (PRC2) whose function is pivotal to the polycomb machinery. In the first two aims of this proposal, we will investigate mechanistically how transcription factors, non-coding RNAs and chromatin structure might independently or in conjunction establish the conditions conducive to gene targeting by PRC2 and regulate its activity. In the third and fourth aims of this proposal, we will investigate what is the function of PRC2 during tumorigenesis and cell reprogramming and how its function is regulated during these processes.
Max ERC Funding
1 499 815 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym RESPONSIVEGOV
Project Democratic Responsiveness in Comparative Perspective: How Do Democratic Governments Respond to Different Expressions of Public Opinion?
Researcher (PI) Laura Morales Diez De Ulzurrun
Host Institution (HI) FONDATION NATIONALE DES SCIENCES POLITIQUES
Call Details Starting Grant (StG), SH2, ERC-2011-StG_20101124
Summary To what extent are democratic governments responsive to citizens’ demands and preferences between elections? Are governments more likely to be responsive to the interpretation of public opinion through surveys or to collective and publicly expressed opinion –generally in the form of protests? When does one ore the other type of expression prevail as a mechanism to foster governmental responsiveness? What happens when both forms of expression of the public mood are in clear contradiction? Are certain institutional and political configurations more likely to make governments more responsive to citizens’ views between elections? Are certain political configurations more conducive to governments paying attention to opinion polls while others make them more receptive to collective action claims-making? This project will answer these questions by developing a comparative study of of governmental responsiveness in established democracies between 1980 and 2010. To this purpose, we will discuss the relevant definitions of ‘governmental responsiveness’ and ‘public opinion’, and analyse data from various sources: (i) public opinion surveys, (ii) datasets with information on protest events, (iii) news reports on public moods, collective action, and governmental activity and decision-making, and (iv) comparative indicators on institutional attributes of democratic systems. In terms of the research strategy, the project will combine the analysis of a large number of cases (20 established democracies) with a more detailed study of a set of up to 7 cases. This study will provide a highly innovative approach to the representative link between citizens and governments by comparing the dynamics of democratic representation in decision-making junctures in the periods between elections for which governments cannot invoke an electoral mandate, with the dynamics that emerge in ‘normal’ policy-making situations. The project lies at the intersection of political science and sociology.
Summary
To what extent are democratic governments responsive to citizens’ demands and preferences between elections? Are governments more likely to be responsive to the interpretation of public opinion through surveys or to collective and publicly expressed opinion –generally in the form of protests? When does one ore the other type of expression prevail as a mechanism to foster governmental responsiveness? What happens when both forms of expression of the public mood are in clear contradiction? Are certain institutional and political configurations more likely to make governments more responsive to citizens’ views between elections? Are certain political configurations more conducive to governments paying attention to opinion polls while others make them more receptive to collective action claims-making? This project will answer these questions by developing a comparative study of of governmental responsiveness in established democracies between 1980 and 2010. To this purpose, we will discuss the relevant definitions of ‘governmental responsiveness’ and ‘public opinion’, and analyse data from various sources: (i) public opinion surveys, (ii) datasets with information on protest events, (iii) news reports on public moods, collective action, and governmental activity and decision-making, and (iv) comparative indicators on institutional attributes of democratic systems. In terms of the research strategy, the project will combine the analysis of a large number of cases (20 established democracies) with a more detailed study of a set of up to 7 cases. This study will provide a highly innovative approach to the representative link between citizens and governments by comparing the dynamics of democratic representation in decision-making junctures in the periods between elections for which governments cannot invoke an electoral mandate, with the dynamics that emerge in ‘normal’ policy-making situations. The project lies at the intersection of political science and sociology.
Max ERC Funding
1 440 622 €
Duration
Start date: 2011-12-01, End date: 2018-02-28
Project acronym SIGHT
Project Systems Genetics of Heritable variaTions
Researcher (PI) Gael Yvert
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), LS2, ERC-2011-StG_20101109
Summary The complexity by which genotypes modulate phenotypic variation has been a major obstacle in understanding the basis of inter-individual differences. In the particular case of disease susceptibility, enormous efforts have been conducted among large consortia of quantitative geneticists, and a recent wealth of results showed both victories and frustrations. Victories because many genetic factors could successfully be linked to diabetes, heart failure, cancer, infectivity, and many other common diseases. Frustrations as it is becoming more and more apparent that genetic dissections are far from completion, with many unsolved questions especially regarding gene x environment interactions and incomplete penetrance. In this context, I propose to revisit the molecular basis of phenotypic diversity by addressing fundamental questions in a simple and powerful model organism: the yeast S. cerevisiae.
Combining experimental biology and bioinformatics into a ‘systems’ approach, I propose 1) To reconsider our current view of genetic determinism. By examining the effect of genetic variation on single-cells, we will visualise how they shape probability laws underlying phenotypic outcomes. This will prepare us to the upcoming era of generalized single-cell analysis. 2) To investigate how chromatin epigenotypes affect phenotypic variations. We will characterize nucleosomal epi-polymorphisms and study their impact on transcriptional and phenotypic responses to environmental changes. This will establish whether and how individual epigenomes should be considered when planning trait dissections.
This ambitious project is grounded on solid preliminary results and can be achieved thanks to my dual expertise in numerical science and experimental genetics. The questions addressed are fundamental for our understanding of living systems and the innovative methodology will help us integrate upcoming technologies into the construction of personalized medicine.
Summary
The complexity by which genotypes modulate phenotypic variation has been a major obstacle in understanding the basis of inter-individual differences. In the particular case of disease susceptibility, enormous efforts have been conducted among large consortia of quantitative geneticists, and a recent wealth of results showed both victories and frustrations. Victories because many genetic factors could successfully be linked to diabetes, heart failure, cancer, infectivity, and many other common diseases. Frustrations as it is becoming more and more apparent that genetic dissections are far from completion, with many unsolved questions especially regarding gene x environment interactions and incomplete penetrance. In this context, I propose to revisit the molecular basis of phenotypic diversity by addressing fundamental questions in a simple and powerful model organism: the yeast S. cerevisiae.
Combining experimental biology and bioinformatics into a ‘systems’ approach, I propose 1) To reconsider our current view of genetic determinism. By examining the effect of genetic variation on single-cells, we will visualise how they shape probability laws underlying phenotypic outcomes. This will prepare us to the upcoming era of generalized single-cell analysis. 2) To investigate how chromatin epigenotypes affect phenotypic variations. We will characterize nucleosomal epi-polymorphisms and study their impact on transcriptional and phenotypic responses to environmental changes. This will establish whether and how individual epigenomes should be considered when planning trait dissections.
This ambitious project is grounded on solid preliminary results and can be achieved thanks to my dual expertise in numerical science and experimental genetics. The questions addressed are fundamental for our understanding of living systems and the innovative methodology will help us integrate upcoming technologies into the construction of personalized medicine.
Max ERC Funding
1 499 660 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym SIGMA-VISION
Project Sparsity, Image and Geometry to Model Adaptively Visual Processings
Researcher (PI) Gabriel Louis-Jean Peyré
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary SIGMA-Vision will develop the next generation algorithms and methodologies for image process- ing. These algorithms will rely on several mathematical breakthroughs in image modeling: structured sparsity, geometric representations and adaptivity. They will be implemented using fast optimization codes that can handle massive datasets with gigapixels images and videos. These algorithms will have far reaching applications in computer vision, graphics and neu- roscience. These cutting edge mathematical approaches will go beyond traditional image processing scenarios and impact significantly object recognition, dynamical special effects and exploration of the visual cortex.
Summary
SIGMA-Vision will develop the next generation algorithms and methodologies for image process- ing. These algorithms will rely on several mathematical breakthroughs in image modeling: structured sparsity, geometric representations and adaptivity. They will be implemented using fast optimization codes that can handle massive datasets with gigapixels images and videos. These algorithms will have far reaching applications in computer vision, graphics and neu- roscience. These cutting edge mathematical approaches will go beyond traditional image processing scenarios and impact significantly object recognition, dynamical special effects and exploration of the visual cortex.
Max ERC Funding
1 414 960 €
Duration
Start date: 2011-10-01, End date: 2016-09-30
Project acronym SILENCING & IMMUNITY
Project Small RNA-directed control of the plant and animal innate immune responses
Researcher (PI) Lionel Francois Navarro
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), LS6, ERC-2011-StG_20101109
Summary The innate immune response is the first line of defence against pathogens, which is initiated by the detection of pathogen-derived signatures referred to as Pathogen-Associated Molecular Patterns (PAMPs). Plants and animals sense PAMPs and in turn differentially regulate a large set of immune response genes, among which microRNAs (miRNAs) were recently identified. In the model plant Arabidopsis thaliana, dozens of miRNAs are PAMP-responsive, among them we found that a conserved miRNA contributes to antibacterial resistance. More recently, we have reported a major role of the Arabidopsis miRNA pathway in antibacterial defence and, as a corollary, have identified a series of bacterial-derived suppressors of the miRNA pathway. This pioneering work represents an important contribution to the understanding of virulence strategies employed by pathogenic bacteria and suggests that analogous strategies may also be used by human pathogenic bacteria.
In the proposed project, we will first aim to investigate the extent to which an RNA silencing suppression strategy employed by a phytopathogenic bacterial effector is also used by effectors from human pathogenic bacteria. We will additionally aim to generate a comprehensive view of small RNA repertoires produced in the course of bacterial infection using both a human and a plant pathogenic bacteria. The second aspect of our proposal is directed toward a better understanding of the influence of PAMPs and bacterial effectors on Arabidopsis transcriptional gene silencing (TGS), a pathway that silences transposable elements and repeats through the establishment and maintenance of cytosine DNA methylation. Finally, we will aim to identify and characterize bacterial effectors that interfere with TGS. Overall, these studies should reveal completely novel bacterial virulence strategies and contribute to a better understanding of the mechanisms underlying post-transriptional- and transcriptional- gene silencing in different organisms.
Summary
The innate immune response is the first line of defence against pathogens, which is initiated by the detection of pathogen-derived signatures referred to as Pathogen-Associated Molecular Patterns (PAMPs). Plants and animals sense PAMPs and in turn differentially regulate a large set of immune response genes, among which microRNAs (miRNAs) were recently identified. In the model plant Arabidopsis thaliana, dozens of miRNAs are PAMP-responsive, among them we found that a conserved miRNA contributes to antibacterial resistance. More recently, we have reported a major role of the Arabidopsis miRNA pathway in antibacterial defence and, as a corollary, have identified a series of bacterial-derived suppressors of the miRNA pathway. This pioneering work represents an important contribution to the understanding of virulence strategies employed by pathogenic bacteria and suggests that analogous strategies may also be used by human pathogenic bacteria.
In the proposed project, we will first aim to investigate the extent to which an RNA silencing suppression strategy employed by a phytopathogenic bacterial effector is also used by effectors from human pathogenic bacteria. We will additionally aim to generate a comprehensive view of small RNA repertoires produced in the course of bacterial infection using both a human and a plant pathogenic bacteria. The second aspect of our proposal is directed toward a better understanding of the influence of PAMPs and bacterial effectors on Arabidopsis transcriptional gene silencing (TGS), a pathway that silences transposable elements and repeats through the establishment and maintenance of cytosine DNA methylation. Finally, we will aim to identify and characterize bacterial effectors that interfere with TGS. Overall, these studies should reveal completely novel bacterial virulence strategies and contribute to a better understanding of the mechanisms underlying post-transriptional- and transcriptional- gene silencing in different organisms.
Max ERC Funding
1 499 955 €
Duration
Start date: 2011-12-01, End date: 2017-11-30
Project acronym SMAC
Project Statistical machine learning for complex biological data
Researcher (PI) Jean-Philippe Vert
Host Institution (HI) ASSOCIATION POUR LA RECHERCHE ET LE DEVELOPPEMENT DES METHODES ET PROCESSUS INDUSTRIELS
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary This interdisciplinary project aims to develop new statistical and machine learning approaches to analyze high-dimensional, structured and heterogeneous biological data. We focus on the cases where a relatively small number of samples are characterized by huge quantities of quantitative features, a common situation in large-scale genomic projects, but particularly challenging for statistical inference. In order to overcome the curse of dimension we propose to exploit the particular structures of the data, and encode prior biological knowledge in a unified, mathematically sound, and computationally efficient framework. These methodological development, both theoretical and practical, will be guided by and applied to the inference of predictive models and the detection of predictive factors for prognosis and drug response prediction in cancer.
Summary
This interdisciplinary project aims to develop new statistical and machine learning approaches to analyze high-dimensional, structured and heterogeneous biological data. We focus on the cases where a relatively small number of samples are characterized by huge quantities of quantitative features, a common situation in large-scale genomic projects, but particularly challenging for statistical inference. In order to overcome the curse of dimension we propose to exploit the particular structures of the data, and encode prior biological knowledge in a unified, mathematically sound, and computationally efficient framework. These methodological development, both theoretical and practical, will be guided by and applied to the inference of predictive models and the detection of predictive factors for prognosis and drug response prediction in cancer.
Max ERC Funding
1 496 004 €
Duration
Start date: 2012-02-01, End date: 2018-01-31
Project acronym Starget-in-PANR
Project Determination of specific components from “stromal PDAC signature” involved in PDAC Associated Neural Remodeling (PANR) and their use as clinical tool-box
Researcher (PI) Richard Tomasini
Host Institution (HI) INSTITUT NATIONAL DE LA SANTE ET DE LA RECHERCHE MEDICALE
Call Details Starting Grant (StG), LS4, ERC-2011-StG_20101109
Summary Pancreatic ductal adenocarcinoma (PDAC) is the most intractable of human malignancies. Survival rate at 5 years is very low (less than 5%). Patients are most of the time diagnosed while the disease has already spread out and benefits from surgical resection is often cut off due to local recurrence and lack of efficient chemotherapy. Indeed, it is urgent to develop new tools to be used by clinicians in order to propose better management of the disease. This project is based on two specific characteristics of PDAC. First, the existence of a prominent tumor stroma compartment (desmoplasia) consisting of non-neoplastic myofibroblastic pancreatic stellate cells, vascular, nerve and immune cells surrounded by immense quantities of ECM, from far exceeding that found in most other tumor types. Second, the very specific and almost unique PDAC associated neural compartment remodeling (PANR) correlated with the intense neuropathic pain observed in this disease. PANR consists in a modification of nerve fiber structure/density and presence of tumoral cell within nerve fibers, a phenomenon called peri-neural invasion which is highly correlated to local recurrence of primary PDAC tumor. We and others hypothesized that, by dialoguing with cancer cells, non-tumoral stromal cells impact on PDAC tumor biology by modeling the own tumoral structure and fostering the tumor development. Regarding this concept we suppose that the intra-tumoral micro-environment has an active and efficient role on the neural compartment remodeling, its associated pain and local recurrence. By integrating preliminary and ongoing results from transcriptomic and proteomic analysis of human PDAC and endogenous mice model developing PDAC, as well as multiples cell lines co-culture studies, we aim to determine this “stromal PDAC signature” and its specific components involved in the PANR. Such improvement could permit to unravel novel diagnostic, prognostic, and therapeutic options for this deadly malignancy.
Summary
Pancreatic ductal adenocarcinoma (PDAC) is the most intractable of human malignancies. Survival rate at 5 years is very low (less than 5%). Patients are most of the time diagnosed while the disease has already spread out and benefits from surgical resection is often cut off due to local recurrence and lack of efficient chemotherapy. Indeed, it is urgent to develop new tools to be used by clinicians in order to propose better management of the disease. This project is based on two specific characteristics of PDAC. First, the existence of a prominent tumor stroma compartment (desmoplasia) consisting of non-neoplastic myofibroblastic pancreatic stellate cells, vascular, nerve and immune cells surrounded by immense quantities of ECM, from far exceeding that found in most other tumor types. Second, the very specific and almost unique PDAC associated neural compartment remodeling (PANR) correlated with the intense neuropathic pain observed in this disease. PANR consists in a modification of nerve fiber structure/density and presence of tumoral cell within nerve fibers, a phenomenon called peri-neural invasion which is highly correlated to local recurrence of primary PDAC tumor. We and others hypothesized that, by dialoguing with cancer cells, non-tumoral stromal cells impact on PDAC tumor biology by modeling the own tumoral structure and fostering the tumor development. Regarding this concept we suppose that the intra-tumoral micro-environment has an active and efficient role on the neural compartment remodeling, its associated pain and local recurrence. By integrating preliminary and ongoing results from transcriptomic and proteomic analysis of human PDAC and endogenous mice model developing PDAC, as well as multiples cell lines co-culture studies, we aim to determine this “stromal PDAC signature” and its specific components involved in the PANR. Such improvement could permit to unravel novel diagnostic, prognostic, and therapeutic options for this deadly malignancy.
Max ERC Funding
952 686 €
Duration
Start date: 2011-12-01, End date: 2016-11-30
Project acronym STRESSMETABOL
Project Stress signaling mechanisms in metabolism and inflammation and related disorders
Researcher (PI) Romeo Ricci
Host Institution (HI) CENTRE EUROPEEN DE RECHERCHE EN BIOLOGIE ET MEDECINE
Call Details Starting Grant (StG), LS4, ERC-2011-StG_20101109
Summary "Protein kinases represent key signal transduction components transferring signals to their effectors by phosphorylation. Among other important functions, MAPK-dependent signal transduction is principally required to control inflammation and metabolism in vertebrates. However, chronic MAPK signaling in response to environmental stress contributes to the development of metabolic and inflammatory diseases.
While most studies investigated functions of p38α of the p38 MAPK family, we have recently identified first non-redundant in vivo functions for p38δ. We found that p38δ regulates glucose homeostasis by controlling insulin secretion from pancreatic β cells and more recently that p38δ is pivotal in regulation of neutrophil-mediated inflammation. At the molecular level, both functions are dependent on Protein Kinase D1 (PKD1) activity, the latter of which, we identified as a direct and negatively regulated target of p38δ. Overall, our recent work describes a new signaling axis that may be important in diabetes mellitus and inflammatory diseases, respectively.
The future core activity of my laboratory is directed towards elucidation of particular roles of p38δ and PKD1 downstream targets in neutrophils and β cells. We also aim at finding common upstream signaling mechanisms that converge in p38δ-PKD1 signaling. Finally, we will explore newly discovered potential metabolic processes that are regulated by this signaling module. This biased research will be complemented by a more comprehensive proteomic screening approach in liver focusing on basic metabolic adaption in response to fasting and feeding. While, protein phosphorylation is widely explored in this context, we will globally screen for ubiquitination of proteins that recently emerged as a key event in cellular signaling.
Our work will hopefully lead to the discovery of novel and important cellular and molecular mechanisms in metabolism and inflammation with relevant implications in related human disorders."
Summary
"Protein kinases represent key signal transduction components transferring signals to their effectors by phosphorylation. Among other important functions, MAPK-dependent signal transduction is principally required to control inflammation and metabolism in vertebrates. However, chronic MAPK signaling in response to environmental stress contributes to the development of metabolic and inflammatory diseases.
While most studies investigated functions of p38α of the p38 MAPK family, we have recently identified first non-redundant in vivo functions for p38δ. We found that p38δ regulates glucose homeostasis by controlling insulin secretion from pancreatic β cells and more recently that p38δ is pivotal in regulation of neutrophil-mediated inflammation. At the molecular level, both functions are dependent on Protein Kinase D1 (PKD1) activity, the latter of which, we identified as a direct and negatively regulated target of p38δ. Overall, our recent work describes a new signaling axis that may be important in diabetes mellitus and inflammatory diseases, respectively.
The future core activity of my laboratory is directed towards elucidation of particular roles of p38δ and PKD1 downstream targets in neutrophils and β cells. We also aim at finding common upstream signaling mechanisms that converge in p38δ-PKD1 signaling. Finally, we will explore newly discovered potential metabolic processes that are regulated by this signaling module. This biased research will be complemented by a more comprehensive proteomic screening approach in liver focusing on basic metabolic adaption in response to fasting and feeding. While, protein phosphorylation is widely explored in this context, we will globally screen for ubiquitination of proteins that recently emerged as a key event in cellular signaling.
Our work will hopefully lead to the discovery of novel and important cellular and molecular mechanisms in metabolism and inflammation with relevant implications in related human disorders."
Max ERC Funding
1 499 360 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym SYSTEMSDENDRITIC
Project Harnessing systems immunology to unravel dendritic cell subset biology
Researcher (PI) Marc Ives Dalod
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), LS6, ERC-2011-StG_20101109
Summary The development of better vaccines against cancer or intracellular pathogens is a major challenge of immunological research. Novel approaches are needed to induce efficient, long-lasting memory CD8 T cell (CTL) responses. The CD8α subset of mouse dendritic cells (mDC) is specialized in this function, excelling at antigen cross-presentation. Their targeting in vivo for vaccination yielded encouraging results. How to translate this strategy for human health was not obvious, because no human DC (hDC) subset equivalent to CD8α mDC had been discovered. Investigating the relationships between mDC and hDC subsets was hampered by the use of different markers for their identification. To overcome this problem, we used comparative genomics to unravel potential equivalences between mDC and hDC subsets based on the similarities of their gene expression programs. This led us to demonstrate that BDCA3 hDC are professional cross-presenting DC equivalent to CD8α mDC. However, we still do not know what the extent of the physiological functions of different DC subsets is, and we largely ignore how they are regulated. We propose a major and innovative effort to investigate in parallel in mouse and human the biology of two DC subsets thought to be important for antiviral and antitumoral defence: plasmacytoid DC and professional cross-presenting DC. We designed a Systems Biology approach to uncover key functions and regulatory pathways conserved in these cells. We will investigate DC subset functions and their regulation in vivo during infectious challenges, by using state-of-the-art technology to generate and analyse novel high throughput data and innovative mutant mice. We will directly translate to human the knowledge obtained in the mouse, through comparative genomics and in vitro experiments on hDC. This approach is the first of this kind, at the forefront of creating new knowledge to understand the pivotal role of DC subsets in immunity and to manipulate them for promoting health.
Summary
The development of better vaccines against cancer or intracellular pathogens is a major challenge of immunological research. Novel approaches are needed to induce efficient, long-lasting memory CD8 T cell (CTL) responses. The CD8α subset of mouse dendritic cells (mDC) is specialized in this function, excelling at antigen cross-presentation. Their targeting in vivo for vaccination yielded encouraging results. How to translate this strategy for human health was not obvious, because no human DC (hDC) subset equivalent to CD8α mDC had been discovered. Investigating the relationships between mDC and hDC subsets was hampered by the use of different markers for their identification. To overcome this problem, we used comparative genomics to unravel potential equivalences between mDC and hDC subsets based on the similarities of their gene expression programs. This led us to demonstrate that BDCA3 hDC are professional cross-presenting DC equivalent to CD8α mDC. However, we still do not know what the extent of the physiological functions of different DC subsets is, and we largely ignore how they are regulated. We propose a major and innovative effort to investigate in parallel in mouse and human the biology of two DC subsets thought to be important for antiviral and antitumoral defence: plasmacytoid DC and professional cross-presenting DC. We designed a Systems Biology approach to uncover key functions and regulatory pathways conserved in these cells. We will investigate DC subset functions and their regulation in vivo during infectious challenges, by using state-of-the-art technology to generate and analyse novel high throughput data and innovative mutant mice. We will directly translate to human the knowledge obtained in the mouse, through comparative genomics and in vitro experiments on hDC. This approach is the first of this kind, at the forefront of creating new knowledge to understand the pivotal role of DC subsets in immunity and to manipulate them for promoting health.
Max ERC Funding
1 498 822 €
Duration
Start date: 2012-02-01, End date: 2017-07-31
Project acronym TDMET
Project Time-resolving electron dynamics in molecules by time-dependent many-electron theory
Researcher (PI) Lars Bojer Madsen
Host Institution (HI) AARHUS UNIVERSITET
Call Details Starting Grant (StG), PE2, ERC-2011-StG_20101014
Summary The interaction of atoms and molecules with new light sources such as attosecond and free-electron lasers is under strong current experimental investigation. Within the next few years the interest will shift from relatively simple systems with a few atoms and electrons to bigger systems with many atoms and may electrons. The aims will be to study time-resolved dynamics and chemical reactions on the natural timescales for these processes. To fulfill this ambitious goal, there will be a strong need for the development of new theory to guide the experiments and to analyze and understand the results. Currently there is no satisfactory theory in this research area that can treat more than the nonperturbative response of a single electron in a model potential. It is the purpose of the present project to develop such theory.
Summary
The interaction of atoms and molecules with new light sources such as attosecond and free-electron lasers is under strong current experimental investigation. Within the next few years the interest will shift from relatively simple systems with a few atoms and electrons to bigger systems with many atoms and may electrons. The aims will be to study time-resolved dynamics and chemical reactions on the natural timescales for these processes. To fulfill this ambitious goal, there will be a strong need for the development of new theory to guide the experiments and to analyze and understand the results. Currently there is no satisfactory theory in this research area that can treat more than the nonperturbative response of a single electron in a model potential. It is the purpose of the present project to develop such theory.
Max ERC Funding
1 330 305 €
Duration
Start date: 2011-12-01, End date: 2016-11-30
Project acronym TEEMBIO
Project Toward Eco-Evolutionary Models for BIODiversity Scenarios
Researcher (PI) Wilfried Thuiller
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), LS8, ERC-2011-StG_20101109
Summary Given the contemporary biodiversity crisis, effective conservation strategies that offset the threats to ecosystem integrity are crucial for maintaining biodiversity. The development of sound biodiversity scenarios is thus a major challenge for the scientific community. However, current biodiversity models rarely incorporate recent advances in ecological and evolutionary theory like: (i) how evolution shapes species’ niches and ranges; (ii) how community assembly rules shape species' ranges; and (iii) how these two processes interact to drive the response of populations and communities to environmental changes. Since the considered processes act on the opposing ends of an organisational hierarchy, they have rarely been combined and no model integrating all these processes yet exists. The task of bridging the gap between local processes and macroecological species range dynamics is to build upon theoretical and empirical approaches from evolutionary ecology and community ecology, to extract the processes relevant for higher-scale dynamics and to account for their interactions to generate biodiversity scenarios and associated services.
The key-idea of the proposed project TEEMBIO is thus to fill this gap through four interrelated research axes:
1- Improve our understanding on how evolution shapes species ranges at micro- and macro-evolutionary scales.
2- Improve our understanding on how community assembly rules shape biodiversity and species.
3- Develop, analyze and parameterize comprehensive projection tools (EEM-models) that incorporate both evolutionary dynamics and community assembly rules to predict global change impacts on biodiversity.
4- Develop a set of quantitative scenarios of plant biodiversity and associated ecosystem services using two case studies (forest in European Alps and grasslands in French Alps) with a comprehensive assessment of protected area networks
Summary
Given the contemporary biodiversity crisis, effective conservation strategies that offset the threats to ecosystem integrity are crucial for maintaining biodiversity. The development of sound biodiversity scenarios is thus a major challenge for the scientific community. However, current biodiversity models rarely incorporate recent advances in ecological and evolutionary theory like: (i) how evolution shapes species’ niches and ranges; (ii) how community assembly rules shape species' ranges; and (iii) how these two processes interact to drive the response of populations and communities to environmental changes. Since the considered processes act on the opposing ends of an organisational hierarchy, they have rarely been combined and no model integrating all these processes yet exists. The task of bridging the gap between local processes and macroecological species range dynamics is to build upon theoretical and empirical approaches from evolutionary ecology and community ecology, to extract the processes relevant for higher-scale dynamics and to account for their interactions to generate biodiversity scenarios and associated services.
The key-idea of the proposed project TEEMBIO is thus to fill this gap through four interrelated research axes:
1- Improve our understanding on how evolution shapes species ranges at micro- and macro-evolutionary scales.
2- Improve our understanding on how community assembly rules shape biodiversity and species.
3- Develop, analyze and parameterize comprehensive projection tools (EEM-models) that incorporate both evolutionary dynamics and community assembly rules to predict global change impacts on biodiversity.
4- Develop a set of quantitative scenarios of plant biodiversity and associated ecosystem services using two case studies (forest in European Alps and grasslands in French Alps) with a comprehensive assessment of protected area networks
Max ERC Funding
1 482 705 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym TeraGaN
Project GaN Quantum Devices for T-Ray Sources
Researcher (PI) Eva Maria Monroy Fernandez
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE7, ERC-2011-StG_20101014
Summary T-rays, often called terahertz radiation or submillimeter waves, are loosely defined as the wavelengths from 30 µm to 1,000 µm, or the frequencies from 10 THz to 300 GHz. This non-ionizing radiation appears as a harmless alternative to x-rays in medical, biological and security screening. Current solutions in terms of coherent sources of T-rays either require cryogenic temperatures or are relatively bulky equipments based on optically-pumped materials. The solid-state recourse consisting of GaAs-based quantum cascade lasers presents an intrinsic limitation in operation temperature: The low energy of the longitudinal-optical (LO) phonon in arsenide compounds hinders laser emission beyond 180 K at 4 THz, and forces operation below the liquid nitrogen temperature (< 70 K) for frequencies below 1 THz. Overcoming this limitation requires a technology revolution through introduction of a new material system. This project aims at exploring a novel semiconductor technology for high-performance photonic devices operating in the T-ray spectral region. The advanced materials that we will investigate consist of nitride-based [GaN/Al(Ga,In)N] superlattices and nanowires, where we can profit from unique properties of III-nitride semiconductors, namely the large LO-phonon energy and the strong electron-phonon interaction. Our target is to adapt the quantum cascade design and fabrication technology to these new materials, characterized by intense internal polarization fields. Our project aims at pushing intersubband transitions in this material family to unprecendently long wavelengths, in other to cover the whole T-ray spectral gap with coherent solid-state sources operating at room temperature and above.
Summary
T-rays, often called terahertz radiation or submillimeter waves, are loosely defined as the wavelengths from 30 µm to 1,000 µm, or the frequencies from 10 THz to 300 GHz. This non-ionizing radiation appears as a harmless alternative to x-rays in medical, biological and security screening. Current solutions in terms of coherent sources of T-rays either require cryogenic temperatures or are relatively bulky equipments based on optically-pumped materials. The solid-state recourse consisting of GaAs-based quantum cascade lasers presents an intrinsic limitation in operation temperature: The low energy of the longitudinal-optical (LO) phonon in arsenide compounds hinders laser emission beyond 180 K at 4 THz, and forces operation below the liquid nitrogen temperature (< 70 K) for frequencies below 1 THz. Overcoming this limitation requires a technology revolution through introduction of a new material system. This project aims at exploring a novel semiconductor technology for high-performance photonic devices operating in the T-ray spectral region. The advanced materials that we will investigate consist of nitride-based [GaN/Al(Ga,In)N] superlattices and nanowires, where we can profit from unique properties of III-nitride semiconductors, namely the large LO-phonon energy and the strong electron-phonon interaction. Our target is to adapt the quantum cascade design and fabrication technology to these new materials, characterized by intense internal polarization fields. Our project aims at pushing intersubband transitions in this material family to unprecendently long wavelengths, in other to cover the whole T-ray spectral gap with coherent solid-state sources operating at room temperature and above.
Max ERC Funding
1 627 236 €
Duration
Start date: 2012-01-01, End date: 2017-06-30
Project acronym UMWA
Project Ultimate measurement of the W boson mass
with ATLAS, at the LHC
Researcher (PI) Maarten Boonekamp
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE2, ERC-2011-StG_20101014
Summary Widely advertised as a discovery machine, the LHC proton collider at CERN will also provide large samples of standard particles, allowing the final, most precise experimental determination of some fundamental parameters of particle physics theory. Together with the expected discoveries, these measurements will allow to elucidate the dynamics of electroweak symmetry breaking. The aim of the present project is to organize and realize a precise measurement of the W resonance parameters, and in particular the W boson mass with a target precision below 0.01%. Most LHC measurements aim at a precision of order 1%, and can be performed on a time scale of about one year, within a small group of collaborators. The measurement of the W boson mass is a special case, as its aimed precision requires perfect understanding of the experimental and physical environment. It therefore requires careful planning. Intermediate measurements of standard processes need to be performed, synchronized, and will play a key role in the final result. The primary tools serving this purpose are the study of the production properties of the J/Psi, W and Z particles in the uncertain LHC environment. These studies will allow to understand the performance of the ATLAS detector and to constrain the dynamics of proton-proton interactions, which represent a major source of uncertainty. Once the above properties are firmly established, the distributions sensitive to the W boson mass can be precisely predicted, and the fundamental parameters can be determined. The coordinator of this project has recognized expertise on electroweak physics, and has led the ATLAS Collaboration's research in this field since 2007. As argued above, the preparation of the measurement of mW requires to build and coordinate a team devoted to the realization of this difficult measurement over the next years. The present grant would provide a unique opportunity to gather all necessary conditions for the success of this project.
Summary
Widely advertised as a discovery machine, the LHC proton collider at CERN will also provide large samples of standard particles, allowing the final, most precise experimental determination of some fundamental parameters of particle physics theory. Together with the expected discoveries, these measurements will allow to elucidate the dynamics of electroweak symmetry breaking. The aim of the present project is to organize and realize a precise measurement of the W resonance parameters, and in particular the W boson mass with a target precision below 0.01%. Most LHC measurements aim at a precision of order 1%, and can be performed on a time scale of about one year, within a small group of collaborators. The measurement of the W boson mass is a special case, as its aimed precision requires perfect understanding of the experimental and physical environment. It therefore requires careful planning. Intermediate measurements of standard processes need to be performed, synchronized, and will play a key role in the final result. The primary tools serving this purpose are the study of the production properties of the J/Psi, W and Z particles in the uncertain LHC environment. These studies will allow to understand the performance of the ATLAS detector and to constrain the dynamics of proton-proton interactions, which represent a major source of uncertainty. Once the above properties are firmly established, the distributions sensitive to the W boson mass can be precisely predicted, and the fundamental parameters can be determined. The coordinator of this project has recognized expertise on electroweak physics, and has led the ATLAS Collaboration's research in this field since 2007. As argued above, the preparation of the measurement of mW requires to build and coordinate a team devoted to the realization of this difficult measurement over the next years. The present grant would provide a unique opportunity to gather all necessary conditions for the success of this project.
Max ERC Funding
1 259 760 €
Duration
Start date: 2011-10-01, End date: 2016-09-30
Project acronym WIQOJO
Project Wideband Quantum Optics with Josephson Junctions
Researcher (PI) Max Hofheinz
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE3, ERC-2011-StG_20101014
Summary Circuit quantum optics (quantum optics with microwave photons in electronic circuits) has allowed to solve several hard problems in traditional quantum optics and to explore new physics. However, so far only one of two regimes of circuit quantum optics has been explored, the circuit quantum electrodynamics regime, where photons reside in electrical resonators.
In this project we want to develop the other regime of circuit quantum optics, the wideband regime, where photons are wave packets propagating along transmission lines. To do so we will build devices based on dynamical Coulomb blockade in Josephson junctions, a phenomenon relating tunneling of Cooper pairs to the emission and absorption of photons. This effect is well understood, but only DC current has been studied so far. We want to employ the photonic aspect of dynamical Coulomb blockade: Engineering the impedance seen by the junction and applying appropriate voltages allows to select specific single- or multi-photon processes that we want to use to build single photon sources, detectors and amplifiers and many other devices. Together they will fully enable wideband circuit quantum optics.
The successful project will also extend the frequency range accessible to circuit quantum optics: Current quantum circuits can be operated only in a limited range around 5 GHz due to engineering constraints. Our approach lifts these constraints and the proposed devices should function in the range from a few GHz up to 1 THz. This extended frequency window will enable the development of hybrid quantum systems coupling quantum circuits to single dopants, molecules, quantum dots or other mesoscopic devices. The output of our project will also be helpful for other domains where radiation in the GHz to THz has to be measured at the single photon level, for example astronomy.
Summary
Circuit quantum optics (quantum optics with microwave photons in electronic circuits) has allowed to solve several hard problems in traditional quantum optics and to explore new physics. However, so far only one of two regimes of circuit quantum optics has been explored, the circuit quantum electrodynamics regime, where photons reside in electrical resonators.
In this project we want to develop the other regime of circuit quantum optics, the wideband regime, where photons are wave packets propagating along transmission lines. To do so we will build devices based on dynamical Coulomb blockade in Josephson junctions, a phenomenon relating tunneling of Cooper pairs to the emission and absorption of photons. This effect is well understood, but only DC current has been studied so far. We want to employ the photonic aspect of dynamical Coulomb blockade: Engineering the impedance seen by the junction and applying appropriate voltages allows to select specific single- or multi-photon processes that we want to use to build single photon sources, detectors and amplifiers and many other devices. Together they will fully enable wideband circuit quantum optics.
The successful project will also extend the frequency range accessible to circuit quantum optics: Current quantum circuits can be operated only in a limited range around 5 GHz due to engineering constraints. Our approach lifts these constraints and the proposed devices should function in the range from a few GHz up to 1 THz. This extended frequency window will enable the development of hybrid quantum systems coupling quantum circuits to single dopants, molecules, quantum dots or other mesoscopic devices. The output of our project will also be helpful for other domains where radiation in the GHz to THz has to be measured at the single photon level, for example astronomy.
Max ERC Funding
1 640 587 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym YODA
Project Topographic signaling and spatial landmarks of key polarized neuro-developmental processes
Researcher (PI) Valérie Lucienne Corinne Castellani
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), LS5, ERC-2011-StG_20101109
Summary Polarization, which confers asymmetry at molecular, cellular and tissue scales, is a fascinating process establishing fundamental features of biological systems. In multicellular organisms, symmetry breaking triggers the specification of embryonic body axes, governing the positioning of subsequent morphogenetic processes. Cells and tissues acquire complex polarity features, which remarkably, are highly precisely positioned within the body axes. How are polarization processes spatially oriented remlains fully enigmatic. During the formation of the nervous system, some crucial processes are polarized. Likewise, the navigation of neuronal projections in the body is a typical polarized process, axons selecting specific pathways to reach their targets. Studies in this field established crucial roles for topographic cues in controlling the polarized growth of neuronal projections. Up to now, my lab has focused on axon guidance mechanisms and while investigating the links between spatial position and neural circuit formation, I became convinced that topographic signalling must be equally required to set other key polarized processes of the developing nervous system. For example in the neuroepithelium, progenitor division is polarized along the apico-basal axis of the neural tube. Likewise in the young post-mitotic neuron, precise coordinates along the body axes define the site where the axon emerges. First, we postulate the existence of a topographic signaling giving to neuronal cells (but this might be a more general case) landmarks of the different embryonic axes so that polarization takes place with appropriate spatial orientation. Second, we make the assumption that this topographic signalling is ensured by cues initially identified for their role during axon navigation. Our goals are to explore these issues, using as a model the sensorimotor circuits, where several processes can be investigated for questioning the interplay between polarity and topography.
Summary
Polarization, which confers asymmetry at molecular, cellular and tissue scales, is a fascinating process establishing fundamental features of biological systems. In multicellular organisms, symmetry breaking triggers the specification of embryonic body axes, governing the positioning of subsequent morphogenetic processes. Cells and tissues acquire complex polarity features, which remarkably, are highly precisely positioned within the body axes. How are polarization processes spatially oriented remlains fully enigmatic. During the formation of the nervous system, some crucial processes are polarized. Likewise, the navigation of neuronal projections in the body is a typical polarized process, axons selecting specific pathways to reach their targets. Studies in this field established crucial roles for topographic cues in controlling the polarized growth of neuronal projections. Up to now, my lab has focused on axon guidance mechanisms and while investigating the links between spatial position and neural circuit formation, I became convinced that topographic signalling must be equally required to set other key polarized processes of the developing nervous system. For example in the neuroepithelium, progenitor division is polarized along the apico-basal axis of the neural tube. Likewise in the young post-mitotic neuron, precise coordinates along the body axes define the site where the axon emerges. First, we postulate the existence of a topographic signaling giving to neuronal cells (but this might be a more general case) landmarks of the different embryonic axes so that polarization takes place with appropriate spatial orientation. Second, we make the assumption that this topographic signalling is ensured by cues initially identified for their role during axon navigation. Our goals are to explore these issues, using as a model the sensorimotor circuits, where several processes can be investigated for questioning the interplay between polarity and topography.
Max ERC Funding
1 498 971 €
Duration
Start date: 2012-04-01, End date: 2017-03-31