Project acronym 1stProposal
Project An alternative development of analytic number theory and applications
Researcher (PI) ANDREW Granville
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Advanced Grant (AdG), PE1, ERC-2014-ADG
Summary The traditional (Riemann) approach to analytic number theory uses the zeros of zeta functions. This requires the associated multiplicative function, say f(n), to have special enough properties that the associated Dirichlet series may be analytically continued. In this proposal we continue to develop an approach which requires less of the multiplicative function, linking the original question with the mean value of f. Such techniques have been around for a long time but have generally been regarded as “ad hoc”. In this project we aim to show that one can develop a coherent approach to the whole subject, not only reproving all of the old results, but also many new ones that appear inaccessible to traditional methods.
Our first goal is to complete a monograph yielding a reworking of all the classical theory using these new methods and then to push forward in new directions. The most important is to extend these techniques to GL(n) L-functions, which we hope will now be feasible having found the correct framework in which to proceed. Since we rarely know how to analytically continue such L-functions this could be of great benefit to the subject.
We are developing the large sieve so that it can be used for individual moduli, and will determine a strong form of that. Also a new method to give asymptotics for mean values, when they are not too small.
We wish to incorporate techniques of analytic number theory into our theory, for example recent advances on mean values of Dirichlet polynomials. Also the recent breakthroughs on the sieve suggest strong links that need further exploration.
Additive combinatorics yields important results in many areas. There are strong analogies between its results, and those for multiplicative functions, especially in large value spectrum theory, and its applications. We hope to develop these further.
Much of this is joint work with K Soundararajan of Stanford University.
Summary
The traditional (Riemann) approach to analytic number theory uses the zeros of zeta functions. This requires the associated multiplicative function, say f(n), to have special enough properties that the associated Dirichlet series may be analytically continued. In this proposal we continue to develop an approach which requires less of the multiplicative function, linking the original question with the mean value of f. Such techniques have been around for a long time but have generally been regarded as “ad hoc”. In this project we aim to show that one can develop a coherent approach to the whole subject, not only reproving all of the old results, but also many new ones that appear inaccessible to traditional methods.
Our first goal is to complete a monograph yielding a reworking of all the classical theory using these new methods and then to push forward in new directions. The most important is to extend these techniques to GL(n) L-functions, which we hope will now be feasible having found the correct framework in which to proceed. Since we rarely know how to analytically continue such L-functions this could be of great benefit to the subject.
We are developing the large sieve so that it can be used for individual moduli, and will determine a strong form of that. Also a new method to give asymptotics for mean values, when they are not too small.
We wish to incorporate techniques of analytic number theory into our theory, for example recent advances on mean values of Dirichlet polynomials. Also the recent breakthroughs on the sieve suggest strong links that need further exploration.
Additive combinatorics yields important results in many areas. There are strong analogies between its results, and those for multiplicative functions, especially in large value spectrum theory, and its applications. We hope to develop these further.
Much of this is joint work with K Soundararajan of Stanford University.
Max ERC Funding
2 011 742 €
Duration
Start date: 2015-08-01, End date: 2020-07-31
Project acronym ABLASE
Project Advanced Bioderived and Biocompatible Lasers
Researcher (PI) Malte Christian Gather
Host Institution (HI) THE UNIVERSITY COURT OF THE UNIVERSITY OF ST ANDREWS
Call Details Starting Grant (StG), PE3, ERC-2014-STG
Summary Naturally occurring optical phenomena attract great attention and transform our ability to study biological processes, with “the discovery and development of the green fluorescent protein (GFP)” (Nobel Prize in Chemistry 2008) being a particularly successful example. Although found only in very few species in nature, most organisms can be genetically programmed to produce the brightly fluorescent GFP molecules. Combined with modern fluorescence detection schemes, this has led to entirely new ways of monitoring biological processes. The applicant now demonstrated a biological laser – a completely novel, living source of coherent light based on a single biological cell bioengineered to produce GFP. Such a laser is intrinsically biocompatible, thus offering unique properties not shared by any existing laser. However, the physical processes involved in lasing from GFP remain poorly understood and so far biological lasers rely on bulky, impractical external resonators for optical feedback. Within this project, the applicant and his team will develop for the first time an understanding of stimulated emission in GFP and related proteins and create an unprecedented stand-alone single-cell biolaser based on intracellular optical feedback. These lasers will be deployed as microscopic and biocompatible imaging probes, thus opening in vivo microscopy to dense wavelength-multiplexing and enabling unmatched sensing of biomolecules and mechanical pressure. The evolutionarily evolved nano-structure of GFP will also enable novel ways of studying strong light-matter coupling and will bio-inspire advances of synthetic emitters. The proposed project is inter-disciplinary by its very nature, bridging photonics, genetic engineering and material science. The applicant’s previous pioneering work and synergies with work on other lasers developed at the applicant’s host institution provide an exclusive competitive edge. ERC support would transform this into a truly novel field of research.
Summary
Naturally occurring optical phenomena attract great attention and transform our ability to study biological processes, with “the discovery and development of the green fluorescent protein (GFP)” (Nobel Prize in Chemistry 2008) being a particularly successful example. Although found only in very few species in nature, most organisms can be genetically programmed to produce the brightly fluorescent GFP molecules. Combined with modern fluorescence detection schemes, this has led to entirely new ways of monitoring biological processes. The applicant now demonstrated a biological laser – a completely novel, living source of coherent light based on a single biological cell bioengineered to produce GFP. Such a laser is intrinsically biocompatible, thus offering unique properties not shared by any existing laser. However, the physical processes involved in lasing from GFP remain poorly understood and so far biological lasers rely on bulky, impractical external resonators for optical feedback. Within this project, the applicant and his team will develop for the first time an understanding of stimulated emission in GFP and related proteins and create an unprecedented stand-alone single-cell biolaser based on intracellular optical feedback. These lasers will be deployed as microscopic and biocompatible imaging probes, thus opening in vivo microscopy to dense wavelength-multiplexing and enabling unmatched sensing of biomolecules and mechanical pressure. The evolutionarily evolved nano-structure of GFP will also enable novel ways of studying strong light-matter coupling and will bio-inspire advances of synthetic emitters. The proposed project is inter-disciplinary by its very nature, bridging photonics, genetic engineering and material science. The applicant’s previous pioneering work and synergies with work on other lasers developed at the applicant’s host institution provide an exclusive competitive edge. ERC support would transform this into a truly novel field of research.
Max ERC Funding
1 499 875 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym ACCORD
Project Algorithms for Complex Collective Decisions on Structured Domains
Researcher (PI) Edith Elkind
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Starting Grant (StG), PE6, ERC-2014-STG
Summary Algorithms for Complex Collective Decisions on Structured Domains.
The aim of this proposal is to substantially advance the field of Computational Social Choice, by developing new tools and methodologies that can be used for making complex group decisions in rich and structured environments. We consider settings where each member of a decision-making body has preferences over a finite set of alternatives, and the goal is to synthesise a collective preference over these alternatives, which may take the form of a partial order over the set of alternatives with a predefined structure: examples include selecting a fixed-size set of alternatives, a ranking of the alternatives, a winner and up to two runner-ups, etc. We will formulate desiderata that apply to such preference aggregation procedures, design specific procedures that satisfy as many of these desiderata as possible, and develop efficient algorithms for computing them. As the latter step may be infeasible on general preference domains, we will focus on identifying the least restrictive domains that enable efficient computation, and use real-life preference data to verify whether the associated restrictions are likely to be satisfied in realistic preference aggregation scenarios. Also, we will determine whether our preference aggregation procedures are computationally resistant to malicious behavior. To lower the cognitive burden on the decision-makers, we will extend our procedures to accept partial rankings as inputs. Finally, to further contribute towards bridging the gap between theory and practice of collective decision making, we will provide open-source software implementations of our procedures, and reach out to the potential users to obtain feedback on their practical applicability.
Summary
Algorithms for Complex Collective Decisions on Structured Domains.
The aim of this proposal is to substantially advance the field of Computational Social Choice, by developing new tools and methodologies that can be used for making complex group decisions in rich and structured environments. We consider settings where each member of a decision-making body has preferences over a finite set of alternatives, and the goal is to synthesise a collective preference over these alternatives, which may take the form of a partial order over the set of alternatives with a predefined structure: examples include selecting a fixed-size set of alternatives, a ranking of the alternatives, a winner and up to two runner-ups, etc. We will formulate desiderata that apply to such preference aggregation procedures, design specific procedures that satisfy as many of these desiderata as possible, and develop efficient algorithms for computing them. As the latter step may be infeasible on general preference domains, we will focus on identifying the least restrictive domains that enable efficient computation, and use real-life preference data to verify whether the associated restrictions are likely to be satisfied in realistic preference aggregation scenarios. Also, we will determine whether our preference aggregation procedures are computationally resistant to malicious behavior. To lower the cognitive burden on the decision-makers, we will extend our procedures to accept partial rankings as inputs. Finally, to further contribute towards bridging the gap between theory and practice of collective decision making, we will provide open-source software implementations of our procedures, and reach out to the potential users to obtain feedback on their practical applicability.
Max ERC Funding
1 395 933 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym ALKENoNE
Project Algal Lipids: the Key to Earth Now and aNcient Earth
Researcher (PI) Jaime Lynn Toney
Host Institution (HI) UNIVERSITY OF GLASGOW
Call Details Starting Grant (StG), PE10, ERC-2014-STG
Summary Alkenones are algal lipids that have been used for decades to reconstruct quantitative past sea surface temperature. Although alkenones are being discovered in an increasing number of lake sites worldwide, only two terrestrial temperature records have been reconstructed so far. The development of this research field is limited by the lack of interdisciplinary research that combines modern biological and ecological algal research with the organic geochemical techniques needed to develop a quantitative biomarker (or molecular fossil) for past lake temperatures. More research is needed for alkenones to become a widely used tool for reconstructing past terrestrial temperature change. The early career Principal Investigator has discovered a new lake alkenone-producing species of haptophyte algae that produces alkenones in high abundances both in the environment and in laboratory cultures. This makes the new species an ideal organism for developing a culture-based temperature calibration and exploring other potential environmental controls. In this project, alkenone production will be manipulated, and monitored using state-of-the-art photobioreactors with real-time detectors for cell density, light, and temperature. The latest algal culture and isolation techniques that are used in microalgal biofuel development will be applied to developing the lake temperature proxy. The objectives will be achieved through the analysis of 90 new Canadian lakes to develop a core-top temperature calibration across a large latitudinal and temperature gradient (Δ latitude = 5°, Δ spring surface temperature = 9°C). The results will be used to assess how regional palaeo-temperature (Uk37), palaeo-moisture (δDwax) and palaeo-evaporation (δDalgal) respond during times of past global warmth (e.g., Medieval Warm Period, 900-1200 AD) to find an accurate analogue for assessing future drought risk in the interior of Canada.
Summary
Alkenones are algal lipids that have been used for decades to reconstruct quantitative past sea surface temperature. Although alkenones are being discovered in an increasing number of lake sites worldwide, only two terrestrial temperature records have been reconstructed so far. The development of this research field is limited by the lack of interdisciplinary research that combines modern biological and ecological algal research with the organic geochemical techniques needed to develop a quantitative biomarker (or molecular fossil) for past lake temperatures. More research is needed for alkenones to become a widely used tool for reconstructing past terrestrial temperature change. The early career Principal Investigator has discovered a new lake alkenone-producing species of haptophyte algae that produces alkenones in high abundances both in the environment and in laboratory cultures. This makes the new species an ideal organism for developing a culture-based temperature calibration and exploring other potential environmental controls. In this project, alkenone production will be manipulated, and monitored using state-of-the-art photobioreactors with real-time detectors for cell density, light, and temperature. The latest algal culture and isolation techniques that are used in microalgal biofuel development will be applied to developing the lake temperature proxy. The objectives will be achieved through the analysis of 90 new Canadian lakes to develop a core-top temperature calibration across a large latitudinal and temperature gradient (Δ latitude = 5°, Δ spring surface temperature = 9°C). The results will be used to assess how regional palaeo-temperature (Uk37), palaeo-moisture (δDwax) and palaeo-evaporation (δDalgal) respond during times of past global warmth (e.g., Medieval Warm Period, 900-1200 AD) to find an accurate analogue for assessing future drought risk in the interior of Canada.
Max ERC Funding
940 883 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym AUTAR
Project A Unified Theory of Algorithmic Relaxations
Researcher (PI) Albert Atserias Peri
Host Institution (HI) UNIVERSITAT POLITECNICA DE CATALUNYA
Call Details Consolidator Grant (CoG), PE6, ERC-2014-CoG
Summary For a large family of computational problems collectively known as constrained optimization and satisfaction problems (CSPs), four decades of research in algorithms and computational complexity have led to a theory that tries to classify them as algorithmically tractable vs. intractable, i.e. polynomial-time solvable vs. NP-hard. However, there remains an important gap in our knowledge in that many CSPs of interest resist classification by this theory. Some such problems of practical relevance include fundamental partition problems in graph theory, isomorphism problems in combinatorics, and strategy-design problems in mathematical game theory. To tackle this gap in our knowledge, the research of the last decade has been driven either by finding hard instances for algorithms that solve tighter and tighter relaxations of the original problem, or by formulating new hardness-hypotheses that are stronger but admittedly less robust than NP-hardness.
The ultimate goal of this project is closing the gap between the partial progress that these approaches represent and the original classification project into tractable vs. intractable problems. Our thesis is that the field has reached a point where, in many cases of interest, the analysis of the current candidate algorithms that appear to solve all instances could suffice to classify the problem one way or the other, without the need for alternative hardness-hypotheses. The novelty in our approach is a program to develop our recent discovery that, in some cases of interest, two methods from different areas match in strength: indistinguishability pebble games from mathematical logic, and hierarchies of convex relaxations from mathematical programming. Thus, we aim at making significant advances in the status of important algorithmic problems by looking for a general theory that unifies and goes beyond the current understanding of its components.
Summary
For a large family of computational problems collectively known as constrained optimization and satisfaction problems (CSPs), four decades of research in algorithms and computational complexity have led to a theory that tries to classify them as algorithmically tractable vs. intractable, i.e. polynomial-time solvable vs. NP-hard. However, there remains an important gap in our knowledge in that many CSPs of interest resist classification by this theory. Some such problems of practical relevance include fundamental partition problems in graph theory, isomorphism problems in combinatorics, and strategy-design problems in mathematical game theory. To tackle this gap in our knowledge, the research of the last decade has been driven either by finding hard instances for algorithms that solve tighter and tighter relaxations of the original problem, or by formulating new hardness-hypotheses that are stronger but admittedly less robust than NP-hardness.
The ultimate goal of this project is closing the gap between the partial progress that these approaches represent and the original classification project into tractable vs. intractable problems. Our thesis is that the field has reached a point where, in many cases of interest, the analysis of the current candidate algorithms that appear to solve all instances could suffice to classify the problem one way or the other, without the need for alternative hardness-hypotheses. The novelty in our approach is a program to develop our recent discovery that, in some cases of interest, two methods from different areas match in strength: indistinguishability pebble games from mathematical logic, and hierarchies of convex relaxations from mathematical programming. Thus, we aim at making significant advances in the status of important algorithmic problems by looking for a general theory that unifies and goes beyond the current understanding of its components.
Max ERC Funding
1 725 656 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym CatHet
Project New Catalytic Asymmetric Strategies for N-Heterocycle Synthesis
Researcher (PI) John Forwood Bower
Host Institution (HI) UNIVERSITY OF BRISTOL
Call Details Starting Grant (StG), PE5, ERC-2014-STG
Summary Medicinal chemistry requires more efficient and diverse methods for the asymmetric synthesis of chiral scaffolds. Over 60% of the world’s top selling small molecule drug compounds are chiral and, of these, approximately 80% are marketed as single enantiomers. There is a compelling correlation between drug candidate “chiral complexity” and the likelihood of progression to the marketplace. Surprisingly, and despite the tremendous advances made in catalysis over the past several decades, the “chiral complexity” of drug discovery libraries has actually decreased, while, at the same time, for the reasons mentioned above, the “chiral complexity” of marketed drugs has increased. Since the mid-1990s, there has been a notable acceleration of this “complexity divergence”. Consequently, there is now an urgent need to provide efficient processes that directly access privileged chiral scaffolds. It is our philosophy that catalysis holds the key here and new processes should be based upon platforms that can exert control over both absolute and relative stereochemistry. In this proposal we outline the development of a range of N-heteroannulation processes based upon the catalytic generation and trapping of unique or unusual classes of organometallic intermediate derived from transition metal insertion into C-C and C-N sigma-bonds. We will provide a variety of enabling methodologies and demonstrate applicability in flexible total syntheses of important natural product scaffolds. The processes proposed are synthetically flexible, operationally simple and amenable to asymmetric catalysis. Likely starting points, based upon preliminary results, will set the stage for the realisation of aspirational and transformative goals. Through the study of the organometallic intermediates involved here, there is potential to generalise these new catalytic manifolds, such that this research will transcend N heterocyclic chemistry to provide enabling methods for organic chemistry as a whole.
Summary
Medicinal chemistry requires more efficient and diverse methods for the asymmetric synthesis of chiral scaffolds. Over 60% of the world’s top selling small molecule drug compounds are chiral and, of these, approximately 80% are marketed as single enantiomers. There is a compelling correlation between drug candidate “chiral complexity” and the likelihood of progression to the marketplace. Surprisingly, and despite the tremendous advances made in catalysis over the past several decades, the “chiral complexity” of drug discovery libraries has actually decreased, while, at the same time, for the reasons mentioned above, the “chiral complexity” of marketed drugs has increased. Since the mid-1990s, there has been a notable acceleration of this “complexity divergence”. Consequently, there is now an urgent need to provide efficient processes that directly access privileged chiral scaffolds. It is our philosophy that catalysis holds the key here and new processes should be based upon platforms that can exert control over both absolute and relative stereochemistry. In this proposal we outline the development of a range of N-heteroannulation processes based upon the catalytic generation and trapping of unique or unusual classes of organometallic intermediate derived from transition metal insertion into C-C and C-N sigma-bonds. We will provide a variety of enabling methodologies and demonstrate applicability in flexible total syntheses of important natural product scaffolds. The processes proposed are synthetically flexible, operationally simple and amenable to asymmetric catalysis. Likely starting points, based upon preliminary results, will set the stage for the realisation of aspirational and transformative goals. Through the study of the organometallic intermediates involved here, there is potential to generalise these new catalytic manifolds, such that this research will transcend N heterocyclic chemistry to provide enabling methods for organic chemistry as a whole.
Max ERC Funding
1 548 738 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym CC
Project Combinatorial Construction
Researcher (PI) Peter Keevash
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Consolidator Grant (CoG), PE1, ERC-2014-CoG
Summary Combinatorial Construction is a mathematical challenge with many applications. Examples include the construction of networks that are very sparse but highly connected, or codes that can correct many transmission errors with little overhead in communication costs. For a general class of combinatorial objects, and some desirable property, the fundamental question in Combinatorial Construction is to demonstrate the existence of an object with the property, preferably via an explicit algorithmic construction. Thus it is ubiquitous in Computer Science, including applications to expanders, sorting networks, distributed communication, data storage, codes, cryptography and derandomisation. In popular culture it appears as the unsolved `lottery problem' of determining the minimum number of tickets that guarantee a prize. In a recent preprint I prove the Existence Conjecture for combinatorial designs, via a new method of Randomised Algebraic Constructions; this result has already attracted considerable attention in the mathematical community. The significance is not only in the solution of a problem posed by Steiner in 1852, but also in the discovery of a powerful new method, that promises to have many further applications in Combinatorics, and more widely in Mathematics and Theoretical Computer Science. I am now poised to resolve many other problems of combinatorial construction.
Summary
Combinatorial Construction is a mathematical challenge with many applications. Examples include the construction of networks that are very sparse but highly connected, or codes that can correct many transmission errors with little overhead in communication costs. For a general class of combinatorial objects, and some desirable property, the fundamental question in Combinatorial Construction is to demonstrate the existence of an object with the property, preferably via an explicit algorithmic construction. Thus it is ubiquitous in Computer Science, including applications to expanders, sorting networks, distributed communication, data storage, codes, cryptography and derandomisation. In popular culture it appears as the unsolved `lottery problem' of determining the minimum number of tickets that guarantee a prize. In a recent preprint I prove the Existence Conjecture for combinatorial designs, via a new method of Randomised Algebraic Constructions; this result has already attracted considerable attention in the mathematical community. The significance is not only in the solution of a problem posed by Steiner in 1852, but also in the discovery of a powerful new method, that promises to have many further applications in Combinatorics, and more widely in Mathematics and Theoretical Computer Science. I am now poised to resolve many other problems of combinatorial construction.
Max ERC Funding
1 706 729 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym CNT-QUBIT
Project Carbon Nanotube Quantum Circuits
Researcher (PI) Mark Robertus Buitelaar
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Consolidator Grant (CoG), PE3, ERC-2014-CoG
Summary The aim of this proposal is to use spin qubits defined in carbon nanotube quantum dots to demonstrate measurement-based entanglement in an all-electrical and scalable solid-state architecture. The project makes use of spin-orbit interaction to drive spin rotations in the carbon nanotube host system and hyperfine interaction to store quantum information in the nuclear spin states. The proposal builds on techniques developed by the principal investigator for fast and non-invasive read-out of the electron spin qubits using radio-frequency reflectometry and spin-to-charge conversion.
Any quantum computer requires entanglement. One route to achieve entanglement between electron spin qubits in quantum dots is to use the direct interaction of neighbouring qubits due to their electron wavefunction overlap. This approach, however, becomes rapidly impractical for any large scale quantum processor, as distant qubits can only be entangled through the use of qubits in between. Here I propose an alternative strategy which makes use of an intriguing quantum mechanical effect by which two spatially separated spin qubits coupled to a single electrical resonator become entangled if a measurement cannot tell them apart.
The quantum information encoded in the entangled electron spin qubits will be transferred to carbon-13 nuclear spins which are used as a quantum memory with coherence times that exceed seconds. Entanglement with further qubits then proceeds again via projective measurements of the electron spin qubits without risk of losing the existing entanglement. When entanglement of the electron spin qubits is heralded – which might take several attempts – the quantum information is transferred again to the nuclear spin states. This allows for the coupling of large numbers of physically separated qubits, building up so-called graph or cluster states in an all-electrical and scalable solid-state architecture.
Summary
The aim of this proposal is to use spin qubits defined in carbon nanotube quantum dots to demonstrate measurement-based entanglement in an all-electrical and scalable solid-state architecture. The project makes use of spin-orbit interaction to drive spin rotations in the carbon nanotube host system and hyperfine interaction to store quantum information in the nuclear spin states. The proposal builds on techniques developed by the principal investigator for fast and non-invasive read-out of the electron spin qubits using radio-frequency reflectometry and spin-to-charge conversion.
Any quantum computer requires entanglement. One route to achieve entanglement between electron spin qubits in quantum dots is to use the direct interaction of neighbouring qubits due to their electron wavefunction overlap. This approach, however, becomes rapidly impractical for any large scale quantum processor, as distant qubits can only be entangled through the use of qubits in between. Here I propose an alternative strategy which makes use of an intriguing quantum mechanical effect by which two spatially separated spin qubits coupled to a single electrical resonator become entangled if a measurement cannot tell them apart.
The quantum information encoded in the entangled electron spin qubits will be transferred to carbon-13 nuclear spins which are used as a quantum memory with coherence times that exceed seconds. Entanglement with further qubits then proceeds again via projective measurements of the electron spin qubits without risk of losing the existing entanglement. When entanglement of the electron spin qubits is heralded – which might take several attempts – the quantum information is transferred again to the nuclear spin states. This allows for the coupling of large numbers of physically separated qubits, building up so-called graph or cluster states in an all-electrical and scalable solid-state architecture.
Max ERC Funding
1 998 574 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym COEVOLUTION
Project Black holes and their host galaxies: coevolution across cosmic time
Researcher (PI) Debora Sijacki
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Starting Grant (StG), PE9, ERC-2014-STG
Summary Galaxy formation is one of the most fascinating yet challenging fields of astrophysics. The desire to understand
galaxy formation has led to the design of ever more sophisticated telescopes which show a bewildering variety
of galaxies in the Universe. However, the degree to which an interpretation of this wealth of data can succeed
depends critically on having accurate and realistic theoretical models of galaxy formation. While cosmological
simulations of galaxy formation provide the most powerful technique for calculating the non-linear evolution of
cosmic structures, the enormous dynamic range and poorly understood baryonic physics are main uncertainties
of present simulations. This impacts on their predictive power and is the major obstacle to our understanding of
observational data. The objective of this proposal is to drastically improve upon the current state-of-the-art by i)
including more realistic physical processes, such as those occurring at the sphere of influence of a galaxy’s central
black hole and ii) greatly extending spatial dynamical range with the aid of a novel technique I have developed.
With this technique I want to address one of the major unsolved issues of galaxy formation: “How do galaxies and
their central black holes coevolve?” Specifically, I want to focus on three crucial areas of galaxy formation: a) How
and where the very first black holes form, what are their observational signatures, and when is the coevolution with
host galaxies established? b) Is black hole heating solely responsible for the morphological transformation and
quenching of massive galaxies, or are other processes important as well? c) What is the impact of supermassive
black holes on galaxy clusters and can we calibrate baryonic physics in clusters to use them as high precision
cosmological probes? The requested funding is for 50% of the PI’s time and three postdoctoral researchers to
establish an independent research group at the KICC and IoA, Cambridge.
Summary
Galaxy formation is one of the most fascinating yet challenging fields of astrophysics. The desire to understand
galaxy formation has led to the design of ever more sophisticated telescopes which show a bewildering variety
of galaxies in the Universe. However, the degree to which an interpretation of this wealth of data can succeed
depends critically on having accurate and realistic theoretical models of galaxy formation. While cosmological
simulations of galaxy formation provide the most powerful technique for calculating the non-linear evolution of
cosmic structures, the enormous dynamic range and poorly understood baryonic physics are main uncertainties
of present simulations. This impacts on their predictive power and is the major obstacle to our understanding of
observational data. The objective of this proposal is to drastically improve upon the current state-of-the-art by i)
including more realistic physical processes, such as those occurring at the sphere of influence of a galaxy’s central
black hole and ii) greatly extending spatial dynamical range with the aid of a novel technique I have developed.
With this technique I want to address one of the major unsolved issues of galaxy formation: “How do galaxies and
their central black holes coevolve?” Specifically, I want to focus on three crucial areas of galaxy formation: a) How
and where the very first black holes form, what are their observational signatures, and when is the coevolution with
host galaxies established? b) Is black hole heating solely responsible for the morphological transformation and
quenching of massive galaxies, or are other processes important as well? c) What is the impact of supermassive
black holes on galaxy clusters and can we calibrate baryonic physics in clusters to use them as high precision
cosmological probes? The requested funding is for 50% of the PI’s time and three postdoctoral researchers to
establish an independent research group at the KICC and IoA, Cambridge.
Max ERC Funding
1 975 062 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym complexNMR
Project Structural Dynamics of Protein Complexes by Solid-State NMR
Researcher (PI) Józef Romuald Lewandowski
Host Institution (HI) THE UNIVERSITY OF WARWICK
Call Details Starting Grant (StG), PE4, ERC-2014-STG
Summary Multidrug resistant bacteria that render worthless the current arsenal of antibiotics are a growing global problem. This grave challenge could be tackled by polyketide synthases (PKSs), which are gigantic modular enzymatic assembly lines for natural products. PKSs could be developed for industry to produce chemically difficult to synthesize drugs, but cannot be harnessed until we understand how they work on the molecular level. However, such understanding is missing because we cannot easily investigate large complexes with current structural biology and modeling methods. A key puzzle is how the function of these multicomponent systems emerges from atomic-scale interactions of their parts. Solving this puzzle requires a holistic approach involving measuring and modeling the relevant interacting parts together.
Our goal is to develop a multidisciplinary approach rooted in solid and solution state NMR that will make possible studies of complexes from PKSs. The two main challenges for the NMR of PKSs are increasing sensitivity and resolution. Recent innovations from our lab allow application of solid-state to study large complexes in 2–10 nanomole quantities. Building on this approach, with a protein-antibody complex as a test case, we will develop new NMR methods that will enable a study of structure and motions of domains in complexes. We will probe, for the first time, the structural dynamics of PKSs of enacyloxin and gladiolin, which are antibiotics against life-threatening multidrug resistant hospital-acquired Acinetobacter baumannii infections and tuberculosis. These studies will guide rational engineering of the PKSs to enable synthetic biology approaches to produce new antibiotics.
If successful, this project will go beyond the state of the art by: enabling studies of unknown proteins in large complexes and providing unique insights into novel mechanisms for controlling biosynthesis in PKSs, turning them into truly programmable synthetic biology devices.
Summary
Multidrug resistant bacteria that render worthless the current arsenal of antibiotics are a growing global problem. This grave challenge could be tackled by polyketide synthases (PKSs), which are gigantic modular enzymatic assembly lines for natural products. PKSs could be developed for industry to produce chemically difficult to synthesize drugs, but cannot be harnessed until we understand how they work on the molecular level. However, such understanding is missing because we cannot easily investigate large complexes with current structural biology and modeling methods. A key puzzle is how the function of these multicomponent systems emerges from atomic-scale interactions of their parts. Solving this puzzle requires a holistic approach involving measuring and modeling the relevant interacting parts together.
Our goal is to develop a multidisciplinary approach rooted in solid and solution state NMR that will make possible studies of complexes from PKSs. The two main challenges for the NMR of PKSs are increasing sensitivity and resolution. Recent innovations from our lab allow application of solid-state to study large complexes in 2–10 nanomole quantities. Building on this approach, with a protein-antibody complex as a test case, we will develop new NMR methods that will enable a study of structure and motions of domains in complexes. We will probe, for the first time, the structural dynamics of PKSs of enacyloxin and gladiolin, which are antibiotics against life-threatening multidrug resistant hospital-acquired Acinetobacter baumannii infections and tuberculosis. These studies will guide rational engineering of the PKSs to enable synthetic biology approaches to produce new antibiotics.
If successful, this project will go beyond the state of the art by: enabling studies of unknown proteins in large complexes and providing unique insights into novel mechanisms for controlling biosynthesis in PKSs, turning them into truly programmable synthetic biology devices.
Max ERC Funding
1 999 044 €
Duration
Start date: 2015-05-01, End date: 2020-04-30