Project acronym 1stProposal
Project An alternative development of analytic number theory and applications
Researcher (PI) ANDREW Granville
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Advanced Grant (AdG), PE1, ERC-2014-ADG
Summary The traditional (Riemann) approach to analytic number theory uses the zeros of zeta functions. This requires the associated multiplicative function, say f(n), to have special enough properties that the associated Dirichlet series may be analytically continued. In this proposal we continue to develop an approach which requires less of the multiplicative function, linking the original question with the mean value of f. Such techniques have been around for a long time but have generally been regarded as “ad hoc”. In this project we aim to show that one can develop a coherent approach to the whole subject, not only reproving all of the old results, but also many new ones that appear inaccessible to traditional methods.
Our first goal is to complete a monograph yielding a reworking of all the classical theory using these new methods and then to push forward in new directions. The most important is to extend these techniques to GL(n) L-functions, which we hope will now be feasible having found the correct framework in which to proceed. Since we rarely know how to analytically continue such L-functions this could be of great benefit to the subject.
We are developing the large sieve so that it can be used for individual moduli, and will determine a strong form of that. Also a new method to give asymptotics for mean values, when they are not too small.
We wish to incorporate techniques of analytic number theory into our theory, for example recent advances on mean values of Dirichlet polynomials. Also the recent breakthroughs on the sieve suggest strong links that need further exploration.
Additive combinatorics yields important results in many areas. There are strong analogies between its results, and those for multiplicative functions, especially in large value spectrum theory, and its applications. We hope to develop these further.
Much of this is joint work with K Soundararajan of Stanford University.
Summary
The traditional (Riemann) approach to analytic number theory uses the zeros of zeta functions. This requires the associated multiplicative function, say f(n), to have special enough properties that the associated Dirichlet series may be analytically continued. In this proposal we continue to develop an approach which requires less of the multiplicative function, linking the original question with the mean value of f. Such techniques have been around for a long time but have generally been regarded as “ad hoc”. In this project we aim to show that one can develop a coherent approach to the whole subject, not only reproving all of the old results, but also many new ones that appear inaccessible to traditional methods.
Our first goal is to complete a monograph yielding a reworking of all the classical theory using these new methods and then to push forward in new directions. The most important is to extend these techniques to GL(n) L-functions, which we hope will now be feasible having found the correct framework in which to proceed. Since we rarely know how to analytically continue such L-functions this could be of great benefit to the subject.
We are developing the large sieve so that it can be used for individual moduli, and will determine a strong form of that. Also a new method to give asymptotics for mean values, when they are not too small.
We wish to incorporate techniques of analytic number theory into our theory, for example recent advances on mean values of Dirichlet polynomials. Also the recent breakthroughs on the sieve suggest strong links that need further exploration.
Additive combinatorics yields important results in many areas. There are strong analogies between its results, and those for multiplicative functions, especially in large value spectrum theory, and its applications. We hope to develop these further.
Much of this is joint work with K Soundararajan of Stanford University.
Max ERC Funding
2 011 742 €
Duration
Start date: 2015-08-01, End date: 2020-07-31
Project acronym ABLASE
Project Advanced Bioderived and Biocompatible Lasers
Researcher (PI) Malte Christian Gather
Host Institution (HI) THE UNIVERSITY COURT OF THE UNIVERSITY OF ST ANDREWS
Call Details Starting Grant (StG), PE3, ERC-2014-STG
Summary Naturally occurring optical phenomena attract great attention and transform our ability to study biological processes, with “the discovery and development of the green fluorescent protein (GFP)” (Nobel Prize in Chemistry 2008) being a particularly successful example. Although found only in very few species in nature, most organisms can be genetically programmed to produce the brightly fluorescent GFP molecules. Combined with modern fluorescence detection schemes, this has led to entirely new ways of monitoring biological processes. The applicant now demonstrated a biological laser – a completely novel, living source of coherent light based on a single biological cell bioengineered to produce GFP. Such a laser is intrinsically biocompatible, thus offering unique properties not shared by any existing laser. However, the physical processes involved in lasing from GFP remain poorly understood and so far biological lasers rely on bulky, impractical external resonators for optical feedback. Within this project, the applicant and his team will develop for the first time an understanding of stimulated emission in GFP and related proteins and create an unprecedented stand-alone single-cell biolaser based on intracellular optical feedback. These lasers will be deployed as microscopic and biocompatible imaging probes, thus opening in vivo microscopy to dense wavelength-multiplexing and enabling unmatched sensing of biomolecules and mechanical pressure. The evolutionarily evolved nano-structure of GFP will also enable novel ways of studying strong light-matter coupling and will bio-inspire advances of synthetic emitters. The proposed project is inter-disciplinary by its very nature, bridging photonics, genetic engineering and material science. The applicant’s previous pioneering work and synergies with work on other lasers developed at the applicant’s host institution provide an exclusive competitive edge. ERC support would transform this into a truly novel field of research.
Summary
Naturally occurring optical phenomena attract great attention and transform our ability to study biological processes, with “the discovery and development of the green fluorescent protein (GFP)” (Nobel Prize in Chemistry 2008) being a particularly successful example. Although found only in very few species in nature, most organisms can be genetically programmed to produce the brightly fluorescent GFP molecules. Combined with modern fluorescence detection schemes, this has led to entirely new ways of monitoring biological processes. The applicant now demonstrated a biological laser – a completely novel, living source of coherent light based on a single biological cell bioengineered to produce GFP. Such a laser is intrinsically biocompatible, thus offering unique properties not shared by any existing laser. However, the physical processes involved in lasing from GFP remain poorly understood and so far biological lasers rely on bulky, impractical external resonators for optical feedback. Within this project, the applicant and his team will develop for the first time an understanding of stimulated emission in GFP and related proteins and create an unprecedented stand-alone single-cell biolaser based on intracellular optical feedback. These lasers will be deployed as microscopic and biocompatible imaging probes, thus opening in vivo microscopy to dense wavelength-multiplexing and enabling unmatched sensing of biomolecules and mechanical pressure. The evolutionarily evolved nano-structure of GFP will also enable novel ways of studying strong light-matter coupling and will bio-inspire advances of synthetic emitters. The proposed project is inter-disciplinary by its very nature, bridging photonics, genetic engineering and material science. The applicant’s previous pioneering work and synergies with work on other lasers developed at the applicant’s host institution provide an exclusive competitive edge. ERC support would transform this into a truly novel field of research.
Max ERC Funding
1 499 875 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym ACCORD
Project Algorithms for Complex Collective Decisions on Structured Domains
Researcher (PI) Edith Elkind
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Starting Grant (StG), PE6, ERC-2014-STG
Summary Algorithms for Complex Collective Decisions on Structured Domains.
The aim of this proposal is to substantially advance the field of Computational Social Choice, by developing new tools and methodologies that can be used for making complex group decisions in rich and structured environments. We consider settings where each member of a decision-making body has preferences over a finite set of alternatives, and the goal is to synthesise a collective preference over these alternatives, which may take the form of a partial order over the set of alternatives with a predefined structure: examples include selecting a fixed-size set of alternatives, a ranking of the alternatives, a winner and up to two runner-ups, etc. We will formulate desiderata that apply to such preference aggregation procedures, design specific procedures that satisfy as many of these desiderata as possible, and develop efficient algorithms for computing them. As the latter step may be infeasible on general preference domains, we will focus on identifying the least restrictive domains that enable efficient computation, and use real-life preference data to verify whether the associated restrictions are likely to be satisfied in realistic preference aggregation scenarios. Also, we will determine whether our preference aggregation procedures are computationally resistant to malicious behavior. To lower the cognitive burden on the decision-makers, we will extend our procedures to accept partial rankings as inputs. Finally, to further contribute towards bridging the gap between theory and practice of collective decision making, we will provide open-source software implementations of our procedures, and reach out to the potential users to obtain feedback on their practical applicability.
Summary
Algorithms for Complex Collective Decisions on Structured Domains.
The aim of this proposal is to substantially advance the field of Computational Social Choice, by developing new tools and methodologies that can be used for making complex group decisions in rich and structured environments. We consider settings where each member of a decision-making body has preferences over a finite set of alternatives, and the goal is to synthesise a collective preference over these alternatives, which may take the form of a partial order over the set of alternatives with a predefined structure: examples include selecting a fixed-size set of alternatives, a ranking of the alternatives, a winner and up to two runner-ups, etc. We will formulate desiderata that apply to such preference aggregation procedures, design specific procedures that satisfy as many of these desiderata as possible, and develop efficient algorithms for computing them. As the latter step may be infeasible on general preference domains, we will focus on identifying the least restrictive domains that enable efficient computation, and use real-life preference data to verify whether the associated restrictions are likely to be satisfied in realistic preference aggregation scenarios. Also, we will determine whether our preference aggregation procedures are computationally resistant to malicious behavior. To lower the cognitive burden on the decision-makers, we will extend our procedures to accept partial rankings as inputs. Finally, to further contribute towards bridging the gap between theory and practice of collective decision making, we will provide open-source software implementations of our procedures, and reach out to the potential users to obtain feedback on their practical applicability.
Max ERC Funding
1 395 933 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym ALKENoNE
Project Algal Lipids: the Key to Earth Now and aNcient Earth
Researcher (PI) Jaime Lynn Toney
Host Institution (HI) UNIVERSITY OF GLASGOW
Call Details Starting Grant (StG), PE10, ERC-2014-STG
Summary Alkenones are algal lipids that have been used for decades to reconstruct quantitative past sea surface temperature. Although alkenones are being discovered in an increasing number of lake sites worldwide, only two terrestrial temperature records have been reconstructed so far. The development of this research field is limited by the lack of interdisciplinary research that combines modern biological and ecological algal research with the organic geochemical techniques needed to develop a quantitative biomarker (or molecular fossil) for past lake temperatures. More research is needed for alkenones to become a widely used tool for reconstructing past terrestrial temperature change. The early career Principal Investigator has discovered a new lake alkenone-producing species of haptophyte algae that produces alkenones in high abundances both in the environment and in laboratory cultures. This makes the new species an ideal organism for developing a culture-based temperature calibration and exploring other potential environmental controls. In this project, alkenone production will be manipulated, and monitored using state-of-the-art photobioreactors with real-time detectors for cell density, light, and temperature. The latest algal culture and isolation techniques that are used in microalgal biofuel development will be applied to developing the lake temperature proxy. The objectives will be achieved through the analysis of 90 new Canadian lakes to develop a core-top temperature calibration across a large latitudinal and temperature gradient (Δ latitude = 5°, Δ spring surface temperature = 9°C). The results will be used to assess how regional palaeo-temperature (Uk37), palaeo-moisture (δDwax) and palaeo-evaporation (δDalgal) respond during times of past global warmth (e.g., Medieval Warm Period, 900-1200 AD) to find an accurate analogue for assessing future drought risk in the interior of Canada.
Summary
Alkenones are algal lipids that have been used for decades to reconstruct quantitative past sea surface temperature. Although alkenones are being discovered in an increasing number of lake sites worldwide, only two terrestrial temperature records have been reconstructed so far. The development of this research field is limited by the lack of interdisciplinary research that combines modern biological and ecological algal research with the organic geochemical techniques needed to develop a quantitative biomarker (or molecular fossil) for past lake temperatures. More research is needed for alkenones to become a widely used tool for reconstructing past terrestrial temperature change. The early career Principal Investigator has discovered a new lake alkenone-producing species of haptophyte algae that produces alkenones in high abundances both in the environment and in laboratory cultures. This makes the new species an ideal organism for developing a culture-based temperature calibration and exploring other potential environmental controls. In this project, alkenone production will be manipulated, and monitored using state-of-the-art photobioreactors with real-time detectors for cell density, light, and temperature. The latest algal culture and isolation techniques that are used in microalgal biofuel development will be applied to developing the lake temperature proxy. The objectives will be achieved through the analysis of 90 new Canadian lakes to develop a core-top temperature calibration across a large latitudinal and temperature gradient (Δ latitude = 5°, Δ spring surface temperature = 9°C). The results will be used to assess how regional palaeo-temperature (Uk37), palaeo-moisture (δDwax) and palaeo-evaporation (δDalgal) respond during times of past global warmth (e.g., Medieval Warm Period, 900-1200 AD) to find an accurate analogue for assessing future drought risk in the interior of Canada.
Max ERC Funding
940 883 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym AUTAR
Project A Unified Theory of Algorithmic Relaxations
Researcher (PI) Albert Atserias Peri
Host Institution (HI) UNIVERSITAT POLITECNICA DE CATALUNYA
Call Details Consolidator Grant (CoG), PE6, ERC-2014-CoG
Summary For a large family of computational problems collectively known as constrained optimization and satisfaction problems (CSPs), four decades of research in algorithms and computational complexity have led to a theory that tries to classify them as algorithmically tractable vs. intractable, i.e. polynomial-time solvable vs. NP-hard. However, there remains an important gap in our knowledge in that many CSPs of interest resist classification by this theory. Some such problems of practical relevance include fundamental partition problems in graph theory, isomorphism problems in combinatorics, and strategy-design problems in mathematical game theory. To tackle this gap in our knowledge, the research of the last decade has been driven either by finding hard instances for algorithms that solve tighter and tighter relaxations of the original problem, or by formulating new hardness-hypotheses that are stronger but admittedly less robust than NP-hardness.
The ultimate goal of this project is closing the gap between the partial progress that these approaches represent and the original classification project into tractable vs. intractable problems. Our thesis is that the field has reached a point where, in many cases of interest, the analysis of the current candidate algorithms that appear to solve all instances could suffice to classify the problem one way or the other, without the need for alternative hardness-hypotheses. The novelty in our approach is a program to develop our recent discovery that, in some cases of interest, two methods from different areas match in strength: indistinguishability pebble games from mathematical logic, and hierarchies of convex relaxations from mathematical programming. Thus, we aim at making significant advances in the status of important algorithmic problems by looking for a general theory that unifies and goes beyond the current understanding of its components.
Summary
For a large family of computational problems collectively known as constrained optimization and satisfaction problems (CSPs), four decades of research in algorithms and computational complexity have led to a theory that tries to classify them as algorithmically tractable vs. intractable, i.e. polynomial-time solvable vs. NP-hard. However, there remains an important gap in our knowledge in that many CSPs of interest resist classification by this theory. Some such problems of practical relevance include fundamental partition problems in graph theory, isomorphism problems in combinatorics, and strategy-design problems in mathematical game theory. To tackle this gap in our knowledge, the research of the last decade has been driven either by finding hard instances for algorithms that solve tighter and tighter relaxations of the original problem, or by formulating new hardness-hypotheses that are stronger but admittedly less robust than NP-hardness.
The ultimate goal of this project is closing the gap between the partial progress that these approaches represent and the original classification project into tractable vs. intractable problems. Our thesis is that the field has reached a point where, in many cases of interest, the analysis of the current candidate algorithms that appear to solve all instances could suffice to classify the problem one way or the other, without the need for alternative hardness-hypotheses. The novelty in our approach is a program to develop our recent discovery that, in some cases of interest, two methods from different areas match in strength: indistinguishability pebble games from mathematical logic, and hierarchies of convex relaxations from mathematical programming. Thus, we aim at making significant advances in the status of important algorithmic problems by looking for a general theory that unifies and goes beyond the current understanding of its components.
Max ERC Funding
1 725 656 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym BlackHoleMaps
Project Mapping gravitational waves from collisions of black holes
Researcher (PI) Mark Douglas Hannam
Host Institution (HI) CARDIFF UNIVERSITY
Call Details Consolidator Grant (CoG), PE2, ERC-2014-CoG
Summary Breakthroughs in numerical relativity in 2005 gave us unprecedented access to the strong-field regime of general relativity, making possible solutions of the full nonlinear Einstein equations for the merger of two black holes. Numerical relativity is also crucial to study fundamental physics with gravitational-wave (GW) observations: numerical solutions allow us to construct models that will be essential to extract physical information from observations in data from Advanced LIGO and Virgo, which will operate from late 2015. Complete signal models will allow us to follow up our first theoretical predictions of the nature of black-hole mergers with their first observational measurements.
The goal of this project is to advance numerical-relativity methods, deepen our understanding of black-hole mergers, and map the parameter space of binary configurations with the most comprehensive and systematic set of numerical calculations performed to date, in order to produce a complete GW signal model. Central to this problem is the purely general-relativistic effect of orbital precession. The inclusion of precession in waveform models is the most challenging and urgent theoretical problem in the build-up to GW astronomy. Simulations must cover a seven-dimensional parameter space of binary configurations, but their computational cost makes a naive covering unfeasible. This project capitalizes on a breakthrough preliminary model produced by my team in 2013, with the pragmatic goal of focussing on the physics that will be measurable with GW detectors over the next five years.
My team at Cardiff is uniquely placed to tackle this problem. Since 2005 I have been at the forefront of black-hole simulations and waveform modelling, and the Cardiff group is a world leader in analysis of GW detector data. This project will consolidate my team to make breakthroughs in strong-field gravity, astrophysics, fundamental physics and cosmology using GW observations.
Summary
Breakthroughs in numerical relativity in 2005 gave us unprecedented access to the strong-field regime of general relativity, making possible solutions of the full nonlinear Einstein equations for the merger of two black holes. Numerical relativity is also crucial to study fundamental physics with gravitational-wave (GW) observations: numerical solutions allow us to construct models that will be essential to extract physical information from observations in data from Advanced LIGO and Virgo, which will operate from late 2015. Complete signal models will allow us to follow up our first theoretical predictions of the nature of black-hole mergers with their first observational measurements.
The goal of this project is to advance numerical-relativity methods, deepen our understanding of black-hole mergers, and map the parameter space of binary configurations with the most comprehensive and systematic set of numerical calculations performed to date, in order to produce a complete GW signal model. Central to this problem is the purely general-relativistic effect of orbital precession. The inclusion of precession in waveform models is the most challenging and urgent theoretical problem in the build-up to GW astronomy. Simulations must cover a seven-dimensional parameter space of binary configurations, but their computational cost makes a naive covering unfeasible. This project capitalizes on a breakthrough preliminary model produced by my team in 2013, with the pragmatic goal of focussing on the physics that will be measurable with GW detectors over the next five years.
My team at Cardiff is uniquely placed to tackle this problem. Since 2005 I have been at the forefront of black-hole simulations and waveform modelling, and the Cardiff group is a world leader in analysis of GW detector data. This project will consolidate my team to make breakthroughs in strong-field gravity, astrophysics, fundamental physics and cosmology using GW observations.
Max ERC Funding
1 998 009 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym BSMFLEET
Project Challenging the Standard Model using an extended Physics program in LHCb
Researcher (PI) Diego Martinez Santos
Host Institution (HI) UNIVERSIDAD DE SANTIAGO DE COMPOSTELA
Call Details Starting Grant (StG), PE2, ERC-2014-STG
Summary We know that the Standard Model (SM) of Particle Physics is not the ultimate theory of Nature. It misses a quantum description of gravity, it does not offer any explanation to the composition of Dark Matter, and the matter-antimatter unbalance of the Universe is predicted to be significantly smaller than what we actually see. Those are fundamental questions that still need an answer. Alternative models to SM exist, based on ideas such as SuperSymmetry or extra dimensions, and are currently being tested at the Large Hadron Collider (LHC) at CERN. But after the first run of the LHC the SM is yet unbeaten at accelerators, which imposes severe constraints in Physics beyond the SM (BSM). From this point, I see two further working directions: on one side, we must increase our precision in the previous measurements in order to access smaller BSM effects. On the other hand; we should attack the SM with a new fleet of observables sensitive to different BSM scenarios, and make sure that we are making full use of what the LHC offers to us. I propose to create a team at Universidade de Santiago de Compostela that will expand the use of LHCb beyond its original design, while also reinforcing the core LHCb analyses in which I played a leading role so far. LHCb has up to now collected world-leading samples of decays of b and c quarks. My proposal implies to use LHCb for collecting and analysing also world-leading samples of rare s quarks complementary to those of NA62. In the rare s decays the SM sources of Flavour Violation have a stronger suppression than anywhere else, and therefore those decays are excellent places to search for new Flavour Violating sources that otherwise would be hidden behind the SM contributions. It is very important to do this now, since we may not have a similar opportunity in years. In addition, the team will also exploit LHCb to search for μμ resonances predicted in models like NMSSM, and for which LHCb also offers a unique potential that must be used.
Summary
We know that the Standard Model (SM) of Particle Physics is not the ultimate theory of Nature. It misses a quantum description of gravity, it does not offer any explanation to the composition of Dark Matter, and the matter-antimatter unbalance of the Universe is predicted to be significantly smaller than what we actually see. Those are fundamental questions that still need an answer. Alternative models to SM exist, based on ideas such as SuperSymmetry or extra dimensions, and are currently being tested at the Large Hadron Collider (LHC) at CERN. But after the first run of the LHC the SM is yet unbeaten at accelerators, which imposes severe constraints in Physics beyond the SM (BSM). From this point, I see two further working directions: on one side, we must increase our precision in the previous measurements in order to access smaller BSM effects. On the other hand; we should attack the SM with a new fleet of observables sensitive to different BSM scenarios, and make sure that we are making full use of what the LHC offers to us. I propose to create a team at Universidade de Santiago de Compostela that will expand the use of LHCb beyond its original design, while also reinforcing the core LHCb analyses in which I played a leading role so far. LHCb has up to now collected world-leading samples of decays of b and c quarks. My proposal implies to use LHCb for collecting and analysing also world-leading samples of rare s quarks complementary to those of NA62. In the rare s decays the SM sources of Flavour Violation have a stronger suppression than anywhere else, and therefore those decays are excellent places to search for new Flavour Violating sources that otherwise would be hidden behind the SM contributions. It is very important to do this now, since we may not have a similar opportunity in years. In addition, the team will also exploit LHCb to search for μμ resonances predicted in models like NMSSM, and for which LHCb also offers a unique potential that must be used.
Max ERC Funding
1 499 855 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym CatHet
Project New Catalytic Asymmetric Strategies for N-Heterocycle Synthesis
Researcher (PI) John Forwood Bower
Host Institution (HI) UNIVERSITY OF BRISTOL
Call Details Starting Grant (StG), PE5, ERC-2014-STG
Summary Medicinal chemistry requires more efficient and diverse methods for the asymmetric synthesis of chiral scaffolds. Over 60% of the world’s top selling small molecule drug compounds are chiral and, of these, approximately 80% are marketed as single enantiomers. There is a compelling correlation between drug candidate “chiral complexity” and the likelihood of progression to the marketplace. Surprisingly, and despite the tremendous advances made in catalysis over the past several decades, the “chiral complexity” of drug discovery libraries has actually decreased, while, at the same time, for the reasons mentioned above, the “chiral complexity” of marketed drugs has increased. Since the mid-1990s, there has been a notable acceleration of this “complexity divergence”. Consequently, there is now an urgent need to provide efficient processes that directly access privileged chiral scaffolds. It is our philosophy that catalysis holds the key here and new processes should be based upon platforms that can exert control over both absolute and relative stereochemistry. In this proposal we outline the development of a range of N-heteroannulation processes based upon the catalytic generation and trapping of unique or unusual classes of organometallic intermediate derived from transition metal insertion into C-C and C-N sigma-bonds. We will provide a variety of enabling methodologies and demonstrate applicability in flexible total syntheses of important natural product scaffolds. The processes proposed are synthetically flexible, operationally simple and amenable to asymmetric catalysis. Likely starting points, based upon preliminary results, will set the stage for the realisation of aspirational and transformative goals. Through the study of the organometallic intermediates involved here, there is potential to generalise these new catalytic manifolds, such that this research will transcend N heterocyclic chemistry to provide enabling methods for organic chemistry as a whole.
Summary
Medicinal chemistry requires more efficient and diverse methods for the asymmetric synthesis of chiral scaffolds. Over 60% of the world’s top selling small molecule drug compounds are chiral and, of these, approximately 80% are marketed as single enantiomers. There is a compelling correlation between drug candidate “chiral complexity” and the likelihood of progression to the marketplace. Surprisingly, and despite the tremendous advances made in catalysis over the past several decades, the “chiral complexity” of drug discovery libraries has actually decreased, while, at the same time, for the reasons mentioned above, the “chiral complexity” of marketed drugs has increased. Since the mid-1990s, there has been a notable acceleration of this “complexity divergence”. Consequently, there is now an urgent need to provide efficient processes that directly access privileged chiral scaffolds. It is our philosophy that catalysis holds the key here and new processes should be based upon platforms that can exert control over both absolute and relative stereochemistry. In this proposal we outline the development of a range of N-heteroannulation processes based upon the catalytic generation and trapping of unique or unusual classes of organometallic intermediate derived from transition metal insertion into C-C and C-N sigma-bonds. We will provide a variety of enabling methodologies and demonstrate applicability in flexible total syntheses of important natural product scaffolds. The processes proposed are synthetically flexible, operationally simple and amenable to asymmetric catalysis. Likely starting points, based upon preliminary results, will set the stage for the realisation of aspirational and transformative goals. Through the study of the organometallic intermediates involved here, there is potential to generalise these new catalytic manifolds, such that this research will transcend N heterocyclic chemistry to provide enabling methods for organic chemistry as a whole.
Max ERC Funding
1 548 738 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym CC
Project Combinatorial Construction
Researcher (PI) Peter Keevash
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Consolidator Grant (CoG), PE1, ERC-2014-CoG
Summary Combinatorial Construction is a mathematical challenge with many applications. Examples include the construction of networks that are very sparse but highly connected, or codes that can correct many transmission errors with little overhead in communication costs. For a general class of combinatorial objects, and some desirable property, the fundamental question in Combinatorial Construction is to demonstrate the existence of an object with the property, preferably via an explicit algorithmic construction. Thus it is ubiquitous in Computer Science, including applications to expanders, sorting networks, distributed communication, data storage, codes, cryptography and derandomisation. In popular culture it appears as the unsolved `lottery problem' of determining the minimum number of tickets that guarantee a prize. In a recent preprint I prove the Existence Conjecture for combinatorial designs, via a new method of Randomised Algebraic Constructions; this result has already attracted considerable attention in the mathematical community. The significance is not only in the solution of a problem posed by Steiner in 1852, but also in the discovery of a powerful new method, that promises to have many further applications in Combinatorics, and more widely in Mathematics and Theoretical Computer Science. I am now poised to resolve many other problems of combinatorial construction.
Summary
Combinatorial Construction is a mathematical challenge with many applications. Examples include the construction of networks that are very sparse but highly connected, or codes that can correct many transmission errors with little overhead in communication costs. For a general class of combinatorial objects, and some desirable property, the fundamental question in Combinatorial Construction is to demonstrate the existence of an object with the property, preferably via an explicit algorithmic construction. Thus it is ubiquitous in Computer Science, including applications to expanders, sorting networks, distributed communication, data storage, codes, cryptography and derandomisation. In popular culture it appears as the unsolved `lottery problem' of determining the minimum number of tickets that guarantee a prize. In a recent preprint I prove the Existence Conjecture for combinatorial designs, via a new method of Randomised Algebraic Constructions; this result has already attracted considerable attention in the mathematical community. The significance is not only in the solution of a problem posed by Steiner in 1852, but also in the discovery of a powerful new method, that promises to have many further applications in Combinatorics, and more widely in Mathematics and Theoretical Computer Science. I am now poised to resolve many other problems of combinatorial construction.
Max ERC Funding
1 706 729 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym CNT-QUBIT
Project Carbon Nanotube Quantum Circuits
Researcher (PI) Mark Robertus Buitelaar
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Consolidator Grant (CoG), PE3, ERC-2014-CoG
Summary The aim of this proposal is to use spin qubits defined in carbon nanotube quantum dots to demonstrate measurement-based entanglement in an all-electrical and scalable solid-state architecture. The project makes use of spin-orbit interaction to drive spin rotations in the carbon nanotube host system and hyperfine interaction to store quantum information in the nuclear spin states. The proposal builds on techniques developed by the principal investigator for fast and non-invasive read-out of the electron spin qubits using radio-frequency reflectometry and spin-to-charge conversion.
Any quantum computer requires entanglement. One route to achieve entanglement between electron spin qubits in quantum dots is to use the direct interaction of neighbouring qubits due to their electron wavefunction overlap. This approach, however, becomes rapidly impractical for any large scale quantum processor, as distant qubits can only be entangled through the use of qubits in between. Here I propose an alternative strategy which makes use of an intriguing quantum mechanical effect by which two spatially separated spin qubits coupled to a single electrical resonator become entangled if a measurement cannot tell them apart.
The quantum information encoded in the entangled electron spin qubits will be transferred to carbon-13 nuclear spins which are used as a quantum memory with coherence times that exceed seconds. Entanglement with further qubits then proceeds again via projective measurements of the electron spin qubits without risk of losing the existing entanglement. When entanglement of the electron spin qubits is heralded – which might take several attempts – the quantum information is transferred again to the nuclear spin states. This allows for the coupling of large numbers of physically separated qubits, building up so-called graph or cluster states in an all-electrical and scalable solid-state architecture.
Summary
The aim of this proposal is to use spin qubits defined in carbon nanotube quantum dots to demonstrate measurement-based entanglement in an all-electrical and scalable solid-state architecture. The project makes use of spin-orbit interaction to drive spin rotations in the carbon nanotube host system and hyperfine interaction to store quantum information in the nuclear spin states. The proposal builds on techniques developed by the principal investigator for fast and non-invasive read-out of the electron spin qubits using radio-frequency reflectometry and spin-to-charge conversion.
Any quantum computer requires entanglement. One route to achieve entanglement between electron spin qubits in quantum dots is to use the direct interaction of neighbouring qubits due to their electron wavefunction overlap. This approach, however, becomes rapidly impractical for any large scale quantum processor, as distant qubits can only be entangled through the use of qubits in between. Here I propose an alternative strategy which makes use of an intriguing quantum mechanical effect by which two spatially separated spin qubits coupled to a single electrical resonator become entangled if a measurement cannot tell them apart.
The quantum information encoded in the entangled electron spin qubits will be transferred to carbon-13 nuclear spins which are used as a quantum memory with coherence times that exceed seconds. Entanglement with further qubits then proceeds again via projective measurements of the electron spin qubits without risk of losing the existing entanglement. When entanglement of the electron spin qubits is heralded – which might take several attempts – the quantum information is transferred again to the nuclear spin states. This allows for the coupling of large numbers of physically separated qubits, building up so-called graph or cluster states in an all-electrical and scalable solid-state architecture.
Max ERC Funding
1 998 574 €
Duration
Start date: 2015-09-01, End date: 2020-08-31