Project acronym 100 Archaic Genomes
Project Genome sequences from extinct hominins
Researcher (PI) Svante PÄÄBO
Host Institution (HI) MAX-PLANCK-GESELLSCHAFT ZUR FORDERUNG DER WISSENSCHAFTEN EV
Call Details Advanced Grant (AdG), LS2, ERC-2015-AdG
Summary Neandertals and Denisovans, an Asian group distantly related to Neandertals, are the closest evolutionary relatives of present-day humans. They are thus of direct relevance for understanding the origin of modern humans and how modern humans differ from their closest relatives. We will generate genome-wide data from a large number of Neandertal and Denisovan individuals from across their geographical and temporal range as well as from other extinct hominin groups which we may discover. This will be possible by automating highly sensitive approaches to ancient DNA extraction and DNA libraries construction that we have developed so that they can be applied to many specimens from many sites in order to identify those that contain retrievable DNA. Whenever possible we will sequence whole genomes and in other cases use DNA capture methods to generate high-quality data from representative parts of the genome. This will allow us to study the population history of Neandertals and Denisovans, elucidate how many times and where these extinct hominins contributed genes to present-day people, and the extent to which modern humans and archaic groups contributed genetically to Neandertals and Denisovans. By retrieving DNA from specimens that go back to the Middle Pleistocene we will furthermore shed light on the early history and origins of Neandertals and Denisovans.
Summary
Neandertals and Denisovans, an Asian group distantly related to Neandertals, are the closest evolutionary relatives of present-day humans. They are thus of direct relevance for understanding the origin of modern humans and how modern humans differ from their closest relatives. We will generate genome-wide data from a large number of Neandertal and Denisovan individuals from across their geographical and temporal range as well as from other extinct hominin groups which we may discover. This will be possible by automating highly sensitive approaches to ancient DNA extraction and DNA libraries construction that we have developed so that they can be applied to many specimens from many sites in order to identify those that contain retrievable DNA. Whenever possible we will sequence whole genomes and in other cases use DNA capture methods to generate high-quality data from representative parts of the genome. This will allow us to study the population history of Neandertals and Denisovans, elucidate how many times and where these extinct hominins contributed genes to present-day people, and the extent to which modern humans and archaic groups contributed genetically to Neandertals and Denisovans. By retrieving DNA from specimens that go back to the Middle Pleistocene we will furthermore shed light on the early history and origins of Neandertals and Denisovans.
Max ERC Funding
2 350 000 €
Duration
Start date: 2016-11-01, End date: 2021-10-31
Project acronym 14Constraint
Project Radiocarbon constraints for models of C cycling in terrestrial ecosystems: from process understanding to global benchmarking
Researcher (PI) Susan Trumbore
Host Institution (HI) MAX-PLANCK-GESELLSCHAFT ZUR FORDERUNG DER WISSENSCHAFTEN EV
Call Details Advanced Grant (AdG), PE10, ERC-2015-AdG
Summary The overall goal of 14Constraint is to enhance the availability and use of radiocarbon data as constraints for process-based understanding of the age distribution of carbon in and respired by soils and ecosystems. Carbon enters ecosystems by a single process, photosynthesis. It returns by a range of processes that depend on plant allocation and turnover, the efficiency and rate of litter decomposition and the mechanisms stabilizing C in soils. Thus the age distribution of respired CO2 and the age of C residing in plants, litter and soils are diagnostic properties of ecosystems that provide key constraints for testing carbon cycle models. Radiocarbon, especially the transit of ‘bomb’ 14C created in the 1960s, is a powerful tool for tracing C exchange on decadal to centennial timescales. 14Constraint will assemble a global database of existing radiocarbon data (WP1) and demonstrate how they can constrain and test ecosystem carbon cycle models. WP2 will fill data gaps and add new data from sites in key biomes that have ancillary data sufficient to construct belowground C and 14C budgets. These detailed investigations will focus on the role of time lags caused in necromass and fine roots, as well as the dynamics of deep soil C. Spatial extrapolation beyond the WP2 sites will require sampling along global gradients designed to explore the relative roles of mineralogy, vegetation and climate on the age of C in and respired from soil (WP3). Products of this 14Constraint will include the first publicly available global synthesis of terrestrial 14C data, and will add over 5000 new measurements. This project is urgently needed before atmospheric 14C levels decline to below 1950 levels as expected in the next decade.
Summary
The overall goal of 14Constraint is to enhance the availability and use of radiocarbon data as constraints for process-based understanding of the age distribution of carbon in and respired by soils and ecosystems. Carbon enters ecosystems by a single process, photosynthesis. It returns by a range of processes that depend on plant allocation and turnover, the efficiency and rate of litter decomposition and the mechanisms stabilizing C in soils. Thus the age distribution of respired CO2 and the age of C residing in plants, litter and soils are diagnostic properties of ecosystems that provide key constraints for testing carbon cycle models. Radiocarbon, especially the transit of ‘bomb’ 14C created in the 1960s, is a powerful tool for tracing C exchange on decadal to centennial timescales. 14Constraint will assemble a global database of existing radiocarbon data (WP1) and demonstrate how they can constrain and test ecosystem carbon cycle models. WP2 will fill data gaps and add new data from sites in key biomes that have ancillary data sufficient to construct belowground C and 14C budgets. These detailed investigations will focus on the role of time lags caused in necromass and fine roots, as well as the dynamics of deep soil C. Spatial extrapolation beyond the WP2 sites will require sampling along global gradients designed to explore the relative roles of mineralogy, vegetation and climate on the age of C in and respired from soil (WP3). Products of this 14Constraint will include the first publicly available global synthesis of terrestrial 14C data, and will add over 5000 new measurements. This project is urgently needed before atmospheric 14C levels decline to below 1950 levels as expected in the next decade.
Max ERC Funding
2 283 747 €
Duration
Start date: 2016-12-01, End date: 2021-11-30
Project acronym 15CBOOKTRADE
Project The 15th-century Book Trade: An Evidence-based Assessment and Visualization of the Distribution, Sale, and Reception of Books in the Renaissance
Researcher (PI) Cristina Dondi
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Consolidator Grant (CoG), SH6, ERC-2013-CoG
Summary The idea that underpins this project is to use the material evidence from thousands of surviving 15th-c. books, as well as unique documentary evidence — the unpublished ledger of a Venetian bookseller in the 1480s which records the sale of 25,000 printed books with their prices — to address four fundamental questions relating to the introduction of printing in the West which have so far eluded scholarship, partly because of lack of evidence, partly because of the lack of effective tools to deal with existing evidence. The book trade differs from other trades operating in the medieval and early modern periods in that the goods traded survive in considerable numbers. Not only do they survive, but many of them bear stratified evidence of their history in the form of marks of ownership, prices, manuscript annotations, binding and decoration styles. A British Academy pilot project conceived by the PI produced a now internationally-used database which gathers together this kind of evidence for thousands of surviving 15th-c. printed books. For the first time, this makes it possible to track the circulation of books, their trade routes and later collecting, across Europe and the USA, and throughout the centuries. The objectives of this project are to examine (1) the distribution and trade-routes, national and international, of 15th-c. printed books, along with the identity of the buyers and users (private, institutional, religious, lay, female, male, and by profession) and their reading practices; (2) the books' contemporary market value; (3) the transmission and dissemination of the texts they contain, their survival and their loss (rebalancing potentially skewed scholarship); and (4) the circulation and re-use of the illustrations they contain. Finally, the project will experiment with the application of scientific visualization techniques to represent, geographically and chronologically, the movement of 15th-c. printed books and of the texts they contain.
Summary
The idea that underpins this project is to use the material evidence from thousands of surviving 15th-c. books, as well as unique documentary evidence — the unpublished ledger of a Venetian bookseller in the 1480s which records the sale of 25,000 printed books with their prices — to address four fundamental questions relating to the introduction of printing in the West which have so far eluded scholarship, partly because of lack of evidence, partly because of the lack of effective tools to deal with existing evidence. The book trade differs from other trades operating in the medieval and early modern periods in that the goods traded survive in considerable numbers. Not only do they survive, but many of them bear stratified evidence of their history in the form of marks of ownership, prices, manuscript annotations, binding and decoration styles. A British Academy pilot project conceived by the PI produced a now internationally-used database which gathers together this kind of evidence for thousands of surviving 15th-c. printed books. For the first time, this makes it possible to track the circulation of books, their trade routes and later collecting, across Europe and the USA, and throughout the centuries. The objectives of this project are to examine (1) the distribution and trade-routes, national and international, of 15th-c. printed books, along with the identity of the buyers and users (private, institutional, religious, lay, female, male, and by profession) and their reading practices; (2) the books' contemporary market value; (3) the transmission and dissemination of the texts they contain, their survival and their loss (rebalancing potentially skewed scholarship); and (4) the circulation and re-use of the illustrations they contain. Finally, the project will experiment with the application of scientific visualization techniques to represent, geographically and chronologically, the movement of 15th-c. printed books and of the texts they contain.
Max ERC Funding
1 999 172 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym 19TH-CENTURY_EUCLID
Project Nineteenth-Century Euclid: Geometry and the Literary Imagination from Wordsworth to Wells
Researcher (PI) Alice Jenkins
Host Institution (HI) UNIVERSITY OF GLASGOW
Call Details Starting Grant (StG), SH4, ERC-2007-StG
Summary This radically interdisciplinary project aims to bring a substantially new field of research – literature and mathematics studies – to prominence as a tool for investigating the culture of nineteenth-century Britain. It will result in three kinds of outcome: a monograph, two interdisciplinary and international colloquia, and a collection of essays. The project focuses on Euclidean geometry as a key element of nineteenth-century literary and scientific culture, showing that it was part of the shared knowledge flowing through elite and popular Romantic and Victorian writing, and figuring notably in the work of very many of the century’s best-known writers. Despite its traditional cultural prestige and educational centrality, geometry has been almost wholly neglected by literary history. This project shows how literature and mathematics studies can draw a new map of nineteenth-century British culture, revitalising our understanding of the Romantic and Victorian imagination through its writing about geometry.
Summary
This radically interdisciplinary project aims to bring a substantially new field of research – literature and mathematics studies – to prominence as a tool for investigating the culture of nineteenth-century Britain. It will result in three kinds of outcome: a monograph, two interdisciplinary and international colloquia, and a collection of essays. The project focuses on Euclidean geometry as a key element of nineteenth-century literary and scientific culture, showing that it was part of the shared knowledge flowing through elite and popular Romantic and Victorian writing, and figuring notably in the work of very many of the century’s best-known writers. Despite its traditional cultural prestige and educational centrality, geometry has been almost wholly neglected by literary history. This project shows how literature and mathematics studies can draw a new map of nineteenth-century British culture, revitalising our understanding of the Romantic and Victorian imagination through its writing about geometry.
Max ERC Funding
323 118 €
Duration
Start date: 2009-01-01, End date: 2011-10-31
Project acronym 1D-Engine
Project 1D-electrons coupled to dissipation: a novel approach for understanding and engineering superconducting materials and devices
Researcher (PI) Adrian KANTIAN
Host Institution (HI) UPPSALA UNIVERSITET
Call Details Starting Grant (StG), PE3, ERC-2017-STG
Summary Correlated electrons are at the forefront of condensed matter theory. Interacting quasi-1D electrons have seen vast progress in analytical and numerical theory, and thus in fundamental understanding and quantitative prediction. Yet, in the 1D limit fluctuations preclude important technological use, particularly of superconductors. In contrast, high-Tc superconductors in 2D/3D are not precluded by fluctuations, but lack a fundamental theory, making prediction and engineering of their properties, a major goal in physics, very difficult. This project aims to combine the advantages of both areas by making major progress in the theory of quasi-1D electrons coupled to an electron bath, in part building on recent breakthroughs (with the PIs extensive involvement) in simulating 1D and 2D electrons with parallelized density matrix renormalization group (pDMRG) numerics. Such theory will fundamentally advance the study of open electron systems, and show how to use 1D materials as elements of new superconducting (SC) devices and materials: 1) It will enable a new state of matter, 1D electrons with true SC order. Fluctuations from the electronic liquid, such as graphene, could also enable nanoscale wires to appear SC at high temperatures. 2) A new approach for the deliberate engineering of a high-Tc superconductor. In 1D, how electrons pair by repulsive interactions is understood and can be predicted. Stabilization by reservoir - formed by a parallel array of many such 1D systems - offers a superconductor for which all factors setting Tc are known and can be optimized. 3) Many existing superconductors with repulsive electron pairing, all presently not understood, can be cast as 1D electrons coupled to a bath. Developing chain-DMFT theory based on pDMRG will allow these materials SC properties to be simulated and understood for the first time. 4) The insights gained will be translated to 2D superconductors to study how they could be enhanced by contact with electronic liquids.
Summary
Correlated electrons are at the forefront of condensed matter theory. Interacting quasi-1D electrons have seen vast progress in analytical and numerical theory, and thus in fundamental understanding and quantitative prediction. Yet, in the 1D limit fluctuations preclude important technological use, particularly of superconductors. In contrast, high-Tc superconductors in 2D/3D are not precluded by fluctuations, but lack a fundamental theory, making prediction and engineering of their properties, a major goal in physics, very difficult. This project aims to combine the advantages of both areas by making major progress in the theory of quasi-1D electrons coupled to an electron bath, in part building on recent breakthroughs (with the PIs extensive involvement) in simulating 1D and 2D electrons with parallelized density matrix renormalization group (pDMRG) numerics. Such theory will fundamentally advance the study of open electron systems, and show how to use 1D materials as elements of new superconducting (SC) devices and materials: 1) It will enable a new state of matter, 1D electrons with true SC order. Fluctuations from the electronic liquid, such as graphene, could also enable nanoscale wires to appear SC at high temperatures. 2) A new approach for the deliberate engineering of a high-Tc superconductor. In 1D, how electrons pair by repulsive interactions is understood and can be predicted. Stabilization by reservoir - formed by a parallel array of many such 1D systems - offers a superconductor for which all factors setting Tc are known and can be optimized. 3) Many existing superconductors with repulsive electron pairing, all presently not understood, can be cast as 1D electrons coupled to a bath. Developing chain-DMFT theory based on pDMRG will allow these materials SC properties to be simulated and understood for the first time. 4) The insights gained will be translated to 2D superconductors to study how they could be enhanced by contact with electronic liquids.
Max ERC Funding
1 491 013 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym 1st-principles-discs
Project A First Principles Approach to Accretion Discs
Researcher (PI) Martin Elias Pessah
Host Institution (HI) KOBENHAVNS UNIVERSITET
Call Details Starting Grant (StG), PE9, ERC-2012-StG_20111012
Summary Most celestial bodies, from planets, to stars, to black holes; gain mass during their lives by means of an accretion disc. Understanding the physical processes that determine the rate at which matter accretes and energy is radiated in these discs is vital for unraveling the formation, evolution, and fate of almost every type of object in the Universe. Despite the fact that magnetic fields have been known to be crucial in accretion discs since the early 90’s, the majority of astrophysical questions that depend on the details of how disc accretion proceeds are still being addressed using the “standard” accretion disc model (developed in the early 70’s), where magnetic fields do not play an explicit role. This has prevented us from fully exploring the astrophysical consequences and observational signatures of realistic accretion disc models, leading to a profound disconnect between observations (usually interpreted with the standard paradigm) and modern accretion disc theory and numerical simulations (where magnetic turbulence is crucial). The goal of this proposal is to use several complementary approaches in order to finally move beyond the standard paradigm. This program has two main objectives: 1) Develop the theoretical framework to incorporate magnetic fields, and the ensuing turbulence, into self-consistent accretion disc models, and investigate their observational implications. 2) Investigate transport and radiative processes in collision-less disc regions, where non-thermal radiation originates, by employing a kinetic particle description of the plasma. In order to achieve these goals, we will use, and build upon, state-of-the-art magnetohydrodynamic and particle-in-cell codes in conjunction with theoretical modeling. This framework will make it possible to address fundamental questions on stellar and planet formation, binary systems with a compact object, and supermassive black hole feedback in a way that has no counterpart within the standard paradigm.
Summary
Most celestial bodies, from planets, to stars, to black holes; gain mass during their lives by means of an accretion disc. Understanding the physical processes that determine the rate at which matter accretes and energy is radiated in these discs is vital for unraveling the formation, evolution, and fate of almost every type of object in the Universe. Despite the fact that magnetic fields have been known to be crucial in accretion discs since the early 90’s, the majority of astrophysical questions that depend on the details of how disc accretion proceeds are still being addressed using the “standard” accretion disc model (developed in the early 70’s), where magnetic fields do not play an explicit role. This has prevented us from fully exploring the astrophysical consequences and observational signatures of realistic accretion disc models, leading to a profound disconnect between observations (usually interpreted with the standard paradigm) and modern accretion disc theory and numerical simulations (where magnetic turbulence is crucial). The goal of this proposal is to use several complementary approaches in order to finally move beyond the standard paradigm. This program has two main objectives: 1) Develop the theoretical framework to incorporate magnetic fields, and the ensuing turbulence, into self-consistent accretion disc models, and investigate their observational implications. 2) Investigate transport and radiative processes in collision-less disc regions, where non-thermal radiation originates, by employing a kinetic particle description of the plasma. In order to achieve these goals, we will use, and build upon, state-of-the-art magnetohydrodynamic and particle-in-cell codes in conjunction with theoretical modeling. This framework will make it possible to address fundamental questions on stellar and planet formation, binary systems with a compact object, and supermassive black hole feedback in a way that has no counterpart within the standard paradigm.
Max ERC Funding
1 793 697 €
Duration
Start date: 2013-02-01, End date: 2018-01-31
Project acronym 1stProposal
Project An alternative development of analytic number theory and applications
Researcher (PI) ANDREW Granville
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Advanced Grant (AdG), PE1, ERC-2014-ADG
Summary The traditional (Riemann) approach to analytic number theory uses the zeros of zeta functions. This requires the associated multiplicative function, say f(n), to have special enough properties that the associated Dirichlet series may be analytically continued. In this proposal we continue to develop an approach which requires less of the multiplicative function, linking the original question with the mean value of f. Such techniques have been around for a long time but have generally been regarded as “ad hoc”. In this project we aim to show that one can develop a coherent approach to the whole subject, not only reproving all of the old results, but also many new ones that appear inaccessible to traditional methods.
Our first goal is to complete a monograph yielding a reworking of all the classical theory using these new methods and then to push forward in new directions. The most important is to extend these techniques to GL(n) L-functions, which we hope will now be feasible having found the correct framework in which to proceed. Since we rarely know how to analytically continue such L-functions this could be of great benefit to the subject.
We are developing the large sieve so that it can be used for individual moduli, and will determine a strong form of that. Also a new method to give asymptotics for mean values, when they are not too small.
We wish to incorporate techniques of analytic number theory into our theory, for example recent advances on mean values of Dirichlet polynomials. Also the recent breakthroughs on the sieve suggest strong links that need further exploration.
Additive combinatorics yields important results in many areas. There are strong analogies between its results, and those for multiplicative functions, especially in large value spectrum theory, and its applications. We hope to develop these further.
Much of this is joint work with K Soundararajan of Stanford University.
Summary
The traditional (Riemann) approach to analytic number theory uses the zeros of zeta functions. This requires the associated multiplicative function, say f(n), to have special enough properties that the associated Dirichlet series may be analytically continued. In this proposal we continue to develop an approach which requires less of the multiplicative function, linking the original question with the mean value of f. Such techniques have been around for a long time but have generally been regarded as “ad hoc”. In this project we aim to show that one can develop a coherent approach to the whole subject, not only reproving all of the old results, but also many new ones that appear inaccessible to traditional methods.
Our first goal is to complete a monograph yielding a reworking of all the classical theory using these new methods and then to push forward in new directions. The most important is to extend these techniques to GL(n) L-functions, which we hope will now be feasible having found the correct framework in which to proceed. Since we rarely know how to analytically continue such L-functions this could be of great benefit to the subject.
We are developing the large sieve so that it can be used for individual moduli, and will determine a strong form of that. Also a new method to give asymptotics for mean values, when they are not too small.
We wish to incorporate techniques of analytic number theory into our theory, for example recent advances on mean values of Dirichlet polynomials. Also the recent breakthroughs on the sieve suggest strong links that need further exploration.
Additive combinatorics yields important results in many areas. There are strong analogies between its results, and those for multiplicative functions, especially in large value spectrum theory, and its applications. We hope to develop these further.
Much of this is joint work with K Soundararajan of Stanford University.
Max ERC Funding
2 011 742 €
Duration
Start date: 2015-08-01, End date: 2020-07-31
Project acronym 2-3-AUT
Project Surfaces, 3-manifolds and automorphism groups
Researcher (PI) Nathalie Wahl
Host Institution (HI) KOBENHAVNS UNIVERSITET
Call Details Starting Grant (StG), PE1, ERC-2009-StG
Summary The scientific goal of the proposal is to answer central questions related to diffeomorphism groups of manifolds of dimension 2 and 3, and to their deformation invariant analogs, the mapping class groups. While the classification of surfaces has been known for more than a century, their automorphism groups have yet to be fully understood. Even less is known about diffeomorphisms of 3-manifolds despite much interest, and the objects here have only been classified recently, by the breakthrough work of Perelman on the Poincar\'e and geometrization conjectures. In dimension 2, I will focus on the relationship between mapping class groups and topological conformal field theories, with applications to Hochschild homology. In dimension 3, I propose to compute the stable homology of classifying spaces of diffeomorphism groups and mapping class groups, as well as study the homotopy type of the space of diffeomorphisms. I propose moreover to establish homological stability theorems in the wider context of automorphism groups and more general families of groups. The project combines breakthrough methods from homotopy theory with methods from differential and geometric topology. The research team will consist of 3 PhD students, and 4 postdocs, which I will lead.
Summary
The scientific goal of the proposal is to answer central questions related to diffeomorphism groups of manifolds of dimension 2 and 3, and to their deformation invariant analogs, the mapping class groups. While the classification of surfaces has been known for more than a century, their automorphism groups have yet to be fully understood. Even less is known about diffeomorphisms of 3-manifolds despite much interest, and the objects here have only been classified recently, by the breakthrough work of Perelman on the Poincar\'e and geometrization conjectures. In dimension 2, I will focus on the relationship between mapping class groups and topological conformal field theories, with applications to Hochschild homology. In dimension 3, I propose to compute the stable homology of classifying spaces of diffeomorphism groups and mapping class groups, as well as study the homotopy type of the space of diffeomorphisms. I propose moreover to establish homological stability theorems in the wider context of automorphism groups and more general families of groups. The project combines breakthrough methods from homotopy theory with methods from differential and geometric topology. The research team will consist of 3 PhD students, and 4 postdocs, which I will lead.
Max ERC Funding
724 992 €
Duration
Start date: 2009-11-01, End date: 2014-10-31
Project acronym 2D4QT
Project 2D Materials for Quantum Technology
Researcher (PI) Christoph STAMPFER
Host Institution (HI) RHEINISCH-WESTFAELISCHE TECHNISCHE HOCHSCHULE AACHEN
Call Details Consolidator Grant (CoG), PE3, ERC-2018-COG
Summary Since its discovery, graphene has been indicated as a promising platform for quantum technologies (QT). The number of theoretical proposal dedicated to this vision has grown steadily, exploring a wide range of directions, ranging from spin and valley qubits, to topologically-protected states. The experimental confirmation of these ideas lagged so far significantly behind, mostly because of material quality problems. The quality of graphene-based devices has however improved dramatically in the past five years, thanks to the advent of the so-called van der Waals (vdW) heteostructures - artificial solids formed by mechanically stacking layers of different two dimensional (2D) materials, such as graphene, hexagonal boron nitride and transition metal dichalcogenides. These new advances open now finally the door to put several of those theoretical proposals to test.
The goal of this project is to assess experimentally the potential of graphene-based heterostructures for QT applications. Specifically, I will push the development of an advanced technological platform for vdW heterostructures, which will allow to give quantitative answers to the following open questions: i) what are the relaxation and coherence times of spin and valley qubits in isotopically purified bilayer graphene (BLG); ii) what is the efficiency of a Cooper-pair splitter based on BLG; and iii) what are the characteristic energy scales of topologically protected quantum states engineered in graphene-based heterostructures.
At the end of this project, I aim at being in the position of saying whether graphene is the horse-worth-betting-on predicted by theory, or whether it still hides surprises in terms of fundamental physics. The technological advancements developed in this project for integrating nanostructured layers into vdW heterostructures will reach even beyond this goal, opening the door to new research directions and possible applications.
Summary
Since its discovery, graphene has been indicated as a promising platform for quantum technologies (QT). The number of theoretical proposal dedicated to this vision has grown steadily, exploring a wide range of directions, ranging from spin and valley qubits, to topologically-protected states. The experimental confirmation of these ideas lagged so far significantly behind, mostly because of material quality problems. The quality of graphene-based devices has however improved dramatically in the past five years, thanks to the advent of the so-called van der Waals (vdW) heteostructures - artificial solids formed by mechanically stacking layers of different two dimensional (2D) materials, such as graphene, hexagonal boron nitride and transition metal dichalcogenides. These new advances open now finally the door to put several of those theoretical proposals to test.
The goal of this project is to assess experimentally the potential of graphene-based heterostructures for QT applications. Specifically, I will push the development of an advanced technological platform for vdW heterostructures, which will allow to give quantitative answers to the following open questions: i) what are the relaxation and coherence times of spin and valley qubits in isotopically purified bilayer graphene (BLG); ii) what is the efficiency of a Cooper-pair splitter based on BLG; and iii) what are the characteristic energy scales of topologically protected quantum states engineered in graphene-based heterostructures.
At the end of this project, I aim at being in the position of saying whether graphene is the horse-worth-betting-on predicted by theory, or whether it still hides surprises in terms of fundamental physics. The technological advancements developed in this project for integrating nanostructured layers into vdW heterostructures will reach even beyond this goal, opening the door to new research directions and possible applications.
Max ERC Funding
1 806 250 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym 2DHIBSA
Project Nanoscopic and Hierachical Materials via Living Crystallization-Driven Self-Assembly
Researcher (PI) Ian MANNERS
Host Institution (HI) UNIVERSITY OF BRISTOL
Call Details Advanced Grant (AdG), PE5, ERC-2017-ADG
Summary A key synthetic challenge of widespread interest in chemical science involves the creation of well-defined 2D functional materials that exist on a length-scale of nanometers to microns. In this ambitious 5 year proposal we aim to tackle this issue by exploiting the unique opportunities made possible by recent developments with the living crystallization-driven self-assembly (CDSA) platform. Using this solution processing approach, amphiphilic block copolymers (BCPs) with crystallizable blocks, related amphiphiles, and polymers with charged end groups will be used to predictably construct monodisperse samples of tailored, functional soft matter-based 2D nanostructures with controlled shape, size, and spatially-defined chemistries. Many of the resulting nanostructures will also offer unprecedented opportunities as precursors to materials with hierarchical structures through further solution-based “bottom-up” assembly methods. In addition to fundamental studies, the proposed work also aims to make important impact in the cutting-edge fields of liquid crystals, interface stabilization, catalysis, supramolecular polymers, and hierarchical materials.
Summary
A key synthetic challenge of widespread interest in chemical science involves the creation of well-defined 2D functional materials that exist on a length-scale of nanometers to microns. In this ambitious 5 year proposal we aim to tackle this issue by exploiting the unique opportunities made possible by recent developments with the living crystallization-driven self-assembly (CDSA) platform. Using this solution processing approach, amphiphilic block copolymers (BCPs) with crystallizable blocks, related amphiphiles, and polymers with charged end groups will be used to predictably construct monodisperse samples of tailored, functional soft matter-based 2D nanostructures with controlled shape, size, and spatially-defined chemistries. Many of the resulting nanostructures will also offer unprecedented opportunities as precursors to materials with hierarchical structures through further solution-based “bottom-up” assembly methods. In addition to fundamental studies, the proposed work also aims to make important impact in the cutting-edge fields of liquid crystals, interface stabilization, catalysis, supramolecular polymers, and hierarchical materials.
Max ERC Funding
2 499 597 €
Duration
Start date: 2018-05-01, End date: 2023-04-30