Project acronym 15CBOOKTRADE
Project The 15th-century Book Trade: An Evidence-based Assessment and Visualization of the Distribution, Sale, and Reception of Books in the Renaissance
Researcher (PI) Cristina Dondi
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Consolidator Grant (CoG), SH6, ERC-2013-CoG
Summary The idea that underpins this project is to use the material evidence from thousands of surviving 15th-c. books, as well as unique documentary evidence — the unpublished ledger of a Venetian bookseller in the 1480s which records the sale of 25,000 printed books with their prices — to address four fundamental questions relating to the introduction of printing in the West which have so far eluded scholarship, partly because of lack of evidence, partly because of the lack of effective tools to deal with existing evidence. The book trade differs from other trades operating in the medieval and early modern periods in that the goods traded survive in considerable numbers. Not only do they survive, but many of them bear stratified evidence of their history in the form of marks of ownership, prices, manuscript annotations, binding and decoration styles. A British Academy pilot project conceived by the PI produced a now internationally-used database which gathers together this kind of evidence for thousands of surviving 15th-c. printed books. For the first time, this makes it possible to track the circulation of books, their trade routes and later collecting, across Europe and the USA, and throughout the centuries. The objectives of this project are to examine (1) the distribution and trade-routes, national and international, of 15th-c. printed books, along with the identity of the buyers and users (private, institutional, religious, lay, female, male, and by profession) and their reading practices; (2) the books' contemporary market value; (3) the transmission and dissemination of the texts they contain, their survival and their loss (rebalancing potentially skewed scholarship); and (4) the circulation and re-use of the illustrations they contain. Finally, the project will experiment with the application of scientific visualization techniques to represent, geographically and chronologically, the movement of 15th-c. printed books and of the texts they contain.
Summary
The idea that underpins this project is to use the material evidence from thousands of surviving 15th-c. books, as well as unique documentary evidence — the unpublished ledger of a Venetian bookseller in the 1480s which records the sale of 25,000 printed books with their prices — to address four fundamental questions relating to the introduction of printing in the West which have so far eluded scholarship, partly because of lack of evidence, partly because of the lack of effective tools to deal with existing evidence. The book trade differs from other trades operating in the medieval and early modern periods in that the goods traded survive in considerable numbers. Not only do they survive, but many of them bear stratified evidence of their history in the form of marks of ownership, prices, manuscript annotations, binding and decoration styles. A British Academy pilot project conceived by the PI produced a now internationally-used database which gathers together this kind of evidence for thousands of surviving 15th-c. printed books. For the first time, this makes it possible to track the circulation of books, their trade routes and later collecting, across Europe and the USA, and throughout the centuries. The objectives of this project are to examine (1) the distribution and trade-routes, national and international, of 15th-c. printed books, along with the identity of the buyers and users (private, institutional, religious, lay, female, male, and by profession) and their reading practices; (2) the books' contemporary market value; (3) the transmission and dissemination of the texts they contain, their survival and their loss (rebalancing potentially skewed scholarship); and (4) the circulation and re-use of the illustrations they contain. Finally, the project will experiment with the application of scientific visualization techniques to represent, geographically and chronologically, the movement of 15th-c. printed books and of the texts they contain.
Max ERC Funding
1 999 172 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym 2DIR SPECTROMETER
Project A step-change in sensitivity for two dimensional laser infrared spectroscopy
Researcher (PI) Jasper VAN THOR
Host Institution (HI) IMPERIAL COLLEGE OF SCIENCE TECHNOLOGY AND MEDICINE
Call Details Proof of Concept (PoC), PC1, ERC-2013-PoC
Summary "Here, we propose a novel design for a significantly improved detector for the emerging field of coherent two-dimension infrared (2DIR) spectroscopy, which is an optical analog of Nuclear Magnetic Resonance spectroscopy (NMR). 2DIR is a cutting edge technique which is rapidly growing and has applications in subjects as diverse as energy sciences, biophysics, biomedical research and physical chemistry. Currently, the single most important technical problem that is generally agreed to limit applications of the methodology is the sensitivity with which the signals are measured. Having worked on multiple stabilisation techniques during the ERC funded research it was realised that a straightforward design alteration of the infrared detector will improve the sensitivity very significantly, theoretically by more than one order of magnitude. Here, the technical principles are explained, and a plan for commercialising the instrument in collaboration with the current market leader - Infrared System Development Corp. (ISDC) -. We apply for funding to develop the prototype."
Summary
"Here, we propose a novel design for a significantly improved detector for the emerging field of coherent two-dimension infrared (2DIR) spectroscopy, which is an optical analog of Nuclear Magnetic Resonance spectroscopy (NMR). 2DIR is a cutting edge technique which is rapidly growing and has applications in subjects as diverse as energy sciences, biophysics, biomedical research and physical chemistry. Currently, the single most important technical problem that is generally agreed to limit applications of the methodology is the sensitivity with which the signals are measured. Having worked on multiple stabilisation techniques during the ERC funded research it was realised that a straightforward design alteration of the infrared detector will improve the sensitivity very significantly, theoretically by more than one order of magnitude. Here, the technical principles are explained, and a plan for commercialising the instrument in collaboration with the current market leader - Infrared System Development Corp. (ISDC) -. We apply for funding to develop the prototype."
Max ERC Funding
149 999 €
Duration
Start date: 2013-11-01, End date: 2014-10-31
Project acronym 2DQP
Project Two-dimensional quantum photonics
Researcher (PI) Brian David GERARDOT
Host Institution (HI) HERIOT-WATT UNIVERSITY
Call Details Consolidator Grant (CoG), PE3, ERC-2016-COG
Summary Quantum optics, the study of how discrete packets of light (photons) and matter interact, has led to the development of remarkable new technologies which exploit the bizarre properties of quantum mechanics. These quantum technologies are primed to revolutionize the fields of communication, information processing, and metrology in the coming years. Similar to contemporary technologies, the future quantum machinery will likely consist of a semiconductor platform to create and process the quantum information. However, to date the demanding requirements on a quantum photonic platform have yet to be satisfied with conventional bulk (three-dimensional) semiconductors.
To surmount these well-known obstacles, a new paradigm in quantum photonics is required. Initiated by the recent discovery of single photon emitters in atomically flat (two-dimensional) semiconducting materials, 2DQP aims to be at the nucleus of a new approach by realizing quantum optics with ultra-stable (coherent) quantum states integrated into devices with electronic and photonic functionality. We will characterize, identify, engineer, and coherently manipulate localized quantum states in this two-dimensional quantum photonic platform. A vital component of 2DQP’s vision is to go beyond the fundamental science and achieve the ideal solid-state single photon device yielding perfect extraction - 100% efficiency - of on-demand indistinguishable single photons. Finally, we will exploit this ideal device to implement the critical building block for a photonic quantum computer.
Summary
Quantum optics, the study of how discrete packets of light (photons) and matter interact, has led to the development of remarkable new technologies which exploit the bizarre properties of quantum mechanics. These quantum technologies are primed to revolutionize the fields of communication, information processing, and metrology in the coming years. Similar to contemporary technologies, the future quantum machinery will likely consist of a semiconductor platform to create and process the quantum information. However, to date the demanding requirements on a quantum photonic platform have yet to be satisfied with conventional bulk (three-dimensional) semiconductors.
To surmount these well-known obstacles, a new paradigm in quantum photonics is required. Initiated by the recent discovery of single photon emitters in atomically flat (two-dimensional) semiconducting materials, 2DQP aims to be at the nucleus of a new approach by realizing quantum optics with ultra-stable (coherent) quantum states integrated into devices with electronic and photonic functionality. We will characterize, identify, engineer, and coherently manipulate localized quantum states in this two-dimensional quantum photonic platform. A vital component of 2DQP’s vision is to go beyond the fundamental science and achieve the ideal solid-state single photon device yielding perfect extraction - 100% efficiency - of on-demand indistinguishable single photons. Finally, we will exploit this ideal device to implement the critical building block for a photonic quantum computer.
Max ERC Funding
1 999 135 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym 321
Project from Cubic To Linear complexity in computational electromagnetics
Researcher (PI) Francesco Paolo ANDRIULLI
Host Institution (HI) POLITECNICO DI TORINO
Call Details Consolidator Grant (CoG), PE7, ERC-2016-COG
Summary Computational Electromagnetics (CEM) is the scientific field at the origin of all new modeling and simulation tools required by the constantly arising design challenges of emerging and future technologies in applied electromagnetics. As in many other technological fields, however, the trend in all emerging technologies in electromagnetic engineering is going towards miniaturized, higher density and multi-scale scenarios. Computationally speaking this translates in the steep increase of the number of degrees of freedom. Given that the design cost (the cost of a multi-right-hand side problem dominated by matrix inversion) can scale as badly as cubically with these degrees of freedom, this fact, as pointed out by many, will sensibly compromise the practical impact of CEM on future and emerging technologies.
For this reason, the CEM scientific community has been looking for years for a FFT-like paradigm shift: a dynamic fast direct solver providing a design cost that would scale only linearly with the degrees of freedom. Such a fast solver is considered today a Holy Grail of the discipline.
The Grand Challenge of 321 will be to tackle this Holy Grail in Computational Electromagnetics by investigating a dynamic Fast Direct Solver for Maxwell Problems that would run in a linear-instead-of-cubic complexity for an arbitrary number and configuration of degrees of freedom.
The failure of all previous attempts will be overcome by a game-changing transformation of the CEM classical problem that will leverage on a recent breakthrough of the PI. Starting from this, the project will investigate an entire new paradigm for impacting algorithms to achieve this grand challenge.
The impact of the FFT’s quadratic-to-linear paradigm shift shows how computational complexity reductions can be groundbreaking on applications. The cubic-to-linear paradigm shift, which the 321 project will aim for, will have such a rupturing impact on electromagnetic science and technology.
Summary
Computational Electromagnetics (CEM) is the scientific field at the origin of all new modeling and simulation tools required by the constantly arising design challenges of emerging and future technologies in applied electromagnetics. As in many other technological fields, however, the trend in all emerging technologies in electromagnetic engineering is going towards miniaturized, higher density and multi-scale scenarios. Computationally speaking this translates in the steep increase of the number of degrees of freedom. Given that the design cost (the cost of a multi-right-hand side problem dominated by matrix inversion) can scale as badly as cubically with these degrees of freedom, this fact, as pointed out by many, will sensibly compromise the practical impact of CEM on future and emerging technologies.
For this reason, the CEM scientific community has been looking for years for a FFT-like paradigm shift: a dynamic fast direct solver providing a design cost that would scale only linearly with the degrees of freedom. Such a fast solver is considered today a Holy Grail of the discipline.
The Grand Challenge of 321 will be to tackle this Holy Grail in Computational Electromagnetics by investigating a dynamic Fast Direct Solver for Maxwell Problems that would run in a linear-instead-of-cubic complexity for an arbitrary number and configuration of degrees of freedom.
The failure of all previous attempts will be overcome by a game-changing transformation of the CEM classical problem that will leverage on a recent breakthrough of the PI. Starting from this, the project will investigate an entire new paradigm for impacting algorithms to achieve this grand challenge.
The impact of the FFT’s quadratic-to-linear paradigm shift shows how computational complexity reductions can be groundbreaking on applications. The cubic-to-linear paradigm shift, which the 321 project will aim for, will have such a rupturing impact on electromagnetic science and technology.
Max ERC Funding
2 000 000 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym A-BINGOS
Project Accreting binary populations in Nearby Galaxies: Observations and Simulations
Researcher (PI) Andreas Zezas
Host Institution (HI) IDRYMA TECHNOLOGIAS KAI EREVNAS
Call Details Consolidator Grant (CoG), PE9, ERC-2013-CoG
Summary "High-energy observations of our Galaxy offer a good, albeit not complete, picture of the X-ray source populations, in particular the accreting binary sources. Recent ability to study accreting binaries in nearby galaxies has shown that we would be short-sighted if we restricted ourselves to our Galaxy or to a few nearby ones. I propose an ambitious project that involves a comprehensive study of all the galaxies within 10 Mpc for which we can study in detail their X-ray sources and stellar populations. The study will combine data from a unique suite of observatories (Chandra, XMM-Newton, HST, Spitzer) with state-of-the-art theoretical modelling of binary systems. I propose a novel approach that links the accreting binary populations to their parent stellar populations and surpasses any current studies of X-ray binary populations, both in scale and in scope, by: (a) combining methods and results from several different areas of astrophysics (compact objects, binary systems, stellar populations, galaxy evolution); (b) using data from almost the whole electromagnetic spectrum (infrared to X-ray bands); (c) identifying and studying the different sub-populations of accreting binaries; and (d) performing direct comparison between observations and theoretical predictions, over a broad parameter space. The project: (a) will answer the long-standing question of the formation efficiency of accreting binaries in different environments; and (b) will constrain their evolutionary paths. As by-products the project will provide eagerly awaited input to the fields of gravitational-wave sources, γ-ray bursts, and X-ray emitting galaxies at cosmological distances and it will produce a heritage multi-wavelength dataset and library of models for future studies of galaxies and accreting binaries."
Summary
"High-energy observations of our Galaxy offer a good, albeit not complete, picture of the X-ray source populations, in particular the accreting binary sources. Recent ability to study accreting binaries in nearby galaxies has shown that we would be short-sighted if we restricted ourselves to our Galaxy or to a few nearby ones. I propose an ambitious project that involves a comprehensive study of all the galaxies within 10 Mpc for which we can study in detail their X-ray sources and stellar populations. The study will combine data from a unique suite of observatories (Chandra, XMM-Newton, HST, Spitzer) with state-of-the-art theoretical modelling of binary systems. I propose a novel approach that links the accreting binary populations to their parent stellar populations and surpasses any current studies of X-ray binary populations, both in scale and in scope, by: (a) combining methods and results from several different areas of astrophysics (compact objects, binary systems, stellar populations, galaxy evolution); (b) using data from almost the whole electromagnetic spectrum (infrared to X-ray bands); (c) identifying and studying the different sub-populations of accreting binaries; and (d) performing direct comparison between observations and theoretical predictions, over a broad parameter space. The project: (a) will answer the long-standing question of the formation efficiency of accreting binaries in different environments; and (b) will constrain their evolutionary paths. As by-products the project will provide eagerly awaited input to the fields of gravitational-wave sources, γ-ray bursts, and X-ray emitting galaxies at cosmological distances and it will produce a heritage multi-wavelength dataset and library of models for future studies of galaxies and accreting binaries."
Max ERC Funding
1 242 000 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym AAREA
Project The Archaeology of Agricultural Resilience in Eastern Africa
Researcher (PI) Daryl Stump
Host Institution (HI) UNIVERSITY OF YORK
Call Details Starting Grant (StG), SH6, ERC-2013-StG
Summary "The twin concepts of sustainability and conservation that are so pivotal within current debates regarding economic development and biodiversity protection both contain an inherent temporal dimension, since both refer to the need to balance short-term gains with long-term resource maintenance. Proponents of resilience theory and of development based on ‘indigenous knowledge’ have thus argued for the necessity of including archaeological, historical and palaeoenvironmental components within development project design. Indeed, some have argued that archaeology should lead these interdisciplinary projects on the grounds that it provides the necessary time depth and bridges the social and natural sciences. The project proposed here accepts this logic and endorses this renewed contemporary relevance of archaeological research. However, it also needs to be admitted that moving beyond critiques of the misuse of historical data presents significant hurdles. When presenting results outside the discipline, for example, archaeological projects tend to downplay the poor archaeological visibility of certain agricultural practices, and computer models designed to test sustainability struggle to adequately account for local cultural preferences. This field will therefore not progress unless there is a frank appraisal of archaeology’s strengths and weaknesses. This project will provide this assessment by employing a range of established and groundbreaking archaeological and modelling techniques to examine the development of two east Africa agricultural systems: one at the abandoned site of Engaruka in Tanzania, commonly seen as an example of resource mismanagement and ecological collapse; and another at the current agricultural landscape in Konso, Ethiopia, described by the UN FAO as one of a select few African “lessons from the past”. The project thus aims to assess the sustainability of these systems, but will also assess the role archaeology can play in such debates worldwide."
Summary
"The twin concepts of sustainability and conservation that are so pivotal within current debates regarding economic development and biodiversity protection both contain an inherent temporal dimension, since both refer to the need to balance short-term gains with long-term resource maintenance. Proponents of resilience theory and of development based on ‘indigenous knowledge’ have thus argued for the necessity of including archaeological, historical and palaeoenvironmental components within development project design. Indeed, some have argued that archaeology should lead these interdisciplinary projects on the grounds that it provides the necessary time depth and bridges the social and natural sciences. The project proposed here accepts this logic and endorses this renewed contemporary relevance of archaeological research. However, it also needs to be admitted that moving beyond critiques of the misuse of historical data presents significant hurdles. When presenting results outside the discipline, for example, archaeological projects tend to downplay the poor archaeological visibility of certain agricultural practices, and computer models designed to test sustainability struggle to adequately account for local cultural preferences. This field will therefore not progress unless there is a frank appraisal of archaeology’s strengths and weaknesses. This project will provide this assessment by employing a range of established and groundbreaking archaeological and modelling techniques to examine the development of two east Africa agricultural systems: one at the abandoned site of Engaruka in Tanzania, commonly seen as an example of resource mismanagement and ecological collapse; and another at the current agricultural landscape in Konso, Ethiopia, described by the UN FAO as one of a select few African “lessons from the past”. The project thus aims to assess the sustainability of these systems, but will also assess the role archaeology can play in such debates worldwide."
Max ERC Funding
1 196 701 €
Duration
Start date: 2014-02-01, End date: 2018-01-31
Project acronym AB-SWITCH
Project Evaluation of commercial potential of a low-cost kit based on DNA-nanoswitches for the single-step measurement of diagnostic antibodies
Researcher (PI) Francesco RICCI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA TOR VERGATA
Call Details Proof of Concept (PoC), ERC-2016-PoC, ERC-2016-PoC
Summary "Antibodies are among the most widely monitored class of diagnostic biomarkers. Immunoassays market now covers about 1/3 of the global market of in-vitro diagnostics (about $50 billion). However, current methods for the detection of diagnostic antibodies are either qualitative or require cumbersome, resource-intensive laboratory procedures that need hours to provide clinicians with diagnostic information. A new method for fast and low-cost detection of antibodies will have a strong economic impact in the market of in-vitro diagnostics and Immunoassays.
During our ERC Starting Grant project ""Nature Nanodevices"" we have developed a novel diagnostic technology for the detection of clinically relevant antibodies in serum and other body fluids. The platform (here named Ab-switch) supports the fluorescent detection of diagnostic antibodies (for example, HIV diagnostic antibodies) in a rapid (<3 minutes), single-step and low-cost fashion.
The goal of this Proof of Concept project is to bring our promising platform to the proof of diagnostic market and exploit its innovative features for commercial purposes. We will focus our initial efforts in the development of rapid kits for the detection of antibodies diagnostic of HIV. We will 1) Fully characterize the Ab-switch product in terms of analytical performances (i.e. sensitivity, specificity, stability etc.) with direct comparison with other commercial kits; 2) Prepare a Manufacturing Plan for producing/testing the Ab-switch; 3) Establish an IP strategy for patent filing and maintenance; 4) Determine a business and commercialization planning."
Summary
"Antibodies are among the most widely monitored class of diagnostic biomarkers. Immunoassays market now covers about 1/3 of the global market of in-vitro diagnostics (about $50 billion). However, current methods for the detection of diagnostic antibodies are either qualitative or require cumbersome, resource-intensive laboratory procedures that need hours to provide clinicians with diagnostic information. A new method for fast and low-cost detection of antibodies will have a strong economic impact in the market of in-vitro diagnostics and Immunoassays.
During our ERC Starting Grant project ""Nature Nanodevices"" we have developed a novel diagnostic technology for the detection of clinically relevant antibodies in serum and other body fluids. The platform (here named Ab-switch) supports the fluorescent detection of diagnostic antibodies (for example, HIV diagnostic antibodies) in a rapid (<3 minutes), single-step and low-cost fashion.
The goal of this Proof of Concept project is to bring our promising platform to the proof of diagnostic market and exploit its innovative features for commercial purposes. We will focus our initial efforts in the development of rapid kits for the detection of antibodies diagnostic of HIV. We will 1) Fully characterize the Ab-switch product in terms of analytical performances (i.e. sensitivity, specificity, stability etc.) with direct comparison with other commercial kits; 2) Prepare a Manufacturing Plan for producing/testing the Ab-switch; 3) Establish an IP strategy for patent filing and maintenance; 4) Determine a business and commercialization planning."
Max ERC Funding
150 000 €
Duration
Start date: 2017-02-01, End date: 2018-07-31
Project acronym ABEL
Project "Alpha-helical Barrels: Exploring, Understanding and Exploiting a New Class of Protein Structure"
Researcher (PI) Derek Neil Woolfson
Host Institution (HI) UNIVERSITY OF BRISTOL
Call Details Advanced Grant (AdG), LS9, ERC-2013-ADG
Summary "Recently through de novo peptide design, we have discovered and presented a new protein structure. This is an all-parallel, 6-helix bundle with a continuous central channel of 0.5 – 0.6 nm diameter. We posit that this is one of a broader class of protein structures that we call the alpha-helical barrels. Here, in three Work Packages, we propose to explore these structures and to develop protein functions within them. First, through a combination of computer-aided design, peptide synthesis and thorough biophysical characterization, we will examine the extents and limits of the alpha-helical-barrel structures. Whilst this is curiosity driven research, it also has practical consequences for the studies that will follow; that is, alpha-helical barrels made from increasing numbers of helices have channels or pores that increase in a predictable way. Second, we will use rational and empirical design approaches to engineer a range of functions within these cavities, including binding capabilities and enzyme-like activities. Finally, and taking the programme into another ambitious area, we will use the alpha-helical barrels to template other folds that are otherwise difficult to design and engineer, notably beta-barrels that insert into membranes to render ion-channel and sensor functions."
Summary
"Recently through de novo peptide design, we have discovered and presented a new protein structure. This is an all-parallel, 6-helix bundle with a continuous central channel of 0.5 – 0.6 nm diameter. We posit that this is one of a broader class of protein structures that we call the alpha-helical barrels. Here, in three Work Packages, we propose to explore these structures and to develop protein functions within them. First, through a combination of computer-aided design, peptide synthesis and thorough biophysical characterization, we will examine the extents and limits of the alpha-helical-barrel structures. Whilst this is curiosity driven research, it also has practical consequences for the studies that will follow; that is, alpha-helical barrels made from increasing numbers of helices have channels or pores that increase in a predictable way. Second, we will use rational and empirical design approaches to engineer a range of functions within these cavities, including binding capabilities and enzyme-like activities. Finally, and taking the programme into another ambitious area, we will use the alpha-helical barrels to template other folds that are otherwise difficult to design and engineer, notably beta-barrels that insert into membranes to render ion-channel and sensor functions."
Max ERC Funding
2 467 844 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ACOUSEQ
Project Acoustics for Next Generation Sequencing
Researcher (PI) Jonathan Mark Cooper
Host Institution (HI) UNIVERSITY OF GLASGOW
Call Details Proof of Concept (PoC), PC1, ERC-2016-PoC
Summary Since completion of the first human genome sequence, the demand for cheaper and faster sequencing methods has increased enormously. This need has driven the development of second-generation sequencing methods, or next-generation sequencing (also known as NGS or high throughput sequencing). The creation of these platforms has made sequencing accessible to more laboratories, rapidly increasing the volume of research, including clinical diagnostics and its use in directing treatment (precision medicine). The applications of NGS are also allowing rapid advances in clinically related fields such as public health and epidemiology. Such developments illustrate why sequencing is now the fastest-growing area in genomics (+23% p.a.). The activity is said to be worth $2.5B this year, and poised to reach ~$9B by 2020. In any workflow, prior to the sequencing reactions, a number of pre-sequencing steps are required, including the fragmentation of the DNA into smaller sizes for processing, size selection, library preparation and target enrichment. This proposal is specifically concerned with this latter area, namely DNA fragmentation – now widely acknowledged across the industry as being the most important technological bottleneck in the pre-sequencing workflow. Our new method for DNA fragmentation – involving using surface acoustic waves will enable sample preparation from lower sample volumes using lower powers. It also has the potential to allow the seamless integration of fragmentation into sequencing instrumentation, opening up the possibility of “sample to answer” diagnostics. In the near term this will enable the implementation of sample preparation pre-sequencing steps within the NGS instruments. In the longer term, our techniques will also enable us to develop methods for field-based DNA sequencing – as may be required for determining “microbial resistance” and informing the treatment of infectious disease in the face of the emergence of drug resistance.
Summary
Since completion of the first human genome sequence, the demand for cheaper and faster sequencing methods has increased enormously. This need has driven the development of second-generation sequencing methods, or next-generation sequencing (also known as NGS or high throughput sequencing). The creation of these platforms has made sequencing accessible to more laboratories, rapidly increasing the volume of research, including clinical diagnostics and its use in directing treatment (precision medicine). The applications of NGS are also allowing rapid advances in clinically related fields such as public health and epidemiology. Such developments illustrate why sequencing is now the fastest-growing area in genomics (+23% p.a.). The activity is said to be worth $2.5B this year, and poised to reach ~$9B by 2020. In any workflow, prior to the sequencing reactions, a number of pre-sequencing steps are required, including the fragmentation of the DNA into smaller sizes for processing, size selection, library preparation and target enrichment. This proposal is specifically concerned with this latter area, namely DNA fragmentation – now widely acknowledged across the industry as being the most important technological bottleneck in the pre-sequencing workflow. Our new method for DNA fragmentation – involving using surface acoustic waves will enable sample preparation from lower sample volumes using lower powers. It also has the potential to allow the seamless integration of fragmentation into sequencing instrumentation, opening up the possibility of “sample to answer” diagnostics. In the near term this will enable the implementation of sample preparation pre-sequencing steps within the NGS instruments. In the longer term, our techniques will also enable us to develop methods for field-based DNA sequencing – as may be required for determining “microbial resistance” and informing the treatment of infectious disease in the face of the emergence of drug resistance.
Max ERC Funding
149 995 €
Duration
Start date: 2017-05-01, End date: 2018-10-31
Project acronym ACRCC
Project Understanding the atmospheric circulation response to climate change
Researcher (PI) Theodore Shepherd
Host Institution (HI) THE UNIVERSITY OF READING
Call Details Advanced Grant (AdG), PE10, ERC-2013-ADG
Summary Computer models based on known physical laws are our primary tool for predicting climate change. Yet the state-of-the-art models exhibit a disturbingly wide range of predictions of future climate change, especially when examined at the regional scale, which has not decreased as the models have become more comprehensive. The reasons for this are not understood. This represents a basic challenge to our fundamental understanding of climate.
The divergence of model projections is presumably related to systematic model errors in the large-scale fluxes of heat, moisture and momentum that control regional aspects of climate. That these errors stubbornly persist in spite of increases in the spatial resolution of the models suggests that they are associated with errors in the representation of unresolved processes, whose effects must be parameterised.
Most attention in climate science has hitherto focused on the thermodynamic aspects of climate. Dynamical aspects, which involve the atmospheric circulation, have received much less attention. However regional climate, including persistent climate regimes and extremes, is strongly controlled by atmospheric circulation patterns, which exhibit chaotic variability and whose representation in climate models depends sensitively on parameterised processes. Moreover the dynamical aspects of model projections are much less robust than the thermodynamic ones. There are good reasons to believe that model bias, the divergence of model projections, and chaotic variability are somehow related, although the relationships are not well understood. This calls for studying them together.
My proposed research will focus on this problem, addressing these three aspects of the atmospheric circulation response to climate change in parallel: (i) diagnosing the sources of model error; (ii) elucidating the relationship between model error and the spread in model projections; (iii) understanding the physical mechanisms of atmospheric variability.
Summary
Computer models based on known physical laws are our primary tool for predicting climate change. Yet the state-of-the-art models exhibit a disturbingly wide range of predictions of future climate change, especially when examined at the regional scale, which has not decreased as the models have become more comprehensive. The reasons for this are not understood. This represents a basic challenge to our fundamental understanding of climate.
The divergence of model projections is presumably related to systematic model errors in the large-scale fluxes of heat, moisture and momentum that control regional aspects of climate. That these errors stubbornly persist in spite of increases in the spatial resolution of the models suggests that they are associated with errors in the representation of unresolved processes, whose effects must be parameterised.
Most attention in climate science has hitherto focused on the thermodynamic aspects of climate. Dynamical aspects, which involve the atmospheric circulation, have received much less attention. However regional climate, including persistent climate regimes and extremes, is strongly controlled by atmospheric circulation patterns, which exhibit chaotic variability and whose representation in climate models depends sensitively on parameterised processes. Moreover the dynamical aspects of model projections are much less robust than the thermodynamic ones. There are good reasons to believe that model bias, the divergence of model projections, and chaotic variability are somehow related, although the relationships are not well understood. This calls for studying them together.
My proposed research will focus on this problem, addressing these three aspects of the atmospheric circulation response to climate change in parallel: (i) diagnosing the sources of model error; (ii) elucidating the relationship between model error and the spread in model projections; (iii) understanding the physical mechanisms of atmospheric variability.
Max ERC Funding
2 489 151 €
Duration
Start date: 2014-03-01, End date: 2020-02-29