Project acronym 2DNanoSpec
Project Nanoscale Vibrational Spectroscopy of Sensitive 2D Molecular Materials
Researcher (PI) Renato ZENOBI
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Advanced Grant (AdG), PE4, ERC-2016-ADG
Summary I propose to investigate the nanometer scale organization of delicate 2-dimensional molecular materials using nanoscale vibrational spectroscopy. 2D structures are of great scientific and technological importance, for example as novel materials (graphene, MoS2, WS2, etc.), and in the form of biological membranes and synthetic 2D-polymers. Powerful methods for their analysis and imaging with molecular selectivity and sufficient spatial resolution, however, are lacking. Tip-enhanced Raman spectroscopy (TERS) allows label-free spectroscopic identification of molecular species, with ≈10 nm spatial resolution, and with single molecule sensitivity for strong Raman scatterers. So far, however, TERS is not being carried out in liquids, which is the natural environment for membranes, and its application to poor Raman scatterers such as components of 2D polymers, lipids, or other membrane compounds (proteins, sugars) is difficult. TERS has the potential to overcome the restrictions of other optical/spectroscopic methods to study 2D materials, namely (i) insufficient spatial resolution of diffraction-limited optical methods; (ii) the need for labelling for all methods relying on fluorescence; and (iii) the inability of some methods to work in liquids. I propose to address a number of scientific questions associated with the spatial organization, and the occurrence of defects in sensitive 2D molecular materials. The success of these studies will also rely critically on technical innovations of TERS that notably address the problem of energy dissipation. This will for the first time allow its application to study of complex, delicate 2D molecular systems without photochemical damage.
Summary
I propose to investigate the nanometer scale organization of delicate 2-dimensional molecular materials using nanoscale vibrational spectroscopy. 2D structures are of great scientific and technological importance, for example as novel materials (graphene, MoS2, WS2, etc.), and in the form of biological membranes and synthetic 2D-polymers. Powerful methods for their analysis and imaging with molecular selectivity and sufficient spatial resolution, however, are lacking. Tip-enhanced Raman spectroscopy (TERS) allows label-free spectroscopic identification of molecular species, with ≈10 nm spatial resolution, and with single molecule sensitivity for strong Raman scatterers. So far, however, TERS is not being carried out in liquids, which is the natural environment for membranes, and its application to poor Raman scatterers such as components of 2D polymers, lipids, or other membrane compounds (proteins, sugars) is difficult. TERS has the potential to overcome the restrictions of other optical/spectroscopic methods to study 2D materials, namely (i) insufficient spatial resolution of diffraction-limited optical methods; (ii) the need for labelling for all methods relying on fluorescence; and (iii) the inability of some methods to work in liquids. I propose to address a number of scientific questions associated with the spatial organization, and the occurrence of defects in sensitive 2D molecular materials. The success of these studies will also rely critically on technical innovations of TERS that notably address the problem of energy dissipation. This will for the first time allow its application to study of complex, delicate 2D molecular systems without photochemical damage.
Max ERC Funding
2 311 696 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym 3DIMAGE
Project 3D Imaging Across Lengthscales: From Atoms to Grains
Researcher (PI) Paul Anthony Midgley
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Advanced Grant (AdG), PE4, ERC-2011-ADG_20110209
Summary "Understanding structure-property relationships across lengthscales is key to the design of functional and structural materials and devices. Moreover, the complexity of modern devices extends to three dimensions and as such 3D characterization is required across those lengthscales to provide a complete understanding and enable improvement in the material’s physical and chemical behaviour. 3D imaging and analysis from the atomic scale through to granular microstructure is proposed through the development of electron tomography using (S)TEM, and ‘dual beam’ SEM-FIB, techniques offering complementary approaches to 3D imaging across lengthscales stretching over 5 orders of magnitude.
We propose to extend tomography to include novel methods to determine atom positions in 3D with approaches incorporating new reconstruction algorithms, image processing and complementary nano-diffraction techniques. At the nanoscale, true 3D nano-metrology of morphology and composition is a key objective of the project, minimizing reconstruction and visualization artefacts. Mapping strain and optical properties in 3D are ambitious and exciting challenges that will yield new information at the nanoscale. Using the SEM-FIB, 3D ‘mesoscale’ structures will be revealed: morphology, crystallography and composition can be mapped simultaneously, with ~5nm resolution and over volumes too large to tackle by (S)TEM and too small for most x-ray techniques. In parallel, we will apply 3D imaging to a wide variety of key materials including heterogeneous catalysts, aerospace alloys, biomaterials, photovoltaic materials, and novel semiconductors.
We will collaborate with many departments in Cambridge and institutes worldwide. The personnel on the proposal will cover all aspects of the tomography proposed using high-end TEMs, including an aberration-corrected Titan, and a Helios dual beam. Importantly, a postdoc is dedicated to developing new algorithms for reconstruction, image and spectral processing."
Summary
"Understanding structure-property relationships across lengthscales is key to the design of functional and structural materials and devices. Moreover, the complexity of modern devices extends to three dimensions and as such 3D characterization is required across those lengthscales to provide a complete understanding and enable improvement in the material’s physical and chemical behaviour. 3D imaging and analysis from the atomic scale through to granular microstructure is proposed through the development of electron tomography using (S)TEM, and ‘dual beam’ SEM-FIB, techniques offering complementary approaches to 3D imaging across lengthscales stretching over 5 orders of magnitude.
We propose to extend tomography to include novel methods to determine atom positions in 3D with approaches incorporating new reconstruction algorithms, image processing and complementary nano-diffraction techniques. At the nanoscale, true 3D nano-metrology of morphology and composition is a key objective of the project, minimizing reconstruction and visualization artefacts. Mapping strain and optical properties in 3D are ambitious and exciting challenges that will yield new information at the nanoscale. Using the SEM-FIB, 3D ‘mesoscale’ structures will be revealed: morphology, crystallography and composition can be mapped simultaneously, with ~5nm resolution and over volumes too large to tackle by (S)TEM and too small for most x-ray techniques. In parallel, we will apply 3D imaging to a wide variety of key materials including heterogeneous catalysts, aerospace alloys, biomaterials, photovoltaic materials, and novel semiconductors.
We will collaborate with many departments in Cambridge and institutes worldwide. The personnel on the proposal will cover all aspects of the tomography proposed using high-end TEMs, including an aberration-corrected Titan, and a Helios dual beam. Importantly, a postdoc is dedicated to developing new algorithms for reconstruction, image and spectral processing."
Max ERC Funding
2 337 330 €
Duration
Start date: 2012-01-01, End date: 2017-12-31
Project acronym A-HERO
Project Anthelmintic Research and Optimization
Researcher (PI) Jennifer Irene Keiser
Host Institution (HI) SCHWEIZERISCHES TROPEN- UND PUBLIC HEALTH-INSTITUT
Call Details Consolidator Grant (CoG), LS7, ERC-2013-CoG
Summary "I propose an ambitious, yet feasible 5-year research project that will fill an important gap in global health. Specifically, I will develop and validate novel approaches for anthelmintic drug discovery and development. My proposal pursues the following five research questions: (i) Is a chip calorimeter suitable for high-throughput screening in anthelmintic drug discovery? (ii) Is combination chemotherapy safe and more efficacious than monotherapy against strongyloidiasis and trichuriasis? (iii) What are the key pharmacokinetic parameters of praziquantel in preschool-aged children and school-aged children infected with Schistosoma mansoni and S. haematobium using a novel and validated technology based on dried blood spotting? (iv) What are the metabolic consequences and clearance of praziquantel treatment in S. mansoni-infected mice and S. mansoni- and S. haematobium-infected children? (v) Which is the ideal compartment to study pharmacokinetic parameters for intestinal nematode infections and does age, nutrition, co-infection and infection intensity influence the efficacy of anthelmintic drugs?
My proposed research is of considerable public health relevance since it will ultimately result in improved treatments for soil-transmitted helminthiasis and pediatric schistosomiasis. Additionally, at the end of this project, I have generated comprehensive information on drug disposition of anthelmintics. A comprehensive database of metabolite profiles following praziquantel treatment will be available. Finally, the proof-of-concept of chip calorimetry in anthelmintic drug discovery has been established and broadly validated."
Summary
"I propose an ambitious, yet feasible 5-year research project that will fill an important gap in global health. Specifically, I will develop and validate novel approaches for anthelmintic drug discovery and development. My proposal pursues the following five research questions: (i) Is a chip calorimeter suitable for high-throughput screening in anthelmintic drug discovery? (ii) Is combination chemotherapy safe and more efficacious than monotherapy against strongyloidiasis and trichuriasis? (iii) What are the key pharmacokinetic parameters of praziquantel in preschool-aged children and school-aged children infected with Schistosoma mansoni and S. haematobium using a novel and validated technology based on dried blood spotting? (iv) What are the metabolic consequences and clearance of praziquantel treatment in S. mansoni-infected mice and S. mansoni- and S. haematobium-infected children? (v) Which is the ideal compartment to study pharmacokinetic parameters for intestinal nematode infections and does age, nutrition, co-infection and infection intensity influence the efficacy of anthelmintic drugs?
My proposed research is of considerable public health relevance since it will ultimately result in improved treatments for soil-transmitted helminthiasis and pediatric schistosomiasis. Additionally, at the end of this project, I have generated comprehensive information on drug disposition of anthelmintics. A comprehensive database of metabolite profiles following praziquantel treatment will be available. Finally, the proof-of-concept of chip calorimetry in anthelmintic drug discovery has been established and broadly validated."
Max ERC Funding
1 927 350 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym ABC
Project Targeting Multidrug Resistant Cancer
Researcher (PI) Gergely Szakacs
Host Institution (HI) MAGYAR TUDOMANYOS AKADEMIA TERMESZETTUDOMANYI KUTATOKOZPONT
Call Details Starting Grant (StG), LS7, ERC-2010-StG_20091118
Summary Despite considerable advances in drug discovery, resistance to anticancer chemotherapy confounds the effective treatment of patients. Cancer cells can acquire broad cross-resistance to mechanistically and structurally unrelated drugs. P-glycoprotein (Pgp) actively extrudes many types of drugs from cancer cells, thereby conferring resistance to those agents. The central tenet of my work is that Pgp, a universally accepted biomarker of drug resistance, should in addition be considered as a molecular target of multidrug-resistant (MDR) cancer cells. Successful targeting of MDR cells would reduce the tumor burden and would also enable the elimination of ABC transporter-overexpressing cancer stem cells that are responsible for the replenishment of tumors. The proposed project is based on the following observations:
- First, by using a pharmacogenomic approach, I have revealed the hidden vulnerability of MDRcells (Szakács et al. 2004, Cancer Cell 6, 129-37);
- Second, I have identified a series of MDR-selective compounds with increased toxicity toPgp-expressing cells
(Turk et al.,Cancer Res, 2009. 69(21));
- Third, I have shown that MDR-selective compounds can be used to prevent theemergence of MDR (Ludwig, Szakács et al. 2006, Cancer Res 66, 4808-15);
- Fourth, we have generated initial pharmacophore models for cytotoxicity and MDR-selectivity (Hall et al. 2009, J Med Chem 52, 3191-3204).
I propose a comprehensive series of studies that will address thefollowing critical questions:
- First, what is the scope of MDR-selective compounds?
- Second, what is their mechanism of action?
- Third, what is the optimal therapeutic modality?
Extensive biological, pharmacological and bioinformatic analyses will be utilized to address four major specific aims. These aims address basic questions concerning the physiology of MDR ABC transporters in determining the mechanism of action of MDR-selective compounds, setting the stage for a fresh therapeutic approach that may eventually translate into improved patient care.
Summary
Despite considerable advances in drug discovery, resistance to anticancer chemotherapy confounds the effective treatment of patients. Cancer cells can acquire broad cross-resistance to mechanistically and structurally unrelated drugs. P-glycoprotein (Pgp) actively extrudes many types of drugs from cancer cells, thereby conferring resistance to those agents. The central tenet of my work is that Pgp, a universally accepted biomarker of drug resistance, should in addition be considered as a molecular target of multidrug-resistant (MDR) cancer cells. Successful targeting of MDR cells would reduce the tumor burden and would also enable the elimination of ABC transporter-overexpressing cancer stem cells that are responsible for the replenishment of tumors. The proposed project is based on the following observations:
- First, by using a pharmacogenomic approach, I have revealed the hidden vulnerability of MDRcells (Szakács et al. 2004, Cancer Cell 6, 129-37);
- Second, I have identified a series of MDR-selective compounds with increased toxicity toPgp-expressing cells
(Turk et al.,Cancer Res, 2009. 69(21));
- Third, I have shown that MDR-selective compounds can be used to prevent theemergence of MDR (Ludwig, Szakács et al. 2006, Cancer Res 66, 4808-15);
- Fourth, we have generated initial pharmacophore models for cytotoxicity and MDR-selectivity (Hall et al. 2009, J Med Chem 52, 3191-3204).
I propose a comprehensive series of studies that will address thefollowing critical questions:
- First, what is the scope of MDR-selective compounds?
- Second, what is their mechanism of action?
- Third, what is the optimal therapeutic modality?
Extensive biological, pharmacological and bioinformatic analyses will be utilized to address four major specific aims. These aims address basic questions concerning the physiology of MDR ABC transporters in determining the mechanism of action of MDR-selective compounds, setting the stage for a fresh therapeutic approach that may eventually translate into improved patient care.
Max ERC Funding
1 499 640 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym ACCORD
Project Algorithms for Complex Collective Decisions on Structured Domains
Researcher (PI) Edith Elkind
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Starting Grant (StG), PE6, ERC-2014-STG
Summary Algorithms for Complex Collective Decisions on Structured Domains.
The aim of this proposal is to substantially advance the field of Computational Social Choice, by developing new tools and methodologies that can be used for making complex group decisions in rich and structured environments. We consider settings where each member of a decision-making body has preferences over a finite set of alternatives, and the goal is to synthesise a collective preference over these alternatives, which may take the form of a partial order over the set of alternatives with a predefined structure: examples include selecting a fixed-size set of alternatives, a ranking of the alternatives, a winner and up to two runner-ups, etc. We will formulate desiderata that apply to such preference aggregation procedures, design specific procedures that satisfy as many of these desiderata as possible, and develop efficient algorithms for computing them. As the latter step may be infeasible on general preference domains, we will focus on identifying the least restrictive domains that enable efficient computation, and use real-life preference data to verify whether the associated restrictions are likely to be satisfied in realistic preference aggregation scenarios. Also, we will determine whether our preference aggregation procedures are computationally resistant to malicious behavior. To lower the cognitive burden on the decision-makers, we will extend our procedures to accept partial rankings as inputs. Finally, to further contribute towards bridging the gap between theory and practice of collective decision making, we will provide open-source software implementations of our procedures, and reach out to the potential users to obtain feedback on their practical applicability.
Summary
Algorithms for Complex Collective Decisions on Structured Domains.
The aim of this proposal is to substantially advance the field of Computational Social Choice, by developing new tools and methodologies that can be used for making complex group decisions in rich and structured environments. We consider settings where each member of a decision-making body has preferences over a finite set of alternatives, and the goal is to synthesise a collective preference over these alternatives, which may take the form of a partial order over the set of alternatives with a predefined structure: examples include selecting a fixed-size set of alternatives, a ranking of the alternatives, a winner and up to two runner-ups, etc. We will formulate desiderata that apply to such preference aggregation procedures, design specific procedures that satisfy as many of these desiderata as possible, and develop efficient algorithms for computing them. As the latter step may be infeasible on general preference domains, we will focus on identifying the least restrictive domains that enable efficient computation, and use real-life preference data to verify whether the associated restrictions are likely to be satisfied in realistic preference aggregation scenarios. Also, we will determine whether our preference aggregation procedures are computationally resistant to malicious behavior. To lower the cognitive burden on the decision-makers, we will extend our procedures to accept partial rankings as inputs. Finally, to further contribute towards bridging the gap between theory and practice of collective decision making, we will provide open-source software implementations of our procedures, and reach out to the potential users to obtain feedback on their practical applicability.
Max ERC Funding
1 395 933 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym AFTERTHEGOLDRUSH
Project Addressing global sustainability challenges by changing perceptions in catalyst design
Researcher (PI) Graham John Hutchings
Host Institution (HI) CARDIFF UNIVERSITY
Call Details Advanced Grant (AdG), PE4, ERC-2011-ADG_20110209
Summary One of the greatest challenges facing society is the sustainability of resources. At present, a step change in the sustainable use of resources is needed and catalysis lies at the heart of the solution by providing new routes to carbon dioxide mitigation, energy security and water conservation. It is clear that new high efficiency game-changing catalysts are required to meet the challenge. This proposal will focus on excellence in catalyst design by learning from recent step change advances in gold catalysis by challenging perceptions. Intense interest in gold catalysts over the past two decades has accelerated our understanding of gold particle-size effects, gold-support and gold-metal interactions, the interchange between atomic and ionic gold species, and the role of the gold-support interface in creating and maintaining catalytic activity. The field has also driven the development of cutting-edge techniques, particularly in microscopy and transient kinetics, providing detailed structural characterisation on the nano-scale and probing the short-range and often short-lived interactions. By comparison, our understanding of other metal catalysts has remained relatively static.
The proposed programme will engender a step change in the design of supported-metal catalysts, by exploiting the learning and the techniques emerging from gold catalysis. The research will be set out in two themes. In Theme 1 two established key grand challenges will be attacked; namely, energy vectors and greenhouse gas control. Theme 2 will address two new and emerging grand challenges in catalysis namely the effective low temperature activation of primary carbon hydrogen bonds and CO2 utilisation where instead of treating CO2 as a thermodynamic endpoint, the aim will be to re-use it as a feedstock for bulk chemical and fuel production. The legacy of the research will be the development of a new catalyst design approach that will provide a tool box for future catalyst development.
Summary
One of the greatest challenges facing society is the sustainability of resources. At present, a step change in the sustainable use of resources is needed and catalysis lies at the heart of the solution by providing new routes to carbon dioxide mitigation, energy security and water conservation. It is clear that new high efficiency game-changing catalysts are required to meet the challenge. This proposal will focus on excellence in catalyst design by learning from recent step change advances in gold catalysis by challenging perceptions. Intense interest in gold catalysts over the past two decades has accelerated our understanding of gold particle-size effects, gold-support and gold-metal interactions, the interchange between atomic and ionic gold species, and the role of the gold-support interface in creating and maintaining catalytic activity. The field has also driven the development of cutting-edge techniques, particularly in microscopy and transient kinetics, providing detailed structural characterisation on the nano-scale and probing the short-range and often short-lived interactions. By comparison, our understanding of other metal catalysts has remained relatively static.
The proposed programme will engender a step change in the design of supported-metal catalysts, by exploiting the learning and the techniques emerging from gold catalysis. The research will be set out in two themes. In Theme 1 two established key grand challenges will be attacked; namely, energy vectors and greenhouse gas control. Theme 2 will address two new and emerging grand challenges in catalysis namely the effective low temperature activation of primary carbon hydrogen bonds and CO2 utilisation where instead of treating CO2 as a thermodynamic endpoint, the aim will be to re-use it as a feedstock for bulk chemical and fuel production. The legacy of the research will be the development of a new catalyst design approach that will provide a tool box for future catalyst development.
Max ERC Funding
2 279 785 €
Duration
Start date: 2012-04-01, End date: 2017-03-31
Project acronym ALCOHOLLIFECOURSE
Project Alcohol Consumption across the Life-course: Determinants and Consequences
Researcher (PI) Anne Rebecca Britton
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Starting Grant (StG), LS7, ERC-2012-StG_20111109
Summary The epidemiology of alcohol use and related health consequences plays a vital role by monitoring populations’ alcohol consumption patterns and problems associated with drinking. Such studies seek to explain mechanisms linking consumption to harm and ultimately to reduce the health burden. Research needs to consider changes in drinking behaviour over the life-course. The current evidence base lacks the consideration of the complexity of lifetime consumption patterns, the predictors of change and subsequent health risks.
Aims of the study
1. To describe age-related trajectories of drinking in different settings and to determine the extent to which individual and social contextual factors, including socioeconomic position, social networks and life events influence drinking pattern trajectories.
2. To estimate the impact of drinking trajectories on physical functioning and disease and to disentangle the exposure-outcome associations in terms of a) timing, i.e. health effect of drinking patterns in early, mid and late life; and b) duration, i.e. whether the impact of drinking accumulates over time.
3. To test the bidirectional associations between health and changes in consumption over the life-course in order to estimate the relative importance of these effects and to determine the dominant temporal direction.
4. To explore mechanisms and pathways through which drinking trajectories affect health and functioning in later life and to examine the role played by potential effect modifiers of the association between drinking and poor health.
Several large, longitudinal cohort studies from European countries with repeated measures of alcohol consumption will be combined and analysed to address the aims. A new team will be formed consisting of the PI, a Research Associate and two PhD students. Dissemination will be through journals, conferences, and culminating in a one-day workshop for academics, practitioners and policy makers in the alcohol field.
Summary
The epidemiology of alcohol use and related health consequences plays a vital role by monitoring populations’ alcohol consumption patterns and problems associated with drinking. Such studies seek to explain mechanisms linking consumption to harm and ultimately to reduce the health burden. Research needs to consider changes in drinking behaviour over the life-course. The current evidence base lacks the consideration of the complexity of lifetime consumption patterns, the predictors of change and subsequent health risks.
Aims of the study
1. To describe age-related trajectories of drinking in different settings and to determine the extent to which individual and social contextual factors, including socioeconomic position, social networks and life events influence drinking pattern trajectories.
2. To estimate the impact of drinking trajectories on physical functioning and disease and to disentangle the exposure-outcome associations in terms of a) timing, i.e. health effect of drinking patterns in early, mid and late life; and b) duration, i.e. whether the impact of drinking accumulates over time.
3. To test the bidirectional associations between health and changes in consumption over the life-course in order to estimate the relative importance of these effects and to determine the dominant temporal direction.
4. To explore mechanisms and pathways through which drinking trajectories affect health and functioning in later life and to examine the role played by potential effect modifiers of the association between drinking and poor health.
Several large, longitudinal cohort studies from European countries with repeated measures of alcohol consumption will be combined and analysed to address the aims. A new team will be formed consisting of the PI, a Research Associate and two PhD students. Dissemination will be through journals, conferences, and culminating in a one-day workshop for academics, practitioners and policy makers in the alcohol field.
Max ERC Funding
1 032 815 €
Duration
Start date: 2013-01-01, End date: 2017-12-31
Project acronym ALEXANDRIA
Project Large-Scale Formal Proof for the Working Mathematician
Researcher (PI) Lawrence PAULSON
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Advanced Grant (AdG), PE6, ERC-2016-ADG
Summary Mathematical proofs have always been prone to error. Today, proofs can be hundreds of pages long and combine results from many specialisms, making them almost impossible to check. One solution is to deploy modern verification technology. Interactive theorem provers have demonstrated their potential as vehicles for formalising mathematics through achievements such as the verification of the Kepler Conjecture. Proofs done using such tools reach a high standard of correctness.
However, existing theorem provers are unsuitable for mathematics. Their formal proofs are unreadable. They struggle to do simple tasks, such as evaluating limits. They lack much basic mathematics, and the material they do have is difficult to locate and apply.
ALEXANDRIA will create a proof development environment attractive to working mathematicians, utilising the best technology available across computer science. Its focus will be the management and use of large-scale mathematical knowledge, both theorems and algorithms. The project will employ mathematicians to investigate the formalisation of mathematics in practice. Our already substantial formalised libraries will serve as the starting point. They will be extended and annotated to support sophisticated searches. Techniques will be borrowed from machine learning, information retrieval and natural language processing. Algorithms will be treated similarly: ALEXANDRIA will help users find and invoke the proof methods and algorithms appropriate for the task.
ALEXANDRIA will provide (1) comprehensive formal mathematical libraries; (2) search within libraries, and the mining of libraries for proof patterns; (3) automated support for the construction of large formal proofs; (4) sound and practical computer algebra tools.
ALEXANDRIA will be based on legible structured proofs. Formal proofs should be not mere code, but a machine-checkable form of communication between mathematicians.
Summary
Mathematical proofs have always been prone to error. Today, proofs can be hundreds of pages long and combine results from many specialisms, making them almost impossible to check. One solution is to deploy modern verification technology. Interactive theorem provers have demonstrated their potential as vehicles for formalising mathematics through achievements such as the verification of the Kepler Conjecture. Proofs done using such tools reach a high standard of correctness.
However, existing theorem provers are unsuitable for mathematics. Their formal proofs are unreadable. They struggle to do simple tasks, such as evaluating limits. They lack much basic mathematics, and the material they do have is difficult to locate and apply.
ALEXANDRIA will create a proof development environment attractive to working mathematicians, utilising the best technology available across computer science. Its focus will be the management and use of large-scale mathematical knowledge, both theorems and algorithms. The project will employ mathematicians to investigate the formalisation of mathematics in practice. Our already substantial formalised libraries will serve as the starting point. They will be extended and annotated to support sophisticated searches. Techniques will be borrowed from machine learning, information retrieval and natural language processing. Algorithms will be treated similarly: ALEXANDRIA will help users find and invoke the proof methods and algorithms appropriate for the task.
ALEXANDRIA will provide (1) comprehensive formal mathematical libraries; (2) search within libraries, and the mining of libraries for proof patterns; (3) automated support for the construction of large formal proofs; (4) sound and practical computer algebra tools.
ALEXANDRIA will be based on legible structured proofs. Formal proofs should be not mere code, but a machine-checkable form of communication between mathematicians.
Max ERC Funding
2 430 140 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym ALGAME
Project Algorithms, Games, Mechanisms, and the Price of Anarchy
Researcher (PI) Elias Koutsoupias
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Advanced Grant (AdG), PE6, ERC-2012-ADG_20120216
Summary The objective of this proposal is to bring together a local team of young researchers who will work closely with international collaborators to advance the state of the art of Algorithmic Game Theory and open new venues of research at the interface of Computer Science, Game Theory, and Economics. The proposal consists mainly of three intertwined research strands: algorithmic mechanism design, price of anarchy, and online algorithms.
Specifically, we will attempt to resolve some outstanding open problems in algorithmic mechanism design: characterizing the incentive compatible mechanisms for important domains, such as the domain of combinatorial auctions, and resolving the approximation ratio of mechanisms for scheduling unrelated machines. More generally, we will study centralized and distributed algorithms whose inputs are controlled by selfish agents that are interested in the outcome of the computation. We will investigate new notions of mechanisms with strong truthfulness and limited susceptibility to externalities that can facilitate modular design of mechanisms of complex domains.
We will expand the current research on the price of anarchy to time-dependent games where the players can select not only how to act but also when to act. We also plan to resolve outstanding questions on the price of stability and to build a robust approach to these questions, similar to smooth analysis. For repeated games, we will investigate convergence of simple strategies (e.g., fictitious play), online fairness, and strategic considerations (e.g., metagames). More generally, our aim is to find a productive formulation of playing unknown games by drawing on the fields of online algorithms and machine learning.
Summary
The objective of this proposal is to bring together a local team of young researchers who will work closely with international collaborators to advance the state of the art of Algorithmic Game Theory and open new venues of research at the interface of Computer Science, Game Theory, and Economics. The proposal consists mainly of three intertwined research strands: algorithmic mechanism design, price of anarchy, and online algorithms.
Specifically, we will attempt to resolve some outstanding open problems in algorithmic mechanism design: characterizing the incentive compatible mechanisms for important domains, such as the domain of combinatorial auctions, and resolving the approximation ratio of mechanisms for scheduling unrelated machines. More generally, we will study centralized and distributed algorithms whose inputs are controlled by selfish agents that are interested in the outcome of the computation. We will investigate new notions of mechanisms with strong truthfulness and limited susceptibility to externalities that can facilitate modular design of mechanisms of complex domains.
We will expand the current research on the price of anarchy to time-dependent games where the players can select not only how to act but also when to act. We also plan to resolve outstanding questions on the price of stability and to build a robust approach to these questions, similar to smooth analysis. For repeated games, we will investigate convergence of simple strategies (e.g., fictitious play), online fairness, and strategic considerations (e.g., metagames). More generally, our aim is to find a productive formulation of playing unknown games by drawing on the fields of online algorithms and machine learning.
Max ERC Funding
2 461 000 €
Duration
Start date: 2013-04-01, End date: 2019-03-31
Project acronym ALGILE
Project Foundations of Algebraic and Dynamic Data Management Systems
Researcher (PI) Christoph Koch
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary "Contemporary database query languages are ultimately founded on logic and feature an additive operation – usually a form of (multi)set union or disjunction – that is asymmetric in that additions or updates do not always have an inverse. This asymmetry puts a greater part of the machinery of abstract algebra for equation solving outside the reach of databases. However, such equation solving would be a key functionality that problems such as query equivalence testing and data integration could be reduced to: In the current scenario of the presence of an asymmetric additive operation they are undecidable. Moreover, query languages with a symmetric additive operation (i.e., which has an inverse and is thus based on ring theory) would open up databases for a large range of new scientific and mathematical applications.
The goal of the proposed project is to reinvent database management systems with a foundation in abstract algebra and specifically in ring theory. The presence of an additive inverse allows to cleanly define differences between queries. This gives rise to a database analog of differential calculus that leads to radically new incremental and adaptive query evaluation algorithms that substantially outperform the state of the art techniques. These algorithms enable a new class of systems which I call Dynamic Data Management Systems. Such systems can maintain continuously fresh query views at extremely high update rates and have important applications in interactive Large-scale Data Analysis. There is a natural connection between differences and updates, motivating the group theoretic study of updates that will lead to better ways of creating out-of-core data processing algorithms for new storage devices. Basing queries on ring theory leads to a new class of systems, Algebraic Data Management Systems, which herald a convergence of database systems and computer algebra systems."
Summary
"Contemporary database query languages are ultimately founded on logic and feature an additive operation – usually a form of (multi)set union or disjunction – that is asymmetric in that additions or updates do not always have an inverse. This asymmetry puts a greater part of the machinery of abstract algebra for equation solving outside the reach of databases. However, such equation solving would be a key functionality that problems such as query equivalence testing and data integration could be reduced to: In the current scenario of the presence of an asymmetric additive operation they are undecidable. Moreover, query languages with a symmetric additive operation (i.e., which has an inverse and is thus based on ring theory) would open up databases for a large range of new scientific and mathematical applications.
The goal of the proposed project is to reinvent database management systems with a foundation in abstract algebra and specifically in ring theory. The presence of an additive inverse allows to cleanly define differences between queries. This gives rise to a database analog of differential calculus that leads to radically new incremental and adaptive query evaluation algorithms that substantially outperform the state of the art techniques. These algorithms enable a new class of systems which I call Dynamic Data Management Systems. Such systems can maintain continuously fresh query views at extremely high update rates and have important applications in interactive Large-scale Data Analysis. There is a natural connection between differences and updates, motivating the group theoretic study of updates that will lead to better ways of creating out-of-core data processing algorithms for new storage devices. Basing queries on ring theory leads to a new class of systems, Algebraic Data Management Systems, which herald a convergence of database systems and computer algebra systems."
Max ERC Funding
1 480 548 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym AlgoRNN
Project Recurrent Neural Networks and Related Machines That Learn Algorithms
Researcher (PI) Juergen Schmidhuber
Host Institution (HI) UNIVERSITA DELLA SVIZZERA ITALIANA
Call Details Advanced Grant (AdG), PE6, ERC-2016-ADG
Summary Recurrent neural networks (RNNs) are general parallel-sequential computers. Some learn their programs or weights. Our supervised Long Short-Term Memory (LSTM) RNNs were the first to win pattern recognition contests, and recently enabled best known results in speech and handwriting recognition, machine translation, etc. They are now available to billions of users through the world's most valuable public companies including Google and Apple. Nevertheless, in lots of real-world tasks RNNs do not yet live up to their full potential. Although universal in theory, in practice they fail to learn important types of algorithms. This ERC project will go far beyond today's best RNNs through novel RNN-like systems that address some of the biggest open RNN problems and hottest RNN research topics: (1) How can RNNs learn to control (through internal spotlights of attention) separate large short-memory structures such as sub-networks with fast weights, to improve performance on many natural short-term memory-intensive tasks which are currently hard to learn by RNNs, such as answering detailed questions on recently observed videos? (2) How can such RNN-like systems metalearn entire learning algorithms that outperform the original learning algorithms? (3) How to achieve efficient transfer learning from one RNN-learned set of problem-solving programs to new RNN programs solving new tasks? In other words, how can one RNN-like system actively learn to exploit algorithmic information contained in the programs running on another? We will test our systems existing benchmarks, and create new, more challenging multi-task benchmarks. This will be supported by a rather cheap, GPU-based mini-brain for implementing large RNNs.
Summary
Recurrent neural networks (RNNs) are general parallel-sequential computers. Some learn their programs or weights. Our supervised Long Short-Term Memory (LSTM) RNNs were the first to win pattern recognition contests, and recently enabled best known results in speech and handwriting recognition, machine translation, etc. They are now available to billions of users through the world's most valuable public companies including Google and Apple. Nevertheless, in lots of real-world tasks RNNs do not yet live up to their full potential. Although universal in theory, in practice they fail to learn important types of algorithms. This ERC project will go far beyond today's best RNNs through novel RNN-like systems that address some of the biggest open RNN problems and hottest RNN research topics: (1) How can RNNs learn to control (through internal spotlights of attention) separate large short-memory structures such as sub-networks with fast weights, to improve performance on many natural short-term memory-intensive tasks which are currently hard to learn by RNNs, such as answering detailed questions on recently observed videos? (2) How can such RNN-like systems metalearn entire learning algorithms that outperform the original learning algorithms? (3) How to achieve efficient transfer learning from one RNN-learned set of problem-solving programs to new RNN programs solving new tasks? In other words, how can one RNN-like system actively learn to exploit algorithmic information contained in the programs running on another? We will test our systems existing benchmarks, and create new, more challenging multi-task benchmarks. This will be supported by a rather cheap, GPU-based mini-brain for implementing large RNNs.
Max ERC Funding
2 500 000 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym ALUNIF
Project Algorithms and Lower Bounds: A Unified Approach
Researcher (PI) Rahul Santhanam
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary One of the fundamental goals of theoretical computer science is to
understand the possibilities and limits of efficient computation. This
quest has two dimensions. The
theory of algorithms focuses on finding efficient solutions to
problems, while computational complexity theory aims to understand when
and why problems are hard to solve. These two areas have different
philosophies and use different sets of techniques. However, in recent
years there have been indications of deep and mysterious connections
between them.
In this project, we propose to explore and develop the connections between
algorithmic analysis and complexity lower bounds in a systematic way.
On the one hand, we plan to use complexity lower bound techniques as inspiration
to design new and improved algorithms for Satisfiability and other
NP-complete problems, as well as to analyze existing algorithms better.
On the other hand, we plan to strengthen implications yielding circuit
lower bounds from non-trivial algorithms for Satisfiability, and to derive
new circuit lower bounds using these stronger implications.
This project has potential for massive impact in both the areas of algorithms
and computational complexity. Improved algorithms for Satisfiability could lead
to improved SAT solvers, and the new analytical tools would lead to a better
understanding of existing heuristics. Complexity lower bound questions are
fundamental
but notoriously difficult, and new lower bounds would open the way to
unconditionally secure cryptographic protocols and derandomization of
probabilistic algorithms. More broadly, this project aims to initiate greater
dialogue between the two areas, with an exchange of ideas and techniques
which leads to accelerated progress in both, as well as a deeper understanding
of the nature of efficient computation.
Summary
One of the fundamental goals of theoretical computer science is to
understand the possibilities and limits of efficient computation. This
quest has two dimensions. The
theory of algorithms focuses on finding efficient solutions to
problems, while computational complexity theory aims to understand when
and why problems are hard to solve. These two areas have different
philosophies and use different sets of techniques. However, in recent
years there have been indications of deep and mysterious connections
between them.
In this project, we propose to explore and develop the connections between
algorithmic analysis and complexity lower bounds in a systematic way.
On the one hand, we plan to use complexity lower bound techniques as inspiration
to design new and improved algorithms for Satisfiability and other
NP-complete problems, as well as to analyze existing algorithms better.
On the other hand, we plan to strengthen implications yielding circuit
lower bounds from non-trivial algorithms for Satisfiability, and to derive
new circuit lower bounds using these stronger implications.
This project has potential for massive impact in both the areas of algorithms
and computational complexity. Improved algorithms for Satisfiability could lead
to improved SAT solvers, and the new analytical tools would lead to a better
understanding of existing heuristics. Complexity lower bound questions are
fundamental
but notoriously difficult, and new lower bounds would open the way to
unconditionally secure cryptographic protocols and derandomization of
probabilistic algorithms. More broadly, this project aims to initiate greater
dialogue between the two areas, with an exchange of ideas and techniques
which leads to accelerated progress in both, as well as a deeper understanding
of the nature of efficient computation.
Max ERC Funding
1 274 496 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym AMSEL
Project Atomic Force Microscopy for Molecular Structure Elucidation
Researcher (PI) Leo Gross
Host Institution (HI) IBM RESEARCH GMBH
Call Details Consolidator Grant (CoG), PE4, ERC-2015-CoG
Summary Molecular structure elucidation is of great importance in synthetic chemistry, pharmacy, life sciences, energy and environmental sciences, and technology applications. To date structure elucidation by atomic force microscopy (AFM) has been demonstrated for a few, small and mainly planar molecules. In this project high-risk, high-impact scientific questions will be solved using structure elucidation with the AFM employing a novel tool and novel methodologies.
A combined low-temperature scanning tunneling microscope/atomic force microscope (LT-STM/AFM) with high throughput and in situ electrospray deposition method will be developed. Chemical resolution will be achieved by novel measurement techniques, in particular the usage of different and novel tip functionalizations and combination with Kelvin probe force microscopy. Elements will be identified using substructure recognition provided by a database that will be erected and by refined theory and simulations.
The developed tools and techniques will be applied to molecules of increasing fragility, complexity, size, and three-dimensionality. In particular samples that are challenging to characterize with conventional methods will be studied. Complex molecular mixtures will be investigated molecule-by-molecule taking advantage of the single-molecule sensitivity. The absolute stereochemistry of molecules will be determined, resolving molecules with multiple stereocenters. The operation of single molecular machines as nanocars and molecular gears will be investigated. Reactive intermediates generated with atomic manipulation will be characterized and their on-surface reactivity will be studied by AFM.
Summary
Molecular structure elucidation is of great importance in synthetic chemistry, pharmacy, life sciences, energy and environmental sciences, and technology applications. To date structure elucidation by atomic force microscopy (AFM) has been demonstrated for a few, small and mainly planar molecules. In this project high-risk, high-impact scientific questions will be solved using structure elucidation with the AFM employing a novel tool and novel methodologies.
A combined low-temperature scanning tunneling microscope/atomic force microscope (LT-STM/AFM) with high throughput and in situ electrospray deposition method will be developed. Chemical resolution will be achieved by novel measurement techniques, in particular the usage of different and novel tip functionalizations and combination with Kelvin probe force microscopy. Elements will be identified using substructure recognition provided by a database that will be erected and by refined theory and simulations.
The developed tools and techniques will be applied to molecules of increasing fragility, complexity, size, and three-dimensionality. In particular samples that are challenging to characterize with conventional methods will be studied. Complex molecular mixtures will be investigated molecule-by-molecule taking advantage of the single-molecule sensitivity. The absolute stereochemistry of molecules will be determined, resolving molecules with multiple stereocenters. The operation of single molecular machines as nanocars and molecular gears will be investigated. Reactive intermediates generated with atomic manipulation will be characterized and their on-surface reactivity will be studied by AFM.
Max ERC Funding
2 000 000 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym ANGLE
Project Accelerated design and discovery of novel molecular materials via global lattice energy minimisation
Researcher (PI) Graeme Matthew Day
Host Institution (HI) UNIVERSITY OF SOUTHAMPTON
Call Details Starting Grant (StG), PE4, ERC-2012-StG_20111012
Summary The goal of crystal engineering is the design of functional crystalline materials in which the arrangement of basic structural building blocks imparts desired properties. The engineering of organic molecular crystals has, to date, relied largely on empirical rules governing the intermolecular association of functional groups in the solid state. However, many materials properties depend intricately on the complete crystal structure, i.e. the unit cell, space group and atomic positions, which cannot be predicted solely using such rules. Therefore, the development of computational methods for crystal structure prediction (CSP) from first principles has been a goal of computational chemistry that could significantly accelerate the design of new materials. It is only recently that the necessary advances in the modelling of intermolecular interactions and developments in algorithms for identifying all relevant crystal structures have come together to provide predictive methods that are becoming reliable and affordable on a timescale that could usefully complement an experimental research programme. The principle aim of the proposed work is to establish the use of state-of-the-art crystal structure prediction methods as a means of guiding the discovery and design of novel molecular materials.
This research proposal both continues the development of the computational methods for CSP and, by developing a computational framework for screening of potential molecules, develops the application of these methods for materials design. The areas on which we will focus are organic molecular semiconductors with high charge carrier mobilities and, building on our recently published results in Nature [1], the development of porous organic molecular materials. The project will both deliver novel materials, as well as improvements in the reliability of computational methods that will find widespread applications in materials chemistry.
[1] Nature 2011, 474, 367-371.
Summary
The goal of crystal engineering is the design of functional crystalline materials in which the arrangement of basic structural building blocks imparts desired properties. The engineering of organic molecular crystals has, to date, relied largely on empirical rules governing the intermolecular association of functional groups in the solid state. However, many materials properties depend intricately on the complete crystal structure, i.e. the unit cell, space group and atomic positions, which cannot be predicted solely using such rules. Therefore, the development of computational methods for crystal structure prediction (CSP) from first principles has been a goal of computational chemistry that could significantly accelerate the design of new materials. It is only recently that the necessary advances in the modelling of intermolecular interactions and developments in algorithms for identifying all relevant crystal structures have come together to provide predictive methods that are becoming reliable and affordable on a timescale that could usefully complement an experimental research programme. The principle aim of the proposed work is to establish the use of state-of-the-art crystal structure prediction methods as a means of guiding the discovery and design of novel molecular materials.
This research proposal both continues the development of the computational methods for CSP and, by developing a computational framework for screening of potential molecules, develops the application of these methods for materials design. The areas on which we will focus are organic molecular semiconductors with high charge carrier mobilities and, building on our recently published results in Nature [1], the development of porous organic molecular materials. The project will both deliver novel materials, as well as improvements in the reliability of computational methods that will find widespread applications in materials chemistry.
[1] Nature 2011, 474, 367-371.
Max ERC Funding
1 499 906 €
Duration
Start date: 2012-10-01, End date: 2017-09-30
Project acronym Antibodyomics
Project Vaccine profiling and immunodiagnostic discovery by high-throughput antibody repertoire analysis
Researcher (PI) Sai Tota Reddy
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), LS7, ERC-2015-STG
Summary Vaccines and immunodiagnostics have been vital for public health and medicine, however a quantitative molecular understanding of vaccine-induced antibody responses is lacking. Antibody research is currently going through a big-data driven revolution, largely due to progress in next-generation sequencing (NGS) and bioinformatic analysis of antibody repertoires. A main advantage of high-throughput antibody repertoire analysis is that it provides a wealth of quantitative information not possible with other classical methods of antibody analysis (i.e., serum titers); this information includes: clonal distribution and diversity, somatic hypermutation patterns, and lineage tracing. In preliminary work my group has established standardized methods for antibody repertoire NGS, including an experimental-bioinformatic pipeline for error and bias correction that enables highly accurate repertoire sequencing and analysis. The overall goal of this proposal will be to apply high-throughput antibody repertoire analysis for quantitative vaccine profiling and discovery of next-generation immunodiagnostics. Using mouse subunit vaccination as our model system, we will answer for the first time, a fundamental biological question within the context of antibody responses - what is the link between genotype (antibody repertoire) and phenotype (serum antibodies)? We will expand upon this approach for improved rational vaccine design by quantitatively determining the impact of a comprehensive set of subunit vaccination parameters on complete antibody landscapes. Finally, we will develop advanced bioinformatic methods to discover immunodiagnostics based on antibody repertoire sequences. In summary, this proposal lays the foundation for fundamentally new approaches in the quantitative analysis of antibody responses, which long-term will promote the development of next-generation vaccines and immunodiagnostics.
Summary
Vaccines and immunodiagnostics have been vital for public health and medicine, however a quantitative molecular understanding of vaccine-induced antibody responses is lacking. Antibody research is currently going through a big-data driven revolution, largely due to progress in next-generation sequencing (NGS) and bioinformatic analysis of antibody repertoires. A main advantage of high-throughput antibody repertoire analysis is that it provides a wealth of quantitative information not possible with other classical methods of antibody analysis (i.e., serum titers); this information includes: clonal distribution and diversity, somatic hypermutation patterns, and lineage tracing. In preliminary work my group has established standardized methods for antibody repertoire NGS, including an experimental-bioinformatic pipeline for error and bias correction that enables highly accurate repertoire sequencing and analysis. The overall goal of this proposal will be to apply high-throughput antibody repertoire analysis for quantitative vaccine profiling and discovery of next-generation immunodiagnostics. Using mouse subunit vaccination as our model system, we will answer for the first time, a fundamental biological question within the context of antibody responses - what is the link between genotype (antibody repertoire) and phenotype (serum antibodies)? We will expand upon this approach for improved rational vaccine design by quantitatively determining the impact of a comprehensive set of subunit vaccination parameters on complete antibody landscapes. Finally, we will develop advanced bioinformatic methods to discover immunodiagnostics based on antibody repertoire sequences. In summary, this proposal lays the foundation for fundamentally new approaches in the quantitative analysis of antibody responses, which long-term will promote the development of next-generation vaccines and immunodiagnostics.
Max ERC Funding
1 492 586 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym Antivessel-T-Cells
Project Development of Vascular-Disrupting Lymphocyte Therapy for Tumours
Researcher (PI) Georgios Coukos
Host Institution (HI) CENTRE HOSPITALIER UNIVERSITAIRE VAUDOIS
Call Details Advanced Grant (AdG), LS7, ERC-2012-ADG_20120314
Summary T cell engineering with chimeric antigen receptors has opened the door to effective immunotherapy. CARs are fusion genes encoding receptors whose extracellular domain comprises a single chain variable fragment (scFv) antibody that binds to a tumour surface epitope, while the intracellular domain comprises the signalling module of CD3ζ along with powerful costimulatory domains (e.g. CD28 and/or 4-1BB). CARs are a major breakthrough, since they allow bypassing HLA restrictions or loss, and they can incorporate potent costimulatory signals tailored to optimize T cell function. However, solid tumours present challenges, since they are often genetically unstable, and the tumour microenvironment impedes T cell function. The tumour vasculature is a much more stable and accessible target, and its disruption has catastrophic consequences for tumours. Nevertheless, the lack of affinity reagents has impeded progress in this area. The objectives of this proposal are to develop the first potent and safe tumour vascular-disrupting tumour immunotherapy using scFv’s and CARs uniquely available in my laboratory.
I propose to use these innovative CARs to understand for the first time the molecular mechanisms underlying the interactions between anti-vascular CAR-T cells and tumour endothelium, and exploit them to maximize tumour vascular destruction. I also intend to employ innovative engineering approaches to minimize the chance of reactivity against normal vasculature. Lastly, I propose to manipulate the tumour damage mechanisms ensuing anti-vascular therapy, to maximize tumour rejection through immunomodulation. We are poised to elucidate critical interactions between tumour endothelium and anti-vascular T cells, and bring to bear cancer therapy of unparalleled power. The impact of this work could be transforming, given the applicability of tumour-vascular disruption across most common tumour types.
Summary
T cell engineering with chimeric antigen receptors has opened the door to effective immunotherapy. CARs are fusion genes encoding receptors whose extracellular domain comprises a single chain variable fragment (scFv) antibody that binds to a tumour surface epitope, while the intracellular domain comprises the signalling module of CD3ζ along with powerful costimulatory domains (e.g. CD28 and/or 4-1BB). CARs are a major breakthrough, since they allow bypassing HLA restrictions or loss, and they can incorporate potent costimulatory signals tailored to optimize T cell function. However, solid tumours present challenges, since they are often genetically unstable, and the tumour microenvironment impedes T cell function. The tumour vasculature is a much more stable and accessible target, and its disruption has catastrophic consequences for tumours. Nevertheless, the lack of affinity reagents has impeded progress in this area. The objectives of this proposal are to develop the first potent and safe tumour vascular-disrupting tumour immunotherapy using scFv’s and CARs uniquely available in my laboratory.
I propose to use these innovative CARs to understand for the first time the molecular mechanisms underlying the interactions between anti-vascular CAR-T cells and tumour endothelium, and exploit them to maximize tumour vascular destruction. I also intend to employ innovative engineering approaches to minimize the chance of reactivity against normal vasculature. Lastly, I propose to manipulate the tumour damage mechanisms ensuing anti-vascular therapy, to maximize tumour rejection through immunomodulation. We are poised to elucidate critical interactions between tumour endothelium and anti-vascular T cells, and bring to bear cancer therapy of unparalleled power. The impact of this work could be transforming, given the applicability of tumour-vascular disruption across most common tumour types.
Max ERC Funding
2 500 000 €
Duration
Start date: 2013-08-01, End date: 2018-07-31
Project acronym AOC
Project Adversary-Oriented Computing
Researcher (PI) Rachid Guerraoui
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary "Recent technological evolutions, including the cloud, the multicore, the social and the mobiles ones, are turning computing ubiquitously distributed. Yet, building high-assurance distributed programs is notoriously challenging. One of the main reasons is that these systems usually seek to achieve several goals at the same time. In short, they need to be efficient, responding effectively in various average-case conditions, as well as reliable, behaving correctly in severe, worst-case conditions. As a consequence, they typically intermingle different strategies: each to cope with some specific condition, e.g., with or without node failures, message losses, time-outs, contention, cache misses,
over-sizing, malicious attacks, etc. The resulting programs end up hard to design, prove, verify, implement, test and debug. Not surprisingly, there are anecdotal evidences of the fragility of the most celebrated distributed systems.
The goal of this project is to contribute to building high-assurance distributed programs by introducing a new dimension for separating and isolating their concerns, as well as a new scheme for composing and reusing them in a modular manner. In short, the project will explore the inherent power and limitations of a novel paradigm, Adversary-Oriented Computing (AOC). Sub-programs, each implementing a specific strategy to cope with a given adversary, modelling a specific working condition, are designed, proved, verified, implemented, tested and debugged independently. They are then composed, possibly dynamically, as black-boxes within the same global program. The AOC project is ambitious and it seeks to fundamentally revisit the way distributed algorithms are designed and distributed systems are implemented. The gain expected in comparison with today's approaches is substantial, and I believe it will be proportional to the degree of difficulty of the distributed problem at hand."
Summary
"Recent technological evolutions, including the cloud, the multicore, the social and the mobiles ones, are turning computing ubiquitously distributed. Yet, building high-assurance distributed programs is notoriously challenging. One of the main reasons is that these systems usually seek to achieve several goals at the same time. In short, they need to be efficient, responding effectively in various average-case conditions, as well as reliable, behaving correctly in severe, worst-case conditions. As a consequence, they typically intermingle different strategies: each to cope with some specific condition, e.g., with or without node failures, message losses, time-outs, contention, cache misses,
over-sizing, malicious attacks, etc. The resulting programs end up hard to design, prove, verify, implement, test and debug. Not surprisingly, there are anecdotal evidences of the fragility of the most celebrated distributed systems.
The goal of this project is to contribute to building high-assurance distributed programs by introducing a new dimension for separating and isolating their concerns, as well as a new scheme for composing and reusing them in a modular manner. In short, the project will explore the inherent power and limitations of a novel paradigm, Adversary-Oriented Computing (AOC). Sub-programs, each implementing a specific strategy to cope with a given adversary, modelling a specific working condition, are designed, proved, verified, implemented, tested and debugged independently. They are then composed, possibly dynamically, as black-boxes within the same global program. The AOC project is ambitious and it seeks to fundamentally revisit the way distributed algorithms are designed and distributed systems are implemented. The gain expected in comparison with today's approaches is substantial, and I believe it will be proportional to the degree of difficulty of the distributed problem at hand."
Max ERC Funding
2 147 012 €
Duration
Start date: 2014-06-01, End date: 2019-05-31
Project acronym APPLAUSE
Project Adolescent Precursors to Psychiatric Disorders – Learing from Analysis of User-Service Engagement
Researcher (PI) Sara Evans
Host Institution (HI) LONDON SCHOOL OF ECONOMICS AND POLITICAL SCIENCE
Call Details Starting Grant (StG), LS7, ERC-2013-StG
Summary APPLAUSE’s aim is to produce a body of evidence that illustrates how young people with mental health problems currently interact with both formal mental health services and informal social and familial support structures. Careful analysis of data gathered in the UK and Brazil will allow formulation of globally relevant insights into mental health care delivery for young people, which will be presented internationally as a resource for future health care service design.
APPLAUSE will allow the collection of an important data set that does not currently exist in this field, and will look to other disciplines for innovative approaches to data analysis. Whist standard analysis may allow for snapshots of health service use, using innovative life course methods will allow us to to characterise patterns of complete service use of each individual participant’s experience of accessing mental health care and social support.
Adolescence is a critical period in mental health development, which has been largely neglected by public health efforts. Psychiatric disorders rank as the primary cause of disability among individuals aged 10-24 years, worldwide. Moreover, many health risk behaviours emerge during adolescence and 70% of adult psychiatric disorders are preceded by mental health problems during adolescent years. However, delays to receiving care for psychiatric disorders, following disorder onset, avreage more than ten years and little is known about factors which impede access to and continuity of care among young people with mental health problems. APPLAUSE will analyse current access models, reports of individual experiences of positive and negative interactions with health care services and the culturally embedded social factors that impact on such access. Addressing this complex problem from a global perspective will advance the development of a more diverse and innovative set of strategies for improving earlier access to care.
Summary
APPLAUSE’s aim is to produce a body of evidence that illustrates how young people with mental health problems currently interact with both formal mental health services and informal social and familial support structures. Careful analysis of data gathered in the UK and Brazil will allow formulation of globally relevant insights into mental health care delivery for young people, which will be presented internationally as a resource for future health care service design.
APPLAUSE will allow the collection of an important data set that does not currently exist in this field, and will look to other disciplines for innovative approaches to data analysis. Whist standard analysis may allow for snapshots of health service use, using innovative life course methods will allow us to to characterise patterns of complete service use of each individual participant’s experience of accessing mental health care and social support.
Adolescence is a critical period in mental health development, which has been largely neglected by public health efforts. Psychiatric disorders rank as the primary cause of disability among individuals aged 10-24 years, worldwide. Moreover, many health risk behaviours emerge during adolescence and 70% of adult psychiatric disorders are preceded by mental health problems during adolescent years. However, delays to receiving care for psychiatric disorders, following disorder onset, avreage more than ten years and little is known about factors which impede access to and continuity of care among young people with mental health problems. APPLAUSE will analyse current access models, reports of individual experiences of positive and negative interactions with health care services and the culturally embedded social factors that impact on such access. Addressing this complex problem from a global perspective will advance the development of a more diverse and innovative set of strategies for improving earlier access to care.
Max ERC Funding
1 499 948 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym ASAP
Project Adaptive Security and Privacy
Researcher (PI) Bashar Nuseibeh
Host Institution (HI) THE OPEN UNIVERSITY
Call Details Advanced Grant (AdG), PE6, ERC-2011-ADG_20110209
Summary With the prevalence of mobile computing devices and the increasing availability of pervasive services, ubiquitous computing (Ubicomp) is a reality for many people. This reality is generating opportunities for people to interact socially in new and richer ways, and to work more effectively in a variety of new environments. More generally, Ubicomp infrastructures – controlled by software – will determine users’ access to critical services.
With these opportunities come higher risks of misuse by malicious agents. Therefore, the role and design of software for managing use and protecting against misuse is critical, and the engineering of software that is both functionally effective while safe guarding user assets from harm is a key challenge. Indeed the very nature of Ubicomp means that software must adapt to the changing needs of users and their environment, and, more critically, to the different threats to users’ security and privacy.
ASAP proposes to radically re-conceptualise software engineering for Ubicomp in ways that are cognisant of the changing functional needs of users, of the changing threats to user assets, and of the changing relationships between them. We propose to deliver adaptive software capabilities for supporting users in managing their privacy requirements, and adaptive software capabilities to deliver secure software that underpin those requirements. A key novelty of our approach is its holistic treatment of security and human behaviour. To achieve this, it draws upon contributions from requirements engineering, security & privacy engineering, and human-computer interaction. Our aim is to contribute to software engineering that empowers and protects Ubicomp users. Underpinning our approach will be the development of representations of security and privacy problem structures that capture user requirements, the context in which those requirements arise, and the adaptive software that aims to meet those requirements.
Summary
With the prevalence of mobile computing devices and the increasing availability of pervasive services, ubiquitous computing (Ubicomp) is a reality for many people. This reality is generating opportunities for people to interact socially in new and richer ways, and to work more effectively in a variety of new environments. More generally, Ubicomp infrastructures – controlled by software – will determine users’ access to critical services.
With these opportunities come higher risks of misuse by malicious agents. Therefore, the role and design of software for managing use and protecting against misuse is critical, and the engineering of software that is both functionally effective while safe guarding user assets from harm is a key challenge. Indeed the very nature of Ubicomp means that software must adapt to the changing needs of users and their environment, and, more critically, to the different threats to users’ security and privacy.
ASAP proposes to radically re-conceptualise software engineering for Ubicomp in ways that are cognisant of the changing functional needs of users, of the changing threats to user assets, and of the changing relationships between them. We propose to deliver adaptive software capabilities for supporting users in managing their privacy requirements, and adaptive software capabilities to deliver secure software that underpin those requirements. A key novelty of our approach is its holistic treatment of security and human behaviour. To achieve this, it draws upon contributions from requirements engineering, security & privacy engineering, and human-computer interaction. Our aim is to contribute to software engineering that empowers and protects Ubicomp users. Underpinning our approach will be the development of representations of security and privacy problem structures that capture user requirements, the context in which those requirements arise, and the adaptive software that aims to meet those requirements.
Max ERC Funding
2 499 041 €
Duration
Start date: 2012-10-01, End date: 2018-09-30
Project acronym ASSIMILES
Project Advanced Spectroscopy and Spectrometry for Imaging Metabolism using Isotopically-Labeled Endogenous Substrates
Researcher (PI) Arnaud Comment
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Consolidator Grant (CoG), PE4, ERC-2015-CoG
Summary A technological revolution is currently taking place making it possible to noninvasively study metabolism in mammals (incl. humans) in vivo with unprecedented temporal and spatial resolution. Central to these developments is the phenomenon of hyperpolarization, which transiently enhances the magnetic resonance (MR) signals so much that real-time metabolic imaging and spectroscopy becomes possible. The first clinical translation of hyperpolarization MR technology has recently been demonstrated with prostate cancer patients.
I have played an active role in these exciting developments, through design and construction of hyperpolarization MR setups that are defining the cutting-edge for in vivo preclinical metabolic studies. However, important obstacles still exist for the technology to fulfill its enormous potential.
With this highly interdisciplinary proposal, I will overcome the principal drawbacks of current hyperpolarization technology, namely: 1) A limited time window for hyperpolarized MR detection; 2) The conventional use of potentially toxic polarizing agents; 3) The necessity to use supra-physiological doses of metabolic substrates to reach detectable MR signal
I will develop a novel hyperpolarization instrument making use of photoexcited compounds as polarizing agents to produce hyperpolarized solutions containing exclusively endogenous compounds. It will become possible to deliver hyperpolarized solutions in a quasi-continuous manner, permitting infusion of physiological doses and greatly increasing sensitivity. I will also use a complementary isotope imaging technique, the so-called CryoNanoSIMS (developed at my institution over the last year), which can image isotopic distributions in frozen tissue sections and reveal the localization of injected substrates and their metabolites with subcellular spatial resolution. Case studies will include liver and brain cancer mouse models. This work is pioneering and will create a new frontier in molecular imaging.
Summary
A technological revolution is currently taking place making it possible to noninvasively study metabolism in mammals (incl. humans) in vivo with unprecedented temporal and spatial resolution. Central to these developments is the phenomenon of hyperpolarization, which transiently enhances the magnetic resonance (MR) signals so much that real-time metabolic imaging and spectroscopy becomes possible. The first clinical translation of hyperpolarization MR technology has recently been demonstrated with prostate cancer patients.
I have played an active role in these exciting developments, through design and construction of hyperpolarization MR setups that are defining the cutting-edge for in vivo preclinical metabolic studies. However, important obstacles still exist for the technology to fulfill its enormous potential.
With this highly interdisciplinary proposal, I will overcome the principal drawbacks of current hyperpolarization technology, namely: 1) A limited time window for hyperpolarized MR detection; 2) The conventional use of potentially toxic polarizing agents; 3) The necessity to use supra-physiological doses of metabolic substrates to reach detectable MR signal
I will develop a novel hyperpolarization instrument making use of photoexcited compounds as polarizing agents to produce hyperpolarized solutions containing exclusively endogenous compounds. It will become possible to deliver hyperpolarized solutions in a quasi-continuous manner, permitting infusion of physiological doses and greatly increasing sensitivity. I will also use a complementary isotope imaging technique, the so-called CryoNanoSIMS (developed at my institution over the last year), which can image isotopic distributions in frozen tissue sections and reveal the localization of injected substrates and their metabolites with subcellular spatial resolution. Case studies will include liver and brain cancer mouse models. This work is pioneering and will create a new frontier in molecular imaging.
Max ERC Funding
2 199 146 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym AsthmaPhenotypes
Project Understanding asthma phenotypes: going beyond the atopic/non-atopic paradigm
Researcher (PI) Neil Pearce
Host Institution (HI) LONDON SCHOOL OF HYGIENE AND TROPICAL MEDICINE ROYAL CHARTER
Call Details Advanced Grant (AdG), LS7, ERC-2014-ADG
Summary Fifteen years ago it was widely believed that asthma was an allergic/atopic disease caused by allergen exposure in infancy; this produced atopic sensitization and continued exposure resulted in eosinophilic airways inflammation, bronchial hyper-responsiveness and reversible airflow obstruction. It is now clear that this model is at best incomplete. Less than one-half of asthma cases involve allergic (atopic) mechanisms, and most asthma in low-and-middle income countries is non-atopic. Westernization may be contributing to the global increases in asthma prevalence, but this process appears to involve changes in asthma susceptibility rather than increased exposure to “established” asthma risk factors. Understanding why these changes are occurring is essential in order to halt the growing global asthma epidemic.This will require a combination of epidemiological, clinical and basic science studies in a variety of environments.
A key task is to reclassify asthma phenotypes. These are important to: (i) better understand the aetiological mechanisms of asthma; (ii) identify new causes; and (iii) identify new therapeutic measures. There are major opportunities to address these issues using new techniques for sample collection from the airways (sputum induction, nasal lavage), new methods of analysis (microbiome, epigenetics), and new bioinformatics methods for integrating data from multiple sources and levels. There is an unprecedented potential to go beyond the old atopic/non-atopic categorization of phenotypes.
I will therefore conduct analyses to re-examine and reclassify asthma phenotypes. The key features are the inclusion of: (i) both high and low prevalence centres from both high income countries and low-and-middle income countries; (ii) much more detailed biomarker information than has been used for previous studies of asthma phenotypes; and (iii) new bioinformatics methods for integrating data from multiple sources and levels.
Summary
Fifteen years ago it was widely believed that asthma was an allergic/atopic disease caused by allergen exposure in infancy; this produced atopic sensitization and continued exposure resulted in eosinophilic airways inflammation, bronchial hyper-responsiveness and reversible airflow obstruction. It is now clear that this model is at best incomplete. Less than one-half of asthma cases involve allergic (atopic) mechanisms, and most asthma in low-and-middle income countries is non-atopic. Westernization may be contributing to the global increases in asthma prevalence, but this process appears to involve changes in asthma susceptibility rather than increased exposure to “established” asthma risk factors. Understanding why these changes are occurring is essential in order to halt the growing global asthma epidemic.This will require a combination of epidemiological, clinical and basic science studies in a variety of environments.
A key task is to reclassify asthma phenotypes. These are important to: (i) better understand the aetiological mechanisms of asthma; (ii) identify new causes; and (iii) identify new therapeutic measures. There are major opportunities to address these issues using new techniques for sample collection from the airways (sputum induction, nasal lavage), new methods of analysis (microbiome, epigenetics), and new bioinformatics methods for integrating data from multiple sources and levels. There is an unprecedented potential to go beyond the old atopic/non-atopic categorization of phenotypes.
I will therefore conduct analyses to re-examine and reclassify asthma phenotypes. The key features are the inclusion of: (i) both high and low prevalence centres from both high income countries and low-and-middle income countries; (ii) much more detailed biomarker information than has been used for previous studies of asthma phenotypes; and (iii) new bioinformatics methods for integrating data from multiple sources and levels.
Max ERC Funding
2 348 803 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym ATTOLIQ
Project Attosecond X-ray spectroscopy of liquids
Researcher (PI) Hans Jakob WÖRNER
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Consolidator Grant (CoG), PE4, ERC-2017-COG
Summary Charge and energy transfer are the key steps underlying most chemical reactions and biological transformations. The purely electronic dynamics that control such processes take place on attosecond time scales. A complete understanding of these dynamics on the electronic level therefore calls for new experimental methods with attosecond resolution that are applicable to aqueous environments. We propose to combine the element sensitivity of X-ray spectroscopy with attosecond temporal resolution and ultrathin liquid microjets to study electronic dynamics of relevance to chemical, biological and photovoltaic processes. We will build on our recent achievements in demonstrating femtosecond time-resolved measurements in the water, attosecond pho-toelectron spectroscopy on a liquid microjet and measuring and controlling attosecond charge migration in isolated molecules. We will first concentrate on liquid water to study its electronic dynamics following outer-valence ionization, the formation pathway of the solvated electron and the time scales and intermolecular Coulombic decay following inner-valence or core-level ionization. Second, we will turn to solvated species and measure electronic dynamics and charge migration in solvated molecules, transition-metal complexes and pho-toexcited nanoparticles. These goals will be achieved by developing several innovative experimental tech-niques. We will develop a source of isolated attosecond pulses covering the water window (285-538 eV) and combine it with a flat liquid microjet to realize attosecond transient absorption in liquids. We will complement these measurements with attosecond X-ray emission spectroscopy, Auger spectroscopy and a novel hetero-dyne-detected variant of resonant inelastic Raman scattering, exploiting the large bandwidth that is naturally available from attosecond X-ray sources.
Summary
Charge and energy transfer are the key steps underlying most chemical reactions and biological transformations. The purely electronic dynamics that control such processes take place on attosecond time scales. A complete understanding of these dynamics on the electronic level therefore calls for new experimental methods with attosecond resolution that are applicable to aqueous environments. We propose to combine the element sensitivity of X-ray spectroscopy with attosecond temporal resolution and ultrathin liquid microjets to study electronic dynamics of relevance to chemical, biological and photovoltaic processes. We will build on our recent achievements in demonstrating femtosecond time-resolved measurements in the water, attosecond pho-toelectron spectroscopy on a liquid microjet and measuring and controlling attosecond charge migration in isolated molecules. We will first concentrate on liquid water to study its electronic dynamics following outer-valence ionization, the formation pathway of the solvated electron and the time scales and intermolecular Coulombic decay following inner-valence or core-level ionization. Second, we will turn to solvated species and measure electronic dynamics and charge migration in solvated molecules, transition-metal complexes and pho-toexcited nanoparticles. These goals will be achieved by developing several innovative experimental tech-niques. We will develop a source of isolated attosecond pulses covering the water window (285-538 eV) and combine it with a flat liquid microjet to realize attosecond transient absorption in liquids. We will complement these measurements with attosecond X-ray emission spectroscopy, Auger spectroscopy and a novel hetero-dyne-detected variant of resonant inelastic Raman scattering, exploiting the large bandwidth that is naturally available from attosecond X-ray sources.
Max ERC Funding
2 750 000 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym ATTOSCOPE
Project Measuring attosecond electron dynamics in molecules
Researcher (PI) Hans Jakob Wörner
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), PE4, ERC-2012-StG_20111012
Summary "The goal of the present proposal is to realize measurements of electronic dynamics in polyatomic
molecules with attosecond temporal resolution (1 as = 10^-18s). We propose to study electronic
rearrangements following photoexcitation, charge migration in a molecular chain induced by
ionization and non-adiabatic multi-electron dynamics in an intense laser field. The grand question
addressed by this research is the characterization of electron correlations which control the shape, properties and function of molecules. In all three proposed projects, a time-domain approach appears to be the most suitable since it reduces complex molecular dynamics to the purely electronic dynamics by exploiting the hierarchy of motional time scales. Experimentally, we propose to realize an innovative experimental setup. A few-cycle infrared (IR) pulse will be used to generate attosecond pulses in the extreme-ultraviolet (XUV) by high-harmonic generation. The IR pulse will be separated from the XUV by means of an innovative interferometer. Additionally, it will permit the introduction of a controlled attosecond delay between the two pulses. We propose to use the attosecond pulses as a tool to look inside individual IR- or UV-field cycles to better understand light-matter interactions. Time-resolved pump-probe experiments will be carried out on polyatomic molecules by detecting the energy and angular distribution of photoelectrons in a velocity-map imaging spectrometer. These experiments are expected to provide new insights
into the dynamics of multi-electron systems along with new results for the validation and
improvement of theoretical models. Multi-electron dynamics is indeed a very complex subject
on its own and even more so in the presence of strong laser fields. The proposed experiments
directly address theses challenges and are expected to provide new insights that will be beneficial to a wide range of scientific research areas."
Summary
"The goal of the present proposal is to realize measurements of electronic dynamics in polyatomic
molecules with attosecond temporal resolution (1 as = 10^-18s). We propose to study electronic
rearrangements following photoexcitation, charge migration in a molecular chain induced by
ionization and non-adiabatic multi-electron dynamics in an intense laser field. The grand question
addressed by this research is the characterization of electron correlations which control the shape, properties and function of molecules. In all three proposed projects, a time-domain approach appears to be the most suitable since it reduces complex molecular dynamics to the purely electronic dynamics by exploiting the hierarchy of motional time scales. Experimentally, we propose to realize an innovative experimental setup. A few-cycle infrared (IR) pulse will be used to generate attosecond pulses in the extreme-ultraviolet (XUV) by high-harmonic generation. The IR pulse will be separated from the XUV by means of an innovative interferometer. Additionally, it will permit the introduction of a controlled attosecond delay between the two pulses. We propose to use the attosecond pulses as a tool to look inside individual IR- or UV-field cycles to better understand light-matter interactions. Time-resolved pump-probe experiments will be carried out on polyatomic molecules by detecting the energy and angular distribution of photoelectrons in a velocity-map imaging spectrometer. These experiments are expected to provide new insights
into the dynamics of multi-electron systems along with new results for the validation and
improvement of theoretical models. Multi-electron dynamics is indeed a very complex subject
on its own and even more so in the presence of strong laser fields. The proposed experiments
directly address theses challenges and are expected to provide new insights that will be beneficial to a wide range of scientific research areas."
Max ERC Funding
1 999 992 €
Duration
Start date: 2012-09-01, End date: 2017-08-31
Project acronym BALANCE
Project Mapping Dispersion Spectroscopically in Large Gas-Phase Molecular Ions
Researcher (PI) Peter CHEN
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Advanced Grant (AdG), PE4, ERC-2018-ADG
Summary We use IR spectroscopy of trapped ions in a cryogenic FT-ICR spectrometer to probe non-covalent, “dispersion” interactions in large, gas-phase molecular ions. We will measure conformational equilibria by N-H frequency shifts, and correlate gas-phase IR frequency to the N-H-N bond angle in an ionic H-bond. Substituents on “onium” cations can adopt various conformations, whose energies map interaction potentials. Substituents on their proton-bound dimers interact non-covalently through dispersion forces, whose quantitative evaluation in large molecules has remained difficult despite dispersion becoming increasingly cited as a design principle in the construction of catalysts and materials. The non-covalent interactions bend the N-H-N bond, leading to large shifts in the IR frequency. The proton-bound dimer acts like a molecular balance where the non-covalent interaction, is set against the bending potential in an ionic hydrogen bond. Despite encouragingly accurate calculations for small molecules, experimental benchmarks for large molecules in the gas phase remain scarce, and there is evidence that the good results for small molecules may not extrapolate reliably to large molecules. The present proposal introduces a new experimental probe of non-covalent interactions, providing a sensitive test of the diverging results coming from various computational methods and other experiments. The experiment must be done on isolated molecules in the gas phase, as previous work has shown that solvation substantially cancels out the attractive potential. Accordingly, the proposed experimental design, which involves a custom-built spectrometer, newly available tunable IR sources, chemical synthesis of custom substrates, and quantum calculations up to coupled-cluster levels of theory, showcases how an interdisciplinary approach combining physical and organic chemistry can solve a fundamental problem that impacts how we understand steric effects in organic chemistry.
Summary
We use IR spectroscopy of trapped ions in a cryogenic FT-ICR spectrometer to probe non-covalent, “dispersion” interactions in large, gas-phase molecular ions. We will measure conformational equilibria by N-H frequency shifts, and correlate gas-phase IR frequency to the N-H-N bond angle in an ionic H-bond. Substituents on “onium” cations can adopt various conformations, whose energies map interaction potentials. Substituents on their proton-bound dimers interact non-covalently through dispersion forces, whose quantitative evaluation in large molecules has remained difficult despite dispersion becoming increasingly cited as a design principle in the construction of catalysts and materials. The non-covalent interactions bend the N-H-N bond, leading to large shifts in the IR frequency. The proton-bound dimer acts like a molecular balance where the non-covalent interaction, is set against the bending potential in an ionic hydrogen bond. Despite encouragingly accurate calculations for small molecules, experimental benchmarks for large molecules in the gas phase remain scarce, and there is evidence that the good results for small molecules may not extrapolate reliably to large molecules. The present proposal introduces a new experimental probe of non-covalent interactions, providing a sensitive test of the diverging results coming from various computational methods and other experiments. The experiment must be done on isolated molecules in the gas phase, as previous work has shown that solvation substantially cancels out the attractive potential. Accordingly, the proposed experimental design, which involves a custom-built spectrometer, newly available tunable IR sources, chemical synthesis of custom substrates, and quantum calculations up to coupled-cluster levels of theory, showcases how an interdisciplinary approach combining physical and organic chemistry can solve a fundamental problem that impacts how we understand steric effects in organic chemistry.
Max ERC Funding
2 446 125 €
Duration
Start date: 2019-05-01, End date: 2024-04-30
Project acronym BARCODE
Project The use of genetic profiling to guide prostate cancer targeted screening and cancer care
Researcher (PI) Rosalind Anne Eeles
Host Institution (HI) THE INSTITUTE OF CANCER RESEARCH: ROYAL CANCER HOSPITAL
Call Details Advanced Grant (AdG), LS7, ERC-2013-ADG
Summary "Prostate cancer is the commonest solid cancer in men in the European Community. There is evidence for genetic predisposition to the development of prostate cancer and our group has found the largest number of such genetic variants described to date worldwide. The next challenge is to harness these discoveries to advance the clinical care of populations and prostate cancer patients to improve screening and target treatments. This proposal, BARCODE, aims to be ground-breaking in this area. BARCODE has two components (1) to profile a population in England using the current 77 genetic variant profile and compare screening outcomes with those from population based screening studies to determine if genetics can target screening more effectively in this disease by identifying prostate cancer that more often needs treatment and (2) genetically profiling men with prostate cancer in the uro-oncology clinic for a panel of genes which predict for worse outcome so that these men can be offered more intensive staging and treatment within clinical trials. This will use next generation sequencing technology using a barcoding system which we have developed to speed up throughput and reduce costs. The PI will spend 35% of her time on this project and she will not charge for her time spent on this grant as she is funded by The University of London UK. The research team at The Institute Of Cancer Research, London, UK is a multidisciplinary team which leads the field of genetic predisposition to prostate cancer and its clinical application and so is well placed to deliver on this research. This application will have a dramatic impact on other researchers as it is ground –breaking and state of the art in its application of genetic findings to public health and cancer care. It will therefore influence the work being undertaken in both these areas to integrate genetic profiling and gene panel analysis into population screening and cancer care respectively."
Summary
"Prostate cancer is the commonest solid cancer in men in the European Community. There is evidence for genetic predisposition to the development of prostate cancer and our group has found the largest number of such genetic variants described to date worldwide. The next challenge is to harness these discoveries to advance the clinical care of populations and prostate cancer patients to improve screening and target treatments. This proposal, BARCODE, aims to be ground-breaking in this area. BARCODE has two components (1) to profile a population in England using the current 77 genetic variant profile and compare screening outcomes with those from population based screening studies to determine if genetics can target screening more effectively in this disease by identifying prostate cancer that more often needs treatment and (2) genetically profiling men with prostate cancer in the uro-oncology clinic for a panel of genes which predict for worse outcome so that these men can be offered more intensive staging and treatment within clinical trials. This will use next generation sequencing technology using a barcoding system which we have developed to speed up throughput and reduce costs. The PI will spend 35% of her time on this project and she will not charge for her time spent on this grant as she is funded by The University of London UK. The research team at The Institute Of Cancer Research, London, UK is a multidisciplinary team which leads the field of genetic predisposition to prostate cancer and its clinical application and so is well placed to deliver on this research. This application will have a dramatic impact on other researchers as it is ground –breaking and state of the art in its application of genetic findings to public health and cancer care. It will therefore influence the work being undertaken in both these areas to integrate genetic profiling and gene panel analysis into population screening and cancer care respectively."
Max ERC Funding
2 499 123 €
Duration
Start date: 2014-10-01, End date: 2019-09-30
Project acronym BATNMR
Project Development and Application of New NMR Methods for Studying Interphases and Interfaces in Batteries
Researcher (PI) Clare GREY
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Advanced Grant (AdG), PE4, ERC-2018-ADG
Summary The development of longer lasting, higher energy density and cheaper rechargeable batteries represents one of the major technological challenges of our society, batteries representing the limiting components in the shift from gasoline-powered to electric vehicles. They are also required to enable the use of more (typically intermittent) renewable energy, to balance demand with generation. This proposal seeks to develop and apply new NMR metrologies to determine the structure and dynamics of the multiple electrode-electrolyte interfaces and interphases that are present in these batteries, and how they evolve during battery cycling. New dynamic nuclear polarization (DNP) techniques will be exploited to extract structural information about the interface between the battery electrode and the passivating layers that grow on the electrode materials (the solid electrolyte interphase, SEI) and that are inherent to the stability of the batteries. The role of the SEI (and ceramic interfaces) in controlling lithium metal dendrite growth will be determined in liquid based and all solid state batteries.
New DNP approaches will be developed that are compatible with the heterogeneous and reactive species that are present in conventional, all-solid state, Li-air and redox flow batteries. Method development will run in parallel with the use of DNP approaches to determine the structures of the various battery interfaces and interphases, testing the stability of conventional biradicals in these harsh oxidizing and reducing conditions, modifying the experimental approaches where appropriate. The final result will be a significantly improved understanding of the structures of these phases and how they evolve on cycling, coupled with strategies for designing improved SEI structures. The nature of the interface between a lithium metal dendrite and ceramic composite will be determined, providing much needed insight into how these (unwanted) dendrites grow in all solid state batteries. DNP approaches coupled with electron spin resonance will be use, where possible in situ, to determine the reaction mechanisms of organic molecules such as quinones in organic-based redox flow batteries in order to help prevent degradation of the electrochemically active species.
This proposal involves NMR method development specifically designed to explore a variety of battery chemistries. Thus, this proposal is interdisciplinary, containing both a strong emphasis on materials characterization, electrochemistry and electronic structures of materials, interfaces and nanoparticles, and on analytical and physical chemistry. Some of the methodology will be applicable to other materials and systems including (for example) other electrochemical technologies such as fuel cells and solar fuels and the study of catalysts (to probe surface structure).
Summary
The development of longer lasting, higher energy density and cheaper rechargeable batteries represents one of the major technological challenges of our society, batteries representing the limiting components in the shift from gasoline-powered to electric vehicles. They are also required to enable the use of more (typically intermittent) renewable energy, to balance demand with generation. This proposal seeks to develop and apply new NMR metrologies to determine the structure and dynamics of the multiple electrode-electrolyte interfaces and interphases that are present in these batteries, and how they evolve during battery cycling. New dynamic nuclear polarization (DNP) techniques will be exploited to extract structural information about the interface between the battery electrode and the passivating layers that grow on the electrode materials (the solid electrolyte interphase, SEI) and that are inherent to the stability of the batteries. The role of the SEI (and ceramic interfaces) in controlling lithium metal dendrite growth will be determined in liquid based and all solid state batteries.
New DNP approaches will be developed that are compatible with the heterogeneous and reactive species that are present in conventional, all-solid state, Li-air and redox flow batteries. Method development will run in parallel with the use of DNP approaches to determine the structures of the various battery interfaces and interphases, testing the stability of conventional biradicals in these harsh oxidizing and reducing conditions, modifying the experimental approaches where appropriate. The final result will be a significantly improved understanding of the structures of these phases and how they evolve on cycling, coupled with strategies for designing improved SEI structures. The nature of the interface between a lithium metal dendrite and ceramic composite will be determined, providing much needed insight into how these (unwanted) dendrites grow in all solid state batteries. DNP approaches coupled with electron spin resonance will be use, where possible in situ, to determine the reaction mechanisms of organic molecules such as quinones in organic-based redox flow batteries in order to help prevent degradation of the electrochemically active species.
This proposal involves NMR method development specifically designed to explore a variety of battery chemistries. Thus, this proposal is interdisciplinary, containing both a strong emphasis on materials characterization, electrochemistry and electronic structures of materials, interfaces and nanoparticles, and on analytical and physical chemistry. Some of the methodology will be applicable to other materials and systems including (for example) other electrochemical technologies such as fuel cells and solar fuels and the study of catalysts (to probe surface structure).
Max ERC Funding
3 498 219 €
Duration
Start date: 2019-10-01, End date: 2024-09-30
Project acronym BAYES-KNOWLEDGE
Project Effective Bayesian Modelling with Knowledge before Data
Researcher (PI) Norman Fenton
Host Institution (HI) QUEEN MARY UNIVERSITY OF LONDON
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary This project aims to improve evidence-based decision-making. What makes it radical is that it plans to do this in situations (common for critical risk assessment problems) where there is little or even no data, and hence where traditional statistics cannot be used. To address this problem Bayesian analysis, which enables domain experts to supplement observed data with subjective probabilities, is normally used. As real-world problems typically involve multiple uncertain variables, Bayesian analysis is extended using a technique called Bayesian networks (BNs). But, despite many great benefits, BNs have been under-exploited, especially in areas where they offer the greatest potential for improvements (law, medicine and systems engineering). This is mainly because of widespread resistance to relying on subjective knowledge. To address this problem much current research assumes sufficient data are available to make the expert’s input minimal or even redundant; with such data it may be possible to ‘learn’ the underlying BN model. But this approach offers nothing when there is limited or no data. Even when ‘big’ data are available the resulting models may be superficially objective but fundamentally flawed as they fail to capture the underlying causal structure that only expert knowledge can provide.
Our solution is to develop a method to systemize the way expert driven causal BN models can be built and used effectively either in the absence of data or as a means of determining what future data is really required. The method involves a new way of framing problems and extensions to BN theory, notation and tools. Working with relevant domain experts, along with cognitive psychologists, our methods will be developed and tested experimentally on real-world critical decision-problems in medicine, law, forensics, and transport. As the work complements current data-driven approaches, it will lead to improved BN modelling both when there is extensive data as well as none.
Summary
This project aims to improve evidence-based decision-making. What makes it radical is that it plans to do this in situations (common for critical risk assessment problems) where there is little or even no data, and hence where traditional statistics cannot be used. To address this problem Bayesian analysis, which enables domain experts to supplement observed data with subjective probabilities, is normally used. As real-world problems typically involve multiple uncertain variables, Bayesian analysis is extended using a technique called Bayesian networks (BNs). But, despite many great benefits, BNs have been under-exploited, especially in areas where they offer the greatest potential for improvements (law, medicine and systems engineering). This is mainly because of widespread resistance to relying on subjective knowledge. To address this problem much current research assumes sufficient data are available to make the expert’s input minimal or even redundant; with such data it may be possible to ‘learn’ the underlying BN model. But this approach offers nothing when there is limited or no data. Even when ‘big’ data are available the resulting models may be superficially objective but fundamentally flawed as they fail to capture the underlying causal structure that only expert knowledge can provide.
Our solution is to develop a method to systemize the way expert driven causal BN models can be built and used effectively either in the absence of data or as a means of determining what future data is really required. The method involves a new way of framing problems and extensions to BN theory, notation and tools. Working with relevant domain experts, along with cognitive psychologists, our methods will be developed and tested experimentally on real-world critical decision-problems in medicine, law, forensics, and transport. As the work complements current data-driven approaches, it will lead to improved BN modelling both when there is extensive data as well as none.
Max ERC Funding
1 572 562 €
Duration
Start date: 2014-04-01, End date: 2018-03-31
Project acronym BDE
Project Beyond Distance Estimates: A New Theory of Heuristics for State-Space Search
Researcher (PI) Malte HELMERT
Host Institution (HI) UNIVERSITAT BASEL
Call Details Consolidator Grant (CoG), PE6, ERC-2018-COG
Summary "Many problems in computer science can be cast as state-space search, where the
objective is to find a path from an initial state to a goal state in a
directed graph called a ""state space"". State-space search is challenging due
to the state explosion problem a.k.a. ""curse of dimensionality"": interesting
state spaces are often astronomically large, defying brute-force exploration.
State-space search has been a core research problem in Artificial Intelligence
since its early days and is alive as ever. Every year, a substantial fraction
of research published at the ICAPS and SoCS conferences is concerned with
state-space search, and the topic is very active at general AI conferences
such as IJCAI and AAAI.
Algorithms in the A* family, dating back to 1968, are still the go-to approach
for state-space search. A* is a graph search algorithm whose only
""intelligence"" stems from a so-called ""heuristic function"", which estimates
the distance from a state to the nearest goal state. The efficiency of A*
depends on the accuracy of this estimate, and decades of research have pushed
the envelope in devising increasingly accurate estimates.
In this project, we question the ""A* + distance estimator"" paradigm and
explore three new directions that go beyond the classical approach:
1. We propose a new paradigm of declarative heuristics, where heuristic
information is not represented as distance estimates, but as properties of
solutions amenable to introspection and general reasoning.
2. We suggest moving the burden of creativity away from the human expert by
casting heuristic design as a meta-optimization problem that can be solved
automatically.
3. We propose abandoning the idea of exploring sequential paths in state
spaces, instead transforming state-space search into combinatorial
optimization problems with no explicit sequencing aspect. We argue that the
""curse of sequentiality"" is as bad as the curse of dimensionality and must
be addressed head-on."
Summary
"Many problems in computer science can be cast as state-space search, where the
objective is to find a path from an initial state to a goal state in a
directed graph called a ""state space"". State-space search is challenging due
to the state explosion problem a.k.a. ""curse of dimensionality"": interesting
state spaces are often astronomically large, defying brute-force exploration.
State-space search has been a core research problem in Artificial Intelligence
since its early days and is alive as ever. Every year, a substantial fraction
of research published at the ICAPS and SoCS conferences is concerned with
state-space search, and the topic is very active at general AI conferences
such as IJCAI and AAAI.
Algorithms in the A* family, dating back to 1968, are still the go-to approach
for state-space search. A* is a graph search algorithm whose only
""intelligence"" stems from a so-called ""heuristic function"", which estimates
the distance from a state to the nearest goal state. The efficiency of A*
depends on the accuracy of this estimate, and decades of research have pushed
the envelope in devising increasingly accurate estimates.
In this project, we question the ""A* + distance estimator"" paradigm and
explore three new directions that go beyond the classical approach:
1. We propose a new paradigm of declarative heuristics, where heuristic
information is not represented as distance estimates, but as properties of
solutions amenable to introspection and general reasoning.
2. We suggest moving the burden of creativity away from the human expert by
casting heuristic design as a meta-optimization problem that can be solved
automatically.
3. We propose abandoning the idea of exploring sequential paths in state
spaces, instead transforming state-space search into combinatorial
optimization problems with no explicit sequencing aspect. We argue that the
""curse of sequentiality"" is as bad as the curse of dimensionality and must
be addressed head-on."
Max ERC Funding
1 997 510 €
Duration
Start date: 2019-02-01, End date: 2024-01-31
Project acronym BIGBAYES
Project Rich, Structured and Efficient Learning of Big Bayesian Models
Researcher (PI) Yee Whye Teh
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary As datasets grow ever larger in scale, complexity and variety, there is an increasing need for powerful machine learning and statistical techniques that are capable of learning from such data. Bayesian nonparametrics is a promising approach to data analysis that is increasingly popular in machine learning and statistics. Bayesian nonparametric models are highly flexible models with infinite-dimensional parameter spaces that can be used to directly parameterise and learn about functions, densities, conditional distributions etc, and have been successfully applied to regression, survival analysis, language modelling, time series analysis, and visual scene analysis among others. However, to successfully use Bayesian nonparametric models to analyse the high-dimensional and structured datasets now commonly encountered in the age of Big Data, we will have to overcome a number of challenges. Namely, we need to develop Bayesian nonparametric models that can learn rich representations from structured data, and we need computational methodologies that can scale effectively to the large and complex models of the future. We will ground our developments in relevant applications, particularly to natural language processing (learning distributed representations for language modelling and compositional semantics) and genetics (modelling genetic variations arising from population, genealogical and spatial structures).
Summary
As datasets grow ever larger in scale, complexity and variety, there is an increasing need for powerful machine learning and statistical techniques that are capable of learning from such data. Bayesian nonparametrics is a promising approach to data analysis that is increasingly popular in machine learning and statistics. Bayesian nonparametric models are highly flexible models with infinite-dimensional parameter spaces that can be used to directly parameterise and learn about functions, densities, conditional distributions etc, and have been successfully applied to regression, survival analysis, language modelling, time series analysis, and visual scene analysis among others. However, to successfully use Bayesian nonparametric models to analyse the high-dimensional and structured datasets now commonly encountered in the age of Big Data, we will have to overcome a number of challenges. Namely, we need to develop Bayesian nonparametric models that can learn rich representations from structured data, and we need computational methodologies that can scale effectively to the large and complex models of the future. We will ground our developments in relevant applications, particularly to natural language processing (learning distributed representations for language modelling and compositional semantics) and genetics (modelling genetic variations arising from population, genealogical and spatial structures).
Max ERC Funding
1 918 092 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym BIGCODE
Project Learning from Big Code: Probabilistic Models, Analysis and Synthesis
Researcher (PI) Martin Vechev
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary The goal of this proposal is to fundamentally change the way we build and reason about software. We aim to develop new kinds of statistical programming systems that provide probabilistically likely solutions to tasks that are difficult or impossible to solve with traditional approaches.
These statistical programming systems will be based on probabilistic models of massive codebases (also known as ``Big Code'') built via a combination of advanced programming languages and powerful machine learning and natural language processing techniques. To solve a particular challenge, a statistical programming system will query a probabilistic model, compute the most likely predictions, and present those to the developer.
Based on probabilistic models of ``Big Code'', we propose to investigate new statistical techniques in the context of three fundamental research directions: i) statistical program synthesis where we develop techniques that automatically synthesize and predict new programs, ii) statistical prediction of program properties where we develop new techniques that can predict important facts (e.g., types) about programs, and iii) statistical translation of programs where we investigate new techniques for statistical translation of programs (e.g., from one programming language to another, or to a natural language).
We believe the research direction outlined in this interdisciplinary proposal opens a new and exciting area of computer science. This area will combine sophisticated statistical learning and advanced programming language techniques for building the next-generation statistical programming systems.
We expect the results of this proposal to have an immediate impact upon millions of developers worldwide, triggering a paradigm shift in the way tomorrow's software is built, as well as a long-lasting impact on scientific fields such as machine learning, natural language processing, programming languages and software engineering.
Summary
The goal of this proposal is to fundamentally change the way we build and reason about software. We aim to develop new kinds of statistical programming systems that provide probabilistically likely solutions to tasks that are difficult or impossible to solve with traditional approaches.
These statistical programming systems will be based on probabilistic models of massive codebases (also known as ``Big Code'') built via a combination of advanced programming languages and powerful machine learning and natural language processing techniques. To solve a particular challenge, a statistical programming system will query a probabilistic model, compute the most likely predictions, and present those to the developer.
Based on probabilistic models of ``Big Code'', we propose to investigate new statistical techniques in the context of three fundamental research directions: i) statistical program synthesis where we develop techniques that automatically synthesize and predict new programs, ii) statistical prediction of program properties where we develop new techniques that can predict important facts (e.g., types) about programs, and iii) statistical translation of programs where we investigate new techniques for statistical translation of programs (e.g., from one programming language to another, or to a natural language).
We believe the research direction outlined in this interdisciplinary proposal opens a new and exciting area of computer science. This area will combine sophisticated statistical learning and advanced programming language techniques for building the next-generation statistical programming systems.
We expect the results of this proposal to have an immediate impact upon millions of developers worldwide, triggering a paradigm shift in the way tomorrow's software is built, as well as a long-lasting impact on scientific fields such as machine learning, natural language processing, programming languages and software engineering.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-04-01, End date: 2021-03-31
Project acronym BIMPC
Project Biologically-Inspired Massively-Parallel Computation
Researcher (PI) Stephen Byram Furber
Host Institution (HI) THE UNIVERSITY OF MANCHESTER
Call Details Advanced Grant (AdG), PE6, ERC-2012-ADG_20120216
Summary "We aim to establish a world-leading research capability in Europe for advancing novel models of asynchronous computation based upon principles inspired by brain function. This work will accelerate progress towards an understanding of how the potential of brain-inspired many-core architectures may be harnessed. The results will include new brain-inspired models of asynchronous computation and new brain- inspired approaches to fault-tolerance and reliability in complex computer systems.
Many-core processors are now established as the way forward for computing from embedded systems to supercomputers. An emerging problem with leading-edge silicon technology is a reduction in the yield and reliability of modern processors due to high variability in the manufacture of the components and interconnect as transistor geometries shrink towards atomic scales. We are faced with the longstanding problem of how to make use of a potentially large array of parallel processors, but with the new constraint that the individual elements are the system are inherently unreliable.
The human brain remains as one of the great frontiers of science – how does this organ upon which we all depend so critically actually do its job? A great deal is known about the underlying technology – the neuron – and we can observe large-scale brain activity through techniques such as magnetic resonance imaging, but this knowledge barely starts to tell us how the brain works. Something is happening at the intermediate levels of processing that we have yet to begin to understand, but the essence of the brain's massively-parallel information processing capabilities and robustness to component failure lies in these intermediate levels.
These two issues draws us towards two high-level research questions:
• Can our growing understanding of brain function point the way to more efficient parallel, fault-tolerant computing?
• Can massively parallel computing resources accelerate our understanding of brain function"
Summary
"We aim to establish a world-leading research capability in Europe for advancing novel models of asynchronous computation based upon principles inspired by brain function. This work will accelerate progress towards an understanding of how the potential of brain-inspired many-core architectures may be harnessed. The results will include new brain-inspired models of asynchronous computation and new brain- inspired approaches to fault-tolerance and reliability in complex computer systems.
Many-core processors are now established as the way forward for computing from embedded systems to supercomputers. An emerging problem with leading-edge silicon technology is a reduction in the yield and reliability of modern processors due to high variability in the manufacture of the components and interconnect as transistor geometries shrink towards atomic scales. We are faced with the longstanding problem of how to make use of a potentially large array of parallel processors, but with the new constraint that the individual elements are the system are inherently unreliable.
The human brain remains as one of the great frontiers of science – how does this organ upon which we all depend so critically actually do its job? A great deal is known about the underlying technology – the neuron – and we can observe large-scale brain activity through techniques such as magnetic resonance imaging, but this knowledge barely starts to tell us how the brain works. Something is happening at the intermediate levels of processing that we have yet to begin to understand, but the essence of the brain's massively-parallel information processing capabilities and robustness to component failure lies in these intermediate levels.
These two issues draws us towards two high-level research questions:
• Can our growing understanding of brain function point the way to more efficient parallel, fault-tolerant computing?
• Can massively parallel computing resources accelerate our understanding of brain function"
Max ERC Funding
2 399 761 €
Duration
Start date: 2013-03-01, End date: 2018-02-28
Project acronym Bio-Phononics
Project Advanced Microfluidics & Diagnostics using Acoustic Holograms – Bio-Phononics
Researcher (PI) Jonathan Cooper
Host Institution (HI) UNIVERSITY OF GLASGOW
Call Details Advanced Grant (AdG), LS7, ERC-2013-ADG
Summary This proposal seeks to develop a novel technique for fluid and particle manipulations, based upon exploiting the mechanical interactions between acoustic waves and phononic. The new platform involves generating surface acoustic waves (SAWs) on piezoelectric chips, but, unlike previous work, the ultrasonic waves are first coupled into a phononic lattice, which is placed in the path of the ultrasonic wave. The phononic lattice comprises a miniaturised array of mechanical elements which modulates the sound in a manner analogous to how light is “patterned” using a hologram. However, whilst in an optical hologram, the pattern is created by exploiting the differences in refractive indices of the elements of the structure, here the ultrasonic field is modulated both by the elastic contrast between the elements in the array, as well as by the dimensions of the array and its surrounding matrix (including the size and pitch of the features within the array). The result of passing the acoustic wave through a phononic crystal is the formation of new and complex ultrasonic landscapes.
As part of the proposed work we aim to understand the physics of this technology and to exploit its development in a range of medical devices. We will show that by using phononic crystals it is possible to create highly controllable patterns of acoustic field intensities, which propagate into the fluid, creating pressure differences that result in unique flow patterns to enable a new platform for including biological sample processing, medical diagnostics, drug delivery and blood clotting devices – all on low cost disposable devices. Different frequencies of ultrasound will interact with different phononic structures to give different functions, providing a toolbox of different functions. Just as in electronics, where discrete components are combined to create circuits, so we propose to combine different phononic lattices to create fluidic microcircuits with important new applications.
Summary
This proposal seeks to develop a novel technique for fluid and particle manipulations, based upon exploiting the mechanical interactions between acoustic waves and phononic. The new platform involves generating surface acoustic waves (SAWs) on piezoelectric chips, but, unlike previous work, the ultrasonic waves are first coupled into a phononic lattice, which is placed in the path of the ultrasonic wave. The phononic lattice comprises a miniaturised array of mechanical elements which modulates the sound in a manner analogous to how light is “patterned” using a hologram. However, whilst in an optical hologram, the pattern is created by exploiting the differences in refractive indices of the elements of the structure, here the ultrasonic field is modulated both by the elastic contrast between the elements in the array, as well as by the dimensions of the array and its surrounding matrix (including the size and pitch of the features within the array). The result of passing the acoustic wave through a phononic crystal is the formation of new and complex ultrasonic landscapes.
As part of the proposed work we aim to understand the physics of this technology and to exploit its development in a range of medical devices. We will show that by using phononic crystals it is possible to create highly controllable patterns of acoustic field intensities, which propagate into the fluid, creating pressure differences that result in unique flow patterns to enable a new platform for including biological sample processing, medical diagnostics, drug delivery and blood clotting devices – all on low cost disposable devices. Different frequencies of ultrasound will interact with different phononic structures to give different functions, providing a toolbox of different functions. Just as in electronics, where discrete components are combined to create circuits, so we propose to combine different phononic lattices to create fluidic microcircuits with important new applications.
Max ERC Funding
2 208 594 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym BiocatSusChem
Project Biocatalysis for Sustainable Chemistry – Understanding Oxidation/Reduction of Small Molecules by Redox Metalloenzymes via a Suite of Steady State and Transient Infrared Electrochemical Methods
Researcher (PI) Kylie VINCENT
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Consolidator Grant (CoG), PE4, ERC-2018-COG
Summary Many significant global challenges in catalysis for energy and sustainable chemistry have already been solved in nature. Metalloenzymes within microorganisms catalyse the transformation of carbon dioxide into simple carbon building blocks or fuels, the reduction of dinitrogen to ammonia under ambient conditions and the production and utilisation of dihydrogen. Catalytic sites for these reactions are necessarily based on metals that are abundant in the environment, including iron, nickel and molybdenum. However, attempts to generate biomimetic catalysts have largely failed to reproduce the high activity, stability and selectivity of enzymes. Proton and electron transfer and substrate binding are all finely choreographed, and we do not yet understand how this is achieved. This project develops a suite of new experimental infrared (IR) spectroscopy tools to probe and understand mechanisms of redox metalloenzymes in situ during electrochemically-controlled steady state turnover, and during electron-transfer-triggered transient studies. The ability of IR spectroscopy to report on the nature and strength of chemical bonds makes it ideally suited to follow the activation and transformation of small molecule reactants at metalloenzyme catalytic sites, binding of inhibitors, and protonation of specific sites. By extending to the far-IR, or introducing mid-IR-active probe amino acids, redox and structural changes in biological electron relay chains also become accessible. Taking as models the enzymes nitrogenase, hydrogenase, carbon monoxide dehydrogenase and formate dehydrogenase, the project sets out to establish a unified understanding of central concepts in small molecule activation in biology. It will reveal precise ways in which chemical events are coordinated inside complex multicentre metalloenzymes, propelling a new generation of bio-inspired catalysts and uncovering new chemistry of enzymes.
Summary
Many significant global challenges in catalysis for energy and sustainable chemistry have already been solved in nature. Metalloenzymes within microorganisms catalyse the transformation of carbon dioxide into simple carbon building blocks or fuels, the reduction of dinitrogen to ammonia under ambient conditions and the production and utilisation of dihydrogen. Catalytic sites for these reactions are necessarily based on metals that are abundant in the environment, including iron, nickel and molybdenum. However, attempts to generate biomimetic catalysts have largely failed to reproduce the high activity, stability and selectivity of enzymes. Proton and electron transfer and substrate binding are all finely choreographed, and we do not yet understand how this is achieved. This project develops a suite of new experimental infrared (IR) spectroscopy tools to probe and understand mechanisms of redox metalloenzymes in situ during electrochemically-controlled steady state turnover, and during electron-transfer-triggered transient studies. The ability of IR spectroscopy to report on the nature and strength of chemical bonds makes it ideally suited to follow the activation and transformation of small molecule reactants at metalloenzyme catalytic sites, binding of inhibitors, and protonation of specific sites. By extending to the far-IR, or introducing mid-IR-active probe amino acids, redox and structural changes in biological electron relay chains also become accessible. Taking as models the enzymes nitrogenase, hydrogenase, carbon monoxide dehydrogenase and formate dehydrogenase, the project sets out to establish a unified understanding of central concepts in small molecule activation in biology. It will reveal precise ways in which chemical events are coordinated inside complex multicentre metalloenzymes, propelling a new generation of bio-inspired catalysts and uncovering new chemistry of enzymes.
Max ERC Funding
1 997 286 €
Duration
Start date: 2019-03-01, End date: 2024-02-29
Project acronym BioDisOrder
Project Order and Disorder at the Surface of Biological Membranes.
Researcher (PI) Alfonso DE SIMONE
Host Institution (HI) IMPERIAL COLLEGE OF SCIENCE TECHNOLOGY AND MEDICINE
Call Details Consolidator Grant (CoG), PE4, ERC-2018-COG
Summary Heterogeneous biomolecular mechanisms at the surface of cellular membranes are often fundamental to generate function and dysfunction in living systems. These processes are governed by transient and dynamical macromolecular interactions that pose tremendous challenges to current analytical tools, as the majority of these methods perform best in the study of well-defined and poorly dynamical systems. This proposal aims at a radical innovation in the characterisation of complex processes that are dominated by structural order and disorder, including those occurring at the surface of biological membranes such as cellular signalling, the assembly of molecular machinery, or the regulation vesicular trafficking.
I outline a programme to realise a vision where the combination of experiments and theory can delineate a new analytical platform to study complex biochemical mechanisms at a multiscale level, and to elucidate their role in physiological and pathological contexts. To achieve this ambitious goal, my research team will develop tools based on the combination of nuclear magnetic resonance (NMR) spectroscopy and molecular simulations, which will enable probing the structure, dynamics, thermodynamics and kinetics of complex protein-protein and protein-membrane interactions occurring at the surface of cellular membranes. The ability to advance both the experimental and theoretical sides, and their combination, is fundamental to define the next generation of methods to achieve our transformative aims. We will provide evidence of the innovative nature of the proposed multiscale approach by addressing some of the great questions in neuroscience and elucidate the details of how functional and aberrant biological complexity is achieved via the fine tuning between structural order and disorder at the neuronal synapse.
Summary
Heterogeneous biomolecular mechanisms at the surface of cellular membranes are often fundamental to generate function and dysfunction in living systems. These processes are governed by transient and dynamical macromolecular interactions that pose tremendous challenges to current analytical tools, as the majority of these methods perform best in the study of well-defined and poorly dynamical systems. This proposal aims at a radical innovation in the characterisation of complex processes that are dominated by structural order and disorder, including those occurring at the surface of biological membranes such as cellular signalling, the assembly of molecular machinery, or the regulation vesicular trafficking.
I outline a programme to realise a vision where the combination of experiments and theory can delineate a new analytical platform to study complex biochemical mechanisms at a multiscale level, and to elucidate their role in physiological and pathological contexts. To achieve this ambitious goal, my research team will develop tools based on the combination of nuclear magnetic resonance (NMR) spectroscopy and molecular simulations, which will enable probing the structure, dynamics, thermodynamics and kinetics of complex protein-protein and protein-membrane interactions occurring at the surface of cellular membranes. The ability to advance both the experimental and theoretical sides, and their combination, is fundamental to define the next generation of methods to achieve our transformative aims. We will provide evidence of the innovative nature of the proposed multiscale approach by addressing some of the great questions in neuroscience and elucidate the details of how functional and aberrant biological complexity is achieved via the fine tuning between structural order and disorder at the neuronal synapse.
Max ERC Funding
1 999 945 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym BIOIONS
Project Biological ions in the gas-phase: New techniques for structural characterization of isolated biomolecular ions
Researcher (PI) Caroline Dessent
Host Institution (HI) UNIVERSITY OF YORK
Call Details Starting Grant (StG), PE4, ERC-2007-StG
Summary Recent intensive research on the laser spectroscopy of neutral gas-phase biomolecules has yielded a detailed picture of their structures and conformational preferences away from the complications of the bulk environment. In contrast, work on ionic systems has been sparse despite the fact that many important molecular groups are charged under physiological conditions. To address this probelm, we have developed a custom-built laser spectrometer, which incorporates a distincitive electrospray ionisation (ESI) cluster ion source, dedicated to producing biological anions (ATP,oligonucleotides) and their microsolvated clusters for structural characterization. Many previous laser spectrometers with ESI sources have suffered from producing "hot" congested spectra as the ions were produced at ambient temperatures. This is a particularly serious limitation for spectroscopic studies of biomolecules, since these systems can possess high internal energies due tothe presence of numerous low frequency modes. Our spectrometer overcomes this problem by exploiting the newly developed physics technique of "buffer gas cooling" to produce cold ESI molecular ions. In this proposal, we now seek to exploit the new laser-spectrometer to perform detailed spectroscopic interrogations of ESI generated biomolecular anions and clusters. In addition to traditional ion-dissociation spectroscopies, we propose to develop two new laser spectroscopy techniques (Two-color tuneable IR spectroscopy and Dipole-bound excited state spectroscopy) to give the broadest possible structural characterizations of the systems of interest. Studies will focus on ATP/GTP-anions, olignonucleotides, and sulphated and carboxylated sugars. These methodologies will provide a general approach for performing temperature-controlled spectroscopic characterizations of isolated biological ions, with measurements on the corresponding micro-solvated clusters providing details of how the molecules are perturbed by solvent.
Summary
Recent intensive research on the laser spectroscopy of neutral gas-phase biomolecules has yielded a detailed picture of their structures and conformational preferences away from the complications of the bulk environment. In contrast, work on ionic systems has been sparse despite the fact that many important molecular groups are charged under physiological conditions. To address this probelm, we have developed a custom-built laser spectrometer, which incorporates a distincitive electrospray ionisation (ESI) cluster ion source, dedicated to producing biological anions (ATP,oligonucleotides) and their microsolvated clusters for structural characterization. Many previous laser spectrometers with ESI sources have suffered from producing "hot" congested spectra as the ions were produced at ambient temperatures. This is a particularly serious limitation for spectroscopic studies of biomolecules, since these systems can possess high internal energies due tothe presence of numerous low frequency modes. Our spectrometer overcomes this problem by exploiting the newly developed physics technique of "buffer gas cooling" to produce cold ESI molecular ions. In this proposal, we now seek to exploit the new laser-spectrometer to perform detailed spectroscopic interrogations of ESI generated biomolecular anions and clusters. In addition to traditional ion-dissociation spectroscopies, we propose to develop two new laser spectroscopy techniques (Two-color tuneable IR spectroscopy and Dipole-bound excited state spectroscopy) to give the broadest possible structural characterizations of the systems of interest. Studies will focus on ATP/GTP-anions, olignonucleotides, and sulphated and carboxylated sugars. These methodologies will provide a general approach for performing temperature-controlled spectroscopic characterizations of isolated biological ions, with measurements on the corresponding micro-solvated clusters providing details of how the molecules are perturbed by solvent.
Max ERC Funding
1 250 000 €
Duration
Start date: 2008-10-01, End date: 2015-06-30
Project acronym BIOMOL. SIMULATION
Project Development of multi-scale molecular models, force fields and computer software for biomolecular simulation
Researcher (PI) Willem Frederik Van Gunsteren
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Advanced Grant (AdG), PE4, ERC-2008-AdG
Summary During the past decades the PI has helped shape the research field of computer simulation of biomolecular systems at the atomic level. He has carried out one of the first molecular dynamics (MD) simulations of proteins, and has since then contributed many different methodological improvements and developed one of the major atomic-level force fields for simulations of proteins, carbohydrates, nucleotides and lipids. Methodology and force field have been implemented in a set of programs called GROMOS (GROningen MOlecular Simulation package), which is currently used in hundreds of academic and industrial research groups from over 50 countries on all continents. It is proposed to develop a next generation of molecular models, force fields, multi-scaling simulation methodology and software for biomolecular simulations which is at least an order of magnitude more accurate in terms of energetics, and which is 1000 times more efficient through the use of coarse-grained molecular models than the currently available software and models.
Summary
During the past decades the PI has helped shape the research field of computer simulation of biomolecular systems at the atomic level. He has carried out one of the first molecular dynamics (MD) simulations of proteins, and has since then contributed many different methodological improvements and developed one of the major atomic-level force fields for simulations of proteins, carbohydrates, nucleotides and lipids. Methodology and force field have been implemented in a set of programs called GROMOS (GROningen MOlecular Simulation package), which is currently used in hundreds of academic and industrial research groups from over 50 countries on all continents. It is proposed to develop a next generation of molecular models, force fields, multi-scaling simulation methodology and software for biomolecular simulations which is at least an order of magnitude more accurate in terms of energetics, and which is 1000 times more efficient through the use of coarse-grained molecular models than the currently available software and models.
Max ERC Funding
1 320 000 €
Duration
Start date: 2008-11-01, End date: 2014-09-30
Project acronym BIONET
Project Network Topology Complements Genome as a Source of Biological Information
Researcher (PI) Natasa Przulj
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary Genetic sequences have had an enormous impact on our understanding of biology. The expectation is that biological network data will have a similar impact. However, progress is hindered by a lack of sophisticated graph theoretic tools that will mine these large networked datasets.
In recent breakthrough work at the boundary of computer science and biology supported by my USA NSF CAREER award, I developed sensitive network analysis, comparison and embedding tools which demonstrated that protein-protein interaction networks of eukaryotes are best modeled by geometric graphs. Also, they established phenotypically validated, unprecedented link between network topology and biological function and disease. Now I propose to substantially extend these preliminary results and design sensitive and robust network alignment methods that will lead to uncovering unknown biology and evolutionary relationships. The potential ground-breaking impact of such network alignment tools could be parallel to the impact the BLAST family of sequence alignment tools that have revolutionized our understanding of biological systems and therapeutics. Furthermore, I propose to develop additional sophisticated graph theoretic techniques to mine network data and hence complement biological information that can be extracted from sequence. I propose to exploit these new techniques for biological applications in collaboration with experimentalists at Imperial College London: 1. aligning biological networks of species whose genomes are closely related, but that have very different phenotypes, in order to uncover systems-level factors that contribute to pronounced differences; 2. compare and contrast stress response pathways and metabolic pathways in bacteria in a unified systems-level framework and exploit the findings for: (a) bioengineering of micro-organisms for industrial applications (production of bio-fuels, bioremediation, production of biopolymers); (b) biomedical applications.
Summary
Genetic sequences have had an enormous impact on our understanding of biology. The expectation is that biological network data will have a similar impact. However, progress is hindered by a lack of sophisticated graph theoretic tools that will mine these large networked datasets.
In recent breakthrough work at the boundary of computer science and biology supported by my USA NSF CAREER award, I developed sensitive network analysis, comparison and embedding tools which demonstrated that protein-protein interaction networks of eukaryotes are best modeled by geometric graphs. Also, they established phenotypically validated, unprecedented link between network topology and biological function and disease. Now I propose to substantially extend these preliminary results and design sensitive and robust network alignment methods that will lead to uncovering unknown biology and evolutionary relationships. The potential ground-breaking impact of such network alignment tools could be parallel to the impact the BLAST family of sequence alignment tools that have revolutionized our understanding of biological systems and therapeutics. Furthermore, I propose to develop additional sophisticated graph theoretic techniques to mine network data and hence complement biological information that can be extracted from sequence. I propose to exploit these new techniques for biological applications in collaboration with experimentalists at Imperial College London: 1. aligning biological networks of species whose genomes are closely related, but that have very different phenotypes, in order to uncover systems-level factors that contribute to pronounced differences; 2. compare and contrast stress response pathways and metabolic pathways in bacteria in a unified systems-level framework and exploit the findings for: (a) bioengineering of micro-organisms for industrial applications (production of bio-fuels, bioremediation, production of biopolymers); (b) biomedical applications.
Max ERC Funding
1 638 175 €
Duration
Start date: 2012-01-01, End date: 2017-12-31
Project acronym BioNet
Project Dynamical Redesign of Biomolecular Networks
Researcher (PI) Edina ROSTA
Host Institution (HI) KING'S COLLEGE LONDON
Call Details Starting Grant (StG), PE4, ERC-2017-STG
Summary Enzymes created by Nature are still more selective and can be orders of magnitude more efficient than man-made catalysts, in spite of recent advances in the design of de novo catalysts and in enzyme redesign. The optimal engineering of either small molecular or of complex biological catalysts requires both (i) accurate quantitative computational methods capable of a priori assessing catalytic efficiency, and (ii) molecular design principles and corresponding algorithms to achieve, understand and control biomolecular catalytic function and mechanisms. Presently, the computational design of biocatalysts is challenging due to the need for accurate yet computationally-intensive quantum mechanical calculations of bond formation and cleavage, as well as to the requirement for proper statistical sampling over very many degrees of freedom. Pioneering enhanced sampling and analysis methods have been developed to address crucial challenges bridging the gap between the available simulation length and the biologically relevant timescales. However, biased simulations do not generally permit the direct calculation of kinetic information. Recently, I and others pioneered simulation tools that can enable not only accurate calculations of free energies, but also of the intrinsic molecular kinetics and the underlying reaction mechanisms as well. I propose to develop more robust, automatic, and system-tailored sampling algorithms that are optimal in each case. I will use our kinetics-based methods to develop a novel theoretical framework to address catalytic efficiency and to establish molecular design principles to key design problems for new bio-inspired nanocatalysts, and to identify and characterize small molecule modulators of enzyme activity. This is a highly interdisciplinary project that will enable fundamental advances in molecular simulations and will unveil the physical principles that will lead to design and control of catalysis with Nature-like efficiency.
Summary
Enzymes created by Nature are still more selective and can be orders of magnitude more efficient than man-made catalysts, in spite of recent advances in the design of de novo catalysts and in enzyme redesign. The optimal engineering of either small molecular or of complex biological catalysts requires both (i) accurate quantitative computational methods capable of a priori assessing catalytic efficiency, and (ii) molecular design principles and corresponding algorithms to achieve, understand and control biomolecular catalytic function and mechanisms. Presently, the computational design of biocatalysts is challenging due to the need for accurate yet computationally-intensive quantum mechanical calculations of bond formation and cleavage, as well as to the requirement for proper statistical sampling over very many degrees of freedom. Pioneering enhanced sampling and analysis methods have been developed to address crucial challenges bridging the gap between the available simulation length and the biologically relevant timescales. However, biased simulations do not generally permit the direct calculation of kinetic information. Recently, I and others pioneered simulation tools that can enable not only accurate calculations of free energies, but also of the intrinsic molecular kinetics and the underlying reaction mechanisms as well. I propose to develop more robust, automatic, and system-tailored sampling algorithms that are optimal in each case. I will use our kinetics-based methods to develop a novel theoretical framework to address catalytic efficiency and to establish molecular design principles to key design problems for new bio-inspired nanocatalysts, and to identify and characterize small molecule modulators of enzyme activity. This is a highly interdisciplinary project that will enable fundamental advances in molecular simulations and will unveil the physical principles that will lead to design and control of catalysis with Nature-like efficiency.
Max ERC Funding
1 499 999 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym BioProbe
Project "VERTICAL MICROFLUIDIC PROBE: A nanoliter ""Swiss army knife"" for chemistry and physics at biological interfaces"
Researcher (PI) Govindkrishna Govind Kaigala
Host Institution (HI) IBM RESEARCH GMBH
Call Details Starting Grant (StG), LS7, ERC-2012-StG_20111109
Summary Life is fundamentally characterised by order, compartmentalisation and biochemical reactions, which occurs at the right place right time – within, on the surface and between cells. Only a proportion of life processes can be addressed with contemporary approaches like liquid encapsulations (e.g. droplets) or engineering compartments (e.g. scaffolds). I believe these approaches are severely limited. I am convinced that a technique to study, work and locally probe adherent cells & tissues at micrometer distances from cell surfaces in “open space” would represent a major advance for the biology of biointerfaces. I therefore propose a non-contact, scanning technology, which spatially confines nanoliter volumes of chemicals for interacting with cells at the µm-length scale. This technology called the vertical microfluidic probe (vMFP) – that I developed at IBM-Zurich – shapes liquid on surfaces hydrodynamically and is compatible with samples on Petri dishes & microtiter plates. The project is organized in 4 themes:
(1) Advancing the vMFP by understanding the interaction of liquid flows with biointerfaces, integrating functional elements (e.g. heaters/electrodes, cell traps) & precision control.
(2) Developing a higher resolution method to stain tissue sections for multiple markers & better quality information.
(3) Retrieving rare elements such as circulating tumor cells from biologically diverse libraries.
(4) Patterning cells for applications in regenerative medicine.
Since cells & tissues will no longer be limited by closed systems, the vMFP will enable a completely new range of experiments to be performed in a highly interactive, versatile & precise manner – this approach departs from classical “closed” microfluidics. It is very likely that such a tool by providing multifunctional capabilities akin to the proverbial ‘Swiss army knife’ will be a unique facilitator for investigations of previously unapproachable problems in cell biology & the life science.
Summary
Life is fundamentally characterised by order, compartmentalisation and biochemical reactions, which occurs at the right place right time – within, on the surface and between cells. Only a proportion of life processes can be addressed with contemporary approaches like liquid encapsulations (e.g. droplets) or engineering compartments (e.g. scaffolds). I believe these approaches are severely limited. I am convinced that a technique to study, work and locally probe adherent cells & tissues at micrometer distances from cell surfaces in “open space” would represent a major advance for the biology of biointerfaces. I therefore propose a non-contact, scanning technology, which spatially confines nanoliter volumes of chemicals for interacting with cells at the µm-length scale. This technology called the vertical microfluidic probe (vMFP) – that I developed at IBM-Zurich – shapes liquid on surfaces hydrodynamically and is compatible with samples on Petri dishes & microtiter plates. The project is organized in 4 themes:
(1) Advancing the vMFP by understanding the interaction of liquid flows with biointerfaces, integrating functional elements (e.g. heaters/electrodes, cell traps) & precision control.
(2) Developing a higher resolution method to stain tissue sections for multiple markers & better quality information.
(3) Retrieving rare elements such as circulating tumor cells from biologically diverse libraries.
(4) Patterning cells for applications in regenerative medicine.
Since cells & tissues will no longer be limited by closed systems, the vMFP will enable a completely new range of experiments to be performed in a highly interactive, versatile & precise manner – this approach departs from classical “closed” microfluidics. It is very likely that such a tool by providing multifunctional capabilities akin to the proverbial ‘Swiss army knife’ will be a unique facilitator for investigations of previously unapproachable problems in cell biology & the life science.
Max ERC Funding
1 488 600 €
Duration
Start date: 2013-01-01, End date: 2017-12-31
Project acronym BRCA-ERC
Project Understanding cancer development in BRCA 1/2 mutation carriers for improved Early detection and Risk Control
Researcher (PI) Martin WIDSCHWENDTER
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Advanced Grant (AdG), LS7, ERC-2016-ADG
Summary Recent evidence demonstrates that cancer is overtaking cardiovascular disease as the number one cause of mortality in Europe. This is largely due to the lack of preventative measures for common (e.g. breast) or highly fatal (e.g. ovarian) human cancers. Most cancers are multifactorial in origin. The core hypothesis of this research programme is that the extremely high risk of BRCA1/2 germline mutation carriers to develop breast and ovarian cancer is a net consequence of cell-autonomous (direct effect of BRCA mutation in cells at risk) and cell non-autonomous (produced in distant organs and affecting organs at risk) factors which both trigger epigenetic, cancer-initiating effects.
The project’s aims are centered around the principles of systems medicine and built on a large cohort of BRCA mutation carriers and controls who will be offered newly established cancer screening programmes. We will uncover how ‘cell non-autonomous’ factors work, provide detail on the epigenetic changes in at-risk tissues and investigate whether these changes are mechanistically linked to cancer, study whether we can neutralise this process and measure success in the organs at risk, and ideally in easy to access samples such as blood, buccal and cervical cells.
In my Department for Women’s Cancer we have assembled a powerful interdisciplinary team including computational biologists, functionalists, immunologists and clinician scientists linked to leading patient advocacy groups which is extremely well placed to lead this pioneering project to develop the fundamental understanding of cancer development in women with BRCA mutations. To reset the epigenome, re-establishing normal cell identity and consequently reducing cancer risk without the need for surgery and being able to monitor the efficacy using multicellular epigenetic outcome predictors will be a major scientific and medical breakthrough and possibly applicable to other chronic diseases.
Summary
Recent evidence demonstrates that cancer is overtaking cardiovascular disease as the number one cause of mortality in Europe. This is largely due to the lack of preventative measures for common (e.g. breast) or highly fatal (e.g. ovarian) human cancers. Most cancers are multifactorial in origin. The core hypothesis of this research programme is that the extremely high risk of BRCA1/2 germline mutation carriers to develop breast and ovarian cancer is a net consequence of cell-autonomous (direct effect of BRCA mutation in cells at risk) and cell non-autonomous (produced in distant organs and affecting organs at risk) factors which both trigger epigenetic, cancer-initiating effects.
The project’s aims are centered around the principles of systems medicine and built on a large cohort of BRCA mutation carriers and controls who will be offered newly established cancer screening programmes. We will uncover how ‘cell non-autonomous’ factors work, provide detail on the epigenetic changes in at-risk tissues and investigate whether these changes are mechanistically linked to cancer, study whether we can neutralise this process and measure success in the organs at risk, and ideally in easy to access samples such as blood, buccal and cervical cells.
In my Department for Women’s Cancer we have assembled a powerful interdisciplinary team including computational biologists, functionalists, immunologists and clinician scientists linked to leading patient advocacy groups which is extremely well placed to lead this pioneering project to develop the fundamental understanding of cancer development in women with BRCA mutations. To reset the epigenome, re-establishing normal cell identity and consequently reducing cancer risk without the need for surgery and being able to monitor the efficacy using multicellular epigenetic outcome predictors will be a major scientific and medical breakthrough and possibly applicable to other chronic diseases.
Max ERC Funding
2 497 841 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym BroadSem
Project Induction of Broad-Coverage Semantic Parsers
Researcher (PI) Ivan Titov
Host Institution (HI) THE UNIVERSITY OF EDINBURGH
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary In the last one or two decades, language technology has achieved a number of important successes, for example, producing functional machine translation systems and beating humans in quiz games. The key bottleneck which prevents further progress in these and many other natural language processing (NLP) applications (e.g., text summarization, information retrieval, opinion mining, dialog and tutoring systems) is the lack of accurate methods for producing meaning representations of texts. Accurately predicting such meaning representations on an open domain with an automatic parser is a challenging and unsolved problem, primarily because of language variability and ambiguity. The reason for the unsatisfactory performance is reliance on supervised learning (learning from annotated resources), with the amounts of annotation required for accurate open-domain parsing exceeding what is practically feasible. Moreover, representations defined in these resources typically do not provide abstractions suitable for reasoning.
In this project, we will induce semantic representations from large amounts of unannotated data (i.e. text which has not been labeled by humans) while guided by information contained in human-annotated data and other forms of linguistic knowledge. This will allow us to scale our approach to many domains and across languages. We will specialize meaning representations for reasoning by modeling relations (e.g., facts) appearing across sentences in texts (document-level modeling), across different texts, and across texts and knowledge bases. Learning to predict this linked data is closely related to learning to reason, including learning the notions of semantic equivalence and entailment. We will jointly induce semantic parsers (e.g., log-linear feature-rich models) and reasoning models (latent factor models) relying on this data, thus, ensuring that the semantic representations are informative for applications requiring reasoning.
Summary
In the last one or two decades, language technology has achieved a number of important successes, for example, producing functional machine translation systems and beating humans in quiz games. The key bottleneck which prevents further progress in these and many other natural language processing (NLP) applications (e.g., text summarization, information retrieval, opinion mining, dialog and tutoring systems) is the lack of accurate methods for producing meaning representations of texts. Accurately predicting such meaning representations on an open domain with an automatic parser is a challenging and unsolved problem, primarily because of language variability and ambiguity. The reason for the unsatisfactory performance is reliance on supervised learning (learning from annotated resources), with the amounts of annotation required for accurate open-domain parsing exceeding what is practically feasible. Moreover, representations defined in these resources typically do not provide abstractions suitable for reasoning.
In this project, we will induce semantic representations from large amounts of unannotated data (i.e. text which has not been labeled by humans) while guided by information contained in human-annotated data and other forms of linguistic knowledge. This will allow us to scale our approach to many domains and across languages. We will specialize meaning representations for reasoning by modeling relations (e.g., facts) appearing across sentences in texts (document-level modeling), across different texts, and across texts and knowledge bases. Learning to predict this linked data is closely related to learning to reason, including learning the notions of semantic equivalence and entailment. We will jointly induce semantic parsers (e.g., log-linear feature-rich models) and reasoning models (latent factor models) relying on this data, thus, ensuring that the semantic representations are informative for applications requiring reasoning.
Max ERC Funding
1 457 185 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym BUNGEE-TOOLS
Project Building Next-Generation Computational Tools for High Resolution Neuroimaging Studies
Researcher (PI) Juan Eugenio Iglesias
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary Recent advances in magnetic resonance (MR) acquisition technology are providing us with images of the human brain of increasing detail and resolution. While these images hold promise to greatly increase our understanding of such a complex organ, the neuroimaging community relies on tools (e.g. SPM, FSL, FreeSurfer) which, being over a decade old, were designed to work at much lower resolutions. These tools do not consider brain substructures that are visible in present-day scans, and this inability to capitalize on the vast improvement of MR is hampering progress in the neuroimaging field.
In this ambitious project, which lies at the nexus of medical histology, neuroscience, biomedical imaging, computer vision and statistics, we propose to build a set of next-generation computational tools that will enable neuroimaging studies to take full advantage of the increased resolution of modern MR technology. The core of the tools will be an ultra-high resolution probabilistic atlas of the human brain, built upon multimodal data combining from histology and ex vivo MR. The resulting atlas will be used to analyze in vivo brain MR scans, which will require the development of Bayesian segmentation methods beyond the state of the art.
The developed tools, which will be made freely available to the scientific community, will enable the analysis of MR data at a superior level of structural detail, opening completely new opportunities of research in neuroscience. Therefore, we expect the tools to have a tremendous impact on the quest to understand the human brain (in health and in disease), and ultimately on public health and the economy.
Summary
Recent advances in magnetic resonance (MR) acquisition technology are providing us with images of the human brain of increasing detail and resolution. While these images hold promise to greatly increase our understanding of such a complex organ, the neuroimaging community relies on tools (e.g. SPM, FSL, FreeSurfer) which, being over a decade old, were designed to work at much lower resolutions. These tools do not consider brain substructures that are visible in present-day scans, and this inability to capitalize on the vast improvement of MR is hampering progress in the neuroimaging field.
In this ambitious project, which lies at the nexus of medical histology, neuroscience, biomedical imaging, computer vision and statistics, we propose to build a set of next-generation computational tools that will enable neuroimaging studies to take full advantage of the increased resolution of modern MR technology. The core of the tools will be an ultra-high resolution probabilistic atlas of the human brain, built upon multimodal data combining from histology and ex vivo MR. The resulting atlas will be used to analyze in vivo brain MR scans, which will require the development of Bayesian segmentation methods beyond the state of the art.
The developed tools, which will be made freely available to the scientific community, will enable the analysis of MR data at a superior level of structural detail, opening completely new opportunities of research in neuroscience. Therefore, we expect the tools to have a tremendous impact on the quest to understand the human brain (in health and in disease), and ultimately on public health and the economy.
Max ERC Funding
1 450 075 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym C-POS
Project Children's Palliative care Outcome Scale
Researcher (PI) RICHARD HARDING-SWALE
Host Institution (HI) KING'S COLLEGE LONDON
Call Details Consolidator Grant (CoG), LS7, ERC-2017-COG
Summary Person-centred care is a core health value of modern health care. The overarching aim of C-POS (Children's Palliative care Outcome Scale) is to develop and validate a person-centred outcome measure for children, young people (CYP) and their families affected by life-limiting & life-threatening conditions (LLLTC). International systematic reviews, and clinical guides have highlighted that currently none exists. This novel study will draw together a unique multidisciplinary collaboration to pioneer new methods, enabling engagement in outcome measurement by a population currently neglected in research.
C-POS builds on an international program of work. The sequential mixed methods will collect substantive data through objectives to determine i) the primary concerns of CYP and their families affected by LLLTC & preferences to enable participation in ethical person-centred measurement (n=50); ii) view of clinicians and commissioners on optimal implementation methods (national Delphi study); iii) a systematic review of current data collection tools for CYP regardless of condition; iv) integration of objectives i-iii to develop a tool (C-POS) with face and content validity; v) cognitive interviews to determine interpretability (n=40); vi) longitudinal cohort of CYP and families to determine test-retest reliability, internal consistency, construct validity and responsiveness (n=151); vii) development of resources for routine implementation viii) translation and interpretation protocols for international adoption.
C-POS is an ambitious study that, for the first time, will enable measurement of person-centred outcomes of care. This will be a turning point in the scientific study of a hitherto neglected group.
Summary
Person-centred care is a core health value of modern health care. The overarching aim of C-POS (Children's Palliative care Outcome Scale) is to develop and validate a person-centred outcome measure for children, young people (CYP) and their families affected by life-limiting & life-threatening conditions (LLLTC). International systematic reviews, and clinical guides have highlighted that currently none exists. This novel study will draw together a unique multidisciplinary collaboration to pioneer new methods, enabling engagement in outcome measurement by a population currently neglected in research.
C-POS builds on an international program of work. The sequential mixed methods will collect substantive data through objectives to determine i) the primary concerns of CYP and their families affected by LLLTC & preferences to enable participation in ethical person-centred measurement (n=50); ii) view of clinicians and commissioners on optimal implementation methods (national Delphi study); iii) a systematic review of current data collection tools for CYP regardless of condition; iv) integration of objectives i-iii to develop a tool (C-POS) with face and content validity; v) cognitive interviews to determine interpretability (n=40); vi) longitudinal cohort of CYP and families to determine test-retest reliability, internal consistency, construct validity and responsiveness (n=151); vii) development of resources for routine implementation viii) translation and interpretation protocols for international adoption.
C-POS is an ambitious study that, for the first time, will enable measurement of person-centred outcomes of care. This will be a turning point in the scientific study of a hitherto neglected group.
Max ERC Funding
1 799 820 €
Duration
Start date: 2018-09-01, End date: 2023-02-28
Project acronym CAN-IT-BARRIERS
Project Disruption of systemic and microenvironmental barriers to immunotherapy of antigenic tumors
Researcher (PI) Douglas HANAHAN
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Advanced Grant (AdG), LS7, ERC-2018-ADG
Summary The frontier in cancer therapy of orchestrating the immune system to attack tumors is producing unprecedented survival benefit in some patients. The corollary is lack of efficacy both in ostensibly responsive tumor types as well as others that are mostly non-responsive. The basis lies in pre-existing and adaptive resistance mechanisms that circumvent induction of tumor-reactive cytotoxic T cells (CTLs) capable of infiltrating solid tumors and eliminating cancer cells. A priori, cancers induced by expression of human papillomavirus oncogenes should be responsive to immunotherapy: these cancers encode immunogenic neo-antigens – the oncoproteins E6/7 – necessary for their manifestation. Rather, such tumors are poorly responsive to immunotherapies. Results from my lab and others using mouse models of HPV-induced cancer have established an actionable hypothesis: during tumorigenesis, such tumors erect multiple barriers to the induction, infiltration, and killing of cancer cells by tumor antigen-reactive CTLs. These include overarching systemic antigen-nonspecific immunosuppression mediated by expanded populations of myeloid cells in spleen and lymph nodes, complemented by immune response-impairing barriers operative in the tumor microenvironment. A spectrum of models will probe these barriers, genetically and pharmacologically, establishing their functional importance, alone and in concert. A major focus will be on how oncogene-expressing keratinocytes elicit a marked expansion of immunosuppressive myeloid cells in spleen and lymph nodes, and how these myeloid cells in turn inhibit development and activation of CD8 T cells and antigen-presenting dendritic cells. Then we’ll assess the therapeutic potential of barrier-breaking strategies combined with immuno-stimulatory modalities. This project will deliver new knowledge about multi-faceted barriers to immunotherapy in these refractory cancers, helping lay the groundwork for efficacious immunotherapy.
Summary
The frontier in cancer therapy of orchestrating the immune system to attack tumors is producing unprecedented survival benefit in some patients. The corollary is lack of efficacy both in ostensibly responsive tumor types as well as others that are mostly non-responsive. The basis lies in pre-existing and adaptive resistance mechanisms that circumvent induction of tumor-reactive cytotoxic T cells (CTLs) capable of infiltrating solid tumors and eliminating cancer cells. A priori, cancers induced by expression of human papillomavirus oncogenes should be responsive to immunotherapy: these cancers encode immunogenic neo-antigens – the oncoproteins E6/7 – necessary for their manifestation. Rather, such tumors are poorly responsive to immunotherapies. Results from my lab and others using mouse models of HPV-induced cancer have established an actionable hypothesis: during tumorigenesis, such tumors erect multiple barriers to the induction, infiltration, and killing of cancer cells by tumor antigen-reactive CTLs. These include overarching systemic antigen-nonspecific immunosuppression mediated by expanded populations of myeloid cells in spleen and lymph nodes, complemented by immune response-impairing barriers operative in the tumor microenvironment. A spectrum of models will probe these barriers, genetically and pharmacologically, establishing their functional importance, alone and in concert. A major focus will be on how oncogene-expressing keratinocytes elicit a marked expansion of immunosuppressive myeloid cells in spleen and lymph nodes, and how these myeloid cells in turn inhibit development and activation of CD8 T cells and antigen-presenting dendritic cells. Then we’ll assess the therapeutic potential of barrier-breaking strategies combined with immuno-stimulatory modalities. This project will deliver new knowledge about multi-faceted barriers to immunotherapy in these refractory cancers, helping lay the groundwork for efficacious immunotherapy.
Max ERC Funding
2 500 000 €
Duration
Start date: 2020-01-01, End date: 2024-12-31
Project acronym CANCEREVO
Project Deciphering and predicting the evolution of cancer cell populations
Researcher (PI) Marco Helmut GERLINGER
Host Institution (HI) THE INSTITUTE OF CANCER RESEARCH: ROYAL CANCER HOSPITAL
Call Details Consolidator Grant (CoG), LS7, ERC-2018-COG
Summary The fundamental evolutionary nature of cancer is well recognized but an understanding of the dynamic evolutionary changes occurring throughout a tumour’s lifetime and their clinical implications is in its infancy. Current approaches to reveal cancer evolution by sequencing of multiple biopsies remain of limited use in the clinic due to sample access problems in multi-metastatic disease. Circulating tumour DNA (ctDNA) is thought to comprehensively sample subclones across metastatic sites. However, available technologies either have high sensitivity but are restricted to the analysis of small gene panels or they allow sequencing of large target regions such as exomes but with too limited sensitivity to detect rare subclones. We developed a novel error corrected sequencing technology that will be applied to perform deep exome sequencing on longitudinal ctDNA samples from highly heterogeneous metastatic gastro-oesophageal carcinomas. This will track the evolution of the entire cancer population over the lifetime of these tumours, from metastatic disease over drug therapy to end-stage disease and enable ground breaking insights into cancer population evolution rules and mechanisms. Specifically, we will: 1. Define the genomic landscape and drivers of metastatic and end stage disease. 2. Understand the rules of cancer evolutionary dynamics of entire cancer cell populations. 3. Predict cancer evolution and define the limits of predictability. 4. Rapidly identify drug resistance mechanisms to chemo- and immunotherapy based on signals of Darwinian selection such as parallel and convergent evolution. Our sequencing technology and analysis framework will also transform the way cancer evolution metrics can be accessed and interpreted in the clinic which will have major impacts, ranging from better biomarkers to predict cancer evolution to the identification of drug targets that drive disease progression and therapy resistance.
Summary
The fundamental evolutionary nature of cancer is well recognized but an understanding of the dynamic evolutionary changes occurring throughout a tumour’s lifetime and their clinical implications is in its infancy. Current approaches to reveal cancer evolution by sequencing of multiple biopsies remain of limited use in the clinic due to sample access problems in multi-metastatic disease. Circulating tumour DNA (ctDNA) is thought to comprehensively sample subclones across metastatic sites. However, available technologies either have high sensitivity but are restricted to the analysis of small gene panels or they allow sequencing of large target regions such as exomes but with too limited sensitivity to detect rare subclones. We developed a novel error corrected sequencing technology that will be applied to perform deep exome sequencing on longitudinal ctDNA samples from highly heterogeneous metastatic gastro-oesophageal carcinomas. This will track the evolution of the entire cancer population over the lifetime of these tumours, from metastatic disease over drug therapy to end-stage disease and enable ground breaking insights into cancer population evolution rules and mechanisms. Specifically, we will: 1. Define the genomic landscape and drivers of metastatic and end stage disease. 2. Understand the rules of cancer evolutionary dynamics of entire cancer cell populations. 3. Predict cancer evolution and define the limits of predictability. 4. Rapidly identify drug resistance mechanisms to chemo- and immunotherapy based on signals of Darwinian selection such as parallel and convergent evolution. Our sequencing technology and analysis framework will also transform the way cancer evolution metrics can be accessed and interpreted in the clinic which will have major impacts, ranging from better biomarkers to predict cancer evolution to the identification of drug targets that drive disease progression and therapy resistance.
Max ERC Funding
2 000 000 €
Duration
Start date: 2019-03-01, End date: 2024-02-29
Project acronym CancerExomesInPlasma
Project Non-invasive genomic analysis of cancer using circulating tumour DNA
Researcher (PI) Nitzan Rosenfeld
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Starting Grant (StG), LS7, ERC-2013-StG
Summary Non-invasive genomic analysis of cancer can revolutionize the study of tumour evolution, heterogeneity, and drug resistance. Clinically applied, this can transform current practice in cancer diagnosis and management. Cell-free DNA in plasma contains tumour-specific sequences. This circulating tumour DNA (ctDNA) is a promising source of genomic and diagnostic information, readily accessible non-invasively. The study of ctDNA is therefore timely and of great importance. But it is also very challenging. Measurement can be complex, and high-quality samples are not easily obtained. Though progress has been made, much remains to be discovered.
My lab pioneered the use of targeted sequencing to analyse mutations in ctDNA. We recently developed a ground-breaking paradigm for analysing evolving cancer genomes in plasma DNA, combining ctDNA quantification with exome-sequencing of serial plasma samples. Applied to extensive sets of clinical samples my lab has characterized, this will enable large-scale exploration of acquired drug resistance with unprecedented resolution. CancerExomesInPlasma aims to use ctDNA for genome-wide analysis of tumour evolution, as a means for non-invasive, unbiased discovery of genes and pathways involved in resistance to cancer therapy.
Summary
Non-invasive genomic analysis of cancer can revolutionize the study of tumour evolution, heterogeneity, and drug resistance. Clinically applied, this can transform current practice in cancer diagnosis and management. Cell-free DNA in plasma contains tumour-specific sequences. This circulating tumour DNA (ctDNA) is a promising source of genomic and diagnostic information, readily accessible non-invasively. The study of ctDNA is therefore timely and of great importance. But it is also very challenging. Measurement can be complex, and high-quality samples are not easily obtained. Though progress has been made, much remains to be discovered.
My lab pioneered the use of targeted sequencing to analyse mutations in ctDNA. We recently developed a ground-breaking paradigm for analysing evolving cancer genomes in plasma DNA, combining ctDNA quantification with exome-sequencing of serial plasma samples. Applied to extensive sets of clinical samples my lab has characterized, this will enable large-scale exploration of acquired drug resistance with unprecedented resolution. CancerExomesInPlasma aims to use ctDNA for genome-wide analysis of tumour evolution, as a means for non-invasive, unbiased discovery of genes and pathways involved in resistance to cancer therapy.
Max ERC Funding
1 769 380 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym CANCERINNOVATION
Project Using novel methodologies to target and image cancer invasion and therapeutic resistance
Researcher (PI) Margaret Frame
Host Institution (HI) THE UNIVERSITY OF EDINBURGH
Call Details Advanced Grant (AdG), LS7, ERC-2011-ADG_20110310
Summary We aim to develop and apply a suite of new technologies in a novel cancer discovery platform that will link high-definition cancer biology, via state-of-the-art disease imaging and pathway modelling, with development of novel interrogative and therapeutic interventions to test in models of cancer that closely resemble human disease. The work will lead to a new understanding of cancer invasion, how to treat advanced disease in the metastatic niche, how to monitor therapeutic responses and the compensatory mechanisms that cause acquired resistance. Platform development will be based on combined, cross-informing technologies that will enable us to predict optimal ‘maintenance therapies’ for metastatic disease by targeting cancer evolution and spread through combination therapy. A key strand of the platform is the development of quantitative multi-modal imaging in vivo by use of optical window technology to inform detailed understanding of disease and drug mechanisms and predictive capability of pathway biomarkers. Innovative methodologies are urgently needed to address declining approval rates of novel medicines and the unmet clinical needs of treating cancer patients in the advanced disease setting, where tumour spread and survival generally continues unchecked by current therapies. This work will be largely pre-clinical, but will always be mindful of the clinical problem in managing late stage human disease through rationale design of combination therapies with companion diagnostic tests. The cancer survival statistics will be changed if we can curb continuing spread of aggressive, metastatic disease and resistance to therapy by taking smarter combined approaches that make best use of emerging technologies in an innovative way, particularly where they are more predictive of clinical efficacy.
Summary
We aim to develop and apply a suite of new technologies in a novel cancer discovery platform that will link high-definition cancer biology, via state-of-the-art disease imaging and pathway modelling, with development of novel interrogative and therapeutic interventions to test in models of cancer that closely resemble human disease. The work will lead to a new understanding of cancer invasion, how to treat advanced disease in the metastatic niche, how to monitor therapeutic responses and the compensatory mechanisms that cause acquired resistance. Platform development will be based on combined, cross-informing technologies that will enable us to predict optimal ‘maintenance therapies’ for metastatic disease by targeting cancer evolution and spread through combination therapy. A key strand of the platform is the development of quantitative multi-modal imaging in vivo by use of optical window technology to inform detailed understanding of disease and drug mechanisms and predictive capability of pathway biomarkers. Innovative methodologies are urgently needed to address declining approval rates of novel medicines and the unmet clinical needs of treating cancer patients in the advanced disease setting, where tumour spread and survival generally continues unchecked by current therapies. This work will be largely pre-clinical, but will always be mindful of the clinical problem in managing late stage human disease through rationale design of combination therapies with companion diagnostic tests. The cancer survival statistics will be changed if we can curb continuing spread of aggressive, metastatic disease and resistance to therapy by taking smarter combined approaches that make best use of emerging technologies in an innovative way, particularly where they are more predictive of clinical efficacy.
Max ERC Funding
2 499 000 €
Duration
Start date: 2012-08-01, End date: 2017-07-31
Project acronym CAPER/BREAST CANCE
Project CAPER in Invasive Breast Cancer
Researcher (PI) Michael Lisanti
Host Institution (HI) THE UNIVERSITY OF MANCHESTER
Call Details Advanced Grant (AdG), LS7, ERC-2008-AdG
Summary Breast cancer is a major cause of death in the United States and the Western World. Advanced medical technologies and therapeutic strategies are necessary for the successful detection, diagnosis, and treatment of breast cancer. Here, we propose to use novel technologies (tissue microarrays (TMA) and automated quantivative bioimaging (AQUA)) to identify new therapeutic and prognostic markers for human breast cancer. More specifically, we will study the activation status of a new signaling pathway which we have implicated in breast cancer pathogenesis, using both mouse animal models and cells in culture. For this purpose, we will study the association of CAPER expression with pre-malignant lesions and progression from pre-malignancy to full-blown breast cancer. We expect that this new molecular marker will allow us to improve diagnostic accuracy for individual patients, enhancing both the prognostic predictions as well as the prediction of drug responsiveness for a given patient.
Summary
Breast cancer is a major cause of death in the United States and the Western World. Advanced medical technologies and therapeutic strategies are necessary for the successful detection, diagnosis, and treatment of breast cancer. Here, we propose to use novel technologies (tissue microarrays (TMA) and automated quantivative bioimaging (AQUA)) to identify new therapeutic and prognostic markers for human breast cancer. More specifically, we will study the activation status of a new signaling pathway which we have implicated in breast cancer pathogenesis, using both mouse animal models and cells in culture. For this purpose, we will study the association of CAPER expression with pre-malignant lesions and progression from pre-malignancy to full-blown breast cancer. We expect that this new molecular marker will allow us to improve diagnostic accuracy for individual patients, enhancing both the prognostic predictions as well as the prediction of drug responsiveness for a given patient.
Max ERC Funding
1 500 000 €
Duration
Start date: 2010-01-01, End date: 2014-12-31
Project acronym CAPRI
Project Chemical and photochemical dynamics of reactions in solution
Researcher (PI) Andrew John Orr-Ewing
Host Institution (HI) UNIVERSITY OF BRISTOL
Call Details Advanced Grant (AdG), PE4, ERC-2011-ADG_20110209
Summary Ultrafast laser methods will be employed to examine the dynamics of chemical and photochemical reactions in liquid solutions. By contrasting the solution phase dynamics with those observed for isolated collisions in the gas phase, the fundamental role of solvent on chemical pathways will be explored at a molecular level. The experimental studies will be complemented by computational simulations that explicitly include treatment of the effects of solvent on reaction energy pathways and reactant and product motions.
The research addresses a major challenge in Chemistry to understand the role of solvent on the mechanisms of chemical reactions. Questions that will be examined include how the solvent modifies reaction barriers and other regions of the reaction potential energy surface (PESs), alters the couplings between PESs, most importantly at conical intersections between electronic states, influences and constrains the dynamical stereochemistry of passage through transition states, and dissipates excess product energy.
The experimental strategy will be to obtain absorption spectra of transient species with lifetimes of ~100 fs – 1000 ps using broad bandwidth light sources in the infrared, visible and ultraviolet regions. Time-evolutions of such spectra reveal the formation and decay of short-lived species that might be highly reactive radicals or internally (vibrationally and electronically) excited molecules. The transient species decay by reaction or energy loss to the solvent. Statistical mechanical theories of reactions in solution treat such processes using linear response theory, but the experimental data will challenge this paradigm by seeking evidence for breakdown of the linear response interaction of solvent and solute on short timescales because of microscopic chemical dynamics that perturb the solvent structure. The work will build on our pioneering experiments at the Rutherford Appleton Laboratory that prove the feasilbility of the methods.
Summary
Ultrafast laser methods will be employed to examine the dynamics of chemical and photochemical reactions in liquid solutions. By contrasting the solution phase dynamics with those observed for isolated collisions in the gas phase, the fundamental role of solvent on chemical pathways will be explored at a molecular level. The experimental studies will be complemented by computational simulations that explicitly include treatment of the effects of solvent on reaction energy pathways and reactant and product motions.
The research addresses a major challenge in Chemistry to understand the role of solvent on the mechanisms of chemical reactions. Questions that will be examined include how the solvent modifies reaction barriers and other regions of the reaction potential energy surface (PESs), alters the couplings between PESs, most importantly at conical intersections between electronic states, influences and constrains the dynamical stereochemistry of passage through transition states, and dissipates excess product energy.
The experimental strategy will be to obtain absorption spectra of transient species with lifetimes of ~100 fs – 1000 ps using broad bandwidth light sources in the infrared, visible and ultraviolet regions. Time-evolutions of such spectra reveal the formation and decay of short-lived species that might be highly reactive radicals or internally (vibrationally and electronically) excited molecules. The transient species decay by reaction or energy loss to the solvent. Statistical mechanical theories of reactions in solution treat such processes using linear response theory, but the experimental data will challenge this paradigm by seeking evidence for breakdown of the linear response interaction of solvent and solute on short timescales because of microscopic chemical dynamics that perturb the solvent structure. The work will build on our pioneering experiments at the Rutherford Appleton Laboratory that prove the feasilbility of the methods.
Max ERC Funding
2 666 684 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym CAPRI
Project Children and Adolescents with PaRental mental Illness: Understanding the ‘who’ and ‘how’ of targeting interventions
Researcher (PI) Kathryn Mary Francis Abel
Host Institution (HI) THE UNIVERSITY OF MANCHESTER
Call Details Consolidator Grant (CoG), LS7, ERC-2015-CoG
Summary At least 10% of mothers and 5% of fathers have a mental illness. Family, educational and social lives of children and adolescents with parental mental illness (CAPRI) are disrupted by deprivation and repeated hospitalisation. This is an urgent political and public health concern. The Child and Adolescent Mental Health in Europe (CAMHEE) report urges us ‘to acknowledge and attend to the needs of children and families with parental mental health... ’ recommending better information on CAPRI risks and resilience so interventions can target those at highest risk. This groundbreaking interdisciplinary programme exploits my unique combination of expertise in epidemiology and neuroscience to deliver on CAMHEE objectives for CAPRI.
Previous work focuses on these ‘high risk’ children primarily to examine mental illness heritability. In a crucial departure from this, Work Packages (WP) 1 and 2 exploit my collaborations in Sweden and Australia to create unique linkage across 3 population datasets. This will detail CAPRI numbers and a broad range of life outcomes disentangling effects of social adversity over time. But population epidemiology alone cannot reveal how risk creates effects in individuals. To understand ‘how’ to identify ‘who’ we target for costly interventions, WP 3 links the epidemiology with powerful neuroimaging (near infrared spectroscopy NIRS) to discover which at-risk infants of mothers with severe mental illness show abnormal cognitive development at the level of individual brain.
This work capitalises on my role at the University of Manchester, one of the leading academic psychiatry and imaging centres in the UK, to create a new Centre in Bioepidemiology. My future aim is that epidemiological profiling combined with NIRS biomarkers of cognition in individuals will identify which high risk children need what intervention. Future work can then evaluate different interventions and fits seamlessly with my research goal to improve the life outcomes of CAPRI.
Summary
At least 10% of mothers and 5% of fathers have a mental illness. Family, educational and social lives of children and adolescents with parental mental illness (CAPRI) are disrupted by deprivation and repeated hospitalisation. This is an urgent political and public health concern. The Child and Adolescent Mental Health in Europe (CAMHEE) report urges us ‘to acknowledge and attend to the needs of children and families with parental mental health... ’ recommending better information on CAPRI risks and resilience so interventions can target those at highest risk. This groundbreaking interdisciplinary programme exploits my unique combination of expertise in epidemiology and neuroscience to deliver on CAMHEE objectives for CAPRI.
Previous work focuses on these ‘high risk’ children primarily to examine mental illness heritability. In a crucial departure from this, Work Packages (WP) 1 and 2 exploit my collaborations in Sweden and Australia to create unique linkage across 3 population datasets. This will detail CAPRI numbers and a broad range of life outcomes disentangling effects of social adversity over time. But population epidemiology alone cannot reveal how risk creates effects in individuals. To understand ‘how’ to identify ‘who’ we target for costly interventions, WP 3 links the epidemiology with powerful neuroimaging (near infrared spectroscopy NIRS) to discover which at-risk infants of mothers with severe mental illness show abnormal cognitive development at the level of individual brain.
This work capitalises on my role at the University of Manchester, one of the leading academic psychiatry and imaging centres in the UK, to create a new Centre in Bioepidemiology. My future aim is that epidemiological profiling combined with NIRS biomarkers of cognition in individuals will identify which high risk children need what intervention. Future work can then evaluate different interventions and fits seamlessly with my research goal to improve the life outcomes of CAPRI.
Max ERC Funding
1 999 338 €
Duration
Start date: 2016-10-01, End date: 2021-09-30