Project acronym CELLFITNESS
Project Active Mechanisms of Cell Selection: From Cell Competition to Cell Fitness
Researcher (PI) Eduardo Moreno Lampaya
Host Institution (HI) FUNDACAO D. ANNA SOMMER CHAMPALIMAUD E DR. CARLOS MONTEZ CHAMPALIMAUD
Call Details Consolidator Grant (CoG), LS3, ERC-2013-CoG
Summary The molecular mechanisms that mediate cell competition, cell fitness and cell selection is gaining interest. With innovative approaches, molecules and ground-breaking hypothesis, this field of research can help understand several biological processes such as development, cancer and tissue degeneration. The project has 3 clear and ambitious objectives: 1. We propose to identify all the key genes mediating cell competition and their molecular mechanisms. In order to reach this objective we will use data from two whole genome screens in Drosophila where we have identified 7 key genes. By the end of this CoG grant, we should have no big gaps in our knowledge of how slow dividing cells are recognised and eliminated in Drosophila. 2. In addition, we will explore how general the cell competition pathways are and how they can impact biomedical research, with a focus in cancer and tissue degeneration. The interest in cancer is based on experiments in Drosophila and mice where we and others have found that an active process of cell selection determines tumour growth. Preliminary results suggest that the pathways identified do not only play important roles in the elimination of slow dividing cells, but also during cancer initiation and progression. 3. We will further explore the role of cell competition in neuronal selection, specially during neurodegeneration, development of the retina and adult brain regeneration in Drosophila. This proposal is of an interdisciplinary nature because it takes a basic cellular mechanism (the genetic pathways that select cells within tissues) and crosses boundaries between different fields of research: development, cancer, regeneration and tissue degeneration. In this ERC CoG proposal, we are committed to continue our efforts from basic science to biomedical approaches. The phenomena of cell competition and its participating genes have the potential to discover novel biomarkers and therapeutic strategies against cancer and tissue degeneration.
Summary
The molecular mechanisms that mediate cell competition, cell fitness and cell selection is gaining interest. With innovative approaches, molecules and ground-breaking hypothesis, this field of research can help understand several biological processes such as development, cancer and tissue degeneration. The project has 3 clear and ambitious objectives: 1. We propose to identify all the key genes mediating cell competition and their molecular mechanisms. In order to reach this objective we will use data from two whole genome screens in Drosophila where we have identified 7 key genes. By the end of this CoG grant, we should have no big gaps in our knowledge of how slow dividing cells are recognised and eliminated in Drosophila. 2. In addition, we will explore how general the cell competition pathways are and how they can impact biomedical research, with a focus in cancer and tissue degeneration. The interest in cancer is based on experiments in Drosophila and mice where we and others have found that an active process of cell selection determines tumour growth. Preliminary results suggest that the pathways identified do not only play important roles in the elimination of slow dividing cells, but also during cancer initiation and progression. 3. We will further explore the role of cell competition in neuronal selection, specially during neurodegeneration, development of the retina and adult brain regeneration in Drosophila. This proposal is of an interdisciplinary nature because it takes a basic cellular mechanism (the genetic pathways that select cells within tissues) and crosses boundaries between different fields of research: development, cancer, regeneration and tissue degeneration. In this ERC CoG proposal, we are committed to continue our efforts from basic science to biomedical approaches. The phenomena of cell competition and its participating genes have the potential to discover novel biomarkers and therapeutic strategies against cancer and tissue degeneration.
Max ERC Funding
1 968 062 €
Duration
Start date: 2014-06-01, End date: 2019-05-31
Project acronym Danger ATP
Project Regulation of inflammatory response by extracellular ATP and P2X7 receptor signalling: through and beyond the inflammasome
Researcher (PI) Pablo Pelegrin Vivancos
Host Institution (HI) FUNDACION PARA LA FORMACION E INVESTIGACION SANITARIAS DE LA REGION DE MURCIA
Call Details Consolidator Grant (CoG), LS6, ERC-2013-CoG
Summary Inflammatory diseases affect over 80 million people worldwide and accompany many diseases of industrialized countries, being the majority of them infection-free conditions. There are few efficient anti-inflammatory drugs to treat chronic inflammation and thus, there is an urgent need to validate novel targets. We now know that innate immunity is the main coordinator and driver of inflammation. Recently, we and others have shown that the activation of purinergic P2X7 receptors (P2X7R) in immune cells is a novel and increasingly validated pathway to initiate inflammation through the activation of the NLRP3 inflammasome and the release of IL-1β and IL-18 cytokines. However, how NLRP3 sense P2X7R activation is not fully understood. Furthermore, extracellular ATP, the physiological P2X7R agonist, is a crucial danger signal released by injured cells, and one of the most important mediators of infection-free inflammation. We have also identified novel signalling roles for P2X7R independent on the NLRP3 inflammasome, including the release of proteases or inflammatory lipids. Therefore, P2X7R has generated increasing interest as a therapeutic target in inflammatory diseases, being drug like P2X7R antagonist in clinical trials to treat inflammatory diseases. However, it is often questioned the functionality of P2X7R in vivo, where it is thought that extracellular ATP levels are below the threshold to activate P2X7R. The overall significance of this proposal relays to elucidate how extracellular ATP controls host-defence in vivo, ultimately depicting P2X7R signalling through and beyond inflammasome activation. We foresee that our results will generate a leading innovative knowledge about in vivo extracellular ATP signalling during the host response to infection and sterile danger.
Summary
Inflammatory diseases affect over 80 million people worldwide and accompany many diseases of industrialized countries, being the majority of them infection-free conditions. There are few efficient anti-inflammatory drugs to treat chronic inflammation and thus, there is an urgent need to validate novel targets. We now know that innate immunity is the main coordinator and driver of inflammation. Recently, we and others have shown that the activation of purinergic P2X7 receptors (P2X7R) in immune cells is a novel and increasingly validated pathway to initiate inflammation through the activation of the NLRP3 inflammasome and the release of IL-1β and IL-18 cytokines. However, how NLRP3 sense P2X7R activation is not fully understood. Furthermore, extracellular ATP, the physiological P2X7R agonist, is a crucial danger signal released by injured cells, and one of the most important mediators of infection-free inflammation. We have also identified novel signalling roles for P2X7R independent on the NLRP3 inflammasome, including the release of proteases or inflammatory lipids. Therefore, P2X7R has generated increasing interest as a therapeutic target in inflammatory diseases, being drug like P2X7R antagonist in clinical trials to treat inflammatory diseases. However, it is often questioned the functionality of P2X7R in vivo, where it is thought that extracellular ATP levels are below the threshold to activate P2X7R. The overall significance of this proposal relays to elucidate how extracellular ATP controls host-defence in vivo, ultimately depicting P2X7R signalling through and beyond inflammasome activation. We foresee that our results will generate a leading innovative knowledge about in vivo extracellular ATP signalling during the host response to infection and sterile danger.
Max ERC Funding
1 794 948 €
Duration
Start date: 2014-09-01, End date: 2019-08-31
Project acronym EpiMechanism
Project Mechanisms of Chromatin-based Epigenetic Inheritance
Researcher (PI) Lars Jansen
Host Institution (HI) FUNDACAO CALOUSTE GULBENKIAN
Call Details Consolidator Grant (CoG), LS3, ERC-2013-CoG
Summary Epigenetic mechanisms heritably maintain gene expression states and chromosome organization across cell division. These include chromatin-based factors that are propagated independent of local DNA sequence elements, and are critical for normal development and prevent reprogramming, e.g. during induction of pluripotency. We focus on the role of nucleosomes, the histone-DNA complexes that make up chromatin. While prominently implicated in epigenetic memory, how histones and their local modifications can actually be inherited is largely unknown. We take aim at three fundamental aspects that we argue are central to this problem: stability of the epigenetic mark, self-templated duplication, and cell cycle coupling.
We developed a unique pulse-labeling strategy to determine whether silent and active chromatin can be inherited and how this relates to transcription, both in cancer cells and in vitro differentiating stem cells. By coupling this strategy to an imaging-based RNAi screen we aim to identify components controlling nucleosome assembly and heritability. We achieve this by focusing on the human centromere, the chromosome locus essential for chromosome segregation which serves as an ideal model for epigenetic memory. This locus is specified by nucleosomes carrying the histone H3 variant, CENP-A that we have previously shown to be highly stable in cycling cells and to be replicated in a strict cell cycle coupled manner. We build on our previous successes to uncover the molecular mechanism and cellular consequences of the coupling between CENP-A propagation and the cell cycle which we postulate, ensures proper centromere size and mitotic fidelity. Furthermore, by genome engineering we developed a strategy to delete an endogenous centromere to determine how centromeres can form de novo and how CENP-A chromatin, once formed, can template its own duplication. With this multi-facetted approach we aim to uncover general mechanistic principles of chromatin-based memory.
Summary
Epigenetic mechanisms heritably maintain gene expression states and chromosome organization across cell division. These include chromatin-based factors that are propagated independent of local DNA sequence elements, and are critical for normal development and prevent reprogramming, e.g. during induction of pluripotency. We focus on the role of nucleosomes, the histone-DNA complexes that make up chromatin. While prominently implicated in epigenetic memory, how histones and their local modifications can actually be inherited is largely unknown. We take aim at three fundamental aspects that we argue are central to this problem: stability of the epigenetic mark, self-templated duplication, and cell cycle coupling.
We developed a unique pulse-labeling strategy to determine whether silent and active chromatin can be inherited and how this relates to transcription, both in cancer cells and in vitro differentiating stem cells. By coupling this strategy to an imaging-based RNAi screen we aim to identify components controlling nucleosome assembly and heritability. We achieve this by focusing on the human centromere, the chromosome locus essential for chromosome segregation which serves as an ideal model for epigenetic memory. This locus is specified by nucleosomes carrying the histone H3 variant, CENP-A that we have previously shown to be highly stable in cycling cells and to be replicated in a strict cell cycle coupled manner. We build on our previous successes to uncover the molecular mechanism and cellular consequences of the coupling between CENP-A propagation and the cell cycle which we postulate, ensures proper centromere size and mitotic fidelity. Furthermore, by genome engineering we developed a strategy to delete an endogenous centromere to determine how centromeres can form de novo and how CENP-A chromatin, once formed, can template its own duplication. With this multi-facetted approach we aim to uncover general mechanistic principles of chromatin-based memory.
Max ERC Funding
1 621 400 €
Duration
Start date: 2014-06-01, End date: 2019-05-31
Project acronym Forecasting
Project New Methods and Applications for Forecast Evaluation
Researcher (PI) Barbara Rossi
Host Institution (HI) UNIVERSIDAD POMPEU FABRA
Call Details Consolidator Grant (CoG), SH1, ERC-2013-CoG
Summary Forecasting is a fundamental tool in Economics, Statistics, Business and other sciences. Judging whether forecasts are good and robust is of great importance since forecasts are used everyday to guide policymakers' and practitioners' decisions. The proposal aims at addressing four important issues that researchers encounter in practice.
A first issue is how to assess whether forecasts are optimal in the presence of instabilities. Optimality is an important property of models’ forecasts: if forecasts are not optimal, then the model can be improved. Existing methods to assess forecast optimality are not robust to the presence of instabilities, which are widespread in the data. How to obtain such robust methods and what they tell us about widely used economic models’ forecasts is the first task of this project.
A second problem faced by forecasters in practice is to evaluate density forecasts. Density forecasts are important tools for policymakers since they quantify uncertainty around forecasts. However, existing methodologies focus on a null hypothesis that is not necessarily the one of interest to the forecaster. The second task is to develop tests for forecast density evaluation that address forecasters’ needs.
A third, important question is “Why Do We Use Forecast Tests To Evaluate Models’ Performance?”. The third task of this project is to understand the relationship between traditional in-sample and forecast evaluation tests, and develop a framework that helps to understand under which circumstances forecast tests are more useful than typical in-sample tests.
A final question is how researchers can improve models that do not forecast well. Model misspecification is widespread, still economists are often left wondering exactly which parts of their models are misspecified. The fourth task is to propose an empirical framework for addressing this issue. By estimating time-varying wedges, we assess where misspecification is located, and how important it is.
Summary
Forecasting is a fundamental tool in Economics, Statistics, Business and other sciences. Judging whether forecasts are good and robust is of great importance since forecasts are used everyday to guide policymakers' and practitioners' decisions. The proposal aims at addressing four important issues that researchers encounter in practice.
A first issue is how to assess whether forecasts are optimal in the presence of instabilities. Optimality is an important property of models’ forecasts: if forecasts are not optimal, then the model can be improved. Existing methods to assess forecast optimality are not robust to the presence of instabilities, which are widespread in the data. How to obtain such robust methods and what they tell us about widely used economic models’ forecasts is the first task of this project.
A second problem faced by forecasters in practice is to evaluate density forecasts. Density forecasts are important tools for policymakers since they quantify uncertainty around forecasts. However, existing methodologies focus on a null hypothesis that is not necessarily the one of interest to the forecaster. The second task is to develop tests for forecast density evaluation that address forecasters’ needs.
A third, important question is “Why Do We Use Forecast Tests To Evaluate Models’ Performance?”. The third task of this project is to understand the relationship between traditional in-sample and forecast evaluation tests, and develop a framework that helps to understand under which circumstances forecast tests are more useful than typical in-sample tests.
A final question is how researchers can improve models that do not forecast well. Model misspecification is widespread, still economists are often left wondering exactly which parts of their models are misspecified. The fourth task is to propose an empirical framework for addressing this issue. By estimating time-varying wedges, we assess where misspecification is located, and how important it is.
Max ERC Funding
501 860 €
Duration
Start date: 2014-07-01, End date: 2019-06-30
Project acronym GALACTICNUCLEUS
Project The Fingerprint of a Galactic Nucleus: A Multi-Wavelength, High-Angular Resolution, Near-Infrared Study of the Centre of the Milky Way
Researcher (PI) Rainer Schödel
Host Institution (HI) AGENCIA ESTATAL CONSEJO SUPERIOR DEINVESTIGACIONES CIENTIFICAS
Call Details Consolidator Grant (CoG), PE9, ERC-2013-CoG
Summary Galactic stellar nuclei are very common in all types of galaxies and are marked by the presence of nuclear star clusters, the densest and most massive star clusters in the present-day Universe. Their formation is still an unresolved puzzle. The centre of the Milky Way contains a massive black hole and a stellar nucleus and is orders of magnitude closer than any comparable target. It is the only galactic nucleus and the most extreme astrophysical environment that we can examine on scales of milli-parsecs. It is therefore a crucial laboratory for studying galactic nuclei and their role in the context of galaxy evolution. Yet, suitable data that would allow us to examine the stellar component of the Galactic Centre exist for less than 1% of its projected area. Moreover, the well-explored regions are extraordinary, like the central parsec around the massive black hole, and therefore probably not representative for the overall environment. Fundamental questions on the stellar population, structure and assembly history of the Galactic Centre remain therefore unanswered. This project aims at addressing the open questions by obtaining accurate, high-angular resolution, multi-wavelength near-infrared photometry for an area of several 100 pc^2, a more than ten-fold increase compared to the current state of affairs. The Galactic Centre presents unique observational challenges because of a combination of high extinction and extreme stellar crowding. It is therefore not adequately covered by existing or upcoming imaging surveys. I present a project that is specifically tailored to overcome these observational challenges. In particular, I have developed a key technique to obtain the necessary sensitive, high-angular resolution images with a stable point spread function over large, crowded fields. It works with a range of existing ground-based instruments and will serve to complement existing data to provide a global and detailed picture of the stellar nucleus of the Milky Way.
Summary
Galactic stellar nuclei are very common in all types of galaxies and are marked by the presence of nuclear star clusters, the densest and most massive star clusters in the present-day Universe. Their formation is still an unresolved puzzle. The centre of the Milky Way contains a massive black hole and a stellar nucleus and is orders of magnitude closer than any comparable target. It is the only galactic nucleus and the most extreme astrophysical environment that we can examine on scales of milli-parsecs. It is therefore a crucial laboratory for studying galactic nuclei and their role in the context of galaxy evolution. Yet, suitable data that would allow us to examine the stellar component of the Galactic Centre exist for less than 1% of its projected area. Moreover, the well-explored regions are extraordinary, like the central parsec around the massive black hole, and therefore probably not representative for the overall environment. Fundamental questions on the stellar population, structure and assembly history of the Galactic Centre remain therefore unanswered. This project aims at addressing the open questions by obtaining accurate, high-angular resolution, multi-wavelength near-infrared photometry for an area of several 100 pc^2, a more than ten-fold increase compared to the current state of affairs. The Galactic Centre presents unique observational challenges because of a combination of high extinction and extreme stellar crowding. It is therefore not adequately covered by existing or upcoming imaging surveys. I present a project that is specifically tailored to overcome these observational challenges. In particular, I have developed a key technique to obtain the necessary sensitive, high-angular resolution images with a stable point spread function over large, crowded fields. It works with a range of existing ground-based instruments and will serve to complement existing data to provide a global and detailed picture of the stellar nucleus of the Milky Way.
Max ERC Funding
1 547 657 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym HAPDEGMT
Project Harmonic Analysis, Partial Differential Equations and Geometric Measure Theory
Researcher (PI) Jose Maria Martell Berrocal
Host Institution (HI) AGENCIA ESTATAL CONSEJO SUPERIOR DEINVESTIGACIONES CIENTIFICAS
Call Details Consolidator Grant (CoG), PE1, ERC-2013-CoG
Summary The origin of Harmonic Analysis goes back to the study of the heat diffusion, modeled by a differential equation, and the claim made by Fourier that every periodic function can be represented as a series of sines and cosines. In this statement we can find the motivation to many of the advances that have been made in this field. Partial Differential Equations model many phenomena from the natural, economic and social sciences. Existence, uniqueness, convergence to the boundary data, regularity of solutions, a priori estimates, etc., can be studied for a given PDE. Often, Harmonic Analysis plays an important role in such problems and, when the scenarios are not very friendly, Harmonic Analysis turns out to be fundamental. Not very friendly scenarios are those where one lacks of smoothness either in the coefficients of the PDE and/or in the domains where the PDE is solved. Some of these problems lead to obtain the boundedness of certain singular integral operators and this drives one to the classical and modern Calderón-Zygmund theory, the paradigm of Harmonic Analysis. When studying the behavior of the solutions of the given PDE near the boundary, one needs to understand the geometrical features of the domains and then Geometric Measure Theory jumps into the picture.
This ambitious project lies between the interface of three areas: Harmonic Analysis, PDE and Geometric Measure theory. It seeks deep results motivated by elliptic PDE using techniques from Harmonic Analysis and Geometric Measure Theory.This project is built upon results obtained by the applicant in these three areas. Some of them are very recent and have gone significantly beyond the state of the art. The methods to be used have been shown to be very robust and therefore they might be useful towards its applicability in other regimes. Crucial to this project is the use of Harmonic Analysis where the applicant has already obtained important contributions.
Summary
The origin of Harmonic Analysis goes back to the study of the heat diffusion, modeled by a differential equation, and the claim made by Fourier that every periodic function can be represented as a series of sines and cosines. In this statement we can find the motivation to many of the advances that have been made in this field. Partial Differential Equations model many phenomena from the natural, economic and social sciences. Existence, uniqueness, convergence to the boundary data, regularity of solutions, a priori estimates, etc., can be studied for a given PDE. Often, Harmonic Analysis plays an important role in such problems and, when the scenarios are not very friendly, Harmonic Analysis turns out to be fundamental. Not very friendly scenarios are those where one lacks of smoothness either in the coefficients of the PDE and/or in the domains where the PDE is solved. Some of these problems lead to obtain the boundedness of certain singular integral operators and this drives one to the classical and modern Calderón-Zygmund theory, the paradigm of Harmonic Analysis. When studying the behavior of the solutions of the given PDE near the boundary, one needs to understand the geometrical features of the domains and then Geometric Measure Theory jumps into the picture.
This ambitious project lies between the interface of three areas: Harmonic Analysis, PDE and Geometric Measure theory. It seeks deep results motivated by elliptic PDE using techniques from Harmonic Analysis and Geometric Measure Theory.This project is built upon results obtained by the applicant in these three areas. Some of them are very recent and have gone significantly beyond the state of the art. The methods to be used have been shown to be very robust and therefore they might be useful towards its applicability in other regimes. Crucial to this project is the use of Harmonic Analysis where the applicant has already obtained important contributions.
Max ERC Funding
1 429 790 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym HISTORICALDATABASE
Project The Swedish historical database project
Researcher (PI) Per Einar Pettersson Lidbom
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Consolidator Grant (CoG), SH1, ERC-2013-CoG
Summary The Swedish historical data base project will put together and make publicly available highly disaggregated data on roughly a yearly basis for about 2500 Swedish administrative districts over the period 1749-1952. The finished data set will consist of comprehensive and detailed information on economic activity, political characteristics, vital statistics, occupational structure, education, social and agriculture statistics and infrastructure investments (e.g., railway construction). The comprehensiveness and complete coverage of historical data at the local administrative level is what makes this project unique from an international perspective. Since Sweden has the longest continuous and reliable data series on population and vital statistics in the world,starting as early as 1749, makes it possible to construct a comprehensive panel data set over all 2,500 Swedish local administrative units covering a 200 year period. Consequently, the total number of observations for each variable can be as large as 0.5 million (N=2500×T=200). With this type of rich and disaggregated historical data it become possible to get a better understanding of economic growth, structural transformation and economic development. Also, within-country variation allows for more satisfying empirical identification strategies such as instrumental variables, regression discontinuities or difference-in-differences estimation. As a case in point, I have demonstrated the potential usefulness of the Swedish historical data by addressing the question of whether redistribution of resources towards the poor differs between types of democracy after democratization. The identification strategy is based on a regression-discontinuity design where the type of democracy partly is a function of population size. This paper is currently “revise and resubmit” 2nd round at Econometrica. After collecting the new data, we intend to studying a number of questions related to economic development and growth.
Summary
The Swedish historical data base project will put together and make publicly available highly disaggregated data on roughly a yearly basis for about 2500 Swedish administrative districts over the period 1749-1952. The finished data set will consist of comprehensive and detailed information on economic activity, political characteristics, vital statistics, occupational structure, education, social and agriculture statistics and infrastructure investments (e.g., railway construction). The comprehensiveness and complete coverage of historical data at the local administrative level is what makes this project unique from an international perspective. Since Sweden has the longest continuous and reliable data series on population and vital statistics in the world,starting as early as 1749, makes it possible to construct a comprehensive panel data set over all 2,500 Swedish local administrative units covering a 200 year period. Consequently, the total number of observations for each variable can be as large as 0.5 million (N=2500×T=200). With this type of rich and disaggregated historical data it become possible to get a better understanding of economic growth, structural transformation and economic development. Also, within-country variation allows for more satisfying empirical identification strategies such as instrumental variables, regression discontinuities or difference-in-differences estimation. As a case in point, I have demonstrated the potential usefulness of the Swedish historical data by addressing the question of whether redistribution of resources towards the poor differs between types of democracy after democratization. The identification strategy is based on a regression-discontinuity design where the type of democracy partly is a function of population size. This paper is currently “revise and resubmit” 2nd round at Econometrica. After collecting the new data, we intend to studying a number of questions related to economic development and growth.
Max ERC Funding
1 200 000 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym InanoMOF
Project Multifunctional micro- and nanostructures assembled from nanoscale metal-organic frameworks and inorganic nanoparticles
Researcher (PI) Daniel Maspoch Comamala
Host Institution (HI) FUNDACIO INSTITUT CATALA DE NANOCIENCIA I NANOTECNOLOGIA
Call Details Consolidator Grant (CoG), PE5, ERC-2013-CoG
Summary In InanoMOF, we aim to develop frontier Supramolecular and Nanochemistry methodologies for the synthesis of a novel class of structures via controlled assembly of nanoscale metal-organic frameworks (nanoMOFs) and inorganic nanoparticles (INPs). These methods will embody the premise that “controlled object-by-object nano-assembly is a ground-breaking approach to explore for producing systems of higher complexity with advanced functions”. The resulting hybrid nanoMOF@INPs will marry the unique properties of INPs (magnetism of iron oxide NPs and optics of Au NPs) to the functional porosity of MOFs.
The first part of InanoMOF encompasses the design, synthesis-assembly and characterisation of nanoMOF@INPs - advanced MOF-based sorbents that incorporate the functionality of the INPs used: magnetically controlled movement, in vivo detectability, enhanced biocompatibility and porosity, pollutant removal, or controlled sorption/delivery. The second part of InanoMOF entails studying the physicochemical properties of the synthesised nanoMOF@INPs and ascertaining their utility as drug-delivery/theranostic systems and as magnetic sorbents for pollutant removal. Specifically, we will study their stability in working media and determine their capacities for drug or pollutant sorption/delivery capacities. As proof-of-concept, we will study their toxicity in vitro and in vivo; enhancement of their in vitro therapeutic efficacy; and their capacity to remove pollutants (in real water and gasoline/diesel fuel samples) via magnetic assistance.
In InanoMOF we will endeavour to establish the synthetic bases for controlling the spatial ordering of nanoMOF crystals, whether alone or combined with other nanomaterials (e.g. INPs, graphene, etc.). We are confident that our work will ultimately enable researchers to create MOF-based composites having cooperative and synergistic properties and functions for myriad applications (e.g. heterogeneous catalysis, sensing and separation).
Summary
In InanoMOF, we aim to develop frontier Supramolecular and Nanochemistry methodologies for the synthesis of a novel class of structures via controlled assembly of nanoscale metal-organic frameworks (nanoMOFs) and inorganic nanoparticles (INPs). These methods will embody the premise that “controlled object-by-object nano-assembly is a ground-breaking approach to explore for producing systems of higher complexity with advanced functions”. The resulting hybrid nanoMOF@INPs will marry the unique properties of INPs (magnetism of iron oxide NPs and optics of Au NPs) to the functional porosity of MOFs.
The first part of InanoMOF encompasses the design, synthesis-assembly and characterisation of nanoMOF@INPs - advanced MOF-based sorbents that incorporate the functionality of the INPs used: magnetically controlled movement, in vivo detectability, enhanced biocompatibility and porosity, pollutant removal, or controlled sorption/delivery. The second part of InanoMOF entails studying the physicochemical properties of the synthesised nanoMOF@INPs and ascertaining their utility as drug-delivery/theranostic systems and as magnetic sorbents for pollutant removal. Specifically, we will study their stability in working media and determine their capacities for drug or pollutant sorption/delivery capacities. As proof-of-concept, we will study their toxicity in vitro and in vivo; enhancement of their in vitro therapeutic efficacy; and their capacity to remove pollutants (in real water and gasoline/diesel fuel samples) via magnetic assistance.
In InanoMOF we will endeavour to establish the synthetic bases for controlling the spatial ordering of nanoMOF crystals, whether alone or combined with other nanomaterials (e.g. INPs, graphene, etc.). We are confident that our work will ultimately enable researchers to create MOF-based composites having cooperative and synergistic properties and functions for myriad applications (e.g. heterogeneous catalysis, sensing and separation).
Max ERC Funding
1 942 665 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym IR-DC
Project Individual Robustness in Development and Cancer
Researcher (PI) Benjamin Lehner
Host Institution (HI) FUNDACIO CENTRE DE REGULACIO GENOMICA
Call Details Consolidator Grant (CoG), LS2, ERC-2013-CoG
Summary Biological systems are robust to perturbations, with many genetic, stochastic and environmental challenges having no or little phenotypic consequence. However, the extent of this robustness varies across individuals, for example the same mutation or treatment may only affect a subset of individuals. The overall objective of this project is to understand the cellular and molecular mechanisms that confer this robustness and why it varies across individuals.
We will address three specific questions:
1. Why do inherited mutations have different outcomes in different individuals, even when they are genetically identical and share a common environment?
2. What are the mechanisms during development that confer robustness to mechanical deformation?
3. How can the loss of robustness be exploited to specifically kill cancer cells?
To address the first two questions, we will use live imaging procedures that we have developed that make the C. elegans embryo a unique animal system to link early inter-individual variation in gene expression and cellular behaviour to later variation in phenotypes. To address the third question, we will apply our understanding of genetic robustness and genetic interaction networks in model organisms to the comprehensive analysis of cancer genome datasets. The predictions from these hypothesis-driven computational analyses will then be evaluated using wet-lab experiments.
Understanding and predicting variation in robustness is both a fundamental challenge for biology and one that is central to the development of personalised and predictive medicine. A patient does not want to know the typical outcome of a mutation or treatment; they want to know what will actually happen to them. The work outlined here will contribute to our basic understanding of robustness and its variation among individuals, and it will also directly tackle the problem of predicting and targeting variation in robustness as a strategy to kill tumour cells.
Summary
Biological systems are robust to perturbations, with many genetic, stochastic and environmental challenges having no or little phenotypic consequence. However, the extent of this robustness varies across individuals, for example the same mutation or treatment may only affect a subset of individuals. The overall objective of this project is to understand the cellular and molecular mechanisms that confer this robustness and why it varies across individuals.
We will address three specific questions:
1. Why do inherited mutations have different outcomes in different individuals, even when they are genetically identical and share a common environment?
2. What are the mechanisms during development that confer robustness to mechanical deformation?
3. How can the loss of robustness be exploited to specifically kill cancer cells?
To address the first two questions, we will use live imaging procedures that we have developed that make the C. elegans embryo a unique animal system to link early inter-individual variation in gene expression and cellular behaviour to later variation in phenotypes. To address the third question, we will apply our understanding of genetic robustness and genetic interaction networks in model organisms to the comprehensive analysis of cancer genome datasets. The predictions from these hypothesis-driven computational analyses will then be evaluated using wet-lab experiments.
Understanding and predicting variation in robustness is both a fundamental challenge for biology and one that is central to the development of personalised and predictive medicine. A patient does not want to know the typical outcome of a mutation or treatment; they want to know what will actually happen to them. The work outlined here will contribute to our basic understanding of robustness and its variation among individuals, and it will also directly tackle the problem of predicting and targeting variation in robustness as a strategy to kill tumour cells.
Max ERC Funding
1 996 812 €
Duration
Start date: 2014-06-01, End date: 2019-05-31
Project acronym LATTAL
Project The Latin Talmud and its Influence on Christian-Jewish Polemic
Researcher (PI) Alexander Fidora
Host Institution (HI) UNIVERSITAT AUTONOMA DE BARCELONA
Call Details Consolidator Grant (CoG), SH5, ERC-2013-CoG
Summary "While polemics and dialogue between Judaism and Christianity are as old as the Christian religion itself, one can clearly distinguish different periods, trends and intensities in the relations between the faiths. A significant landmark in this long and complex history is the Latin translation of large sections of the Talmud, the most important Jewish post-biblical text and the basis for the development of Rabbinic Judaism.
When during the 13th century Christian theologians started to examine and translate the Talmud from Hebrew and Aramaic into Latin, they were faced with a huge body of texts which represented centuries of legalistic and homiletic materials. The discovery of this immense post-biblical Jewish literature became a source of fascination for Christians who believed that this text, which encompasses every aspect of Jewish life, was fundamental both for refuting the Jewish faith and for substantiating the truth of Christianity. This realization heralded a rethinking of the place of Jews in Christian society and redefined Christian-Jewish dialogue and polemic.
The purpose of our project is to edit and publish the largest extant collection of Talmudic passages translated from Hebrew into Latin, that is, the ""Extractiones de Talmud"", while studying this ground-breaking document in the context of the trial and burning of the Talmud in 1240-42 and its aftermath.
This project addresses vital questions of Jewish and Christian identity, still relevant to the 21st century, and can only be carried out by a transdisciplinary research team including specialists from Latin Philology, Hebrew Studies and History."
Summary
"While polemics and dialogue between Judaism and Christianity are as old as the Christian religion itself, one can clearly distinguish different periods, trends and intensities in the relations between the faiths. A significant landmark in this long and complex history is the Latin translation of large sections of the Talmud, the most important Jewish post-biblical text and the basis for the development of Rabbinic Judaism.
When during the 13th century Christian theologians started to examine and translate the Talmud from Hebrew and Aramaic into Latin, they were faced with a huge body of texts which represented centuries of legalistic and homiletic materials. The discovery of this immense post-biblical Jewish literature became a source of fascination for Christians who believed that this text, which encompasses every aspect of Jewish life, was fundamental both for refuting the Jewish faith and for substantiating the truth of Christianity. This realization heralded a rethinking of the place of Jews in Christian society and redefined Christian-Jewish dialogue and polemic.
The purpose of our project is to edit and publish the largest extant collection of Talmudic passages translated from Hebrew into Latin, that is, the ""Extractiones de Talmud"", while studying this ground-breaking document in the context of the trial and burning of the Talmud in 1240-42 and its aftermath.
This project addresses vital questions of Jewish and Christian identity, still relevant to the 21st century, and can only be carried out by a transdisciplinary research team including specialists from Latin Philology, Hebrew Studies and History."
Max ERC Funding
1 292 700 €
Duration
Start date: 2014-10-01, End date: 2018-09-30