Project acronym BeyondtheElite
Project Beyond the Elite: Jewish Daily Life in Medieval Europe
Researcher (PI) Elisheva Baumgarten
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Consolidator Grant (CoG), SH6, ERC-2015-CoG
Summary The two fundamental challenges of this project are the integration of medieval Jewries and their histories within the framework of European history without undermining their distinct communal status and the creation of a history of everyday medieval Jewish life that includes those who were not part of the learned elite. The study will focus on the Jewish communities of northern Europe (roughly modern Germany, northern France and England) from 1100-1350. From the mid-thirteenth century these medieval Jewish communities were subject to growing persecution. The approaches proposed to access daily praxis seek to highlight tangible dimensions of religious life rather than the more common study of ideologies to date. This task is complex because the extant sources in Hebrew as well as those in Latin and vernacular were written by the learned elite and will require a broad survey of multiple textual and material sources.
Four main strands will be examined and combined:
1. An outline of the strata of Jewish society, better defining the elites and other groups.
2. A study of select communal and familial spaces such as the house, the synagogue, the market place have yet to be examined as social spaces.
3. Ritual and urban rhythms especially the annual cycle, connecting between Jewish and Christian environments.
4. Material culture, as objects were used by Jews and Christians alike.
Aspects of material culture, the physical environment and urban rhythms are often described as “neutral” yet will be mined to demonstrate how they exemplified difference while being simultaneously ubiquitous in local cultures. The deterioration of relations between Jews and Christians will provide a gauge for examining change during this period. The final stage of the project will include comparative case studies of other Jewish communities. I expect my findings will inform scholars of medieval culture at large and promote comparative methodologies for studying other minority ethnic groups
Summary
The two fundamental challenges of this project are the integration of medieval Jewries and their histories within the framework of European history without undermining their distinct communal status and the creation of a history of everyday medieval Jewish life that includes those who were not part of the learned elite. The study will focus on the Jewish communities of northern Europe (roughly modern Germany, northern France and England) from 1100-1350. From the mid-thirteenth century these medieval Jewish communities were subject to growing persecution. The approaches proposed to access daily praxis seek to highlight tangible dimensions of religious life rather than the more common study of ideologies to date. This task is complex because the extant sources in Hebrew as well as those in Latin and vernacular were written by the learned elite and will require a broad survey of multiple textual and material sources.
Four main strands will be examined and combined:
1. An outline of the strata of Jewish society, better defining the elites and other groups.
2. A study of select communal and familial spaces such as the house, the synagogue, the market place have yet to be examined as social spaces.
3. Ritual and urban rhythms especially the annual cycle, connecting between Jewish and Christian environments.
4. Material culture, as objects were used by Jews and Christians alike.
Aspects of material culture, the physical environment and urban rhythms are often described as “neutral” yet will be mined to demonstrate how they exemplified difference while being simultaneously ubiquitous in local cultures. The deterioration of relations between Jews and Christians will provide a gauge for examining change during this period. The final stage of the project will include comparative case studies of other Jewish communities. I expect my findings will inform scholars of medieval culture at large and promote comparative methodologies for studying other minority ethnic groups
Max ERC Funding
1 941 688 €
Duration
Start date: 2016-11-01, End date: 2021-10-31
Project acronym CentrioleBirthDeath
Project Mechanism of centriole inheritance and maintenance
Researcher (PI) Monica BETTENCOURT CARVALHO DIAS
Host Institution (HI) FUNDACAO CALOUSTE GULBENKIAN
Call Details Consolidator Grant (CoG), LS3, ERC-2015-CoG
Summary Centrioles assemble centrosomes and cilia/flagella, critical structures for cell division, polarity, motility and signalling, which are often deregulated in human disease. Centriole inheritance, in particular the preservation of their copy number and position in the cell is critical in many eukaryotes. I propose to investigate, in an integrative and quantitative way, how centrioles are formed in the right numbers at the right time and place, and how they are maintained to ensure their function and inheritance. We first ask how centrioles guide their own assembly position and centriole copy number. Our recent work highlighted several properties of the system, including positive and negative feedbacks and spatial cues. We explore critical hypotheses through a combination of biochemistry, quantitative live cell microscopy and computational modelling. We then ask how the centrosome and the cell cycle are both coordinated. We recently identified the triggering event in centriole biogenesis and how its regulation is akin to cell cycle control of DNA replication and centromere assembly. We will explore new hypotheses to understand how assembly time is coupled to the cell cycle. Lastly, we ask how centriole maintenance is regulated. By studying centriole disappearance in the female germline we uncovered that centrioles need to be actively maintained by their surrounding matrix. We propose to investigate how that matrix provides stability to the centrioles, whether this is differently regulated in different cell types and the possible consequences of its misregulation for the organism (infertility and ciliopathy-like symptoms). We will take advantage of several experimental systems (in silico, ex-vivo, flies and human cells), tailoring the assay to the question and allowing for comparisons across experimental systems to provide a deeper understanding of the process and its regulation.
Summary
Centrioles assemble centrosomes and cilia/flagella, critical structures for cell division, polarity, motility and signalling, which are often deregulated in human disease. Centriole inheritance, in particular the preservation of their copy number and position in the cell is critical in many eukaryotes. I propose to investigate, in an integrative and quantitative way, how centrioles are formed in the right numbers at the right time and place, and how they are maintained to ensure their function and inheritance. We first ask how centrioles guide their own assembly position and centriole copy number. Our recent work highlighted several properties of the system, including positive and negative feedbacks and spatial cues. We explore critical hypotheses through a combination of biochemistry, quantitative live cell microscopy and computational modelling. We then ask how the centrosome and the cell cycle are both coordinated. We recently identified the triggering event in centriole biogenesis and how its regulation is akin to cell cycle control of DNA replication and centromere assembly. We will explore new hypotheses to understand how assembly time is coupled to the cell cycle. Lastly, we ask how centriole maintenance is regulated. By studying centriole disappearance in the female germline we uncovered that centrioles need to be actively maintained by their surrounding matrix. We propose to investigate how that matrix provides stability to the centrioles, whether this is differently regulated in different cell types and the possible consequences of its misregulation for the organism (infertility and ciliopathy-like symptoms). We will take advantage of several experimental systems (in silico, ex-vivo, flies and human cells), tailoring the assay to the question and allowing for comparisons across experimental systems to provide a deeper understanding of the process and its regulation.
Max ERC Funding
2 000 000 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym CODECHECK
Project CRACKING THE CODE BEHIND MITOTIC FIDELITY: the roles of tubulin post-translational modifications and a chromosome separation checkpoint
Researcher (PI) Helder Jose Martins Maiato
Host Institution (HI) INSTITUTO DE BIOLOGIA MOLECULAR E CELULAR-IBMC
Call Details Consolidator Grant (CoG), LS3, ERC-2015-CoG
Summary During the human lifetime 10000 trillion cell divisions take place to ensure tissue homeostasis and several vital functions in the organism. Mitosis is the process that ensures that dividing cells preserve the chromosome number of their progenitors, while deviation from this, a condition known as aneuploidy, represents the most common feature in human cancers. Here we will test two original concepts with strong implications for chromosome segregation fidelity. The first concept is based on the “tubulin code” hypothesis, which predicts that molecular motors “read” tubulin post-translational modifications on spindle microtubules. Our proof-of-concept experiments demonstrate that tubulin detyrosination works as a navigation system that guides chromosomes towards the cell equator. Thus, in addition to regulating the motors required for chromosome motion, the cell might regulate the tracks in which they move on. We will combine proteomic, super-resolution and live-cell microscopy, with in vitro reconstitutions, to perform a comprehensive survey of the tubulin code and the respective implications for motors involved in chromosome motion, mitotic spindle assembly and correction of kinetochore-microtubule attachments. The second concept is centered on the recently uncovered chromosome separation checkpoint mediated by a midzone-associated Aurora B gradient, which delays nuclear envelope reformation in response to incompletely separated chromosomes. We aim to identify Aurora B targets involved in the spatiotemporal regulation of the anaphase-telophase transition. We will establish powerful live-cell microscopy assays and a novel mammalian model system to dissect how this checkpoint allows the detection and correction of lagging/long chromosomes and DNA bridges that would otherwise contribute to genomic instability. Overall, this work will establish a paradigm shift in our understanding of how spatial information is conveyed to faithfully segregate chromosomes during mitosis.
Summary
During the human lifetime 10000 trillion cell divisions take place to ensure tissue homeostasis and several vital functions in the organism. Mitosis is the process that ensures that dividing cells preserve the chromosome number of their progenitors, while deviation from this, a condition known as aneuploidy, represents the most common feature in human cancers. Here we will test two original concepts with strong implications for chromosome segregation fidelity. The first concept is based on the “tubulin code” hypothesis, which predicts that molecular motors “read” tubulin post-translational modifications on spindle microtubules. Our proof-of-concept experiments demonstrate that tubulin detyrosination works as a navigation system that guides chromosomes towards the cell equator. Thus, in addition to regulating the motors required for chromosome motion, the cell might regulate the tracks in which they move on. We will combine proteomic, super-resolution and live-cell microscopy, with in vitro reconstitutions, to perform a comprehensive survey of the tubulin code and the respective implications for motors involved in chromosome motion, mitotic spindle assembly and correction of kinetochore-microtubule attachments. The second concept is centered on the recently uncovered chromosome separation checkpoint mediated by a midzone-associated Aurora B gradient, which delays nuclear envelope reformation in response to incompletely separated chromosomes. We aim to identify Aurora B targets involved in the spatiotemporal regulation of the anaphase-telophase transition. We will establish powerful live-cell microscopy assays and a novel mammalian model system to dissect how this checkpoint allows the detection and correction of lagging/long chromosomes and DNA bridges that would otherwise contribute to genomic instability. Overall, this work will establish a paradigm shift in our understanding of how spatial information is conveyed to faithfully segregate chromosomes during mitosis.
Max ERC Funding
2 323 468 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym ColloQuantO
Project Colloidal Quantum Dot Quantum Optics
Researcher (PI) Dan Oron
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE LTD
Call Details Consolidator Grant (CoG), PE4, ERC-2015-CoG
Summary Colloidal semiconductor nanocrystals have already found significant use in various arenas, including bioimaging, displays, lighting, photovoltaics and catalysis. Here we aim to harness the extremely broad synthetic toolbox of colloidal semiconductor quantum dots in order to utilize them as unique sources of quantum states of light, extending well beyond the present attempts to use them as single photon sources. By tailoring the shape, size, composition and the organic ligand layer of quantum dots, rods and platelets, we propose their use as sources exhibiting a deterministic number of emitted photons upon saturated excitation and as tunable sources of correlated and entangled photon pairs. The versatility afforded in their fabrication by colloidal synthesis, rather than by epitaxial growth, presents a potential pathway to overcome some of the significant limitations of present-day solid state sources of nonclassical light, including color tunability, fidelity and ease of assembly into devices.
This program is a concerted effort both on colloidal synthesis of complex multicomponent semiconductor nanocrystals and on cutting edge photophysical studies at the single nanocrystal level. This should enable new types of emitters of nonclassical light, as well as provide a platform for the implementation of recently suggested schemes in quantum optics which have never been experimentally demonstrated. These include room temperature sources of exactly two (or more) photons, correlated photon pairs from quantum dot molecules and entanglement based on time reordering. Fulfilling the optical and material requirements from this type of system, including photostability, control of carrier-carrier interactions, and a large quantum yield, will inevitably reveal some of the fundamental properties of coupled carriers in strongly confined structures.
Summary
Colloidal semiconductor nanocrystals have already found significant use in various arenas, including bioimaging, displays, lighting, photovoltaics and catalysis. Here we aim to harness the extremely broad synthetic toolbox of colloidal semiconductor quantum dots in order to utilize them as unique sources of quantum states of light, extending well beyond the present attempts to use them as single photon sources. By tailoring the shape, size, composition and the organic ligand layer of quantum dots, rods and platelets, we propose their use as sources exhibiting a deterministic number of emitted photons upon saturated excitation and as tunable sources of correlated and entangled photon pairs. The versatility afforded in their fabrication by colloidal synthesis, rather than by epitaxial growth, presents a potential pathway to overcome some of the significant limitations of present-day solid state sources of nonclassical light, including color tunability, fidelity and ease of assembly into devices.
This program is a concerted effort both on colloidal synthesis of complex multicomponent semiconductor nanocrystals and on cutting edge photophysical studies at the single nanocrystal level. This should enable new types of emitters of nonclassical light, as well as provide a platform for the implementation of recently suggested schemes in quantum optics which have never been experimentally demonstrated. These include room temperature sources of exactly two (or more) photons, correlated photon pairs from quantum dot molecules and entanglement based on time reordering. Fulfilling the optical and material requirements from this type of system, including photostability, control of carrier-carrier interactions, and a large quantum yield, will inevitably reveal some of the fundamental properties of coupled carriers in strongly confined structures.
Max ERC Funding
2 000 000 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym ErgComNum
Project Ergodic theory and additive combinatorics
Researcher (PI) Tamar Ziegler
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Consolidator Grant (CoG), PE1, ERC-2015-CoG
Summary The last decade has witnessed a new spring for dynamical systems. The field - initiated by Poincare in the study of the N-body problem - has become essential in the understanding of seemingly far off fields such as combinatorics, number theory and theoretical computer science. In particular, ideas from ergodic theory played an important role in the resolution of long standing open problems in combinatorics and number theory. A striking example is the role of dynamics on nilmanifolds in the recent proof of Hardy-Littlewood estimates for the number of solutions to systems of linear equations of finite complexity in the prime numbers. The interplay between ergodic theory, number theory and additive combinatorics has proved very fruitful; it is a fast growing area in mathematics attracting many young researchers. We propose to tackle central open problems in the area.
Summary
The last decade has witnessed a new spring for dynamical systems. The field - initiated by Poincare in the study of the N-body problem - has become essential in the understanding of seemingly far off fields such as combinatorics, number theory and theoretical computer science. In particular, ideas from ergodic theory played an important role in the resolution of long standing open problems in combinatorics and number theory. A striking example is the role of dynamics on nilmanifolds in the recent proof of Hardy-Littlewood estimates for the number of solutions to systems of linear equations of finite complexity in the prime numbers. The interplay between ergodic theory, number theory and additive combinatorics has proved very fruitful; it is a fast growing area in mathematics attracting many young researchers. We propose to tackle central open problems in the area.
Max ERC Funding
1 342 500 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym LDMThExp
Project Going Beyond the WIMP: From Theory to Detection of Light Dark Matter
Researcher (PI) Tomer Volansky
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Consolidator Grant (CoG), PE2, ERC-2015-CoG
Summary The identity of dark matter (DM) is still unknown. For more than three decades, significant theoretical and experimental efforts have been directed towards the search for a Weakly Interacting Massive Particle (WIMP), often overlooking other possibilities. The lack of an unambiguous positive signal, at indirect- and direct-detection experiments and at the LHC, stresses the need to expand on other theoretical possibilities, and more importantly, to develop new experimental capabilities. Indeed it is conceivable that the WIMP paradigm has been misleading, and other theoretically motivated scenarios must be explored vigorously.
This proposal focuses on light, sub-GeV dark matter. In addition to novel theoretical paradigms that point to DM in the low-mass regime, several new strategies to directly detect dark matter particles with MeV to GeV mass, far below standard direct detection capabilities, are studied. In particular, techniques to search for ionized electrons or chemical bond-breaking are considered. The latter possibility is revolutionary and requires new dedicated technologies and experiments. Sensitivity to one or few electrons, on the other hand, has been established and the PI has recently derived the first direct-detection limits on MeV to GeV dark matter using XENON10 data, demonstrating proof-of-principle. Significant efforts are required to lay the theoretical foundation of light DM and to study in depth and develop the various possibilities to directly detect it. The proposal is centered around these efforts.
The innovative theoretical paradigms and novel avenues to experimentally detect sub-GeV DM, open up a new and groundbreaking field of research. The proposal at hand takes the necessary steps, and offers the opportunity to pave the way and enable the discovery of such a particle, if it exists.
Summary
The identity of dark matter (DM) is still unknown. For more than three decades, significant theoretical and experimental efforts have been directed towards the search for a Weakly Interacting Massive Particle (WIMP), often overlooking other possibilities. The lack of an unambiguous positive signal, at indirect- and direct-detection experiments and at the LHC, stresses the need to expand on other theoretical possibilities, and more importantly, to develop new experimental capabilities. Indeed it is conceivable that the WIMP paradigm has been misleading, and other theoretically motivated scenarios must be explored vigorously.
This proposal focuses on light, sub-GeV dark matter. In addition to novel theoretical paradigms that point to DM in the low-mass regime, several new strategies to directly detect dark matter particles with MeV to GeV mass, far below standard direct detection capabilities, are studied. In particular, techniques to search for ionized electrons or chemical bond-breaking are considered. The latter possibility is revolutionary and requires new dedicated technologies and experiments. Sensitivity to one or few electrons, on the other hand, has been established and the PI has recently derived the first direct-detection limits on MeV to GeV dark matter using XENON10 data, demonstrating proof-of-principle. Significant efforts are required to lay the theoretical foundation of light DM and to study in depth and develop the various possibilities to directly detect it. The proposal is centered around these efforts.
The innovative theoretical paradigms and novel avenues to experimentally detect sub-GeV DM, open up a new and groundbreaking field of research. The proposal at hand takes the necessary steps, and offers the opportunity to pave the way and enable the discovery of such a particle, if it exists.
Max ERC Funding
1 822 083 €
Duration
Start date: 2016-03-01, End date: 2022-02-28
Project acronym MAPLE
Project Measuring and Analysing the Politicisation of Europe before and after the Eurozone Crisis
Researcher (PI) Marina Castelo Branco da Costa Lobo
Host Institution (HI) INSTITUTO DE CIENCIAS SOCIAIS
Call Details Consolidator Grant (CoG), SH2, ERC-2015-CoG
Summary The Eurozone crisis forces us to reconsider the conventional wisdom that “Europe” has little effect on national electoral politics. MAPLE’s central goal is to analyse the degree of politicisation the European issue has acquired following the Eurozone crisis, in Belgium, Germany, Greece, Ireland, Portugal and Spain in 2000-2016, and to focus on its consequences for voting behaviour. Our main thesis is that a fundamental shift has occurred in the vote function as a result of this politicisation: short-term factors of voting behaviour, such as economic perceptions as well as leader effects may have been structurally diminished in the countries which have seen bailouts and where citizens increasingly perceive the main policy decisions being directed from Brussels. To measure politicisation of the EU we will analyse both parliamentary debates and media outlets coding for salience and polarisation of the European issue. These measurements will contribute to understand how politicisation of the EU has underpinned political changes between 2000 and 2016 in the countries concerned. The analysis of voting behaviour will employ a social-psychological methodology in order to test the relationship between increased politicisation of the EU and short-term effects. MAPLE will create datasets for 12 newspapers, more than 60 political parties, 26 elections as well as conduct 12 web panel surveys of a representative sample of voters in the countries concerned. MAPLE is interdisciplinary: it combines approaches from social psychology and political science. It includes qualitative data collection (coding of newspapers and parliamentary debates) followed by qualitative and quantitative data analysis. MAPLE will ultimately illuminate the way in which Europe has decisively entered national electoral politics and with what consequences for the vote calculus.
Summary
The Eurozone crisis forces us to reconsider the conventional wisdom that “Europe” has little effect on national electoral politics. MAPLE’s central goal is to analyse the degree of politicisation the European issue has acquired following the Eurozone crisis, in Belgium, Germany, Greece, Ireland, Portugal and Spain in 2000-2016, and to focus on its consequences for voting behaviour. Our main thesis is that a fundamental shift has occurred in the vote function as a result of this politicisation: short-term factors of voting behaviour, such as economic perceptions as well as leader effects may have been structurally diminished in the countries which have seen bailouts and where citizens increasingly perceive the main policy decisions being directed from Brussels. To measure politicisation of the EU we will analyse both parliamentary debates and media outlets coding for salience and polarisation of the European issue. These measurements will contribute to understand how politicisation of the EU has underpinned political changes between 2000 and 2016 in the countries concerned. The analysis of voting behaviour will employ a social-psychological methodology in order to test the relationship between increased politicisation of the EU and short-term effects. MAPLE will create datasets for 12 newspapers, more than 60 political parties, 26 elections as well as conduct 12 web panel surveys of a representative sample of voters in the countries concerned. MAPLE is interdisciplinary: it combines approaches from social psychology and political science. It includes qualitative data collection (coding of newspapers and parliamentary debates) followed by qualitative and quantitative data analysis. MAPLE will ultimately illuminate the way in which Europe has decisively entered national electoral politics and with what consequences for the vote calculus.
Max ERC Funding
1 592 859 €
Duration
Start date: 2016-08-01, End date: 2021-07-31
Project acronym MorphoNotch
Project Multi-scale analysis of the interplay between cell morphology and cell-cell signaling
Researcher (PI) David Sprinzak
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Consolidator Grant (CoG), LS3, ERC-2015-CoG
Summary Signaling, genetic regulatory circuits, and tissue morphology are inherently coupled to each other during embryonic development. Although changes in cellular and tissue morphology are commonly treated as a downstream consequence of cell fate decision processes, there are multiple examples where morphological changes occur concurrently with the differentiation processes. This suggests that a feedback between cell morphology and regulatory processes can play an important role in coordinating tissue development. Currently, however, we lack the experimental, theoretical, and conceptual tools to understand this interplay between cell morphology, signaling, and regulatory circuits. In particular, we need to understand (1) how intercellular signaling depends on the cellular morphology and on the properties of the boundary between cells, and (2) how intercellular signaling, genetic circuits, and cell morphology integrate to generate robust differentiation patterns. Here, I propose to combine quantitative in-vitro and in-vivo experiments with mathematical modeling to address these questions in the context of Notch signaling and Notch mediated patterning, typically used for coordinating differentiation between neighboring cells during development. We will utilize novel reporters and micropatterning technology to analyze Notch signaling between pairs of cells. We will elucidate how the geometry and the molecular composition of the boundary between cells affect signaling. At the tissue level, we will study how the interplay between cell morphology and Notch signaling gives rise to robust patterning in the mammalian inner ear. We will use cochlear inner ear explant imaging to track the transition from disordered undifferentiated state to ordered pattern of hair and supporting cells in the cochlea. Together with a novel hybrid modeling approach, we will provide the foundation for a systems level understanding of development that interconnects morphology and regulatory circuits.
Summary
Signaling, genetic regulatory circuits, and tissue morphology are inherently coupled to each other during embryonic development. Although changes in cellular and tissue morphology are commonly treated as a downstream consequence of cell fate decision processes, there are multiple examples where morphological changes occur concurrently with the differentiation processes. This suggests that a feedback between cell morphology and regulatory processes can play an important role in coordinating tissue development. Currently, however, we lack the experimental, theoretical, and conceptual tools to understand this interplay between cell morphology, signaling, and regulatory circuits. In particular, we need to understand (1) how intercellular signaling depends on the cellular morphology and on the properties of the boundary between cells, and (2) how intercellular signaling, genetic circuits, and cell morphology integrate to generate robust differentiation patterns. Here, I propose to combine quantitative in-vitro and in-vivo experiments with mathematical modeling to address these questions in the context of Notch signaling and Notch mediated patterning, typically used for coordinating differentiation between neighboring cells during development. We will utilize novel reporters and micropatterning technology to analyze Notch signaling between pairs of cells. We will elucidate how the geometry and the molecular composition of the boundary between cells affect signaling. At the tissue level, we will study how the interplay between cell morphology and Notch signaling gives rise to robust patterning in the mammalian inner ear. We will use cochlear inner ear explant imaging to track the transition from disordered undifferentiated state to ordered pattern of hair and supporting cells in the cochlea. Together with a novel hybrid modeling approach, we will provide the foundation for a systems level understanding of development that interconnects morphology and regulatory circuits.
Max ERC Funding
2 000 000 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym MPM
Project Modern Pattern Matching
Researcher (PI) Ely Porat
Host Institution (HI) BAR ILAN UNIVERSITY
Call Details Consolidator Grant (CoG), PE6, ERC-2015-CoG
Summary The advances in technology over the last decade and the massive amount of data passing through the internet has intrigued and challenged computer scientists, as the old models of computation used before this era are now less relevant or too slow. New computational models have been suggested to tackle these technological advances. In the most basic sense, these modern models allow one to scan the input only once, possible with small auxiliary memory. Nevertheless, modern techniques have also been introduced such as sparse recovery which has proven to be a very useful tool for dealing with modern challenges, and the very popular notion of conditional lower bounds which has provided evidence of hardness for various algorithmic tasks based on very popular conjectures.
Pattern matching plays a crucial role in many computing applications that can be seen in day to day life. However, its research community has only recently started gaining insight on what can be done in modern models, and is lagging behind in this respect. In particular, there are no algorithms for pattern matching problems that have utilized ideas from sparse recovery, and only recently has there been progress in proving conditional lower bounds for string problems. Furthermore, conditional lower bounds suffer from the lack of hardness conjectures which address time/space tradeoffs.
This proposal will close this gap for many important pattern matching problems within the new models of computation, and will be the first to utilize modern algorithmic techniques, such as sparse recovery, and adapting them into the pattern matching world. Furthermore, this proposal will focus on developing a theory for proving conditional time/space lower bounds, based on new hardness conjectures. This will greatly influence not only the pattern matching sub-field, but the entire algorithmic field at large.
Summary
The advances in technology over the last decade and the massive amount of data passing through the internet has intrigued and challenged computer scientists, as the old models of computation used before this era are now less relevant or too slow. New computational models have been suggested to tackle these technological advances. In the most basic sense, these modern models allow one to scan the input only once, possible with small auxiliary memory. Nevertheless, modern techniques have also been introduced such as sparse recovery which has proven to be a very useful tool for dealing with modern challenges, and the very popular notion of conditional lower bounds which has provided evidence of hardness for various algorithmic tasks based on very popular conjectures.
Pattern matching plays a crucial role in many computing applications that can be seen in day to day life. However, its research community has only recently started gaining insight on what can be done in modern models, and is lagging behind in this respect. In particular, there are no algorithms for pattern matching problems that have utilized ideas from sparse recovery, and only recently has there been progress in proving conditional lower bounds for string problems. Furthermore, conditional lower bounds suffer from the lack of hardness conjectures which address time/space tradeoffs.
This proposal will close this gap for many important pattern matching problems within the new models of computation, and will be the first to utilize modern algorithmic techniques, such as sparse recovery, and adapting them into the pattern matching world. Furthermore, this proposal will focus on developing a theory for proving conditional time/space lower bounds, based on new hardness conjectures. This will greatly influence not only the pattern matching sub-field, but the entire algorithmic field at large.
Max ERC Funding
1 994 609 €
Duration
Start date: 2017-07-01, End date: 2022-06-30
Project acronym NATURAL_BAT_NAV
Project Neural basis of natural navigation: Representation of goals, 3-D spaces and 1-km distances in the bat hippocampal formation – the role of experience
Researcher (PI) Nachum Ulanovsky
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Consolidator Grant (CoG), LS5, ERC-2015-CoG
Summary The mammalian hippocampal formation contains place cells, grid cells, head-direction cells and border cells, which collectively represent the animal’s position (‘map’), distance traveled (‘odometer’) and direction (‘compass’), and are thought to underlie navigation. These neurons are typically studied in rodents running on linear tracks or in small empty boxes, ~1×1 m in size. However, real-world navigation differs dramatically from typical laboratory setups, in at least three ways – which we plan to study:
(1) The world is not empty, but contains objects and goals. Almost nothing is known about how neural circuits represent goal location – which is essential for navigating towards the goal. We will record single-neuron activity in bats flying towards spatial goals, in search for cells that encode vectorial information about the direction and distance to the goal. Preliminary results support the existence of such cells in the bat hippocampal formation. This new functional cell class of vectorial goal-encoding neurons may underlie goal-directed navigation.
(2) The world is not flat, but three-dimensional (3-D). We will train bats to fly in a large flight-room and examine 3-D grid cells and 3-D border cells.
(3) The world is not 1-m in size, and both rodents and bats navigate over kilometer-scale distances. Nothing is known about how the brain supports such real-life navigation. We will utilize a 1-km long test facility at the Weizmann Institute of Science, and record place cells and grid cells in bats navigating over biologically relevant spatial scales. Further, we will compare neural codes for space in wild-born bats versus bats born in the lab – which have never experienced a 1-km distance – to illuminate the role of experience in mammalian spatial cognition.
Taken together, this set of studies will bridge the gap – a conceptual gap and a gap in spatial scale – between hippocampal laboratory studies and real-world natural navigation.
Summary
The mammalian hippocampal formation contains place cells, grid cells, head-direction cells and border cells, which collectively represent the animal’s position (‘map’), distance traveled (‘odometer’) and direction (‘compass’), and are thought to underlie navigation. These neurons are typically studied in rodents running on linear tracks or in small empty boxes, ~1×1 m in size. However, real-world navigation differs dramatically from typical laboratory setups, in at least three ways – which we plan to study:
(1) The world is not empty, but contains objects and goals. Almost nothing is known about how neural circuits represent goal location – which is essential for navigating towards the goal. We will record single-neuron activity in bats flying towards spatial goals, in search for cells that encode vectorial information about the direction and distance to the goal. Preliminary results support the existence of such cells in the bat hippocampal formation. This new functional cell class of vectorial goal-encoding neurons may underlie goal-directed navigation.
(2) The world is not flat, but three-dimensional (3-D). We will train bats to fly in a large flight-room and examine 3-D grid cells and 3-D border cells.
(3) The world is not 1-m in size, and both rodents and bats navigate over kilometer-scale distances. Nothing is known about how the brain supports such real-life navigation. We will utilize a 1-km long test facility at the Weizmann Institute of Science, and record place cells and grid cells in bats navigating over biologically relevant spatial scales. Further, we will compare neural codes for space in wild-born bats versus bats born in the lab – which have never experienced a 1-km distance – to illuminate the role of experience in mammalian spatial cognition.
Taken together, this set of studies will bridge the gap – a conceptual gap and a gap in spatial scale – between hippocampal laboratory studies and real-world natural navigation.
Max ERC Funding
2 000 000 €
Duration
Start date: 2016-11-01, End date: 2021-10-31
Project acronym OCLD
Project Tracking the Dynamics of Human Metabolism using Spectroscopy-Integrated Liver-on-Chip Microdevices
Researcher (PI) Yaakov Nahmias
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Consolidator Grant (CoG), LS7, ERC-2015-CoG
Summary The liver is the main organ responsible for the systemic regulation of human metabolism, responding to hormonal stimulation, nutritional challenges, and circadian rhythms using fast enzymatic processes and slow transcriptional mechanisms. This regulatory complexity limits our ability to create efficient pharmaceutical interventions for metabolic diseases such as fatty liver disease and diabetes. In addition, circadian changes in drug metabolism can impact pharmacokinetics and pharmacodynamics affecting our ability to optimize drug dosage or properly assess chronic liver toxicity.
The challenge in rationally designing efficient drug interventions stems from current reliance on end-point assays and animal models that provide intermittent information with limited human relevance. Therefore, there is a need to develop systems capable of tracking transcriptional and metabolic dynamics of human tissue with high-resolution preferably in real time. Over the past 5 years, we established state-of-the-art models of human hepatocytes; oxygen nanosensors; and cutting-edge liver-on-chip devices, making us uniquely suited to address this challenge.
We aim to develop a platform capable of tracking the metabolism of tissue engineered livers in real time, enabling an accurate assessment of chronic liver toxicity (e.g. repeated dose response) and the deconstruction of complex metabolic regulation during nutritional events. Our approach is to integrate liver-on-chip devices, with real time measurements of oxygen uptake, infrared microspectroscopy, and continuous MS/MS analysis. This innovative endeavour capitalizes on advances in nanotechnology and chemical characterization offering the ability to non-invasively monitor the metabolic state of the cells (e.g. steatosis) while tracking minute changes in metabolic pathways. This project has the short-term potential to replace animal models in toxicity studies and long-term potential to elucidate critical aspects in metabolic homeostasis.
Summary
The liver is the main organ responsible for the systemic regulation of human metabolism, responding to hormonal stimulation, nutritional challenges, and circadian rhythms using fast enzymatic processes and slow transcriptional mechanisms. This regulatory complexity limits our ability to create efficient pharmaceutical interventions for metabolic diseases such as fatty liver disease and diabetes. In addition, circadian changes in drug metabolism can impact pharmacokinetics and pharmacodynamics affecting our ability to optimize drug dosage or properly assess chronic liver toxicity.
The challenge in rationally designing efficient drug interventions stems from current reliance on end-point assays and animal models that provide intermittent information with limited human relevance. Therefore, there is a need to develop systems capable of tracking transcriptional and metabolic dynamics of human tissue with high-resolution preferably in real time. Over the past 5 years, we established state-of-the-art models of human hepatocytes; oxygen nanosensors; and cutting-edge liver-on-chip devices, making us uniquely suited to address this challenge.
We aim to develop a platform capable of tracking the metabolism of tissue engineered livers in real time, enabling an accurate assessment of chronic liver toxicity (e.g. repeated dose response) and the deconstruction of complex metabolic regulation during nutritional events. Our approach is to integrate liver-on-chip devices, with real time measurements of oxygen uptake, infrared microspectroscopy, and continuous MS/MS analysis. This innovative endeavour capitalizes on advances in nanotechnology and chemical characterization offering the ability to non-invasively monitor the metabolic state of the cells (e.g. steatosis) while tracking minute changes in metabolic pathways. This project has the short-term potential to replace animal models in toxicity studies and long-term potential to elucidate critical aspects in metabolic homeostasis.
Max ERC Funding
2 118 175 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym PhageResist
Project Beyond CRISPR: Systematic characterization of novel anti-phage defense systems in the microbial pan-genome
Researcher (PI) Rotem Sorek
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE LTD
Call Details Consolidator Grant (CoG), LS2, ERC-2015-CoG
Summary The perpetual arms race between bacteria and phage has resulted in the evolution of efficient resistance systems that protect bacteria from phage infection. Such systems, which include restriction enzymes and CRISPR-Cas, have major influence on the evolution of both bacteria and phage, and have also proven to be invaluable for molecular and biotechnological applications. Although much have been learned on the biology of bacterial defense against phage, more than half of all sequenced bacteria do not contain CRISPR-Cas, and it is estimated that many additional, yet-uncharacterized anti-phage defense systems are encoded in bacterial genomes.
The goal of this project is to systematically understand the arsenal of defense mechanisms that are at the disposal of microbes in their struggle against phages. The project combines computational genomics, synthetic biology, high-throughput robotic assays, and deep genetic and biochemical experiments to discover, verify, and study the properties of anti-phage defense systems.
Summary
The perpetual arms race between bacteria and phage has resulted in the evolution of efficient resistance systems that protect bacteria from phage infection. Such systems, which include restriction enzymes and CRISPR-Cas, have major influence on the evolution of both bacteria and phage, and have also proven to be invaluable for molecular and biotechnological applications. Although much have been learned on the biology of bacterial defense against phage, more than half of all sequenced bacteria do not contain CRISPR-Cas, and it is estimated that many additional, yet-uncharacterized anti-phage defense systems are encoded in bacterial genomes.
The goal of this project is to systematically understand the arsenal of defense mechanisms that are at the disposal of microbes in their struggle against phages. The project combines computational genomics, synthetic biology, high-throughput robotic assays, and deep genetic and biochemical experiments to discover, verify, and study the properties of anti-phage defense systems.
Max ERC Funding
2 000 000 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym SpeedInfTradeoff
Project Speed-Information Tradeoffs: Beyond Quasi-Entropy Analysis
Researcher (PI) Nir Yosef Ailon
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Consolidator Grant (CoG), PE6, ERC-2015-CoG
Summary The starting point of this research proposal is a recent result by the PI, making progress in a half century old,
notoriously open problem. In the mid 1960’s, Tukey and Cooley discovered the Fast Fourier Transform, an
algorithm for performing one of the most important linear transformations in science and engineering, the
(discrete) Fourier transform, in time complexity O(n log n).
In spite of its importance, a super-linear lower bound has been elusive for many years, with only very limited
results. Very recently the PI managed to show that, roughly speaking, a faster Fourier transform must result
in information loss, in the form of numerical accuracy. The result can be seen as a type of computational
uncertainty principle, whereby faster computation increases uncertainty in data. The mathematical argument
is established by defining a type of matrix quasi-entropy, generalizing Shannon’s measure of information
(entropy) to “quasi-probabilities” (which can be negative, more than 1, or even complex).
This result, which is not believed to be tight, does not close the book on Fourier complexity. More importantly,
the vision proposed by the PI here reaches far beyond Fourier computation. The computation-information
tradeoff underlying the result suggests a novel view of complexity theory as a whole. We can now revisit
some classic complexity theoretical problems with a fresh view. Examples of these problems include better
understanding of the complexity of polynomial multiplication, integer multiplication, auto-correlation and
cross-correlation computation, dimensionality reduction via the Fast Johnson-Linednstrauss Transform (FJLT;
also discovered and developed by the PI), large scale linear algebra (linear regression, Principal Component
Analysis - PCA, compressed sensing, matrix multiplication) as well as binary functions such as integer multiplication.
Summary
The starting point of this research proposal is a recent result by the PI, making progress in a half century old,
notoriously open problem. In the mid 1960’s, Tukey and Cooley discovered the Fast Fourier Transform, an
algorithm for performing one of the most important linear transformations in science and engineering, the
(discrete) Fourier transform, in time complexity O(n log n).
In spite of its importance, a super-linear lower bound has been elusive for many years, with only very limited
results. Very recently the PI managed to show that, roughly speaking, a faster Fourier transform must result
in information loss, in the form of numerical accuracy. The result can be seen as a type of computational
uncertainty principle, whereby faster computation increases uncertainty in data. The mathematical argument
is established by defining a type of matrix quasi-entropy, generalizing Shannon’s measure of information
(entropy) to “quasi-probabilities” (which can be negative, more than 1, or even complex).
This result, which is not believed to be tight, does not close the book on Fourier complexity. More importantly,
the vision proposed by the PI here reaches far beyond Fourier computation. The computation-information
tradeoff underlying the result suggests a novel view of complexity theory as a whole. We can now revisit
some classic complexity theoretical problems with a fresh view. Examples of these problems include better
understanding of the complexity of polynomial multiplication, integer multiplication, auto-correlation and
cross-correlation computation, dimensionality reduction via the Fast Johnson-Linednstrauss Transform (FJLT;
also discovered and developed by the PI), large scale linear algebra (linear regression, Principal Component
Analysis - PCA, compressed sensing, matrix multiplication) as well as binary functions such as integer multiplication.
Max ERC Funding
1 515 801 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym TiDrugArchitectures
Project Highly Competent and Safe Titanium(IV) Therapeutic Frameworks that are Cancer Targeted based on Complex 1, 2, and 3D Chemical Architectures
Researcher (PI) Edit Yehudit Tshuva Goldberg
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Consolidator Grant (CoG), PE5, ERC-2015-CoG
Summary This proposal aims to develop custom designed anticancer therapeutic frameworks that are effective, stable, safe, and tumor targeted, based on the biocompatible TiIV metal. The Tshuva group has established that water stable phenolato TiIV complexes are especially effective as anticancer agents both in vitro and in vivo, with markedly reduced side effects. Optimal derivatives will be developed to combine activity, stability, and biological accessibility, by maintaining small steric bulk while incorporating strong binding donors and hydrophilicity. The mechanism of action will be investigated by chemical and biological methods, including analyzing bio-distribution, cellular pathways and targets, and interaction with bio-molecules. Specifically, the active metal centers will be linked to bioactive moieties through redox-sensitive S–S bonds to enable tumor targeting. Cell penetrating peptides will facilitate cellular penetration for redox-dependent release of the active species selectively in cancer cells; steroid moieties will direct selectivity to hormone-dependent cancer cell types. Since the combination of TiIV- with Pt-based drugs has shown synergistic effects, multi-active entities will include two or more metal centers, possibly also linked to a transport unit. In addition to linear conjugates, polymeric and dendritic assemblies, exploiting the enhanced permeability of cancer cells, will be constructed with theoretically unlimited options for targeted delivery of multiple active sites. Most importantly, flexible well-defined redox-sensitive cages, as well as rigid pH sensitive complex cages, constructed with customized 3D geometries, will enable specific targeting of any active compound or conjugate and selective dissociation only where desired. This study should yield superior anticancer drugs, while unraveling the mystery of their complex biochemistry, and will contribute to the development of novel chemical and medicinal research directions and applications.
Summary
This proposal aims to develop custom designed anticancer therapeutic frameworks that are effective, stable, safe, and tumor targeted, based on the biocompatible TiIV metal. The Tshuva group has established that water stable phenolato TiIV complexes are especially effective as anticancer agents both in vitro and in vivo, with markedly reduced side effects. Optimal derivatives will be developed to combine activity, stability, and biological accessibility, by maintaining small steric bulk while incorporating strong binding donors and hydrophilicity. The mechanism of action will be investigated by chemical and biological methods, including analyzing bio-distribution, cellular pathways and targets, and interaction with bio-molecules. Specifically, the active metal centers will be linked to bioactive moieties through redox-sensitive S–S bonds to enable tumor targeting. Cell penetrating peptides will facilitate cellular penetration for redox-dependent release of the active species selectively in cancer cells; steroid moieties will direct selectivity to hormone-dependent cancer cell types. Since the combination of TiIV- with Pt-based drugs has shown synergistic effects, multi-active entities will include two or more metal centers, possibly also linked to a transport unit. In addition to linear conjugates, polymeric and dendritic assemblies, exploiting the enhanced permeability of cancer cells, will be constructed with theoretically unlimited options for targeted delivery of multiple active sites. Most importantly, flexible well-defined redox-sensitive cages, as well as rigid pH sensitive complex cages, constructed with customized 3D geometries, will enable specific targeting of any active compound or conjugate and selective dissociation only where desired. This study should yield superior anticancer drugs, while unraveling the mystery of their complex biochemistry, and will contribute to the development of novel chemical and medicinal research directions and applications.
Max ERC Funding
2 000 000 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym Tolerome
Project Evolution of antibiotic tolerance in the 'wild': A quantitative approach
Researcher (PI) Nathalie Balaban
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Consolidator Grant (CoG), LS8, ERC-2015-CoG
Summary Bacterial ability to evolve strategies for evading antibiotic treatment is a fascinating example of an evolutionary process, as well as a major health threat. Despite efforts to understand treatment failure, we lack the means to prevent evolution of resistance when a new drug is released to the market. Most efforts are directed towards understanding the mechanisms of antibiotic resistance. Whereas ‘resistance’ is due to mutations that enable microorganisms to grow even at high concentrations of the drug, ‘tolerance’ is the ability to sustain a transient treatment, for example by entering a mode of transient dormancy. The importance of tolerance in the clinic has not been investigated as thoroughly as resistance. The presence of tolerant bacteria is not detected in the clinic because of the inherent difficulty of tracking dormant bacteria that often constitute only a minute fraction of the bacterial population. I hypothesize that bacterial dormancy may evolve quickly in the host under antibiotic treatment. This hypothesis is strengthened by our recent results demonstrating the rapid evolution of dormancy leading to tolerance in vitro, and by the increasing number of cases of treatment failure in the clinic not explained by resistance. My goal is to develop a multidisciplinary approach to detect, quantify and characterize tolerant bacteria in the clinic. Using my background in quantitative single-cell analyses, I will develop microfluidic devices for the rapid detection of tolerant bacteria in the clinic, systems biology tools to isolate and analyze dormant sub-populations directly from clinical isolates. I will search for the genetic mutations leading to tolerance, namely build what I term here the ‘tolerome’. The results will be analyzed in a mathematical framework of tolerance evolution. This approach should reveal the role of tolerance in the clinic and may lead to a paradigm shift in the way bacterial infections are characterized and treated.
Summary
Bacterial ability to evolve strategies for evading antibiotic treatment is a fascinating example of an evolutionary process, as well as a major health threat. Despite efforts to understand treatment failure, we lack the means to prevent evolution of resistance when a new drug is released to the market. Most efforts are directed towards understanding the mechanisms of antibiotic resistance. Whereas ‘resistance’ is due to mutations that enable microorganisms to grow even at high concentrations of the drug, ‘tolerance’ is the ability to sustain a transient treatment, for example by entering a mode of transient dormancy. The importance of tolerance in the clinic has not been investigated as thoroughly as resistance. The presence of tolerant bacteria is not detected in the clinic because of the inherent difficulty of tracking dormant bacteria that often constitute only a minute fraction of the bacterial population. I hypothesize that bacterial dormancy may evolve quickly in the host under antibiotic treatment. This hypothesis is strengthened by our recent results demonstrating the rapid evolution of dormancy leading to tolerance in vitro, and by the increasing number of cases of treatment failure in the clinic not explained by resistance. My goal is to develop a multidisciplinary approach to detect, quantify and characterize tolerant bacteria in the clinic. Using my background in quantitative single-cell analyses, I will develop microfluidic devices for the rapid detection of tolerant bacteria in the clinic, systems biology tools to isolate and analyze dormant sub-populations directly from clinical isolates. I will search for the genetic mutations leading to tolerance, namely build what I term here the ‘tolerome’. The results will be analyzed in a mathematical framework of tolerance evolution. This approach should reveal the role of tolerance in the clinic and may lead to a paradigm shift in the way bacterial infections are characterized and treated.
Max ERC Funding
1 978 750 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym TSGPs-of-CFSs
Project Role of Tumour Suppressor Gene Products of Common Fragile Sites in Human Diseases
Researcher (PI) Rami Aqeilan
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Consolidator Grant (CoG), LS4, ERC-2015-CoG
Summary Common fragile sites (CFSs) are large chromosomal regions identified by conventional cytogenetics as sequences prone to breakage in cells subjected to replication stress. The interest in CFSs stems from their key role in DNA damage, resulting in chromosomal rearrangements. The instability of CFSs was correlated with genome instability in precancerous lesions and during tumour progression. Two opposing views dominate the discussion regarding the role of CFSs. One school of thought suggested that genomic instability during cancer progression causes collateral damage to genes residing within CFSs, such as WWOX and FHIT. These genes are proposed to be unselected ‘‘passenger’’ mutations. The counter argument is that deletions and other genomic alterations in CFSs occur early in cancer development. Cancer cells with deletions in genes that span CFSs are then selectively expanded due to loss of tumour suppressor functions such as protection of genome stability, coordination of cell cycle or apoptosis.
Recent observations from my lab clearly suggest that gene products from CFSs play driver roles in cancer transformation. Moreover, we have evidence for the involvement of DNA damage and Wwox in pancreatic β-cells in the context of diabetes. Here, I propose to investigate the role of tumour suppressor gene products (TSGPs) of CFSs in human diseases. Three approaches will be taken to tackle this question. First, molecular functions of TSGPs of CFSs will be determined using state-of-the-art genetic tools in vitro. Second, novel transgenic mouse tools will be used to study CFSs and their associated TSGs in preneoplastic lesions and tumours in vivo, with confirmatory studies in human material. Third, we will examine the potential involvement of CFSs and their TSGPs in type-2 diabetes (T2D).
The expected outcome is a detailed molecular understanding of CFSs and their associated TSGPs in genomic instability as well as their roles in cancer and metabolic diseases.
Summary
Common fragile sites (CFSs) are large chromosomal regions identified by conventional cytogenetics as sequences prone to breakage in cells subjected to replication stress. The interest in CFSs stems from their key role in DNA damage, resulting in chromosomal rearrangements. The instability of CFSs was correlated with genome instability in precancerous lesions and during tumour progression. Two opposing views dominate the discussion regarding the role of CFSs. One school of thought suggested that genomic instability during cancer progression causes collateral damage to genes residing within CFSs, such as WWOX and FHIT. These genes are proposed to be unselected ‘‘passenger’’ mutations. The counter argument is that deletions and other genomic alterations in CFSs occur early in cancer development. Cancer cells with deletions in genes that span CFSs are then selectively expanded due to loss of tumour suppressor functions such as protection of genome stability, coordination of cell cycle or apoptosis.
Recent observations from my lab clearly suggest that gene products from CFSs play driver roles in cancer transformation. Moreover, we have evidence for the involvement of DNA damage and Wwox in pancreatic β-cells in the context of diabetes. Here, I propose to investigate the role of tumour suppressor gene products (TSGPs) of CFSs in human diseases. Three approaches will be taken to tackle this question. First, molecular functions of TSGPs of CFSs will be determined using state-of-the-art genetic tools in vitro. Second, novel transgenic mouse tools will be used to study CFSs and their associated TSGs in preneoplastic lesions and tumours in vivo, with confirmatory studies in human material. Third, we will examine the potential involvement of CFSs and their TSGPs in type-2 diabetes (T2D).
The expected outcome is a detailed molecular understanding of CFSs and their associated TSGPs in genomic instability as well as their roles in cancer and metabolic diseases.
Max ERC Funding
2 000 000 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym Virocellsphere
Project Host-virus chemical arms race during algal bloom in the ocean at a single cell resolution
Researcher (PI) Asaf Vardi
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Consolidator Grant (CoG), LS8, ERC-2015-CoG
Summary Phytoplankton blooms are ephemeral events of exceptionally high primary productivity that regulate the flux of carbon across marine food webs. The cosmopolitan coccolithophore Emiliania huxleyi (Haptophyta) is a unicellular eukaryotic alga responsible for the largest oceanic algal blooms covering thousands of square kilometers. These annual blooms are frequently terminated by a specific large dsDNA E. huxleyi virus (EhV).
Despite the huge ecological importance of host-virus interactions, the ability to assess their ecological impact is limited to current approaches, which focus mainly on quantification of viral abundance and diversity. On the molecular basis, a major challenge in the current understanding of host-virus interactions in the marine environment is the ability to decode the wealth of “omics” data and translate it into cellular mechanisms that mediate host susceptibility and resistance to viral infection.
In the current proposal we intend to provide novel functional insights into molecular mechanisms that regulate host-virus interactions at the single-cell level by unravelling phenotypic heterogeneity within infected populations. By using physiological markers and single-cell transcriptomics, we propose to discern between host subpopulations and define their different “metabolic states”, in order to map them into different modes of susceptibility and resistance. By using advanced metabolomic approaches, we also aim to define the infochemical microenvironment generated during viral infection and examine how it can shape host phenotypic plasticity. Mapping the transcriptomic and metabolic footprints of viral infection will provide a meaningful tool to assess the dynamics of active viral infection during natural E. huxleyi blooms. Our novel approaches will pave the way for unprecedented quantification of the “viral shunt” that drives nutrient fluxes in marine food webs, from a single-cell level to a population and eventually ecosystem levels.
Summary
Phytoplankton blooms are ephemeral events of exceptionally high primary productivity that regulate the flux of carbon across marine food webs. The cosmopolitan coccolithophore Emiliania huxleyi (Haptophyta) is a unicellular eukaryotic alga responsible for the largest oceanic algal blooms covering thousands of square kilometers. These annual blooms are frequently terminated by a specific large dsDNA E. huxleyi virus (EhV).
Despite the huge ecological importance of host-virus interactions, the ability to assess their ecological impact is limited to current approaches, which focus mainly on quantification of viral abundance and diversity. On the molecular basis, a major challenge in the current understanding of host-virus interactions in the marine environment is the ability to decode the wealth of “omics” data and translate it into cellular mechanisms that mediate host susceptibility and resistance to viral infection.
In the current proposal we intend to provide novel functional insights into molecular mechanisms that regulate host-virus interactions at the single-cell level by unravelling phenotypic heterogeneity within infected populations. By using physiological markers and single-cell transcriptomics, we propose to discern between host subpopulations and define their different “metabolic states”, in order to map them into different modes of susceptibility and resistance. By using advanced metabolomic approaches, we also aim to define the infochemical microenvironment generated during viral infection and examine how it can shape host phenotypic plasticity. Mapping the transcriptomic and metabolic footprints of viral infection will provide a meaningful tool to assess the dynamics of active viral infection during natural E. huxleyi blooms. Our novel approaches will pave the way for unprecedented quantification of the “viral shunt” that drives nutrient fluxes in marine food webs, from a single-cell level to a population and eventually ecosystem levels.
Max ERC Funding
2 749 901 €
Duration
Start date: 2016-11-01, End date: 2021-10-31