Project acronym ContactLube
Project Highly-lubricated soft contact lenses
Researcher (PI) Jacob Klein
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Proof of Concept (PoC), PC1, ERC-2014-PoC
Summary The object of this proof of concept project is to modify soft contact lenses to render them more lubricated, and so make them far more comfortable to use. We propose to improve greatly the lubricity of soft contact lenses, used by over 100 million people worldwide with a market of around $7.5bn, and for which acute discomfort may arise from high friction at the interfaces between the lenses and the eyelid or cornea. According to an established clinical hypothesis, ocular comfort is related to the level of friction between the anterior side of the contact lens and the inner eyelid occurring during the blinking process, and boundary lubrication is the key to providing user comfort during extensive wearing of soft contact lenses. High lens friction, for a substantial part of the user population, can limit the extent to which soft lenses may be used and can also aggravate pathologies such as dry eye syndrome. Thus soft contact lenses that are much better lubricated than those currently in use have clear economic and health-related benefits. The current project, working through 5 work-packages, will establish the feasibility, will carry out competitive analysis, explore the commercialization process and the IPR position, and seek contacts with appropriate industrial partners to further develop the commercialization of this idea.
Summary
The object of this proof of concept project is to modify soft contact lenses to render them more lubricated, and so make them far more comfortable to use. We propose to improve greatly the lubricity of soft contact lenses, used by over 100 million people worldwide with a market of around $7.5bn, and for which acute discomfort may arise from high friction at the interfaces between the lenses and the eyelid or cornea. According to an established clinical hypothesis, ocular comfort is related to the level of friction between the anterior side of the contact lens and the inner eyelid occurring during the blinking process, and boundary lubrication is the key to providing user comfort during extensive wearing of soft contact lenses. High lens friction, for a substantial part of the user population, can limit the extent to which soft lenses may be used and can also aggravate pathologies such as dry eye syndrome. Thus soft contact lenses that are much better lubricated than those currently in use have clear economic and health-related benefits. The current project, working through 5 work-packages, will establish the feasibility, will carry out competitive analysis, explore the commercialization process and the IPR position, and seek contacts with appropriate industrial partners to further develop the commercialization of this idea.
Max ERC Funding
150 000 €
Duration
Start date: 2015-01-01, End date: 2016-06-30
Project acronym CoPathoPhage
Project Pathogen-phage cooperation during mammalian infection
Researcher (PI) Anat Herskovits
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Consolidator Grant (CoG), LS6, ERC-2018-COG
Summary Most bacterial pathogens are lysogens, namely carry DNA of active phages within their genome, referred to as prophages. While these prophages have the potential to turn under stress into infective viruses which kill their host bacterium in a matter of minutes, it is unclear how pathogens manage to survive this internal threat under the stresses imposed by their invasion into mammalian cells. In the proposed project, we will study the hypothesis that a complex bacteria-phage cooperative adaptation supports virulence during mammalian infection while preventing inadvertent killing by phages. Several years ago, we uncovered a novel pathogen-phage interaction, in which an infective prophage promotes the virulence of its host, the bacterial pathogen Listeria monocytogenes (Lm), via adaptive behaviour. More recently, we discovered that the prophage, though fully infective, is non-autonomous- completely dependent on regulatory factors derived from inactive prophage remnants that reside in the Lm chromosome. These findings lead us to propose that the intimate cross-regulatory interactions between all phage elements within the genome (infective and remnant), are crucial in promoting bacteria-phage patho-adaptive behaviours in the mammalian niche and thereby bacterial virulence. In the proposed project, we will investigate specific cross-regulatory and cooperative mechanisms of all the phage elements, study the domestication of phage remnant-derived regulatory factors, and examine the hypothesis that they collectively form an auxiliary phage-control system that tempers infective phages. Finally, we will examine the premise that the mammalian niche drives the evolution of temperate phages into patho-adaptive phages, and that phages that lack this adaptation may kill host pathogens during infection. This work is expected to provide novel insights into bacteria-phage coexistence in mammalian environments and to facilitate the development of innovative phage therapy strategies.
Summary
Most bacterial pathogens are lysogens, namely carry DNA of active phages within their genome, referred to as prophages. While these prophages have the potential to turn under stress into infective viruses which kill their host bacterium in a matter of minutes, it is unclear how pathogens manage to survive this internal threat under the stresses imposed by their invasion into mammalian cells. In the proposed project, we will study the hypothesis that a complex bacteria-phage cooperative adaptation supports virulence during mammalian infection while preventing inadvertent killing by phages. Several years ago, we uncovered a novel pathogen-phage interaction, in which an infective prophage promotes the virulence of its host, the bacterial pathogen Listeria monocytogenes (Lm), via adaptive behaviour. More recently, we discovered that the prophage, though fully infective, is non-autonomous- completely dependent on regulatory factors derived from inactive prophage remnants that reside in the Lm chromosome. These findings lead us to propose that the intimate cross-regulatory interactions between all phage elements within the genome (infective and remnant), are crucial in promoting bacteria-phage patho-adaptive behaviours in the mammalian niche and thereby bacterial virulence. In the proposed project, we will investigate specific cross-regulatory and cooperative mechanisms of all the phage elements, study the domestication of phage remnant-derived regulatory factors, and examine the hypothesis that they collectively form an auxiliary phage-control system that tempers infective phages. Finally, we will examine the premise that the mammalian niche drives the evolution of temperate phages into patho-adaptive phages, and that phages that lack this adaptation may kill host pathogens during infection. This work is expected to provide novel insights into bacteria-phage coexistence in mammalian environments and to facilitate the development of innovative phage therapy strategies.
Max ERC Funding
2 200 000 €
Duration
Start date: 2019-10-01, End date: 2024-09-30
Project acronym CORALWARM
Project Corals and global warming: The Mediterranean versus the Red Sea
Researcher (PI) Zvy Dubinsky
Host Institution (HI) BAR ILAN UNIVERSITY
Call Details Advanced Grant (AdG), LS8, ERC-2009-AdG
Summary CoralWarm will generate for the first time projections of temperate and subtropical coral survival by integrating sublethal temperature increase effects on metabolic and skeletal processes in Mediterranean and Red Sea key species. CoralWarm unique approach is from the nano- to the macro-scale, correlating molecular events to environmental processes. This will show new pathways to future investigations on cellular mechanisms linking environmental factors to final phenotype, potentially improving prediction powers and paleoclimatological interpretation. Biological and chemical expertise will merge, producing new interdisciplinary approaches for ecophysiology and biomineralization. Field transplantations will be combined with controlled experiments under IPCC scenarios. Corals will be grown in aquaria, exposing the Mediterranean species native to cooler waters to higher temperatures, and the Red Sea ones to gradually increasing above ambient warming seawater. Virtually all state-of-the-art methods will be used, by uniquely combining the investigators expertise. Expected results include responses of algal symbionts photosynthesis, host, symbiont and holobiont respiration, biomineralization rates and patterns, including colony architecture, and reproduction to temperature and pH gradients and combinations. Integration of molecular aspects of potential replacement of symbiont clades, changes in skeletal crystallography, with biochemical and physiological aspects of temperature response, will lead to a novel mechanistic model predicting changes in coral ecology and survival prospect. High-temperature tolerant clades and species will be revealed, allowing future bioremediation actions and establishment of coral refuges, saving corals and coral reefs for future generations.
Summary
CoralWarm will generate for the first time projections of temperate and subtropical coral survival by integrating sublethal temperature increase effects on metabolic and skeletal processes in Mediterranean and Red Sea key species. CoralWarm unique approach is from the nano- to the macro-scale, correlating molecular events to environmental processes. This will show new pathways to future investigations on cellular mechanisms linking environmental factors to final phenotype, potentially improving prediction powers and paleoclimatological interpretation. Biological and chemical expertise will merge, producing new interdisciplinary approaches for ecophysiology and biomineralization. Field transplantations will be combined with controlled experiments under IPCC scenarios. Corals will be grown in aquaria, exposing the Mediterranean species native to cooler waters to higher temperatures, and the Red Sea ones to gradually increasing above ambient warming seawater. Virtually all state-of-the-art methods will be used, by uniquely combining the investigators expertise. Expected results include responses of algal symbionts photosynthesis, host, symbiont and holobiont respiration, biomineralization rates and patterns, including colony architecture, and reproduction to temperature and pH gradients and combinations. Integration of molecular aspects of potential replacement of symbiont clades, changes in skeletal crystallography, with biochemical and physiological aspects of temperature response, will lead to a novel mechanistic model predicting changes in coral ecology and survival prospect. High-temperature tolerant clades and species will be revealed, allowing future bioremediation actions and establishment of coral refuges, saving corals and coral reefs for future generations.
Max ERC Funding
3 332 032 €
Duration
Start date: 2010-06-01, End date: 2016-05-31
Project acronym COSMICEXPLOSIONS
Project The nature of cosmic explosions
Researcher (PI) Avishay Gal-Yam
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), PE9, ERC-2012-StG_20111012
Summary Cosmic explosions, the violent deaths of stars, play a crucial role in many of the most interesting open questions in physics today. These events serve as “cosmic accelerators” for ultra-high-energy particles that are beyond reach for even to most powerful terrestrial accelerators, as well as distant sources for elusive neutrinos. Explosions leave behind compact neutron stars and black hole remnants, natural laboratories to study strong gravity. Acting as cosmic furnaces, these explosions driven the chemical evolution of the Universe Cosmic explosions trigger and inhibit star formation processes, and drive galactic evolution (“feedback”). Distances measured using supernova explosions as standard candles brought about the modern revolution in our view of the accelerating Universe, driven by enigmatic “dark energy”. Understanding the nature of cosmic explosions of all types is thus an extremely well-motivated endeavour. I have been studying cosmic explosions for over a decade, and since the earliest stages of my career, have followed an ambition to figure out the nature of cosmic explosions of all types, and to search for new types of explosions. Having already made several key discoveries, I now propose to undertake a comprehensive program to systematically tackle this problem.I review below the progress made in this field and the breakthrough results we have achieved so far, and propose to climb the next step in this scientific and technological ladder, combining new powerful surveys with comprehensive multi-wavelength and multi-disciplinary (observational and theoretical) analysis. My strategy is based on a combination of two main approaches: detailed studies of single objects which serve as keys to specific questions; and systematic studies of large samples, some that I have, for the first time, been able to assemble and analyze, and those expected from forthcoming efforts. Both approaches have already yielded tantalizing results.
Summary
Cosmic explosions, the violent deaths of stars, play a crucial role in many of the most interesting open questions in physics today. These events serve as “cosmic accelerators” for ultra-high-energy particles that are beyond reach for even to most powerful terrestrial accelerators, as well as distant sources for elusive neutrinos. Explosions leave behind compact neutron stars and black hole remnants, natural laboratories to study strong gravity. Acting as cosmic furnaces, these explosions driven the chemical evolution of the Universe Cosmic explosions trigger and inhibit star formation processes, and drive galactic evolution (“feedback”). Distances measured using supernova explosions as standard candles brought about the modern revolution in our view of the accelerating Universe, driven by enigmatic “dark energy”. Understanding the nature of cosmic explosions of all types is thus an extremely well-motivated endeavour. I have been studying cosmic explosions for over a decade, and since the earliest stages of my career, have followed an ambition to figure out the nature of cosmic explosions of all types, and to search for new types of explosions. Having already made several key discoveries, I now propose to undertake a comprehensive program to systematically tackle this problem.I review below the progress made in this field and the breakthrough results we have achieved so far, and propose to climb the next step in this scientific and technological ladder, combining new powerful surveys with comprehensive multi-wavelength and multi-disciplinary (observational and theoretical) analysis. My strategy is based on a combination of two main approaches: detailed studies of single objects which serve as keys to specific questions; and systematic studies of large samples, some that I have, for the first time, been able to assemble and analyze, and those expected from forthcoming efforts. Both approaches have already yielded tantalizing results.
Max ERC Funding
1 499 302 €
Duration
Start date: 2012-09-01, End date: 2017-08-31
Project acronym CoupledNC
Project Coupled Nanocrystal Molecules: Quantum coupling effects via chemical coupling of colloidal nanocrystals
Researcher (PI) Uri BANIN
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Advanced Grant (AdG), PE4, ERC-2016-ADG
Summary Coupling of atoms is the basis of chemistry, yielding the beauty and richness of molecules and materials. Herein I introduce nanocrystal chemistry: the use of semiconductor nanocrystals (NCs) as artificial atoms to form NC molecules that are chemically, structurally and physically coupled. The unique emergent quantum mechanical consequences of the NCs coupling will be studied and tailored to yield a chemical-quantum palette: coherent coupling of NC exciton states; dual color single photon emitters functional also as photo-switchable chromophores in super-resolution fluorescence microscopy; electrically switchable single NC photon emitters for utilization as taggants for neuronal activity and as chromophores in displays; new NC structures for lasing; and coupled quasi-1D NC chains manifesting mini-band formation, and tailored for a quantum-cascade effect for IR photon emission. A novel methodology of controlled oriented attachment of NC building blocks (in particular of core/shell NCs) will be presented to realize the coupled NCs molecules. For this a new type of Janus NC building block will be developed, and used as an element in a Lego-type construction of double quantum dots (dimers), heterodimers coupling two different types of NCs, and more complex NC coupled quantum structures. To realize this NC chemistry approach, surface control is essential, which will be achieved via investigation of the chemical and dynamical properties of the NCs surface ligands layer. As outcome I can expect to decipher NCs surface chemistry and dynamics, including its size dependence, and to introduce Janus NCs with chemically distinct and selectively modified surface faces. From this I will develop a new step-wise approach for synthesis of coupled NCs molecules and reveal the consequences of quantum coupling in them. This will inspire theoretical and further experimental work and will set the stage for the development of the diverse potential applications of coupled NC molecules.
Summary
Coupling of atoms is the basis of chemistry, yielding the beauty and richness of molecules and materials. Herein I introduce nanocrystal chemistry: the use of semiconductor nanocrystals (NCs) as artificial atoms to form NC molecules that are chemically, structurally and physically coupled. The unique emergent quantum mechanical consequences of the NCs coupling will be studied and tailored to yield a chemical-quantum palette: coherent coupling of NC exciton states; dual color single photon emitters functional also as photo-switchable chromophores in super-resolution fluorescence microscopy; electrically switchable single NC photon emitters for utilization as taggants for neuronal activity and as chromophores in displays; new NC structures for lasing; and coupled quasi-1D NC chains manifesting mini-band formation, and tailored for a quantum-cascade effect for IR photon emission. A novel methodology of controlled oriented attachment of NC building blocks (in particular of core/shell NCs) will be presented to realize the coupled NCs molecules. For this a new type of Janus NC building block will be developed, and used as an element in a Lego-type construction of double quantum dots (dimers), heterodimers coupling two different types of NCs, and more complex NC coupled quantum structures. To realize this NC chemistry approach, surface control is essential, which will be achieved via investigation of the chemical and dynamical properties of the NCs surface ligands layer. As outcome I can expect to decipher NCs surface chemistry and dynamics, including its size dependence, and to introduce Janus NCs with chemically distinct and selectively modified surface faces. From this I will develop a new step-wise approach for synthesis of coupled NCs molecules and reveal the consequences of quantum coupling in them. This will inspire theoretical and further experimental work and will set the stage for the development of the diverse potential applications of coupled NC molecules.
Max ERC Funding
2 499 750 €
Duration
Start date: 2017-11-01, End date: 2022-10-31
Project acronym CrackEpitranscriptom
Project Cracking the epitranscriptome
Researcher (PI) Schraga SCHWARTZ
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), LS2, ERC-2016-STG
Summary Over 100 types of distinct modifications are catalyzed on RNA molecules post-transcriptionally. In an analogous manner to well-studied chemical modifications on proteins or DNA, modifications on RNA - and particularly on mRNA - harbor the exciting potential of regulating the complex and interlinked life cycle of these molecules. The most abundant modification in mammalian and yeast mRNA is N6-methyladenosine (m6A). We have pioneered approaches for mapping m6A in a transcriptome wide manner, and we and others have identified factors involved in encoding and decoding m6A. While experimental disruption of these factors is associated with severe phenotypes, the role of m6A remains enigmatic. No single methylated site has been shown to causally underlie any physiological or molecular function. This proposal aims to establish a framework for systematically deciphering the molecular function of a modification and its underlying mechanisms and to uncover the physiological role of the modification in regulation of a cellular response. We will apply this framework to m6A in the context of meiosis in budding yeast, as m6A dynamically accumulates on meiotic mRNAs and as the methyltransferase catalyzing m6A is essential for meiosis. We will (1) aim to elucidate the physiological targets of methylation governing entry into meiosis (2) seek to elucidate the function of m6A at the molecular level, and understand its impact on the various steps of the mRNA life cycle, (3) seek to understand the mechanisms underlying its effects. These aims will provide a comprehensive framework for understanding how the epitranscriptome, an emerging post-transcriptional layer of regulation, fine-tunes gene regulation and impacts cellular decision making in a dynamic response, and will set the stage towards dissecting the roles of m6A and of an expanding set of mRNA modifications in more complex and disease related systems.
Summary
Over 100 types of distinct modifications are catalyzed on RNA molecules post-transcriptionally. In an analogous manner to well-studied chemical modifications on proteins or DNA, modifications on RNA - and particularly on mRNA - harbor the exciting potential of regulating the complex and interlinked life cycle of these molecules. The most abundant modification in mammalian and yeast mRNA is N6-methyladenosine (m6A). We have pioneered approaches for mapping m6A in a transcriptome wide manner, and we and others have identified factors involved in encoding and decoding m6A. While experimental disruption of these factors is associated with severe phenotypes, the role of m6A remains enigmatic. No single methylated site has been shown to causally underlie any physiological or molecular function. This proposal aims to establish a framework for systematically deciphering the molecular function of a modification and its underlying mechanisms and to uncover the physiological role of the modification in regulation of a cellular response. We will apply this framework to m6A in the context of meiosis in budding yeast, as m6A dynamically accumulates on meiotic mRNAs and as the methyltransferase catalyzing m6A is essential for meiosis. We will (1) aim to elucidate the physiological targets of methylation governing entry into meiosis (2) seek to elucidate the function of m6A at the molecular level, and understand its impact on the various steps of the mRNA life cycle, (3) seek to understand the mechanisms underlying its effects. These aims will provide a comprehensive framework for understanding how the epitranscriptome, an emerging post-transcriptional layer of regulation, fine-tunes gene regulation and impacts cellular decision making in a dynamic response, and will set the stage towards dissecting the roles of m6A and of an expanding set of mRNA modifications in more complex and disease related systems.
Max ERC Funding
1 402 666 €
Duration
Start date: 2016-11-01, End date: 2021-10-31
Project acronym CRISPR-EVOL
Project The eco-evolutionary costs and benefits of CRISPR-Cas systems, and their effect on genome diversity within populations
Researcher (PI) Uri Gophna
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Advanced Grant (AdG), LS8, ERC-2017-ADG
Summary CRISPR-Cas systems are microbial defense systems that provide prokaryotes with acquired and heritable DNA-based immunity against selfish genetic elements, primarily viruses. However, the full scope of benefits that these systems can provide, as well as their costs remain unknown. Specifically, it is unclear whether the benefits against viral infection outweigh the continual costs incurred even in the absence of parasitic elements, and whether CRISPR-Cas systems affect microbial genome diversity in nature.
Since CRISPR-Cas systems can impede lateral gene transfer, it is often assumed that they reduce genetic diversity. Conversely, our recent results suggest the exact opposite: that these systems generate a high level of genomic diversity within populations. We have recently combined genomics of environmental strains and experimental genetics to show that archaea frequently acquire CRISPR immune memory, known as spacers, from chromosomes of related species in the environment. The presence of these spacers reduces gene exchange between lineages, indicating that CRISPR-Cas contributes to diversification. We have also shown that such inter-species mating events induce the acquisition of spacers against a strain's own replicons, supporting a role for CRISPR-Cas systems in generating deletions in natural plasmids and unessential genomic loci, again increasing genome diversity within populations.
Here we aim to test our hypothesis that CRISPR-Cas systems increase within-population diversity, and quantify their benefits to both cells and populations, using large-scale genomics and experimental evolution. We will explore how these systems alter the patterns of recombination within and between species, and explore the potential involvement of CRISPR-associated proteins in cellular DNA repair.
This work will reveal the eco-evolutionary role of CRISPR-Cas systems in shaping microbial populations, and open new research avenues regarding additional roles beyond anti-viral defense
Summary
CRISPR-Cas systems are microbial defense systems that provide prokaryotes with acquired and heritable DNA-based immunity against selfish genetic elements, primarily viruses. However, the full scope of benefits that these systems can provide, as well as their costs remain unknown. Specifically, it is unclear whether the benefits against viral infection outweigh the continual costs incurred even in the absence of parasitic elements, and whether CRISPR-Cas systems affect microbial genome diversity in nature.
Since CRISPR-Cas systems can impede lateral gene transfer, it is often assumed that they reduce genetic diversity. Conversely, our recent results suggest the exact opposite: that these systems generate a high level of genomic diversity within populations. We have recently combined genomics of environmental strains and experimental genetics to show that archaea frequently acquire CRISPR immune memory, known as spacers, from chromosomes of related species in the environment. The presence of these spacers reduces gene exchange between lineages, indicating that CRISPR-Cas contributes to diversification. We have also shown that such inter-species mating events induce the acquisition of spacers against a strain's own replicons, supporting a role for CRISPR-Cas systems in generating deletions in natural plasmids and unessential genomic loci, again increasing genome diversity within populations.
Here we aim to test our hypothesis that CRISPR-Cas systems increase within-population diversity, and quantify their benefits to both cells and populations, using large-scale genomics and experimental evolution. We will explore how these systems alter the patterns of recombination within and between species, and explore the potential involvement of CRISPR-associated proteins in cellular DNA repair.
This work will reveal the eco-evolutionary role of CRISPR-Cas systems in shaping microbial populations, and open new research avenues regarding additional roles beyond anti-viral defense
Max ERC Funding
2 495 625 €
Duration
Start date: 2018-05-01, End date: 2023-04-30
Project acronym CRISPRsition
Project Developing CRISPR adaptation platforms for basic and applied research
Researcher (PI) Ehud Itzhak Qimron
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Consolidator Grant (CoG), LS2, ERC-2018-COG
Summary The CRISPR-Cas system has been extensively studied for its ability to cleave DNA. In contrast, studies of the ability of the system to acquire and integrate new DNA from invaders as a form of prokaryotic adaptive immunity, have lagged behind. This delay reflects the extreme enthusiasm surrounding the potential of using the system’s cleavage capabilities as a genome editing tool. However, the enormous potential of the adaptation process can and should arouse a similar degree of enthusiasm. My lab has pioneered studies on the CRISPR adaptation process by establishing new methodologies, and applying them to demonstrate the essential role of the proteins and DNA elements, as well as the molecular mechanisms, operating in this process. In this project, I will establish novel platforms for studying adaptation and develop them into biotechnological applications and research tools. These tools will allow me to identify the first natural and synthetic inhibitors of the adaptation process. This, in turn, will provide genetic tools to control adaptation, as well as advance the understanding of the arms race between bacteria and their invaders. I will also harness the adaptation process as a platform for diversifying genetic elements for phage display, and for extending phage recognition of a wide range of hosts. Lastly, I will provide the first evidence for an association between the CRISPR adaptation system and gene repression. This linkage will form the basis of a molecular scanner and recorder platform that I will develop and that can be used to identify crucial genetic elements in phage genomes as well as novel regulatory circuits in the bacterial genome. Together, my findings will represent a considerable leap in the understanding of CRISPR adaptation with respect to the process, potential applications, and the intriguing evolutionary significance.
Summary
The CRISPR-Cas system has been extensively studied for its ability to cleave DNA. In contrast, studies of the ability of the system to acquire and integrate new DNA from invaders as a form of prokaryotic adaptive immunity, have lagged behind. This delay reflects the extreme enthusiasm surrounding the potential of using the system’s cleavage capabilities as a genome editing tool. However, the enormous potential of the adaptation process can and should arouse a similar degree of enthusiasm. My lab has pioneered studies on the CRISPR adaptation process by establishing new methodologies, and applying them to demonstrate the essential role of the proteins and DNA elements, as well as the molecular mechanisms, operating in this process. In this project, I will establish novel platforms for studying adaptation and develop them into biotechnological applications and research tools. These tools will allow me to identify the first natural and synthetic inhibitors of the adaptation process. This, in turn, will provide genetic tools to control adaptation, as well as advance the understanding of the arms race between bacteria and their invaders. I will also harness the adaptation process as a platform for diversifying genetic elements for phage display, and for extending phage recognition of a wide range of hosts. Lastly, I will provide the first evidence for an association between the CRISPR adaptation system and gene repression. This linkage will form the basis of a molecular scanner and recorder platform that I will develop and that can be used to identify crucial genetic elements in phage genomes as well as novel regulatory circuits in the bacterial genome. Together, my findings will represent a considerable leap in the understanding of CRISPR adaptation with respect to the process, potential applications, and the intriguing evolutionary significance.
Max ERC Funding
2 000 000 €
Duration
Start date: 2019-12-01, End date: 2024-11-30
Project acronym CRISS
Project CRISPR Gene Correction for Severe Combined Immunodeficiency Caused by Mutations in Recombination-activating gene 1 and 2 (RAG1 and RAG2)
Researcher (PI) Ayal Hendel
Host Institution (HI) BAR ILAN UNIVERSITY
Call Details Starting Grant (StG), LS7, ERC-2017-STG
Summary The severe combined immunodeficiencies (SCIDs) are a set of life threatening genetic diseases in which patients are born with mutations in single genes and are unable to develop functional immune systems. While allogeneic bone marrow transplantation can be curative for these diseases, there remain significant limitations to this approach. Gene therapy using viral vectors containing a corrective transgene is being developed for some of these disorders, most successfully for ADA-SCID. However, for other SCID disorders, such as those caused by genetic mutations in RAG1 and RAG2, the transgene needs to be expressed in a precise, developmental and lineage specific manner to achieve functional gene correction and to avoid the risks of cellular transformation. In contrast to using viral vectors to deliver transgenes in an uncontrolled fashion, we are working towards using genome editing by homologous recombination (HR) to correct the disease causing mutation by precisely modifying the genome. We have shown that by using clustered, regularly interspaced, short palindromic repeats (CRISPR) and the CRISPR-associated protein 9 (Cas9) system we can stimulate genome editing by HR at frequencies that should be therapeutically beneficial (>10%) in hematopoietic stem and progenitor cells (HSPCs). The overall focus of the proposal is to translate our basic science studies to use in RAG-SCID patient-derived HSPCs in methodical, careful and pre-clinically relevant fashion. The fundamental approach is to develop a highly active functional genome editing system using CRISPR-Cas9 for RAG-SCIDs and complete pre-clinical efficacy and safety studies to show the approach has a clear path towards future clinical trials. Our goal with this proposal is to develop the next wave of curative therapies for SCIDs and other hematopoietic disorders using genome editing.
Summary
The severe combined immunodeficiencies (SCIDs) are a set of life threatening genetic diseases in which patients are born with mutations in single genes and are unable to develop functional immune systems. While allogeneic bone marrow transplantation can be curative for these diseases, there remain significant limitations to this approach. Gene therapy using viral vectors containing a corrective transgene is being developed for some of these disorders, most successfully for ADA-SCID. However, for other SCID disorders, such as those caused by genetic mutations in RAG1 and RAG2, the transgene needs to be expressed in a precise, developmental and lineage specific manner to achieve functional gene correction and to avoid the risks of cellular transformation. In contrast to using viral vectors to deliver transgenes in an uncontrolled fashion, we are working towards using genome editing by homologous recombination (HR) to correct the disease causing mutation by precisely modifying the genome. We have shown that by using clustered, regularly interspaced, short palindromic repeats (CRISPR) and the CRISPR-associated protein 9 (Cas9) system we can stimulate genome editing by HR at frequencies that should be therapeutically beneficial (>10%) in hematopoietic stem and progenitor cells (HSPCs). The overall focus of the proposal is to translate our basic science studies to use in RAG-SCID patient-derived HSPCs in methodical, careful and pre-clinically relevant fashion. The fundamental approach is to develop a highly active functional genome editing system using CRISPR-Cas9 for RAG-SCIDs and complete pre-clinical efficacy and safety studies to show the approach has a clear path towards future clinical trials. Our goal with this proposal is to develop the next wave of curative therapies for SCIDs and other hematopoietic disorders using genome editing.
Max ERC Funding
1 372 839 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym CRYOMATH
Project Cryo-electron microscopy: mathematical foundations and algorithms
Researcher (PI) Yoel SHKOLNISKY
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Consolidator Grant (CoG), PE1, ERC-2016-COG
Summary The importance of understanding the functions of the basic building blocks of life, such as proteins, cannot be overstated (as asserted by two recent Nobel prizes in Chemistry), as this understanding unravels the mechanisms that control all organisms. The critical step towards such an understanding is to reveal the structures of these building blocks. A leading method for resolving such structures is cryo-electron microscopy (cryo-EM), in which the structure of a molecule is recovered from its images taken by an electron microscope, by using sophisticated mathematical algorithms (to which my group has made several key mathematical and algorithmic contributions). Due to hardware breakthroughs in the past three years, cryo-EM has made a giant leap forward, introducing capabilities that until recently were unimaginable, opening an opportunity to revolutionize our biological understanding. As extracting information from cryo-EM experiments completely relies on mathematical algorithms, the method’s deep mathematical challenges that have emerged must be solved as soon as possible. Only then cryo-EM could realize its nearly inconceivable potential. These challenges, for which no adequate solutions exist (or none at all), focus on integrating information from huge sets of extremely noisy images reliability and efficiently. Based on the experience of my research group in developing algorithms for cryo-EM data processing, gained during the past eight years, we will address the three key open challenges of the field – a) deriving reliable and robust reconstruction algorithms from cryo-EM data, b) developing tools to process heterogeneous cryo-EM data sets, and c) devising validation and quality measures for structures determined from cryo-EM data. The fourth goal of the project, which ties all goals together and promotes the broad interdisciplinary impact of the project, is to merge all our algorithms into a software platform for state-of-the-art processing of cryo-EM data.
Summary
The importance of understanding the functions of the basic building blocks of life, such as proteins, cannot be overstated (as asserted by two recent Nobel prizes in Chemistry), as this understanding unravels the mechanisms that control all organisms. The critical step towards such an understanding is to reveal the structures of these building blocks. A leading method for resolving such structures is cryo-electron microscopy (cryo-EM), in which the structure of a molecule is recovered from its images taken by an electron microscope, by using sophisticated mathematical algorithms (to which my group has made several key mathematical and algorithmic contributions). Due to hardware breakthroughs in the past three years, cryo-EM has made a giant leap forward, introducing capabilities that until recently were unimaginable, opening an opportunity to revolutionize our biological understanding. As extracting information from cryo-EM experiments completely relies on mathematical algorithms, the method’s deep mathematical challenges that have emerged must be solved as soon as possible. Only then cryo-EM could realize its nearly inconceivable potential. These challenges, for which no adequate solutions exist (or none at all), focus on integrating information from huge sets of extremely noisy images reliability and efficiently. Based on the experience of my research group in developing algorithms for cryo-EM data processing, gained during the past eight years, we will address the three key open challenges of the field – a) deriving reliable and robust reconstruction algorithms from cryo-EM data, b) developing tools to process heterogeneous cryo-EM data sets, and c) devising validation and quality measures for structures determined from cryo-EM data. The fourth goal of the project, which ties all goals together and promotes the broad interdisciplinary impact of the project, is to merge all our algorithms into a software platform for state-of-the-art processing of cryo-EM data.
Max ERC Funding
1 751 250 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym CRYOPRESERVATION
Project Improved Cryopreservation using Ice Binding Proteins
Researcher (PI) Ido Braslavsky
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), LS9, ERC-2011-StG_20101109
Summary Several organisms have evolved specialized ice binding proteins (IBPs) that prevent their body fluids from freezing (antifreeze proteins, AFPs), inhibit recrystallization of ice in frozen tissues, or initiate freezing at moderate supercooling temperatures (ice nucleating proteins, INPs). These proteins have many potential applications in agriculture, food preservation, cryobiology, and biomedical science. The ubiquitous presence of IBPs in such organisms indicates the power of these molecules to enable survival under cold conditions. Despite this key role in nature, however, IBPs have been effectively exploited in only one cryopreservation application, namely, recrystallization inhibition in ice cream. Several terrestrial organisms, including insects, have developed very active forms of AFPs. These hyperactive AFPs (hypAFPs) have not been utilized significantly thus far in cryopreservation techniques. The gap between the obvious potential of IBPs and their actual applications stems from a lack of knowledge regarding the mechanisms by which IBPs interact with ice surfaces and how these proteins can assist in cryoprotection. I propose to investigate the mechanism by which IBPs inhibit ice crystallization and the use of such proteins for cryopreserving cells, tissues, and organisms. My group has a strong record in the study of the interactions between IBPs and ice using novel methods that we have developed, including fluorescence microscopy techniques combined with cooled microfluidic devices. We will investigate the interactions of AFPs with ice and the use of hypAFPs in cryopreservation procedures. This research will contribute to an understanding of the mechanisms by which IBPs act, and apply the acquired knowledge to cryopreservation. The successful implementation of IBPs in cryopreservation would revolutionize the field of cryobiology, with enormous implications for cryopreservation applications in general and the frozen and chilled food industry in particular.
Summary
Several organisms have evolved specialized ice binding proteins (IBPs) that prevent their body fluids from freezing (antifreeze proteins, AFPs), inhibit recrystallization of ice in frozen tissues, or initiate freezing at moderate supercooling temperatures (ice nucleating proteins, INPs). These proteins have many potential applications in agriculture, food preservation, cryobiology, and biomedical science. The ubiquitous presence of IBPs in such organisms indicates the power of these molecules to enable survival under cold conditions. Despite this key role in nature, however, IBPs have been effectively exploited in only one cryopreservation application, namely, recrystallization inhibition in ice cream. Several terrestrial organisms, including insects, have developed very active forms of AFPs. These hyperactive AFPs (hypAFPs) have not been utilized significantly thus far in cryopreservation techniques. The gap between the obvious potential of IBPs and their actual applications stems from a lack of knowledge regarding the mechanisms by which IBPs interact with ice surfaces and how these proteins can assist in cryoprotection. I propose to investigate the mechanism by which IBPs inhibit ice crystallization and the use of such proteins for cryopreserving cells, tissues, and organisms. My group has a strong record in the study of the interactions between IBPs and ice using novel methods that we have developed, including fluorescence microscopy techniques combined with cooled microfluidic devices. We will investigate the interactions of AFPs with ice and the use of hypAFPs in cryopreservation procedures. This research will contribute to an understanding of the mechanisms by which IBPs act, and apply the acquired knowledge to cryopreservation. The successful implementation of IBPs in cryopreservation would revolutionize the field of cryobiology, with enormous implications for cryopreservation applications in general and the frozen and chilled food industry in particular.
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-11-01, End date: 2016-10-31
Project acronym CSG
Project C° symplectic geometry
Researcher (PI) Lev Buhovski
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), PE1, ERC-2017-STG
Summary "The objective of this proposal is to study ""continuous"" (or C^0) objects, as well as C^0 properties of smooth objects, in the field of symplectic geometry and topology. C^0 symplectic geometry has seen spectacular progress in recent years, drawing attention of mathematicians from various background. The proposed study aims to discover new fascinating C^0 phenomena in symplectic geometry.
One circle of questions concerns symplectic and Hamiltonian homeomorphisms. Recent studies indicate that these objects possess both rigidity and flexibility, appearing in surprising and counter-intuitive ways. Our understanding of symplectic and Hamiltonian homeomorphisms is far from being satisfactory, and here we intend to study questions related to action of symplectic homeomorphisms on submanifolds. Some other questions are about Hamiltonian homeomorphisms in relation to the celebrated Arnold conjecture. The PI suggests to study spectral invariants of continuous Hamiltonian flows, which allow to formulate the C^0 Arnold conjecture in higher dimensions. Another central problem that the PI will work on is the C^0 flux conjecture.
A second circle of questions is about the Poisson bracket operator, and its functional-theoretic properties. The first question concerns the lower bound for the Poisson bracket invariant of a cover, conjectured by L. Polterovich who indicated relations between this problem and quantum mechanics. Another direction aims to study the C^0 rigidity versus flexibility of the L_p norm of the Poisson bracket. Despite a recent progress in dimension two showing rigidity, very little is known in higher dimensions. The PI proposes to use combination of tools from topology and from hard analysis in order to address this question, whose solution will be a big step towards understanding functional-theoretic properties of the Poisson bracket operator."
Summary
"The objective of this proposal is to study ""continuous"" (or C^0) objects, as well as C^0 properties of smooth objects, in the field of symplectic geometry and topology. C^0 symplectic geometry has seen spectacular progress in recent years, drawing attention of mathematicians from various background. The proposed study aims to discover new fascinating C^0 phenomena in symplectic geometry.
One circle of questions concerns symplectic and Hamiltonian homeomorphisms. Recent studies indicate that these objects possess both rigidity and flexibility, appearing in surprising and counter-intuitive ways. Our understanding of symplectic and Hamiltonian homeomorphisms is far from being satisfactory, and here we intend to study questions related to action of symplectic homeomorphisms on submanifolds. Some other questions are about Hamiltonian homeomorphisms in relation to the celebrated Arnold conjecture. The PI suggests to study spectral invariants of continuous Hamiltonian flows, which allow to formulate the C^0 Arnold conjecture in higher dimensions. Another central problem that the PI will work on is the C^0 flux conjecture.
A second circle of questions is about the Poisson bracket operator, and its functional-theoretic properties. The first question concerns the lower bound for the Poisson bracket invariant of a cover, conjectured by L. Polterovich who indicated relations between this problem and quantum mechanics. Another direction aims to study the C^0 rigidity versus flexibility of the L_p norm of the Poisson bracket. Despite a recent progress in dimension two showing rigidity, very little is known in higher dimensions. The PI proposes to use combination of tools from topology and from hard analysis in order to address this question, whose solution will be a big step towards understanding functional-theoretic properties of the Poisson bracket operator."
Max ERC Funding
1 345 282 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym CT-PROBES
Project Protease Activated X-Ray Contrast Agents for Molecular Imaging of Vulnerable Atherosclerotic Plaques and Cancer Development using Spectral CT
Researcher (PI) Galia Blum
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), LS7, ERC-2013-StG
Summary The major causes of death in the Western world are cardiovascular diseases and cancer. More accurate detection of these diseases will improve clinical outcomes. Thus, we will develop unique X-ray contrast reagents for use in spectral computerized tomography (CT) that bind active proteases to reveal the exact location and stage of cancer and atherosclerosis.
Activity-based probes (ABPs) are small molecules that covalently bind to active proteases. Based on our success in developing optical ABPs for non-invasive optical detection of cancer and atherosclerosis, we will focus on two novel types of reagents: (1) ABPs conjugated to the various contrast elements that can be visualized by x-rays. (2) “smart probes” conjugated to different contrast reagents on each side of the molecule to overcome clearance limitations. Protease found in diseased tissue will selectively bind and remove a part of the molecule, changing the physical properties of the bound probe. Thus, different signals from bound and unbound probes could be detected by photon counting spectral CT scanners.
Our initial target, cysteine cathepsin proteases, are overexpressed and activated in cancer and arthrosclerosis. The level of active cathepsins correlates with progression of both diseases, thereby serving as a promising biomarker for these pathologies. The “smart probes” are an innovative type of spectral CT agent that will enable high-resolution rapid imaging in humans before probe clearance.
Our probes increase imaging sensitivity since the contrast element remains at the desired site. Moreover, the levels of active cathepsins will reveal critical information regarding disease progression, yielding more accurate diagnoses and improved personalized treatment. For example, these reagents can distinguish between a vulnerable and stable atherosclerotic plaque. Thus, our novel probes will directly reduce cancer and cardiovascular disease mortality by enabling earlier and more accurate disease detection.
Summary
The major causes of death in the Western world are cardiovascular diseases and cancer. More accurate detection of these diseases will improve clinical outcomes. Thus, we will develop unique X-ray contrast reagents for use in spectral computerized tomography (CT) that bind active proteases to reveal the exact location and stage of cancer and atherosclerosis.
Activity-based probes (ABPs) are small molecules that covalently bind to active proteases. Based on our success in developing optical ABPs for non-invasive optical detection of cancer and atherosclerosis, we will focus on two novel types of reagents: (1) ABPs conjugated to the various contrast elements that can be visualized by x-rays. (2) “smart probes” conjugated to different contrast reagents on each side of the molecule to overcome clearance limitations. Protease found in diseased tissue will selectively bind and remove a part of the molecule, changing the physical properties of the bound probe. Thus, different signals from bound and unbound probes could be detected by photon counting spectral CT scanners.
Our initial target, cysteine cathepsin proteases, are overexpressed and activated in cancer and arthrosclerosis. The level of active cathepsins correlates with progression of both diseases, thereby serving as a promising biomarker for these pathologies. The “smart probes” are an innovative type of spectral CT agent that will enable high-resolution rapid imaging in humans before probe clearance.
Our probes increase imaging sensitivity since the contrast element remains at the desired site. Moreover, the levels of active cathepsins will reveal critical information regarding disease progression, yielding more accurate diagnoses and improved personalized treatment. For example, these reagents can distinguish between a vulnerable and stable atherosclerotic plaque. Thus, our novel probes will directly reduce cancer and cardiovascular disease mortality by enabling earlier and more accurate disease detection.
Max ERC Funding
1 499 780 €
Duration
Start date: 2013-12-01, End date: 2018-11-30
Project acronym CuHypMECH
Project New Nuclear Medicine Imaging Radiotracer 64Cu(II) for diagnosing Hypoxia Conditions Based on the Cellular Copper Cycle
Researcher (PI) Sharon RUTHSTEIN
Host Institution (HI) BAR ILAN UNIVERSITY
Call Details Starting Grant (StG), LS7, ERC-2017-STG
Summary Imaging of hypoxia is important in many disease states in oncology, cardiology, and neurology. Hypoxia is a common condition encountered within the tumour microenvironment that drives proliferation, angiogenesis, and resistance to therapy. Despite on-going efforts to identify hypoxia, until now there is no clinically approved imaging biomarker, due to both low tumour uptake, and a low signal to background (S/B) ratio that affects the imaging quality. Nuclear Medicine is using labelled radio-isotopes for PET/CT and SPECT imaging. These radio-tracers diagnose the metabolic processes in the body. Among these tracers, 18F-FDG is the most routinely used as a marker of glucose metabolism. However, not all tumours consume glucose, and glucose consumption is not specific only for malignant tumours, which limits its application. Copper is a nutritional metal, recently examined as a radiotracer for hypoxia, owing to its to the oxidising environment. Clinical and in-vivo studies on various 64Cu(II)-PET radiotracers resulted in controversial reports on the specificity of the current tracers for hypoxia imaging due to non-selective bio-distribution & low S/B ratio. This multidisciplinary proposal focuses on the discovery of comprehensive signal pathways of the cellular copper cycle using advanced biophysical methods and a proprietary design of 64Cu(II) radiotracer. This radiotracer will be incorporated in the cellular copper cycle, and will enable to selectively target the oxidising environment in tumours. The design of the new radiotracer is based on systematic structural & functional mapping of the copper binding sites to the various copper proteins and the visualisation of the transfer mechanism. This new copper tracer should increase the selectivity of tumour uptake, stability, and improve bio-distribution. This project assimilates cold and hot chemistry and biology, while emphasising the clinical unmet need in metal based radiotracer that form stable complexes.
Summary
Imaging of hypoxia is important in many disease states in oncology, cardiology, and neurology. Hypoxia is a common condition encountered within the tumour microenvironment that drives proliferation, angiogenesis, and resistance to therapy. Despite on-going efforts to identify hypoxia, until now there is no clinically approved imaging biomarker, due to both low tumour uptake, and a low signal to background (S/B) ratio that affects the imaging quality. Nuclear Medicine is using labelled radio-isotopes for PET/CT and SPECT imaging. These radio-tracers diagnose the metabolic processes in the body. Among these tracers, 18F-FDG is the most routinely used as a marker of glucose metabolism. However, not all tumours consume glucose, and glucose consumption is not specific only for malignant tumours, which limits its application. Copper is a nutritional metal, recently examined as a radiotracer for hypoxia, owing to its to the oxidising environment. Clinical and in-vivo studies on various 64Cu(II)-PET radiotracers resulted in controversial reports on the specificity of the current tracers for hypoxia imaging due to non-selective bio-distribution & low S/B ratio. This multidisciplinary proposal focuses on the discovery of comprehensive signal pathways of the cellular copper cycle using advanced biophysical methods and a proprietary design of 64Cu(II) radiotracer. This radiotracer will be incorporated in the cellular copper cycle, and will enable to selectively target the oxidising environment in tumours. The design of the new radiotracer is based on systematic structural & functional mapping of the copper binding sites to the various copper proteins and the visualisation of the transfer mechanism. This new copper tracer should increase the selectivity of tumour uptake, stability, and improve bio-distribution. This project assimilates cold and hot chemistry and biology, while emphasising the clinical unmet need in metal based radiotracer that form stable complexes.
Max ERC Funding
1 499 345 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym cureCD
Project Function of long non-coding RNA in Crohn Disease Ulcer Pathogenesis
Researcher (PI) Yael HABERMAN ZIV
Host Institution (HI) MEDICAL RESEARCH INFRASTRUCTURE DEVELOPMENT AND HEALTH SERVICES FUND BY THE SHEBA MEDICAL CENTER
Call Details Starting Grant (StG), LS4, ERC-2017-STG
Summary The Inflammatory Bowel Diseases (IBD), Crohn’s Disease (CD) and Ulcerative Colitis (UC) are chronic/relapsing disorders that affect over six million individuals worldwide. Mucosal ulcers, the hallmark of CD, are the result of a complex interaction between microbiota, immune cells, and gut epithelia. Healing of mucosal ulcers is associated with better outcomes, but is achieved in less than half of cases. Past attempts to suppress central and conserved nodes of the immune system failed due to opposing off-target deleterious effects on epithelial renewal. Therefore, there is a critical need to identify more tissue specific targets that lead to mucosal healing and to improved outcomes.
Using mRNAseq of intestinal biopsies, we identified a widespread dysregulation of long non-coding RNAs (lncRNA) in the ileum of treatment naïve pediatric CD patients. Importently, we identified significant correlations between lncRNA and mucosal ulcers. CD lncRNA, after carful mechanistic exploration, are highly promising targets for potential future intervention as they regulate diverse cellular functions and exhibit a more tissue specific expression in comparison to protein coding genes. The core goal of this proposal is to understand the role of CD lncRNA in ulcer pathogenesis focusing on granulocytes and epithelial functions in the contexts of their interactions with the microbiota.
I plan to utilize state of the art informatics, RNAseq and microbiome profiles together with advanced and novel experimental lab model and co-culture systems, patients-derived prospectively collected tissues, and gut microbiota to explore the role of CD lncRNA function in mediating healing of mucosal ulcers. This work carries the potential to guide new novel therapeutic strategies for mucosal healing with minimal off-targets effects. In a broader prospective, this work will expand our relative limited understanding regarding the role of lncRNA in mediating human diseases.
Summary
The Inflammatory Bowel Diseases (IBD), Crohn’s Disease (CD) and Ulcerative Colitis (UC) are chronic/relapsing disorders that affect over six million individuals worldwide. Mucosal ulcers, the hallmark of CD, are the result of a complex interaction between microbiota, immune cells, and gut epithelia. Healing of mucosal ulcers is associated with better outcomes, but is achieved in less than half of cases. Past attempts to suppress central and conserved nodes of the immune system failed due to opposing off-target deleterious effects on epithelial renewal. Therefore, there is a critical need to identify more tissue specific targets that lead to mucosal healing and to improved outcomes.
Using mRNAseq of intestinal biopsies, we identified a widespread dysregulation of long non-coding RNAs (lncRNA) in the ileum of treatment naïve pediatric CD patients. Importently, we identified significant correlations between lncRNA and mucosal ulcers. CD lncRNA, after carful mechanistic exploration, are highly promising targets for potential future intervention as they regulate diverse cellular functions and exhibit a more tissue specific expression in comparison to protein coding genes. The core goal of this proposal is to understand the role of CD lncRNA in ulcer pathogenesis focusing on granulocytes and epithelial functions in the contexts of their interactions with the microbiota.
I plan to utilize state of the art informatics, RNAseq and microbiome profiles together with advanced and novel experimental lab model and co-culture systems, patients-derived prospectively collected tissues, and gut microbiota to explore the role of CD lncRNA function in mediating healing of mucosal ulcers. This work carries the potential to guide new novel therapeutic strategies for mucosal healing with minimal off-targets effects. In a broader prospective, this work will expand our relative limited understanding regarding the role of lncRNA in mediating human diseases.
Max ERC Funding
1 500 000 €
Duration
Start date: 2018-05-01, End date: 2023-04-30
Project acronym CYTOTOXICTISALANS
Project Salan Ti(IV) Complexes as Novel Anti-Cancer Chemotherapeutics
Researcher (PI) Edit TSHUVA (GOLDBERG)
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Proof of Concept (PoC), PC1, ERC-2011-PoC
Summary "Cisplatin is a platinum-based complex used widely as a chemotherapeutic drug. However, due to its limitations, including narrow activity range and severe side effects, other transition metal complexes are investigated as antitumorur agents. Titanium complexes that previously reached clinical trials showed fair activity, and were mainly limited by rapid decomposition in water, within minutes. Titanium complexes of diamino bis(phenolato) (""salan"") ligands studied in our laboratory under the ERC grant demonstrate extremely high stability for up to weeks in water solutions. Importantly, they have shown great potency as anti-tumour agents towards a number of cells in vitro. Their activity exceeds that of cisplatin by up to two orders of magnitude. In particular, halogenated compounds at specific positions demonstrate a combination of exceptional hydrolytic stability and particularly high cytotoxicity, and are covered in a patent application. In addition, minor effect on regular cells was observed. As the titanium has been previously shown based on in vivo testing to be significantly less toxic than platinum, (and is commonly found in food and cosmetics), several pharmaceutical companies expressed interest in our compounds. However, in vivo activity and some mechanistic insights (required for FDA approval) are fundamental for allowing commercialization. Therefore, we propose to conduct a series of experiments that will bring this research to a more ""mature"" level suitable to be undertaken by pharmaceutical companies. These include in vivo testing on mice to establish safety and activity towards different cells and by different administration techniques, including IV. Additionally, as solubility is an issue, different formulations will be assessed. Basic mechanistic studies will include evaluating potential interaction with DNA. All these studies are not covered by the ERC grant, and will be conducted in collaboration with a biologist expert in cancer research."
Summary
"Cisplatin is a platinum-based complex used widely as a chemotherapeutic drug. However, due to its limitations, including narrow activity range and severe side effects, other transition metal complexes are investigated as antitumorur agents. Titanium complexes that previously reached clinical trials showed fair activity, and were mainly limited by rapid decomposition in water, within minutes. Titanium complexes of diamino bis(phenolato) (""salan"") ligands studied in our laboratory under the ERC grant demonstrate extremely high stability for up to weeks in water solutions. Importantly, they have shown great potency as anti-tumour agents towards a number of cells in vitro. Their activity exceeds that of cisplatin by up to two orders of magnitude. In particular, halogenated compounds at specific positions demonstrate a combination of exceptional hydrolytic stability and particularly high cytotoxicity, and are covered in a patent application. In addition, minor effect on regular cells was observed. As the titanium has been previously shown based on in vivo testing to be significantly less toxic than platinum, (and is commonly found in food and cosmetics), several pharmaceutical companies expressed interest in our compounds. However, in vivo activity and some mechanistic insights (required for FDA approval) are fundamental for allowing commercialization. Therefore, we propose to conduct a series of experiments that will bring this research to a more ""mature"" level suitable to be undertaken by pharmaceutical companies. These include in vivo testing on mice to establish safety and activity towards different cells and by different administration techniques, including IV. Additionally, as solubility is an issue, different formulations will be assessed. Basic mechanistic studies will include evaluating potential interaction with DNA. All these studies are not covered by the ERC grant, and will be conducted in collaboration with a biologist expert in cancer research."
Max ERC Funding
149 599 €
Duration
Start date: 2012-03-01, End date: 2013-05-31
Project acronym DCENSY
Project Doping, Charge Transfer and Energy Flow in Hybrid Nanoparticle Systems
Researcher (PI) Uri Banin
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Advanced Grant (AdG), PE4, ERC-2009-AdG
Summary We target a frontier in nanocrystal science of combining disparate materials into a single hybrid nanosystem. This offers an intriguing route to engineer nanomaterials with multiple functionalities in ways that are not accessible in bulk materials or in molecules. Such control of novel material combinations on a single nanoparticle or in a super-structure of assembled nanoparticles, presents alongside with the synthesis challenges, fundamental questions concerning the physical attributes of nanoscale systems. My goals are to create new highly controlled hybrid nanoparticle systems, focusing on combinations of semiconductors and metals, and to decipher the fundamental principles governing doping in nanoparticles and charge and energy transfer processes among components of the hybrid systems. The research addresses several key challenges: First, in synthesis, combining disparate material components into one hybrid nanoparticle system. Second, in self assembly, organizing a combination of semiconductor (SC) and metal nanoparticle building blocks into hybrid systems with controlled architecture. Third in fundamental physico-chemical questions pertaining to the unique attributes of the hybrid systems, constituting a key component of the research. A first aspect concerns doping of SC nanoparticles with metal atoms. A second aspect concerns light-induced charge transfer between the SC part and metal parts of the hybrid constructs. A third related aspect concerns energy transfer processes between the SC and metal components and the interplay between near-field enhancement and fluorescence quenching effects. Due to the new properties, significant impact on nanocrystal applications in solar energy harvesting, biological tagging, sensing, optics and electropotics is expected.
Summary
We target a frontier in nanocrystal science of combining disparate materials into a single hybrid nanosystem. This offers an intriguing route to engineer nanomaterials with multiple functionalities in ways that are not accessible in bulk materials or in molecules. Such control of novel material combinations on a single nanoparticle or in a super-structure of assembled nanoparticles, presents alongside with the synthesis challenges, fundamental questions concerning the physical attributes of nanoscale systems. My goals are to create new highly controlled hybrid nanoparticle systems, focusing on combinations of semiconductors and metals, and to decipher the fundamental principles governing doping in nanoparticles and charge and energy transfer processes among components of the hybrid systems. The research addresses several key challenges: First, in synthesis, combining disparate material components into one hybrid nanoparticle system. Second, in self assembly, organizing a combination of semiconductor (SC) and metal nanoparticle building blocks into hybrid systems with controlled architecture. Third in fundamental physico-chemical questions pertaining to the unique attributes of the hybrid systems, constituting a key component of the research. A first aspect concerns doping of SC nanoparticles with metal atoms. A second aspect concerns light-induced charge transfer between the SC part and metal parts of the hybrid constructs. A third related aspect concerns energy transfer processes between the SC and metal components and the interplay between near-field enhancement and fluorescence quenching effects. Due to the new properties, significant impact on nanocrystal applications in solar energy harvesting, biological tagging, sensing, optics and electropotics is expected.
Max ERC Funding
2 499 000 €
Duration
Start date: 2010-06-01, End date: 2015-05-31
Project acronym DCM
Project Distributed Cryptography Module
Researcher (PI) Yehuda Lindell
Host Institution (HI) BAR ILAN UNIVERSITY
Call Details Proof of Concept (PoC), PC1, ERC-2014-PoC
Summary The DCM (Distributed Crypto Module) is a unique security system that provides a significant boost in server-side security, which will benefit almost every organisation today. The technology relies on a novel approach to protect cryptographic keys and authentication credentials that form the backbone of network and data security. Currently, the cryptographic keys and authentication credentials that reside on servers inside networks constitute single points of failure: it suffices for the attacker to obtain them and all is lost. This is due to the fact that all cryptographic techniques rely on the secrecy of the key; if the key is compromised then all is lost. Indeed, cryptography is rarely broken (even by the NSA); rather, it is bypassed by stealing the key! Server breaches are ubiquitous today and novel defenses are an acute need today in industry and government.
In the DCM, the key is first split and shared amongst two or more servers (using known secret sharing technology) and then using our novel approach, the cryptographic operations necessary are carried out without bringing the parts of the secret together. Rather, the servers run a secure protocol, based on secure multiparty computation, which has the security guarantee that even if an attacker breaks into all but one of the servers, and can run any malicious code that it wishes, it still cannot learn anything about the secret key or credential. By configuring the DCM servers independently (different OS, different admins, different defenses, etc.), a very high level of security is achieved.
The scope of the Proof of Concept DCM encapsulates the steps need to bring this groundbreaking technology to the market. A full business plan and market survey will be developed for the construction of a new company that will develop the DCM application and bring it to market. The first full version of a DCM will be ready for market a year after the company has been established (with limited versions earlier).
Summary
The DCM (Distributed Crypto Module) is a unique security system that provides a significant boost in server-side security, which will benefit almost every organisation today. The technology relies on a novel approach to protect cryptographic keys and authentication credentials that form the backbone of network and data security. Currently, the cryptographic keys and authentication credentials that reside on servers inside networks constitute single points of failure: it suffices for the attacker to obtain them and all is lost. This is due to the fact that all cryptographic techniques rely on the secrecy of the key; if the key is compromised then all is lost. Indeed, cryptography is rarely broken (even by the NSA); rather, it is bypassed by stealing the key! Server breaches are ubiquitous today and novel defenses are an acute need today in industry and government.
In the DCM, the key is first split and shared amongst two or more servers (using known secret sharing technology) and then using our novel approach, the cryptographic operations necessary are carried out without bringing the parts of the secret together. Rather, the servers run a secure protocol, based on secure multiparty computation, which has the security guarantee that even if an attacker breaks into all but one of the servers, and can run any malicious code that it wishes, it still cannot learn anything about the secret key or credential. By configuring the DCM servers independently (different OS, different admins, different defenses, etc.), a very high level of security is achieved.
The scope of the Proof of Concept DCM encapsulates the steps need to bring this groundbreaking technology to the market. A full business plan and market survey will be developed for the construction of a new company that will develop the DCM application and bring it to market. The first full version of a DCM will be ready for market a year after the company has been established (with limited versions earlier).
Max ERC Funding
149 776 €
Duration
Start date: 2014-11-01, End date: 2016-04-30
Project acronym DEADSEA_ECO
Project Modelling Anthropocene Trophic Cascades of the Judean Desert Ecosystem: A Hidden Dimension in the History of Human-Environment Interactions
Researcher (PI) Nimrod MAROM
Host Institution (HI) UNIVERSITY OF HAIFA
Call Details Starting Grant (StG), SH6, ERC-2018-STG
Summary This project aims to explore the effects of human settlement intensity on desert ecological community structure, focusing on the hitherto unstudied phenomenon of trophic cascades in antiquity. Its key research question is whether human-induced changes in arid land biodiversity can feedback to affect natural resources important for human subsistence, such as pasture and wood. The role of such feedback effects in ecological systems is increasingly acknowledged in recent years in the biological literature but has not been addressed in the study of human past. The research question will be approached using bioarchaeological methods applied to the uniquely-preserved material record from the middle and late Holocene settlement sequence (approximately 4,500 BCE to 700 CE) of the Dead Sea Ein Gedi Oasis, and to the contemporary palaeontological assemblages from caves located in the surrounding Judean Desert. The proposed research is expected to bridge between aspects of current thinking on ecosystem dynamics and the study of human past by exploring the role of trophic cascades as an invisible dimension of Anthropocene life in marginal environments. The study of the history of human impact on such environments is important to resource management planning across a rapidly expanding ecological frontier on Earth, as climate deterioration brings more people in contact with life-sustaining and sensitive arid land ecosystems.
Summary
This project aims to explore the effects of human settlement intensity on desert ecological community structure, focusing on the hitherto unstudied phenomenon of trophic cascades in antiquity. Its key research question is whether human-induced changes in arid land biodiversity can feedback to affect natural resources important for human subsistence, such as pasture and wood. The role of such feedback effects in ecological systems is increasingly acknowledged in recent years in the biological literature but has not been addressed in the study of human past. The research question will be approached using bioarchaeological methods applied to the uniquely-preserved material record from the middle and late Holocene settlement sequence (approximately 4,500 BCE to 700 CE) of the Dead Sea Ein Gedi Oasis, and to the contemporary palaeontological assemblages from caves located in the surrounding Judean Desert. The proposed research is expected to bridge between aspects of current thinking on ecosystem dynamics and the study of human past by exploring the role of trophic cascades as an invisible dimension of Anthropocene life in marginal environments. The study of the history of human impact on such environments is important to resource management planning across a rapidly expanding ecological frontier on Earth, as climate deterioration brings more people in contact with life-sustaining and sensitive arid land ecosystems.
Max ERC Funding
1 499 563 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym DEATHSWITCHING
Project Identifying genes and pathways that drive molecular switches and back-up mechanisms between apoptosis and autophagy
Researcher (PI) Adi Kimchi
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Advanced Grant (AdG), LS3, ERC-2012-ADG_20120314
Summary A cell’s decision to die is governed by multiple input signals received from a complex network of programmed cell death (PCD) pathways, including apoptosis and programmed necrosis. Additionally, under some conditions, autophagy, whose function is mainly pro-survival, may act as a back-up death pathway. We propose to apply new approaches to study the molecular basis of two important questions that await resolution in the field: a) how the cell switches from a pro-survival autophagic response to an apoptotic response and b) whether and how pro-survival autophagy is converted to a death mechanism when apoptosis is blocked. To address the first issue, we will screen for direct physical interactions between autophagic and apoptotic proteins, using the protein fragment complementation assay. Validated pairs will be studied in depth to identify built-in molecular switches that activate apoptosis when autophagy fails to restore homeostasis. As a pilot case to address the concept of molecular ‘sensors’ and ‘switches’, we will focus on the previously identified Atg12/Bcl-2 interaction. In the second line of research we will categorize autophagy-dependent cell death triggers into those that directly result from autophagy-dependent degradation, either by excessive self-digestion or by selective protein degradation, and those that utilize the autophagy machinery to activate programmed necrosis. We will identify the genes regulating these scenarios by whole genome RNAi screens for increased cell survival. In parallel, we will use a cell library of annotated fluorescent-tagged proteins for measuring selective protein degradation. These will be the starting point for identification of the molecular pathways that convert survival autophagy to a death program. Finally, we will explore the physiological relevance of back-up death mechanisms and the newly identified molecular mechanisms to developmental PCD during the cavitation process in early stages of embryogenesis.
Summary
A cell’s decision to die is governed by multiple input signals received from a complex network of programmed cell death (PCD) pathways, including apoptosis and programmed necrosis. Additionally, under some conditions, autophagy, whose function is mainly pro-survival, may act as a back-up death pathway. We propose to apply new approaches to study the molecular basis of two important questions that await resolution in the field: a) how the cell switches from a pro-survival autophagic response to an apoptotic response and b) whether and how pro-survival autophagy is converted to a death mechanism when apoptosis is blocked. To address the first issue, we will screen for direct physical interactions between autophagic and apoptotic proteins, using the protein fragment complementation assay. Validated pairs will be studied in depth to identify built-in molecular switches that activate apoptosis when autophagy fails to restore homeostasis. As a pilot case to address the concept of molecular ‘sensors’ and ‘switches’, we will focus on the previously identified Atg12/Bcl-2 interaction. In the second line of research we will categorize autophagy-dependent cell death triggers into those that directly result from autophagy-dependent degradation, either by excessive self-digestion or by selective protein degradation, and those that utilize the autophagy machinery to activate programmed necrosis. We will identify the genes regulating these scenarios by whole genome RNAi screens for increased cell survival. In parallel, we will use a cell library of annotated fluorescent-tagged proteins for measuring selective protein degradation. These will be the starting point for identification of the molecular pathways that convert survival autophagy to a death program. Finally, we will explore the physiological relevance of back-up death mechanisms and the newly identified molecular mechanisms to developmental PCD during the cavitation process in early stages of embryogenesis.
Max ERC Funding
2 500 000 €
Duration
Start date: 2013-03-01, End date: 2018-02-28
Project acronym DecodingInfection
Project Decoding the host-pathogen interspecies crosstalk at a multiparametric single-cell level
Researcher (PI) Roi AVRAHAM
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), LS6, ERC-2017-STG
Summary Bacterial pathogens remain a significant threat to global health, necessitating a better understanding of host-pathogen biology. While various evidence point to early infection as a key event in the eventual progression to disease, our recent preliminary data show that during this stage, highly adaptable and dynamic host cells and bacteria engage in complex, diverse interactions that contribute to well-documented heterogeneous outcomes of infection. However, current methodologies rely on measurements of bulk populations, thereby overlooking this diversity that can trigger different outcomes. This application focuses on understanding heterogeneity during the first stages of infection in order to reduce the complexity of these interactions into informative readouts of population physiology and predictors of infection outcome. We will apply multiparametric single-cell analysis to obtain an accurate and complete description of infection with the enteric intracellular pathogen Salmonella of macrophages in vitro, and in early stages of mice colonization. We will characterize the molecular details that underlie distinct infection outcomes of individual encounters, to reconstruct the repertoire of host and pathogen strategies that prevail at critical stages of early infection.
We propose the following three objectives: (1) Develop methodologies to simultaneously profile host and pathogen transcriptional changes on a single cell level; 2) Characterizing the molecular details that underlie the formation of subpopulations during macrophage infection; and (3) Determine how host and pathogen encounters in vivo result in emergence of specialized subpopulations, recruitment of immune cells and pathogen dissemination.
We anticipate that this work will fundamentally shift our paradigms of infectious disease pathogenesis and lay the groundwork for the development of a new generation of therapeutic agents targeting the specific host-pathogen interactions ultimately driving disease.
Summary
Bacterial pathogens remain a significant threat to global health, necessitating a better understanding of host-pathogen biology. While various evidence point to early infection as a key event in the eventual progression to disease, our recent preliminary data show that during this stage, highly adaptable and dynamic host cells and bacteria engage in complex, diverse interactions that contribute to well-documented heterogeneous outcomes of infection. However, current methodologies rely on measurements of bulk populations, thereby overlooking this diversity that can trigger different outcomes. This application focuses on understanding heterogeneity during the first stages of infection in order to reduce the complexity of these interactions into informative readouts of population physiology and predictors of infection outcome. We will apply multiparametric single-cell analysis to obtain an accurate and complete description of infection with the enteric intracellular pathogen Salmonella of macrophages in vitro, and in early stages of mice colonization. We will characterize the molecular details that underlie distinct infection outcomes of individual encounters, to reconstruct the repertoire of host and pathogen strategies that prevail at critical stages of early infection.
We propose the following three objectives: (1) Develop methodologies to simultaneously profile host and pathogen transcriptional changes on a single cell level; 2) Characterizing the molecular details that underlie the formation of subpopulations during macrophage infection; and (3) Determine how host and pathogen encounters in vivo result in emergence of specialized subpopulations, recruitment of immune cells and pathogen dissemination.
We anticipate that this work will fundamentally shift our paradigms of infectious disease pathogenesis and lay the groundwork for the development of a new generation of therapeutic agents targeting the specific host-pathogen interactions ultimately driving disease.
Max ERC Funding
1 499 999 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym DeepFace
Project Understanding Deep Face Recognition
Researcher (PI) Lior Wolf
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Consolidator Grant (CoG), PE6, ERC-2016-COG
Summary Face recognition is a fascinating domain: no other domain seems to present as much value when analysing casual photos; it is one of the few domains in machine learning in which millions of classes are routinely learned; and the trade-off between subtle inter-identity variations and pronounced intra-identity variations forms a unique challenge.
The advent of deep learning has brought machines to what is considered a human level of performance. However, there are many research questions that are left open. At the top most level, we ask two questions: what is unique about faces in comparison to other recognition tasks that also employ deep networks and how can we make the next leap in performance of automatic face recognition?
We consider three domains of research. The first is the study of methods that promote effective transfer learning. This is crucial since all state of the art face recognition methods rely on transfer learning. The second domain is the study of the tradeoffs that govern the optimal utilization of the training data and how the properties of the training data affect the optimal network design. The third domain is the post transfer utilization of the learned deep networks, where given the representations of a pair of face images, we seek to compare them in the most accurate way.
Throughout this proposal, we put an emphasis on theoretical reasoning. I aim to support the developed methods by a theoretical framework that would both justify their usage as well as provide concrete guidelines for using them. My goal of achieving a leap forward in performance through a level of theoretical analysis that is unparalleled in object recognition, makes our research agenda truly high-risk/ high-gains. I have been in the forefront of face recognition for the last 8 years and my lab's recent achievements in deep learning suggest that we will be able to carry out this research. To further support its feasibility, we present very promising initial results.
Summary
Face recognition is a fascinating domain: no other domain seems to present as much value when analysing casual photos; it is one of the few domains in machine learning in which millions of classes are routinely learned; and the trade-off between subtle inter-identity variations and pronounced intra-identity variations forms a unique challenge.
The advent of deep learning has brought machines to what is considered a human level of performance. However, there are many research questions that are left open. At the top most level, we ask two questions: what is unique about faces in comparison to other recognition tasks that also employ deep networks and how can we make the next leap in performance of automatic face recognition?
We consider three domains of research. The first is the study of methods that promote effective transfer learning. This is crucial since all state of the art face recognition methods rely on transfer learning. The second domain is the study of the tradeoffs that govern the optimal utilization of the training data and how the properties of the training data affect the optimal network design. The third domain is the post transfer utilization of the learned deep networks, where given the representations of a pair of face images, we seek to compare them in the most accurate way.
Throughout this proposal, we put an emphasis on theoretical reasoning. I aim to support the developed methods by a theoretical framework that would both justify their usage as well as provide concrete guidelines for using them. My goal of achieving a leap forward in performance through a level of theoretical analysis that is unparalleled in object recognition, makes our research agenda truly high-risk/ high-gains. I have been in the forefront of face recognition for the last 8 years and my lab's recent achievements in deep learning suggest that we will be able to carry out this research. To further support its feasibility, we present very promising initial results.
Max ERC Funding
1 696 888 €
Duration
Start date: 2017-05-01, End date: 2022-04-30
Project acronym DeepInternal
Project Going Deep and Blind with Internal Statistics
Researcher (PI) Michal IRANI
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary Unsupervised visual inference can often be performed by exploiting the internal redundancy inside a single visual datum (an image or a video). The strong repetition of patches inside a single image/video provides a powerful data-specific prior for solving a variety of vision tasks in a “blind” manner: (i) Blind in the sense that sophisticated unsupervised inferences can be made with no prior examples or training; (ii) Blind in the sense that complex ill-posed Inverse-Problems can be solved, even when the forward degradation is unknown.
While the above fully unsupervised approach achieved impressive results, it relies on internal data alone, hence cannot enjoy the “wisdom of the crowd” which Deep-Learning (DL) so wisely extracts from external collections of images, yielding state-of-the-art (SOTA) results. Nevertheless, DL requires huge amounts of training data, which restricts its applicability. Moreover, some internal image-specific information, which is clearly visible, remains unexploited by today's DL methods. One such example is shown in Fig.1.
We propose to combine the power of these two complementary approaches – unsupervised Internal Data Recurrence, with Deep Learning, to obtain the best of both worlds. If successful, this will have several important outcomes including:
• A wide range of low-level & high-level inferences (image & video).
• A continuum between Internal & External training – a platform to explore theoretical and practical tradeoffs between amount of available training data and optimal Internal-vs-External training.
• Enable totally unsupervised DL when no training data are available.
• Enable supervised DL with modest amounts of training data.
• New applications, disciplines and domains, which are enabled by the unified approach.
• A platform for substantial progress in video analysis (which has been lagging behind so far due to the strong reliance on exhaustive supervised training data).
Summary
Unsupervised visual inference can often be performed by exploiting the internal redundancy inside a single visual datum (an image or a video). The strong repetition of patches inside a single image/video provides a powerful data-specific prior for solving a variety of vision tasks in a “blind” manner: (i) Blind in the sense that sophisticated unsupervised inferences can be made with no prior examples or training; (ii) Blind in the sense that complex ill-posed Inverse-Problems can be solved, even when the forward degradation is unknown.
While the above fully unsupervised approach achieved impressive results, it relies on internal data alone, hence cannot enjoy the “wisdom of the crowd” which Deep-Learning (DL) so wisely extracts from external collections of images, yielding state-of-the-art (SOTA) results. Nevertheless, DL requires huge amounts of training data, which restricts its applicability. Moreover, some internal image-specific information, which is clearly visible, remains unexploited by today's DL methods. One such example is shown in Fig.1.
We propose to combine the power of these two complementary approaches – unsupervised Internal Data Recurrence, with Deep Learning, to obtain the best of both worlds. If successful, this will have several important outcomes including:
• A wide range of low-level & high-level inferences (image & video).
• A continuum between Internal & External training – a platform to explore theoretical and practical tradeoffs between amount of available training data and optimal Internal-vs-External training.
• Enable totally unsupervised DL when no training data are available.
• Enable supervised DL with modest amounts of training data.
• New applications, disciplines and domains, which are enabled by the unified approach.
• A platform for substantial progress in video analysis (which has been lagging behind so far due to the strong reliance on exhaustive supervised training data).
Max ERC Funding
2 466 940 €
Duration
Start date: 2018-05-01, End date: 2023-04-30
Project acronym DELPHI
Project Computing Answers to Complex Questions in Broad Domains
Researcher (PI) Jonathan Berant
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), PE6, ERC-2018-STG
Summary The explosion of information around us has democratized knowledge and transformed its availability for
people around the world. Still, since information is mediated through automated systems, access is bounded
by their ability to understand language.
Consider an economist asking “What fraction of the top-5 growing countries last year raised their co2 emission?”.
While the required information is available, answering such complex questions automatically is
not possible. Current question answering systems can answer simple questions in broad domains, or complex
questions in narrow domains. However, broad and complex questions are beyond the reach of state-of-the-art.
This is because systems are unable to decompose questions into their parts, and find the relevant information
in multiple sources. Further, as answering such questions is hard for people, collecting large datasets to train
such models is prohibitive.
In this proposal I ask: Can computers answer broad and complex questions that require reasoning over
multiple modalities? I argue that by synthesizing the advantages of symbolic and distributed representations
the answer will be “yes”. My thesis is that symbolic representations are suitable for meaning composition, as
they provide interpretability, coverage, and modularity. Complementarily, distributed representations (learned
by neural nets) excel at capturing the fuzziness of language. I propose a framework where complex questions
are symbolically decomposed into sub-questions, each is answered with a neural network, and the final answer
is computed from all gathered information.
This research tackles foundational questions in language understanding. What is the right representation
for reasoning in language? Can models learn to perform complex actions in the face of paucity of data?
Moreover, my research, if successful, will transform how we interact with machines, and define a role for
them as research assistants in science, education, and our daily life.
Summary
The explosion of information around us has democratized knowledge and transformed its availability for
people around the world. Still, since information is mediated through automated systems, access is bounded
by their ability to understand language.
Consider an economist asking “What fraction of the top-5 growing countries last year raised their co2 emission?”.
While the required information is available, answering such complex questions automatically is
not possible. Current question answering systems can answer simple questions in broad domains, or complex
questions in narrow domains. However, broad and complex questions are beyond the reach of state-of-the-art.
This is because systems are unable to decompose questions into their parts, and find the relevant information
in multiple sources. Further, as answering such questions is hard for people, collecting large datasets to train
such models is prohibitive.
In this proposal I ask: Can computers answer broad and complex questions that require reasoning over
multiple modalities? I argue that by synthesizing the advantages of symbolic and distributed representations
the answer will be “yes”. My thesis is that symbolic representations are suitable for meaning composition, as
they provide interpretability, coverage, and modularity. Complementarily, distributed representations (learned
by neural nets) excel at capturing the fuzziness of language. I propose a framework where complex questions
are symbolically decomposed into sub-questions, each is answered with a neural network, and the final answer
is computed from all gathered information.
This research tackles foundational questions in language understanding. What is the right representation
for reasoning in language? Can models learn to perform complex actions in the face of paucity of data?
Moreover, my research, if successful, will transform how we interact with machines, and define a role for
them as research assistants in science, education, and our daily life.
Max ERC Funding
1 499 375 €
Duration
Start date: 2019-04-01, End date: 2024-03-31
Project acronym DEPENDENTCLASSES
Project Model theory and its applications: dependent classes
Researcher (PI) Saharon Shelah
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Advanced Grant (AdG), PE1, ERC-2013-ADG
Summary Model theory deals with general classes of structures (called models).
Specific examples of such classes are: the class of rings or the class of
algebraically closed fields.
It turns out that counting the so-called complete types over models in the
class has an important role in the development of model theory in general and
stability theory in particular.
Stable classes are those with relatively few complete types (over structures
from the class); understanding stable classes has been central in model theory
and its applications.
Recently, I have proved a new dichotomy among the unstable classes:
Instead of counting all the complete types, they are counted up to conjugacy.
Classes which have few types up to conjugacy are proved to be so-called
``dependent'' classes (which have also been called NIP classes).
I have developed (under reasonable restrictions) a ``recounting theorem'',
parallel to the basic theorems of stability theory.
I have started to develop some of the basic properties of this new approach.
The goal of the current project is to develop systematically the theory of
dependent classes. The above mentioned results give strong indication that this
new theory can be eventually as useful as the (by now the classical) stability
theory. In particular, it covers many well known classes which stability theory
cannot treat.
Summary
Model theory deals with general classes of structures (called models).
Specific examples of such classes are: the class of rings or the class of
algebraically closed fields.
It turns out that counting the so-called complete types over models in the
class has an important role in the development of model theory in general and
stability theory in particular.
Stable classes are those with relatively few complete types (over structures
from the class); understanding stable classes has been central in model theory
and its applications.
Recently, I have proved a new dichotomy among the unstable classes:
Instead of counting all the complete types, they are counted up to conjugacy.
Classes which have few types up to conjugacy are proved to be so-called
``dependent'' classes (which have also been called NIP classes).
I have developed (under reasonable restrictions) a ``recounting theorem'',
parallel to the basic theorems of stability theory.
I have started to develop some of the basic properties of this new approach.
The goal of the current project is to develop systematically the theory of
dependent classes. The above mentioned results give strong indication that this
new theory can be eventually as useful as the (by now the classical) stability
theory. In particular, it covers many well known classes which stability theory
cannot treat.
Max ERC Funding
1 748 000 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym DEPICT
Project Design principles and controllability of protein circuits
Researcher (PI) Uri Alon
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Advanced Grant (AdG), LS2, ERC-2009-AdG
Summary Cells use circuits of interacting proteins to respond to their environment. In the past decades, molecular biology has provided detailed knowledge on the proteins in these circuits and their interactions. To fully understand circuit function requires, in addition to molecular knowledge, new concepts that explain how multiple components work together to perform systems level functions. Our lab has been a leader in defining such concepts, based on combined experimental and theoretical study of well characterized circuits in bacteria and human cells. In this proposal we aim to find novel principles on how circuits resist fluctuations and errors, and how they can be controlled by drugs: (1) Why do key regulatory systems use bifunctional enzymes that catalyze antagonistic reactions (e.g. both kinase and phosphatase)? We will test the role of bifunctional enzymes in making circuits robust to variations in protein levels. (2) Why are some genes regulated by a repressor and others by an activator? We will test this in the context of reduction of errors in transcription control. (3) Are there principles that describe how drugs combine to affect protein dynamics in human cells? We will use a novel dynamic proteomics approach developed in our lab to explore how protein dynamics can be controlled by drug combinations. This research will define principles that unite our understanding of seemingly distinct biological systems, and explain their particular design in terms of systems-level functions. This understanding will help form the basis for a future medicine that rationally controls the state of the cell based on a detailed blueprint of their circuit design, and quantitative principles for the effects of drugs on this circuitry.
Summary
Cells use circuits of interacting proteins to respond to their environment. In the past decades, molecular biology has provided detailed knowledge on the proteins in these circuits and their interactions. To fully understand circuit function requires, in addition to molecular knowledge, new concepts that explain how multiple components work together to perform systems level functions. Our lab has been a leader in defining such concepts, based on combined experimental and theoretical study of well characterized circuits in bacteria and human cells. In this proposal we aim to find novel principles on how circuits resist fluctuations and errors, and how they can be controlled by drugs: (1) Why do key regulatory systems use bifunctional enzymes that catalyze antagonistic reactions (e.g. both kinase and phosphatase)? We will test the role of bifunctional enzymes in making circuits robust to variations in protein levels. (2) Why are some genes regulated by a repressor and others by an activator? We will test this in the context of reduction of errors in transcription control. (3) Are there principles that describe how drugs combine to affect protein dynamics in human cells? We will use a novel dynamic proteomics approach developed in our lab to explore how protein dynamics can be controlled by drug combinations. This research will define principles that unite our understanding of seemingly distinct biological systems, and explain their particular design in terms of systems-level functions. This understanding will help form the basis for a future medicine that rationally controls the state of the cell based on a detailed blueprint of their circuit design, and quantitative principles for the effects of drugs on this circuitry.
Max ERC Funding
2 261 440 €
Duration
Start date: 2010-03-01, End date: 2015-02-28
Project acronym DG-PESP-CS
Project Deterministic Generation of Polarization Entangled single Photons Cluster States
Researcher (PI) David Gershoni
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Advanced Grant (AdG), PE2, ERC-2015-AdG
Summary Measurement based quantum computing is one of the most fault-tolerant architectures proposed for quantum information processing. It opens the possibility of performing quantum computing tasks using linear optical systems. An efficient route for measurement based quantum computing utilizes highly entangled states of photons, called cluster states. Propagation and processing quantum information is made possible this way using only single qubit measurements. It is highly resilient to qubit losses. In addition, single qubit measurements of polarization qubits is easily performed with high fidelity using standard optical tools. These features make photonic clusters excellent platforms for quantum information processing.
Constructing photonic cluster states, however, is a formidable challenge, attracting vast amounts of research efforts. While in principle it is possible to build up cluster states using interferometry, such a method is of a probabilistic nature and entails a large overhead of resources. The use of entangled photon pairs reduces this overhead by a small factor only.
We outline a novel route for constructing a deterministic source of photonic cluster states using a device based on semiconductor quantum dot. Our proposal follows a suggestion by Lindner and Rudolph. We use repeated optical excitations of a long lived coherent spin confined in a single semiconductor quantum dot and demonstrate for the first time practical realization of their proposal. Our preliminary demonstration presents a breakthrough in quantum technology since deterministic source of photonic cluster, reduces the resources needed quantum information processing. It may have revolutionary prospects for technological applications as well as to our fundamental understanding of quantum systems.
We propose to capitalize on this recent breakthrough and concentrate on R&D which will further advance this forefront field of science and technology by utilizing the horizons that it opens.
Summary
Measurement based quantum computing is one of the most fault-tolerant architectures proposed for quantum information processing. It opens the possibility of performing quantum computing tasks using linear optical systems. An efficient route for measurement based quantum computing utilizes highly entangled states of photons, called cluster states. Propagation and processing quantum information is made possible this way using only single qubit measurements. It is highly resilient to qubit losses. In addition, single qubit measurements of polarization qubits is easily performed with high fidelity using standard optical tools. These features make photonic clusters excellent platforms for quantum information processing.
Constructing photonic cluster states, however, is a formidable challenge, attracting vast amounts of research efforts. While in principle it is possible to build up cluster states using interferometry, such a method is of a probabilistic nature and entails a large overhead of resources. The use of entangled photon pairs reduces this overhead by a small factor only.
We outline a novel route for constructing a deterministic source of photonic cluster states using a device based on semiconductor quantum dot. Our proposal follows a suggestion by Lindner and Rudolph. We use repeated optical excitations of a long lived coherent spin confined in a single semiconductor quantum dot and demonstrate for the first time practical realization of their proposal. Our preliminary demonstration presents a breakthrough in quantum technology since deterministic source of photonic cluster, reduces the resources needed quantum information processing. It may have revolutionary prospects for technological applications as well as to our fundamental understanding of quantum systems.
We propose to capitalize on this recent breakthrough and concentrate on R&D which will further advance this forefront field of science and technology by utilizing the horizons that it opens.
Max ERC Funding
2 502 974 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym DIAG-CANCER
Project Diagnosis, Screening and Monitoring of Cancer Diseases via Exhaled Breath Using an Array of Nanosensors
Researcher (PI) Hossam Haick
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Starting Grant (StG), LS7, ERC-2010-StG_20091118
Summary Cancer is rapidly becoming the greatest health hazard of our days. The most widespread cancers, are lung cancer (LC), breast cancer (BC), colorectal cancer (CC), and prostate cancer (PC). The impact of the various techniques used for diagnosis, screening and monitoring
these cancers is either uncertain and/or inconvenient for the patients. This proposal aims to create a low-cost, easy-to-use and noninvasive screening method for LC, BC, CC, and PC based on breath testing with a novel nanosensors approach. With this in mind, we propose to:
(a) modify an array of nanosensors based on Au nanoparticles for obtaining highly-sensitive detection levels of breath biomarkers of cancer; and
(b) investigate the use of the developed array in a clinical study.
Towards this end, we will collect suitable breath samples from patients and healthy controls in a clinical trial and test the feasibility of the device to detect LC, BC, CC, and PC, also in the presence of other diseases.
We will then investigate possible ways to identify the stage of the disease, monitor the response to cancer
treatment, and to identify cancer subtypes. Further, we propose that the device can be used for monitoring of cancer patients during and after treatment. The chemical nature of the cancer biomarkers will be identified through spectrometry techniques.
The proposed approach would be used outside specialist settings and could considerably lessen the burden on the health budgets, both through the low cost of the proposed all-inclusive cancer test, and through earlier and, hence, more cost-effective cancer treatment.
Summary
Cancer is rapidly becoming the greatest health hazard of our days. The most widespread cancers, are lung cancer (LC), breast cancer (BC), colorectal cancer (CC), and prostate cancer (PC). The impact of the various techniques used for diagnosis, screening and monitoring
these cancers is either uncertain and/or inconvenient for the patients. This proposal aims to create a low-cost, easy-to-use and noninvasive screening method for LC, BC, CC, and PC based on breath testing with a novel nanosensors approach. With this in mind, we propose to:
(a) modify an array of nanosensors based on Au nanoparticles for obtaining highly-sensitive detection levels of breath biomarkers of cancer; and
(b) investigate the use of the developed array in a clinical study.
Towards this end, we will collect suitable breath samples from patients and healthy controls in a clinical trial and test the feasibility of the device to detect LC, BC, CC, and PC, also in the presence of other diseases.
We will then investigate possible ways to identify the stage of the disease, monitor the response to cancer
treatment, and to identify cancer subtypes. Further, we propose that the device can be used for monitoring of cancer patients during and after treatment. The chemical nature of the cancer biomarkers will be identified through spectrometry techniques.
The proposed approach would be used outside specialist settings and could considerably lessen the burden on the health budgets, both through the low cost of the proposed all-inclusive cancer test, and through earlier and, hence, more cost-effective cancer treatment.
Max ERC Funding
1 200 000 €
Duration
Start date: 2011-01-01, End date: 2014-12-31
Project acronym DIASPORAINTRANSITION
Project A Diaspora in Transition - Cultural and Religious Changes in Western Sephardic Communities in the Early Modern Period
Researcher (PI) Yosef Mauricio Kaplan
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Advanced Grant (AdG), SH6, ERC-2011-ADG_20110406
Summary The communities of the Western Sephardic Diaspora were founded in the 16th and 17th centuries by New Christians from Iberia who returned to Judaism that had been abandoned by their ancestors in the late Middle Ages. This project will concentrate on the changes in the religious conceptions and behavior as well as the cultural patterns of the communities of Amsterdam, Hamburg, Leghorn, London, and Bordeaux. We will analyze the vigorous activity of their leaders to set the boundaries of their new religious identity in comparison to the policy of several Christian “communities of belief,” which went into exile following religious persecution in their homelands. We will also examine the changes in the attitude toward Judaism during the 17th century in certain segments of the Sephardic Diaspora: rather than a normative system covering every area of life, Judaism came to be seen as a system of faith restricted to the religious sphere. We will seek to explain the extent to which this significant change influenced their institutions and social behaviour. This study will provide us with better understanding of the place of the Jews in European society. At the same time, we will subject a central series of concepts in the historiographical discourse of the Early Modern Period to critical analysis: confessionalization, disciplinary revolution, civilizing process, affective individualism, etc. This phase of the research will be based on qualitative and quantitative analysis of many hundreds of documents, texts and the material remains of these communities. Using sociological and anthropological models, we will analyze ceremonies and rituals described at length in the sources, the social and cultural meaning of the architecture of the Sephardic synagogues of that time, and of other visual symbols.
Summary
The communities of the Western Sephardic Diaspora were founded in the 16th and 17th centuries by New Christians from Iberia who returned to Judaism that had been abandoned by their ancestors in the late Middle Ages. This project will concentrate on the changes in the religious conceptions and behavior as well as the cultural patterns of the communities of Amsterdam, Hamburg, Leghorn, London, and Bordeaux. We will analyze the vigorous activity of their leaders to set the boundaries of their new religious identity in comparison to the policy of several Christian “communities of belief,” which went into exile following religious persecution in their homelands. We will also examine the changes in the attitude toward Judaism during the 17th century in certain segments of the Sephardic Diaspora: rather than a normative system covering every area of life, Judaism came to be seen as a system of faith restricted to the religious sphere. We will seek to explain the extent to which this significant change influenced their institutions and social behaviour. This study will provide us with better understanding of the place of the Jews in European society. At the same time, we will subject a central series of concepts in the historiographical discourse of the Early Modern Period to critical analysis: confessionalization, disciplinary revolution, civilizing process, affective individualism, etc. This phase of the research will be based on qualitative and quantitative analysis of many hundreds of documents, texts and the material remains of these communities. Using sociological and anthropological models, we will analyze ceremonies and rituals described at length in the sources, the social and cultural meaning of the architecture of the Sephardic synagogues of that time, and of other visual symbols.
Max ERC Funding
1 671 200 €
Duration
Start date: 2012-03-01, End date: 2018-02-28
Project acronym DIFFOP
Project Nonlinear Data and Signal Analysis with Diffusion Operators
Researcher (PI) Ronen TALMON
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Starting Grant (StG), PE6, ERC-2018-STG
Summary Nowadays, extensive collection and storage of massive data sets have become a routine in multiple disciplines and in everyday life. These large amounts of intricate data often make data samples arithmetic and basic comparisons problematic, raising new challenges to traditional data analysis objectives such as filtering and prediction. Furthermore, the availability of such data constantly pushes the boundaries of data analysis to new emerging domains, ranging from neuronal and social network analysis to multimodal sensor fusion. The combination of evolved data and new domains drives a fundamental change in the field of data analysis. Indeed, many classical model-based techniques have become obsolete since their models do not embody the richness of the collected data. Today, one notable avenue of research is the development of nonlinear techniques that transition from data to creating representations, without deriving models in closed-form. The vast majority of such existing data-driven methods operate directly on the data, a hard task by itself when the data are large and elaborated. The goal of this research is to develop a fundamentally new methodology for high dimensional data analysis with diffusion operators, making use of recent transformative results in manifold and geometry learning. More concretely, shifting the focus from processing the data samples themselves and considering instead structured data through the lens of diffusion operators will introduce new powerful “handles” to data, capturing their complexity efficiently. We will study the basic theory behind this nonlinear analysis, develop new operators for this purpose, and devise efficient data-driven algorithms. In addition, we will explore how our approach can be leveraged for devising efficient solutions to a broad range of open real-world data analysis problems, involving intrinsic representations, sensor fusion, time-series analysis, network connectivity inference, and domain adaptation.
Summary
Nowadays, extensive collection and storage of massive data sets have become a routine in multiple disciplines and in everyday life. These large amounts of intricate data often make data samples arithmetic and basic comparisons problematic, raising new challenges to traditional data analysis objectives such as filtering and prediction. Furthermore, the availability of such data constantly pushes the boundaries of data analysis to new emerging domains, ranging from neuronal and social network analysis to multimodal sensor fusion. The combination of evolved data and new domains drives a fundamental change in the field of data analysis. Indeed, many classical model-based techniques have become obsolete since their models do not embody the richness of the collected data. Today, one notable avenue of research is the development of nonlinear techniques that transition from data to creating representations, without deriving models in closed-form. The vast majority of such existing data-driven methods operate directly on the data, a hard task by itself when the data are large and elaborated. The goal of this research is to develop a fundamentally new methodology for high dimensional data analysis with diffusion operators, making use of recent transformative results in manifold and geometry learning. More concretely, shifting the focus from processing the data samples themselves and considering instead structured data through the lens of diffusion operators will introduce new powerful “handles” to data, capturing their complexity efficiently. We will study the basic theory behind this nonlinear analysis, develop new operators for this purpose, and devise efficient data-driven algorithms. In addition, we will explore how our approach can be leveraged for devising efficient solutions to a broad range of open real-world data analysis problems, involving intrinsic representations, sensor fusion, time-series analysis, network connectivity inference, and domain adaptation.
Max ERC Funding
1 260 000 €
Duration
Start date: 2019-02-01, End date: 2024-01-31
Project acronym DIGITALBABY
Project The emergence of understanding from the combination of innate mechanisms and visual experience
Researcher (PI) Shimon Ullman
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Advanced Grant (AdG), SH4, ERC-2010-AdG_20100407
Summary The goal of this research initiative is to construct large-scale computational modeling of how knowledge of the world emerges from the combination of innate mechanisms and visual experience. The ultimate goal is a ‘digital baby’ model which, through perception and interaction with the world, develops on its own representations of complex concepts that allow it to understand the world around it, in terms of objects, object categories, events, agents, actions, goals, social interactions, etc. A wealth of empirical research in the cognitive sciences have studied how natural concepts in these domains are acquired spontaneously and efficiently from perceptual experience, but a major open challenge is an understating of the processes and computations involved by rigorous testable models.
To deal with this challenge we propose a novel methodology based on two components. The first, ‘computational Nativism’, is a computational theory of cognitively and biologically plausible innate structures , which guide the system along specific paths through its acquisition of knowledge, to continuously acquire meaningful concepts, which can be significant to the observer, but statistically inconspicuous in the sensory input. The second, ‘embedded interpretation’ is a new way of acquiring extended learning and interpretation processes. This is obtained by placing perceptual inference mechanisms within a broader perception-action loop, where the actions in the loop are not overt actions, but internal operation over internal representation. The results will provide new modeling and understanding of the age-old problem of how innate mechanisms and perception are combined in human cognition, and may lay foundation for a major research direction dealing with computational cognitive development.
Summary
The goal of this research initiative is to construct large-scale computational modeling of how knowledge of the world emerges from the combination of innate mechanisms and visual experience. The ultimate goal is a ‘digital baby’ model which, through perception and interaction with the world, develops on its own representations of complex concepts that allow it to understand the world around it, in terms of objects, object categories, events, agents, actions, goals, social interactions, etc. A wealth of empirical research in the cognitive sciences have studied how natural concepts in these domains are acquired spontaneously and efficiently from perceptual experience, but a major open challenge is an understating of the processes and computations involved by rigorous testable models.
To deal with this challenge we propose a novel methodology based on two components. The first, ‘computational Nativism’, is a computational theory of cognitively and biologically plausible innate structures , which guide the system along specific paths through its acquisition of knowledge, to continuously acquire meaningful concepts, which can be significant to the observer, but statistically inconspicuous in the sensory input. The second, ‘embedded interpretation’ is a new way of acquiring extended learning and interpretation processes. This is obtained by placing perceptual inference mechanisms within a broader perception-action loop, where the actions in the loop are not overt actions, but internal operation over internal representation. The results will provide new modeling and understanding of the age-old problem of how innate mechanisms and perception are combined in human cognition, and may lay foundation for a major research direction dealing with computational cognitive development.
Max ERC Funding
1 647 175 €
Duration
Start date: 2011-06-01, End date: 2016-05-31
Project acronym DigitalValues
Project The Construction of Values in Digital Spheres
Researcher (PI) Limor Shifman
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Consolidator Grant (CoG), SH3, ERC-2018-COG
Summary In recent decades, social media has emerged as a central arena for the construction of values. Artifacts such as YouTube videos, Facebook posts, and tweets reflect and shape what people across the globe consider important, desirable, or reprehensible. Understanding this pervasive value ecology is key to deciphering the political, cultural, and social processes governing the twenty-first century. In this project, I will conduct the first comprehensive study of values in social media. I will explore the following over-arching questions: How are values constructed through social media? Which values are emphasized in these spheres? To what extent are social media platforms associated with the globalization of values? In addressing these fundamental issues, I will apply an entirely new approach for the conceptualization and study of values.
Carried out comparatively in five languages, DigitalValues will explore the interaction between three facets of value construction: (a) explicit uses of the terms “value” and “values”; (b) the implicit construction of values in genres of user-generated content; and (c) users’ interpretation and evaluation of values through both private meaning-making and public social practices of commenting, sharing, and liking. The project is theoretically, empirically, and methodologically groundbreaking in a number of ways: (1) it will be a pioneering large-scale study employing inductive methods to explore the construction of values through everyday cultural artifacts; (2) as a foundational study of values in social media, it will yield a novel theory of value construction as an intersection between individuals, technologies, and sociocultural contexts; (3) it will generate new methods for infering values from verbal texts, combining qualitative, quantitative, and automated analyses; (4) finally, it will yield a comprehensive map of values as expressed across languages and platforms, leading to a new understanding of the globalization of values.
Summary
In recent decades, social media has emerged as a central arena for the construction of values. Artifacts such as YouTube videos, Facebook posts, and tweets reflect and shape what people across the globe consider important, desirable, or reprehensible. Understanding this pervasive value ecology is key to deciphering the political, cultural, and social processes governing the twenty-first century. In this project, I will conduct the first comprehensive study of values in social media. I will explore the following over-arching questions: How are values constructed through social media? Which values are emphasized in these spheres? To what extent are social media platforms associated with the globalization of values? In addressing these fundamental issues, I will apply an entirely new approach for the conceptualization and study of values.
Carried out comparatively in five languages, DigitalValues will explore the interaction between three facets of value construction: (a) explicit uses of the terms “value” and “values”; (b) the implicit construction of values in genres of user-generated content; and (c) users’ interpretation and evaluation of values through both private meaning-making and public social practices of commenting, sharing, and liking. The project is theoretically, empirically, and methodologically groundbreaking in a number of ways: (1) it will be a pioneering large-scale study employing inductive methods to explore the construction of values through everyday cultural artifacts; (2) as a foundational study of values in social media, it will yield a novel theory of value construction as an intersection between individuals, technologies, and sociocultural contexts; (3) it will generate new methods for infering values from verbal texts, combining qualitative, quantitative, and automated analyses; (4) finally, it will yield a comprehensive map of values as expressed across languages and platforms, leading to a new understanding of the globalization of values.
Max ERC Funding
1 985 000 €
Duration
Start date: 2019-08-01, End date: 2024-07-31
Project acronym DIMENSION
Project High-Dimensional Phenomena and Convexity
Researcher (PI) Boaz Binyamin Klartag
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), PE1, ERC-2012-StG_20111012
Summary High-dimensional problems with a geometric flavor appear in quite a few branches of mathematics, mathematical physics and theoretical computer science. A priori, one would think that the diversity and the rapid increase of the number of configurations would make it impossible to formulate general, interesting theorems that apply to large classes of high-dimensional geometric objects. The underlying theme of the proposed project is that the contrary is often true. Mathematical developments of the last decades indicate that high dimensionality, when viewed correctly, may create remarkable order and simplicity, rather than complication. For example, Dvoretzky's theorem demonstrates that any high-dimensional convex body has nearly-Euclidean sections of a high dimension. Another example is the central limit theorem for convex bodies due to the PI, according to which any high-dimensional convex body has approximately Gaussian marginals. There are a number of strong motifs in high-dimensional geometry, such as the concentration of measure, which seem to compensate for the vast amount of different possibilities. Convexity is one of the ways in which to harness these motifs and thereby formulate clean, non-trivial theorems. The scientific goals of the project are to develop new methods for the study of convexity in high dimensions beyond the concentration of measure, to explore emerging connections with other fields of mathematics, and to solve the outstanding problems related to the distribution of volume in high-dimensional convex sets.
Summary
High-dimensional problems with a geometric flavor appear in quite a few branches of mathematics, mathematical physics and theoretical computer science. A priori, one would think that the diversity and the rapid increase of the number of configurations would make it impossible to formulate general, interesting theorems that apply to large classes of high-dimensional geometric objects. The underlying theme of the proposed project is that the contrary is often true. Mathematical developments of the last decades indicate that high dimensionality, when viewed correctly, may create remarkable order and simplicity, rather than complication. For example, Dvoretzky's theorem demonstrates that any high-dimensional convex body has nearly-Euclidean sections of a high dimension. Another example is the central limit theorem for convex bodies due to the PI, according to which any high-dimensional convex body has approximately Gaussian marginals. There are a number of strong motifs in high-dimensional geometry, such as the concentration of measure, which seem to compensate for the vast amount of different possibilities. Convexity is one of the ways in which to harness these motifs and thereby formulate clean, non-trivial theorems. The scientific goals of the project are to develop new methods for the study of convexity in high dimensions beyond the concentration of measure, to explore emerging connections with other fields of mathematics, and to solve the outstanding problems related to the distribution of volume in high-dimensional convex sets.
Max ERC Funding
998 000 €
Duration
Start date: 2013-01-01, End date: 2018-12-31
Project acronym DIRECTEDINFO
Project Investigating Directed Information
Researcher (PI) Haim Permuter
Host Institution (HI) BEN-GURION UNIVERSITY OF THE NEGEV
Call Details Starting Grant (StG), PE7, ERC-2013-StG
Summary This research investigates a new measure that arises in information theory
called directed information. Recent advances, including our preliminary results, shows that
directed information arises in communication as the maximum rate that can be transmitted reliably
in channels with feedback. The directed information is multi-letter expression and therefore very
hard to optimize or compute.
Our plan is first of all to find an efficient methodology for optimizing the measure using the
dynamic programming framework and convex optimization tools. As an important by-product of
finding the fundamental limits is finding coding schemes that achieves the limits. Second, we
plan to find new roles for directed information in communication, especially in networks with
bi-directional communication and in data compression with causal conditions. Third, encouraged by
a preliminary work on interpretation of directed information in economics and estimation theory,
we plan to show that directed information has interpretation in additional fields such as
statistical physics. We plan to show that there is duality relation between different fields with
causal constraints. Due to the duality insights and breakthroughs in one problem will lead to new
insights in other problems. Finally, we will apply directed information as a statistical
inference of causal dependence. We will show how to estimate and use the directed information
estimator to measure causal inference between two or more process. In particular, one of the
questions we plan to answer is the influence of industrial activities (e.g., $\text{CO}_2$
volumes) on the global warming.
Our main focus will be to develop a deeper understanding of the mathematical properties of
directed information, a process that is instrumental to each problem. Due to their theoretical
proximity and their interdisciplinary nature, progress in one problem will lead to new insights
in other problems. A common set of mathematical tools developed in
Summary
This research investigates a new measure that arises in information theory
called directed information. Recent advances, including our preliminary results, shows that
directed information arises in communication as the maximum rate that can be transmitted reliably
in channels with feedback. The directed information is multi-letter expression and therefore very
hard to optimize or compute.
Our plan is first of all to find an efficient methodology for optimizing the measure using the
dynamic programming framework and convex optimization tools. As an important by-product of
finding the fundamental limits is finding coding schemes that achieves the limits. Second, we
plan to find new roles for directed information in communication, especially in networks with
bi-directional communication and in data compression with causal conditions. Third, encouraged by
a preliminary work on interpretation of directed information in economics and estimation theory,
we plan to show that directed information has interpretation in additional fields such as
statistical physics. We plan to show that there is duality relation between different fields with
causal constraints. Due to the duality insights and breakthroughs in one problem will lead to new
insights in other problems. Finally, we will apply directed information as a statistical
inference of causal dependence. We will show how to estimate and use the directed information
estimator to measure causal inference between two or more process. In particular, one of the
questions we plan to answer is the influence of industrial activities (e.g., $\text{CO}_2$
volumes) on the global warming.
Our main focus will be to develop a deeper understanding of the mathematical properties of
directed information, a process that is instrumental to each problem. Due to their theoretical
proximity and their interdisciplinary nature, progress in one problem will lead to new insights
in other problems. A common set of mathematical tools developed in
Max ERC Funding
1 224 600 €
Duration
Start date: 2013-08-01, End date: 2019-07-31
Project acronym DLGAPS
Project Dynamics of Lie group actions on parameter spaces
Researcher (PI) Barak Weiss
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), PE1, ERC-2011-StG_20101014
Summary There are many parallels between Lie group actions on homogeneous spaces and the action of $\SL_2(\R)$ and its subgroups on strata of translation or half-translation surfaces. I propose to investigate these two spaces in parallel, focusing on the dynamical
behavior, and more specifically, the description of orbit-closures.
I intend to utilize existing and emerging measure rigidity results, and to develop new topological
approaches. These should also shed light on the geometry and topology of the spaces. I propose to apply results concerning these spaces to the study of diophantine approximations (approximation on fractals), geometry of numbers (Minkowski's conjecture), interval exchanges, and rational billiards.
Summary
There are many parallels between Lie group actions on homogeneous spaces and the action of $\SL_2(\R)$ and its subgroups on strata of translation or half-translation surfaces. I propose to investigate these two spaces in parallel, focusing on the dynamical
behavior, and more specifically, the description of orbit-closures.
I intend to utilize existing and emerging measure rigidity results, and to develop new topological
approaches. These should also shed light on the geometry and topology of the spaces. I propose to apply results concerning these spaces to the study of diophantine approximations (approximation on fractals), geometry of numbers (Minkowski's conjecture), interval exchanges, and rational billiards.
Max ERC Funding
850 000 €
Duration
Start date: 2011-10-01, End date: 2016-09-30
Project acronym DMMCA
Project Discrete Mathematics: methods, challenges and applications
Researcher (PI) Noga Alon
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Advanced Grant (AdG), PE1, ERC-2008-AdG
Summary Discrete Mathematics is a fundamental mathematical discipline as well as an essential component of many mathematical areas, and its study has experienced an impressive growth in recent years. Some of the main reasons for this growth are the broad applications of tools and techniques from extremal and probabilistic combinatorics in the rapid development of theoretical Computer Science, in the spectacular recent results in Additive Number Theory and in the study of basic questions in Information Theory. While in the past many of the basic combinatorial results were obtained mainly by ingenuity and detailed reasoning, the modern theory has grown out of this early stage, and often relies on deep, well developed tools, like the probabilistic method, algebraic, topological and geometric techniques. The work of the principal investigator, partly jointly with several collaborators and students, and partly in individual efforts, has played a significant role in the introduction of powerful algebraic, probabilistic, spectral and geometric techniques that influenced the development of modern combinatorics. In the present project he aims to try and further develop such tools, trying to tackle some basic open problems in Combinatorics, as well as significant questions in Additive Combinatorics, Information Theory, and theoretical Computer Science. Progress on the problems mentioned in this proposal, and the study of related ones, is expected to provide new insights on these problems and to lead to the development of novel fruitful techniques that are likely to be useful in Discrete Mathematics as well as in related areas.
Summary
Discrete Mathematics is a fundamental mathematical discipline as well as an essential component of many mathematical areas, and its study has experienced an impressive growth in recent years. Some of the main reasons for this growth are the broad applications of tools and techniques from extremal and probabilistic combinatorics in the rapid development of theoretical Computer Science, in the spectacular recent results in Additive Number Theory and in the study of basic questions in Information Theory. While in the past many of the basic combinatorial results were obtained mainly by ingenuity and detailed reasoning, the modern theory has grown out of this early stage, and often relies on deep, well developed tools, like the probabilistic method, algebraic, topological and geometric techniques. The work of the principal investigator, partly jointly with several collaborators and students, and partly in individual efforts, has played a significant role in the introduction of powerful algebraic, probabilistic, spectral and geometric techniques that influenced the development of modern combinatorics. In the present project he aims to try and further develop such tools, trying to tackle some basic open problems in Combinatorics, as well as significant questions in Additive Combinatorics, Information Theory, and theoretical Computer Science. Progress on the problems mentioned in this proposal, and the study of related ones, is expected to provide new insights on these problems and to lead to the development of novel fruitful techniques that are likely to be useful in Discrete Mathematics as well as in related areas.
Max ERC Funding
1 061 300 €
Duration
Start date: 2008-12-01, End date: 2013-11-30
Project acronym DMR-CODE
Project Decoding the Mammalian transcriptional Regulatory code in development and stimulatory responses
Researcher (PI) Ido Amit
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), LS2, ERC-2012-StG_20111109
Summary Transcription factors (TF) regulate genome function by controlling gene expression. Comprehensive characterization of the in vivo binding of TF to the DNA in relevant primary models is a critical step towards a global understanding of the human genome. Recent advances in high-throughput genomic technologies provide an extraordinary opportunity to develop and apply systematic approaches to learn the underline principles and mechanisms of mammalian transcriptional networks. The premise of this proposal is that a tractable set of rules govern how cells commit to a specific cell type or respond to the environment, and that these rules are coded in regulatory elements in the genome. Currently our understanding of the mammalian regulatory code is hampered by the difficulty of directly measuring in vivo binding of large numbers of TFs to DNA across multiple primary cell types and their natural response to physiological stimuli.
Here, we overcome this bottleneck by systematically exploring the genomic binding network of 1. All relevant TFs of key hematopoietic cells in both steady state and under relevant stimuli. 2. Follow the changes in TF networks as cells differentiate 3. Use these models to engineer cell states and responses. To achieve these goals, we developed a new method for automated high throughput ChIP coupled to sequencing (HT-ChIP-Seq). We used this method to measure binding of 40 TFs in 4 time points following stimulation of dendritic cells with pathogen components. We find that TFs vary substantially in their binding dynamics, genomic localization, number of binding events, and degree of interaction with other TFs. The analysis of this data suggests that the TF network is hierarchically organized, and composed of different types of TFs, cell differentiation factors, factors that prime for gene induction, and factors that bind more specifically and dynamically. This proposal revisits and challenges the current understanding of the mammalian regulatory code.
Summary
Transcription factors (TF) regulate genome function by controlling gene expression. Comprehensive characterization of the in vivo binding of TF to the DNA in relevant primary models is a critical step towards a global understanding of the human genome. Recent advances in high-throughput genomic technologies provide an extraordinary opportunity to develop and apply systematic approaches to learn the underline principles and mechanisms of mammalian transcriptional networks. The premise of this proposal is that a tractable set of rules govern how cells commit to a specific cell type or respond to the environment, and that these rules are coded in regulatory elements in the genome. Currently our understanding of the mammalian regulatory code is hampered by the difficulty of directly measuring in vivo binding of large numbers of TFs to DNA across multiple primary cell types and their natural response to physiological stimuli.
Here, we overcome this bottleneck by systematically exploring the genomic binding network of 1. All relevant TFs of key hematopoietic cells in both steady state and under relevant stimuli. 2. Follow the changes in TF networks as cells differentiate 3. Use these models to engineer cell states and responses. To achieve these goals, we developed a new method for automated high throughput ChIP coupled to sequencing (HT-ChIP-Seq). We used this method to measure binding of 40 TFs in 4 time points following stimulation of dendritic cells with pathogen components. We find that TFs vary substantially in their binding dynamics, genomic localization, number of binding events, and degree of interaction with other TFs. The analysis of this data suggests that the TF network is hierarchically organized, and composed of different types of TFs, cell differentiation factors, factors that prime for gene induction, and factors that bind more specifically and dynamically. This proposal revisits and challenges the current understanding of the mammalian regulatory code.
Max ERC Funding
1 500 000 €
Duration
Start date: 2012-10-01, End date: 2017-09-30
Project acronym Domognostics
Project Intelligent Building Automation Diagnostics
Researcher (PI) Marios POLYCARPOU
Host Institution (HI) UNIVERSITY OF CYPRUS
Call Details Proof of Concept (PoC), PC1, ERC-2016-PoC
Summary The emergence of networked cyber-physical systems, in which sensor/actuator networks are integrated with software algorithms, facilitates the development of advanced Building Management Systems (BMS) aimed at enhancing energy efficiency in buildings, which accounts for 40% of the energy consumption in the EU. When a fault arises in some of the components, or an unexpected event occurs in the building, this may lead to a serious degradation in performance or, even worse, to situations that would endanger people’s lives. Studies estimate that 20% of the energy consumed in commercial buildings for heating, ventilation, air conditioning, lighting and water heating can be attributed to various faults. Therefore, there is a market need for an intelligent building automation diagnostic system which integrates with existing BMS to facilitate continuous and effective monitoring of the buildings.
The objective of the proposed proof of concept is to develop the Domognostics platform, a novel solution for monitoring building automation systems, detecting and diagnosing any component faults and/or unexpected events, and providing remedial reconfiguration actions, aiming at improving operational efficiency. Domognostics will interoperate with existing BMS to extend their capabilities, and will integrate directly with heterogeneous sensor types, such as IoT devices, mobile sensors, wearables, etc., to increase redundancy of the available information and measurements. The Domognostics platform will utilise intelligent fault diagnosis algorithms with machine learning capabilities to boost its capacity to learn from experience, and semantically enhanced reasoning to facilitate the flexibility of adding new sensors or replacing faulty components, as needed. The theoretical foundations of these techniques were developed as part of the ERC Advanced Grant project Fault-Adaptive, which started in April 2012, and is currently being carried out at the University of Cyprus.
Summary
The emergence of networked cyber-physical systems, in which sensor/actuator networks are integrated with software algorithms, facilitates the development of advanced Building Management Systems (BMS) aimed at enhancing energy efficiency in buildings, which accounts for 40% of the energy consumption in the EU. When a fault arises in some of the components, or an unexpected event occurs in the building, this may lead to a serious degradation in performance or, even worse, to situations that would endanger people’s lives. Studies estimate that 20% of the energy consumed in commercial buildings for heating, ventilation, air conditioning, lighting and water heating can be attributed to various faults. Therefore, there is a market need for an intelligent building automation diagnostic system which integrates with existing BMS to facilitate continuous and effective monitoring of the buildings.
The objective of the proposed proof of concept is to develop the Domognostics platform, a novel solution for monitoring building automation systems, detecting and diagnosing any component faults and/or unexpected events, and providing remedial reconfiguration actions, aiming at improving operational efficiency. Domognostics will interoperate with existing BMS to extend their capabilities, and will integrate directly with heterogeneous sensor types, such as IoT devices, mobile sensors, wearables, etc., to increase redundancy of the available information and measurements. The Domognostics platform will utilise intelligent fault diagnosis algorithms with machine learning capabilities to boost its capacity to learn from experience, and semantically enhanced reasoning to facilitate the flexibility of adding new sensors or replacing faulty components, as needed. The theoretical foundations of these techniques were developed as part of the ERC Advanced Grant project Fault-Adaptive, which started in April 2012, and is currently being carried out at the University of Cyprus.
Max ERC Funding
150 000 €
Duration
Start date: 2017-05-01, End date: 2018-10-31
Project acronym DPI
Project Deep Packet Inspection to Next Generation Network Devices
Researcher (PI) Anat Bremler-Barr
Host Institution (HI) INTERDISCIPLINARY CENTER (IDC) HERZLIYA
Call Details Starting Grant (StG), PE7, ERC-2010-StG_20091028
Summary Deep packet inspection (DPI) lies at the core of contemporary Network Intrusion Detection/Prevention Systems and Web Application Firewall. DPI aims to identify various malware (including spam and viruses), by inspecting both the header and the payload of each packet and comparing it to a known set of patterns. DPI are often performed on the critical path of the packet processing, thus the overall performance of the security tools is dominated by the speed of DPI.
Traditionally, DPI considered only exact string patterns. However, in modern network devices patterns are often represented by regular expressions due to their superior expressiveness. Matching both exact string and regular expressions are well-studied area in Computer Science; however all well-known solutions are not sufficient for current network demands: First, current solutions do not scale in terms of speed, memory and power requirements. While current network devices work at 10-100 Gbps and have thousands of patterns, traditional solutions suffer from exponential memory size or exponential time and induce prohibitive power consumption. Second, non clear-text traffic, such as compressed traffic, becomes a dominant portion of the Internet and is clearly harder to inspect.
In this research we design new algorithms and schemes that cope with today demand. This is evolving area both in the Academia and Industry, where currently there is no adequate solution.
We intend to use recent advances in hardware to cope with these demanding requirements. More specifically, we plan to use Ternary Content-Addressable Memories (TCAMs), which become standard commodity in contemporary network devices. TCAMs can compare a key against all rules in a memory in parallel and thus provide high throughput. We believ
Summary
Deep packet inspection (DPI) lies at the core of contemporary Network Intrusion Detection/Prevention Systems and Web Application Firewall. DPI aims to identify various malware (including spam and viruses), by inspecting both the header and the payload of each packet and comparing it to a known set of patterns. DPI are often performed on the critical path of the packet processing, thus the overall performance of the security tools is dominated by the speed of DPI.
Traditionally, DPI considered only exact string patterns. However, in modern network devices patterns are often represented by regular expressions due to their superior expressiveness. Matching both exact string and regular expressions are well-studied area in Computer Science; however all well-known solutions are not sufficient for current network demands: First, current solutions do not scale in terms of speed, memory and power requirements. While current network devices work at 10-100 Gbps and have thousands of patterns, traditional solutions suffer from exponential memory size or exponential time and induce prohibitive power consumption. Second, non clear-text traffic, such as compressed traffic, becomes a dominant portion of the Internet and is clearly harder to inspect.
In this research we design new algorithms and schemes that cope with today demand. This is evolving area both in the Academia and Industry, where currently there is no adequate solution.
We intend to use recent advances in hardware to cope with these demanding requirements. More specifically, we plan to use Ternary Content-Addressable Memories (TCAMs), which become standard commodity in contemporary network devices. TCAMs can compare a key against all rules in a memory in parallel and thus provide high throughput. We believ
Max ERC Funding
990 400 €
Duration
Start date: 2010-11-01, End date: 2016-10-31
Project acronym DrugSense
Project Ribo-regulators that sense trace antibiotics
Researcher (PI) Rotem SOREK
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Proof of Concept (PoC), PC1, ERC-2015-PoC
Summary Over-usage of antibiotics in the clinic and in agriculture resulted not only in increased drug resistance among pathogenic bacteria, but also in the spread of antibiotics metabolites in the environment and in our food. This poses multiple significant threats, including the development and expansion of multi-drug resistant pathogens.
The health and safety risks imposed by the presence of antibiotics in food, drinking water, and environmental waters raise the strong necessity for continuous monitoring of trace antibiotics levels in multiple media. The EU now obliges food manufacturers to test for antibiotics traces in their products, but current technologies for antibiotics sensing do not provide a complete solution. There is a strong need for antibiotics sensors that would accurately, rapidly and inexpensively report on the presence of antibiotics in various environments.
In our ERC-StG project we discovered new RNA leaders (ribo-regulators) that sense very low concentrations of antibiotics, leading to the activation of antibiotics resistance genes. These ribo-regulators thus function as efficient antibiotics sensors. Within the current PoC project we will develop a prototype for a highly sensitive bio-sensor, capable of rapid detection of trace levels of multiple antibiotics in food, water and other substances in a cost-effective manner.
Our PoC project involves both prototype development and business development. Within the prototype development we will utilize our earlier discoveries to bio-engineer the antibiotics sensor. Within the business development arm we will perform a thorough market research to identify the market needs, map the competition and pinpoint market segments where our biosensor product would have an advantage over the competition. Our aim is to achieve an IP protected proof of concept prototype that will attract further external investments, leading to spawning of a start up company that will bring our technology to the market.
Summary
Over-usage of antibiotics in the clinic and in agriculture resulted not only in increased drug resistance among pathogenic bacteria, but also in the spread of antibiotics metabolites in the environment and in our food. This poses multiple significant threats, including the development and expansion of multi-drug resistant pathogens.
The health and safety risks imposed by the presence of antibiotics in food, drinking water, and environmental waters raise the strong necessity for continuous monitoring of trace antibiotics levels in multiple media. The EU now obliges food manufacturers to test for antibiotics traces in their products, but current technologies for antibiotics sensing do not provide a complete solution. There is a strong need for antibiotics sensors that would accurately, rapidly and inexpensively report on the presence of antibiotics in various environments.
In our ERC-StG project we discovered new RNA leaders (ribo-regulators) that sense very low concentrations of antibiotics, leading to the activation of antibiotics resistance genes. These ribo-regulators thus function as efficient antibiotics sensors. Within the current PoC project we will develop a prototype for a highly sensitive bio-sensor, capable of rapid detection of trace levels of multiple antibiotics in food, water and other substances in a cost-effective manner.
Our PoC project involves both prototype development and business development. Within the prototype development we will utilize our earlier discoveries to bio-engineer the antibiotics sensor. Within the business development arm we will perform a thorough market research to identify the market needs, map the competition and pinpoint market segments where our biosensor product would have an advantage over the competition. Our aim is to achieve an IP protected proof of concept prototype that will attract further external investments, leading to spawning of a start up company that will bring our technology to the market.
Max ERC Funding
150 000 €
Duration
Start date: 2016-05-01, End date: 2017-04-30
Project acronym DYNA-MIC
Project Deep non-invasive imaging via scattered-light acoustically-mediated computational microscopy
Researcher (PI) Ori Katz
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), PE7, ERC-2015-STG
Summary Optical microscopy, perhaps the most important tool in biomedical investigation and clinical diagnostics, is currently held back by the assumption that it is not possible to noninvasively image microscopic structures more than a fraction of a millimeter deep inside tissue. The governing paradigm is that high-resolution information carried by light is lost due to random scattering in complex samples such as tissue. While non-optical imaging techniques, employing non-ionizing radiation such as ultrasound, allow deeper investigations, they possess drastically inferior resolution and do not permit microscopic studies of cellular structures, crucial for accurate diagnosis of cancer and other diseases.
I propose a new kind of microscope, one that can peer deep inside visually opaque samples, combining the sub-micron resolution of light with the penetration depth of ultrasound. My novel approach is based on our discovery that information on microscopic structures is contained in random scattered-light patterns. It breaks current limits by exploiting the randomness of scattered light rather than struggling to fight it.
We will transform this concept into a breakthrough imaging platform by combining ultrasonic probing and modulation of light with advanced digital signal processing algorithms, extracting the hidden microscopic structure by two complementary approaches: 1) By exploiting the stochastic dynamics of scattered light using methods developed to surpass the diffraction limit in optical nanoscopy and for compressive sampling, harnessing nonlinear effects. 2) Through the analysis of intrinsic correlations in scattered light that persist deep inside scattering tissue.
This proposal is formed by bringing together novel insights on the physics of light in complex media, advanced microscopy techniques, and ultrasound-mediated imaging. It is made possible by the new ability to digitally process vast amounts of scattering data, and has the potential to impact many fields.
Summary
Optical microscopy, perhaps the most important tool in biomedical investigation and clinical diagnostics, is currently held back by the assumption that it is not possible to noninvasively image microscopic structures more than a fraction of a millimeter deep inside tissue. The governing paradigm is that high-resolution information carried by light is lost due to random scattering in complex samples such as tissue. While non-optical imaging techniques, employing non-ionizing radiation such as ultrasound, allow deeper investigations, they possess drastically inferior resolution and do not permit microscopic studies of cellular structures, crucial for accurate diagnosis of cancer and other diseases.
I propose a new kind of microscope, one that can peer deep inside visually opaque samples, combining the sub-micron resolution of light with the penetration depth of ultrasound. My novel approach is based on our discovery that information on microscopic structures is contained in random scattered-light patterns. It breaks current limits by exploiting the randomness of scattered light rather than struggling to fight it.
We will transform this concept into a breakthrough imaging platform by combining ultrasonic probing and modulation of light with advanced digital signal processing algorithms, extracting the hidden microscopic structure by two complementary approaches: 1) By exploiting the stochastic dynamics of scattered light using methods developed to surpass the diffraction limit in optical nanoscopy and for compressive sampling, harnessing nonlinear effects. 2) Through the analysis of intrinsic correlations in scattered light that persist deep inside scattering tissue.
This proposal is formed by bringing together novel insights on the physics of light in complex media, advanced microscopy techniques, and ultrasound-mediated imaging. It is made possible by the new ability to digitally process vast amounts of scattering data, and has the potential to impact many fields.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-04-01, End date: 2021-03-31
Project acronym Dynamic Delegation
Project Implications of the Dynamic Nature of Portfolio Delegation
Researcher (PI) Ron Kaniel
Host Institution (HI) INTERDISCIPLINARY CENTER (IDC) HERZLIYA
Call Details Starting Grant (StG), SH1, ERC-2012-StG_20111124
Summary The asset management industry is a 60 trillion euros industry world wide, with a ratio of assets under management by asset managers to GDP around 100 percent. Despite the prominence of financial intermediaries in financial markets, our understanding of the portfolio delegation relationship, and its equilibrium asset pricing and contracting implications is at its infancy. The recent financial crisis has further underscored the importance of better understanding the incentives of financial intermediaries, the distortions induced by these incentives, the contracts that can help mitigate these distortions, and the impact of their trading on asset pricing dynamics.
One key feature that is at the core of the asset management relationship is its dynamic nature: investors can, and do, periodically re-allocate funds between managers and between funds and other investment vehicles. The magnitude of fund flows, both over time and accross funds at a given point in time, have been shown to be quantitatively large relative to assets under management. The ability of investors to quickly pull money out of funds at a time of crisis can have significant ramifications for the stability of the financial system.
Understanding implications of the dynamic nature of the delegation relationship is imperative in order to understand multiple aspects related to delegation and financial markets at large, including: risk taking behavior by funds; welfare implications for investors who invest in funds; what regulatory restrictions should be imposed on contracts; the evolution, past and future, of the asset management industry; securities return dynamics.
The objective is to develope models that will incorporate dynamic flows in settings that will allow studying implications and deriving empirical predictions on multiple dimensions: portfolio choice; optimal contracting; distribution of assets across funds; equilibrium asset pricing dynamics.
Summary
The asset management industry is a 60 trillion euros industry world wide, with a ratio of assets under management by asset managers to GDP around 100 percent. Despite the prominence of financial intermediaries in financial markets, our understanding of the portfolio delegation relationship, and its equilibrium asset pricing and contracting implications is at its infancy. The recent financial crisis has further underscored the importance of better understanding the incentives of financial intermediaries, the distortions induced by these incentives, the contracts that can help mitigate these distortions, and the impact of their trading on asset pricing dynamics.
One key feature that is at the core of the asset management relationship is its dynamic nature: investors can, and do, periodically re-allocate funds between managers and between funds and other investment vehicles. The magnitude of fund flows, both over time and accross funds at a given point in time, have been shown to be quantitatively large relative to assets under management. The ability of investors to quickly pull money out of funds at a time of crisis can have significant ramifications for the stability of the financial system.
Understanding implications of the dynamic nature of the delegation relationship is imperative in order to understand multiple aspects related to delegation and financial markets at large, including: risk taking behavior by funds; welfare implications for investors who invest in funds; what regulatory restrictions should be imposed on contracts; the evolution, past and future, of the asset management industry; securities return dynamics.
The objective is to develope models that will incorporate dynamic flows in settings that will allow studying implications and deriving empirical predictions on multiple dimensions: portfolio choice; optimal contracting; distribution of assets across funds; equilibrium asset pricing dynamics.
Max ERC Funding
728 436 €
Duration
Start date: 2013-01-01, End date: 2017-12-31
Project acronym EcCRISPR
Project Novel roles, components, and mechanisms of the Escherichia coli CRISPR/Cas system
Researcher (PI) Ehud Itzhak Qimron
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), LS2, ERC-2013-StG
Summary A novel type of defense system was recently identified in bacteria: the CRISPR array and its associated gene products (Cas). The system inserts short DNA sequences, called spacers, derived from foreign nucleic acid molecules in between direct repeats, thus forming the CRISPR array. The transcribed spacers eventually serve as molecular guides for Cas proteins that monitor and destroy nucleic acids having sequences similar to those spacers. Thorough mapping of the functional components and regulators of the system in a single model organism will be extremely valuable for understanding its mechanism of action. Studying the interactions between bacteria and phages should highlight the evolutionary role of the system and its consequences for shaping ecological systems. These insights will lead to novel ways of exploiting the system to improve molecular biology tools, to protect fermenting bacteria from phage spoilage, to equip phages with anti-CRISPR warfare to fight bacteria, and to prevent horizontal gene transfer between pathogens. Here, I intend to systematically seek out new roles of the system and to identify fundamental mechanisms and components that allow the system to function efficiently. I will address fundamental questions such as how the system avoids sampling self DNA into the CRISPR array. In addition, I will pursue two revolutionary possibilities. One, that the CRISPR/Cas system is not merely an adaptive defense system against phages, but that one of its roles is to serve as molecular machinery for silencing specific harmful genes by generating small silencing RNAs without the need for Cas proteins. The other is to test the system’s ability to prevent horizontal gene transfer of antibiotic resistance genes in an effort to study the system’s ecological value, potentially for applicative uses. My proposed studies will allow deeper understanding of the system, and enable breakthroughs from both basic and applicative aspects of the CRISPR field studies.
Summary
A novel type of defense system was recently identified in bacteria: the CRISPR array and its associated gene products (Cas). The system inserts short DNA sequences, called spacers, derived from foreign nucleic acid molecules in between direct repeats, thus forming the CRISPR array. The transcribed spacers eventually serve as molecular guides for Cas proteins that monitor and destroy nucleic acids having sequences similar to those spacers. Thorough mapping of the functional components and regulators of the system in a single model organism will be extremely valuable for understanding its mechanism of action. Studying the interactions between bacteria and phages should highlight the evolutionary role of the system and its consequences for shaping ecological systems. These insights will lead to novel ways of exploiting the system to improve molecular biology tools, to protect fermenting bacteria from phage spoilage, to equip phages with anti-CRISPR warfare to fight bacteria, and to prevent horizontal gene transfer between pathogens. Here, I intend to systematically seek out new roles of the system and to identify fundamental mechanisms and components that allow the system to function efficiently. I will address fundamental questions such as how the system avoids sampling self DNA into the CRISPR array. In addition, I will pursue two revolutionary possibilities. One, that the CRISPR/Cas system is not merely an adaptive defense system against phages, but that one of its roles is to serve as molecular machinery for silencing specific harmful genes by generating small silencing RNAs without the need for Cas proteins. The other is to test the system’s ability to prevent horizontal gene transfer of antibiotic resistance genes in an effort to study the system’s ecological value, potentially for applicative uses. My proposed studies will allow deeper understanding of the system, and enable breakthroughs from both basic and applicative aspects of the CRISPR field studies.
Max ERC Funding
1 499 000 €
Duration
Start date: 2013-12-01, End date: 2018-11-30
Project acronym EchoGreen
Project Photoacoustic instrument for quantification of photosynthesis and health of corals and aquatic plants
Researcher (PI) Zvy DUBINSKY
Host Institution (HI) BAR ILAN UNIVERSITY
Call Details Proof of Concept (PoC), PC1, ERC-2011-PoC
Summary EchoGreen is an innovative photoacoustics based instrument designed to determine the health status of aquatic plants and corals in response to environmental change and stress. The method is very sensitive to changes in the efficiency of photosynthesis of alga or other aquatic plants. It represents a new class, hitherto unavailable, of laboratory and handheld submersible instruments. EchoGreen’s novel technology was invented to enable the activities of the ERC funded CoralWarm because no other technology could reliably and accurately evaluate the effects of ocean warming and acidification on the future of corals and coral reefs.
EchoGreen will enable timely detection of the deterioration of reef communities and the destabilization of any marine and aquatic ecosystems. The wellness of coral reefs is crucial for sustaining tourism, fishing and human activity in more than 50 tropical countries. EchoGreen is also most suited for early warning of contamination-caused deterioration of drinking water in lakes, rivers and reservoirs by following the health status of aquatic plants. By using EchoGreen, the algal conversion of solar energy into biofuel can be optimized, and so can algal culturing for pharmaceuticals and fine chemicals.
The proposed project includes activities aiming at bringing EchoGreen to the market. Commercialisation activities include market analysis, financial and business planning, legal workup, IPR protection and contracting with manufacturers and investors. The project also covers the finalisation of the development of a submersible prototype, a user friendly interface, and testing and validation.
This innovative technology embedded in EchoGreen will find a market with coastal authorities responsible for monitoring of algal and coral populations, marine labs and nature reserves, water supply authorities, industry mass culturing algae for the production of biodiesel and fine chemicals, and installations based on high-rate-algal sewage treatment system
Summary
EchoGreen is an innovative photoacoustics based instrument designed to determine the health status of aquatic plants and corals in response to environmental change and stress. The method is very sensitive to changes in the efficiency of photosynthesis of alga or other aquatic plants. It represents a new class, hitherto unavailable, of laboratory and handheld submersible instruments. EchoGreen’s novel technology was invented to enable the activities of the ERC funded CoralWarm because no other technology could reliably and accurately evaluate the effects of ocean warming and acidification on the future of corals and coral reefs.
EchoGreen will enable timely detection of the deterioration of reef communities and the destabilization of any marine and aquatic ecosystems. The wellness of coral reefs is crucial for sustaining tourism, fishing and human activity in more than 50 tropical countries. EchoGreen is also most suited for early warning of contamination-caused deterioration of drinking water in lakes, rivers and reservoirs by following the health status of aquatic plants. By using EchoGreen, the algal conversion of solar energy into biofuel can be optimized, and so can algal culturing for pharmaceuticals and fine chemicals.
The proposed project includes activities aiming at bringing EchoGreen to the market. Commercialisation activities include market analysis, financial and business planning, legal workup, IPR protection and contracting with manufacturers and investors. The project also covers the finalisation of the development of a submersible prototype, a user friendly interface, and testing and validation.
This innovative technology embedded in EchoGreen will find a market with coastal authorities responsible for monitoring of algal and coral populations, marine labs and nature reserves, water supply authorities, industry mass culturing algae for the production of biodiesel and fine chemicals, and installations based on high-rate-algal sewage treatment system
Max ERC Funding
149 983 €
Duration
Start date: 2012-04-01, End date: 2013-09-30
Project acronym ECOSTRESS
Project Physiological Reaction to Predation- A General Way to Link Individuals to Ecosystems
Researcher (PI) Dror Hawlena
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), LS8, ERC-2013-StG
Summary This proposal aims to advance a new general theory that links plasticity in prey responses to predation and biogeochemical processes to explain context-dependent variations in ecosystem functioning. The physiological reaction of prey to predation involves allocating resources from production to support emergency functions. An example of such a reaction is an increase in maintenance respiration concomitant with higher carbohydrate and lower N demand. Such changes in prey energy and elemental budget should alter the role prey play in regulating the quality of detrital inputs to soils. Nutrient content of detritus is an important determinant of the way soil communities regulate ecosystem processes. Thus, the physiological reaction of prey to predation can potentially explicate changes in ecosystem functioning. My first empirical examination of a few selected mechanisms of this theory has yielded very promising insights.
The main objectives of this proposal are: (1) To systematically test whether prey reactions to predation are consistent with the proposed theory’s predictions across species and ecosystems; (2) to examine the interface between stress physiology and anti-predatory behaviors in explaining predator induced diet shift, and (3) to evaluate how predator induced responses at the individual level regulate ecosystem processes. To address these objectives, I propose combining manipulative field experiments, highly controlled laboratory and garden experiments, and stable-isotopes pulse chase approaches. I will examine individual prey responses and the emerging patterns across five food-chains that represent phylogenetically distant taxa and disparate ecosystems. The proposed study is expected to revolutionize our understanding of the mechanisms by which aboveground predators regulate ecosystem processes. Promoting such a mechanistic understanding is crucial to predict how human-induced changes in biodiversity will affect life-supporting ecosystem services.
Summary
This proposal aims to advance a new general theory that links plasticity in prey responses to predation and biogeochemical processes to explain context-dependent variations in ecosystem functioning. The physiological reaction of prey to predation involves allocating resources from production to support emergency functions. An example of such a reaction is an increase in maintenance respiration concomitant with higher carbohydrate and lower N demand. Such changes in prey energy and elemental budget should alter the role prey play in regulating the quality of detrital inputs to soils. Nutrient content of detritus is an important determinant of the way soil communities regulate ecosystem processes. Thus, the physiological reaction of prey to predation can potentially explicate changes in ecosystem functioning. My first empirical examination of a few selected mechanisms of this theory has yielded very promising insights.
The main objectives of this proposal are: (1) To systematically test whether prey reactions to predation are consistent with the proposed theory’s predictions across species and ecosystems; (2) to examine the interface between stress physiology and anti-predatory behaviors in explaining predator induced diet shift, and (3) to evaluate how predator induced responses at the individual level regulate ecosystem processes. To address these objectives, I propose combining manipulative field experiments, highly controlled laboratory and garden experiments, and stable-isotopes pulse chase approaches. I will examine individual prey responses and the emerging patterns across five food-chains that represent phylogenetically distant taxa and disparate ecosystems. The proposed study is expected to revolutionize our understanding of the mechanisms by which aboveground predators regulate ecosystem processes. Promoting such a mechanistic understanding is crucial to predict how human-induced changes in biodiversity will affect life-supporting ecosystem services.
Max ERC Funding
1 379 600 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym EDUCAGE
Project The EDUCAGE: A Behavioral Platform for Naturalistic Learning
Researcher (PI) Adi Mizrahi
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Proof of Concept (PoC), PC1, ERC-2014-PoC
Summary Understanding behavior is still one of the holy grails in the natural and social sciences. Behavior is often utterly complex because it is a result of an extremely rich set of past experiences, the present state of the animal and the animal’s predictions about the future; all of which affect learning, decision making and consequently behavior. Given the complexity of behavior, most researchers work in the realm of highly simplified learning tasks but these only test very basic attributes of learning and are constraining discovery of more sophisticated behavior. To date, there are only few available platforms for rigorous study of complex behavioral paradigms in experimental animals; not in basic science nor in biomedical research. Our goal is to bring to completion (and potential commercialization) a novel platform for analyzing complex animal behavior named “The Educage” to allow fully automatic, hands free assessment of higher cognitive functions in freely behaving animals. Potential customers are research labs, and the biomedical industry. The Educage will allow researchers to study behavior at unprecedented resolution, 24/7, for any duration of time. The learning paradigms can be tailored to the specific task of interest. The Educage has many advantages that outperform existing technologies by allowing rigorous statistical assessment of complex behaviors in laboratory animals. The Educage allows researchers the flexibility to monitor, analyze and manipulate the experiment during the behavior. Our system can be reliably used to analyze perceptual learning in mice and is well suited for being a new and rigorous behavioral platform. It has great potential to become a central tool to fuel discovery in animal research both in biology and biomedical research.
Summary
Understanding behavior is still one of the holy grails in the natural and social sciences. Behavior is often utterly complex because it is a result of an extremely rich set of past experiences, the present state of the animal and the animal’s predictions about the future; all of which affect learning, decision making and consequently behavior. Given the complexity of behavior, most researchers work in the realm of highly simplified learning tasks but these only test very basic attributes of learning and are constraining discovery of more sophisticated behavior. To date, there are only few available platforms for rigorous study of complex behavioral paradigms in experimental animals; not in basic science nor in biomedical research. Our goal is to bring to completion (and potential commercialization) a novel platform for analyzing complex animal behavior named “The Educage” to allow fully automatic, hands free assessment of higher cognitive functions in freely behaving animals. Potential customers are research labs, and the biomedical industry. The Educage will allow researchers to study behavior at unprecedented resolution, 24/7, for any duration of time. The learning paradigms can be tailored to the specific task of interest. The Educage has many advantages that outperform existing technologies by allowing rigorous statistical assessment of complex behaviors in laboratory animals. The Educage allows researchers the flexibility to monitor, analyze and manipulate the experiment during the behavior. Our system can be reliably used to analyze perceptual learning in mice and is well suited for being a new and rigorous behavioral platform. It has great potential to become a central tool to fuel discovery in animal research both in biology and biomedical research.
Max ERC Funding
150 000 €
Duration
Start date: 2015-01-01, End date: 2016-06-30
Project acronym EDUCATION-LONG-RUN
Project Long-Run Effects of Education Interventions: Evidence from Randomized Trials
Researcher (PI) Haim Victor Lavy
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Advanced Grant (AdG), SH1, ERC-2012-ADG_20120411
Summary The vast majority of published research on the impact of school interventions has examined their effects on short-run outcomes, primarily test scores. While important, a possibly deeper question of interest to society is the impact of such interventions on long-run life outcomes. This is a critical question because the ultimate goal of education is to improve lifetime well-being. Recent research has begun to look at this issue but much work remains to be done, particularly with regard to the long-term effects of interventions explicitly targeting improvement in general quality and students’ educational attainment. This proposal examines the impact of seven different schooling interventions – teachers’ quality, school quality, remedial education, school choice, teacher incentive payments, students' conditional cash transfers and an experiment with an increase in the return to schooling – on long-run life outcomes, including educational attainment, employment, income, marriage and fertility, crime and welfare dependency. To address this important question I will exploit unique data from seven experimental programs and natural experiments implemented simultaneously at different schools in Israel. All programs were successful in achieving their short-term objectives, though the cost of the programs varied. This undertaking presents a unique context with unusual data and very compelling empirical settings. I will examine whether these programs also achieved a longer-term measure of success by improving students’ life outcomes. Another unique feature of the proposed study is that the interventions vary widely and touch on some emergent educational trends. The body of empirical evidence from this study will provide a more complete picture of the individual and social returns from these educational interventions, and will allow policymakers to make more informed decisions when deciding which educational programs lead to the most beneficial use of limited school resources.
Summary
The vast majority of published research on the impact of school interventions has examined their effects on short-run outcomes, primarily test scores. While important, a possibly deeper question of interest to society is the impact of such interventions on long-run life outcomes. This is a critical question because the ultimate goal of education is to improve lifetime well-being. Recent research has begun to look at this issue but much work remains to be done, particularly with regard to the long-term effects of interventions explicitly targeting improvement in general quality and students’ educational attainment. This proposal examines the impact of seven different schooling interventions – teachers’ quality, school quality, remedial education, school choice, teacher incentive payments, students' conditional cash transfers and an experiment with an increase in the return to schooling – on long-run life outcomes, including educational attainment, employment, income, marriage and fertility, crime and welfare dependency. To address this important question I will exploit unique data from seven experimental programs and natural experiments implemented simultaneously at different schools in Israel. All programs were successful in achieving their short-term objectives, though the cost of the programs varied. This undertaking presents a unique context with unusual data and very compelling empirical settings. I will examine whether these programs also achieved a longer-term measure of success by improving students’ life outcomes. Another unique feature of the proposed study is that the interventions vary widely and touch on some emergent educational trends. The body of empirical evidence from this study will provide a more complete picture of the individual and social returns from these educational interventions, and will allow policymakers to make more informed decisions when deciding which educational programs lead to the most beneficial use of limited school resources.
Max ERC Funding
1 519 000 €
Duration
Start date: 2013-05-01, End date: 2019-04-30
Project acronym EffectiveTG
Project Effective Methods in Tame Geometry and Applications in Arithmetic and Dynamics
Researcher (PI) Gal BINYAMINI
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), PE1, ERC-2018-STG
Summary Tame geometry studies structures in which every definable set has a
finite geometric complexity. The study of tame geometry spans several
interrelated mathematical fields, including semialgebraic,
subanalytic, and o-minimal geometry. The past decade has seen the
emergence of a spectacular link between tame geometry and arithmetic
following the discovery of the fundamental Pila-Wilkie counting
theorem and its applications in unlikely diophantine
intersections. The P-W theorem itself relies crucially on the
Yomdin-Gromov theorem, a classical result of tame geometry with
fundamental applications in smooth dynamics.
It is natural to ask whether the complexity of a tame set can be
estimated effectively in terms of the defining formulas. While a large
body of work is devoted to answering such questions in the
semialgebraic case, surprisingly little is known concerning more
general tame structures - specifically those needed in recent
applications to arithmetic. The nature of the link between tame
geometry and arithmetic is such that any progress toward effectivizing
the theory of tame structures will likely lead to effective results
in the domain of unlikely intersections. Similarly, a more effective
version of the Yomdin-Gromov theorem is known to imply important
consequences in smooth dynamics.
The proposed research will approach effectivity in tame geometry from
a fundamentally new direction, bringing to bear methods from the
theory of differential equations which have until recently never been
used in this context. Toward this end, our key goals will be to gain
insight into the differential algebraic and complex analytic structure
of tame sets; and to apply this insight in combination with results
from the theory of differential equations to effectivize key results
in tame geometry and its applications to arithmetic and dynamics. I
believe that my preliminary work in this direction amply demonstrates
the feasibility and potential of this approach.
Summary
Tame geometry studies structures in which every definable set has a
finite geometric complexity. The study of tame geometry spans several
interrelated mathematical fields, including semialgebraic,
subanalytic, and o-minimal geometry. The past decade has seen the
emergence of a spectacular link between tame geometry and arithmetic
following the discovery of the fundamental Pila-Wilkie counting
theorem and its applications in unlikely diophantine
intersections. The P-W theorem itself relies crucially on the
Yomdin-Gromov theorem, a classical result of tame geometry with
fundamental applications in smooth dynamics.
It is natural to ask whether the complexity of a tame set can be
estimated effectively in terms of the defining formulas. While a large
body of work is devoted to answering such questions in the
semialgebraic case, surprisingly little is known concerning more
general tame structures - specifically those needed in recent
applications to arithmetic. The nature of the link between tame
geometry and arithmetic is such that any progress toward effectivizing
the theory of tame structures will likely lead to effective results
in the domain of unlikely intersections. Similarly, a more effective
version of the Yomdin-Gromov theorem is known to imply important
consequences in smooth dynamics.
The proposed research will approach effectivity in tame geometry from
a fundamentally new direction, bringing to bear methods from the
theory of differential equations which have until recently never been
used in this context. Toward this end, our key goals will be to gain
insight into the differential algebraic and complex analytic structure
of tame sets; and to apply this insight in combination with results
from the theory of differential equations to effectivize key results
in tame geometry and its applications to arithmetic and dynamics. I
believe that my preliminary work in this direction amply demonstrates
the feasibility and potential of this approach.
Max ERC Funding
1 155 027 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym ELEGANSFUSION
Project Mechanisms of cell fusion in eukaryotes
Researcher (PI) Benjamin Podbilewicz
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Advanced Grant (AdG), LS3, ERC-2010-AdG_20100317
Summary Membrane fusion is a universal process essential inside cells (endoplasmic) and between cells in fertilization and organ formation (exoplasmic). With the exception of SNARE-mediated endoplasmic fusion the proteins that mediate cellular fusion (fusogens) are unknown. Despite many years of research, little is known about the mechanism of cell-cell fusion. Our studies of developmental cell fusion in the nematode C. elegans have led to the discovery of the first family of eukaryotic fusogens (FF). These fusogens, EFF-1 and AFF-1, are type I membrane glycoproteins that are essential for cell fusion and can fuse cells when ectopically expressed on the membranes of C. elegans and heterologous cells.
Our main goals are:
(1) To determine the physicochemical mechanism of cell membrane fusion mediated by FF proteins.
(2) To find the missing fusogens that act in cell fusion events across all kingdoms of life.
We hypothesize that FF proteins fuse membranes by a mechanism analogous to viral or endoplasmic fusogens and that unidentified fusogens fuse cells following the same principles as FF proteins.
Our specific aims are:
AIM 1 Determine the mechanism of FF-mediated cell fusion: A paradigm for cell membrane fusion
AIM 2 Find the sperm-egg fusion proteins (fusogens) in C. elegans
AIM 3 Identify the myoblast fusogens in mammals
AIM 4 Test fusogens using functional cell fusion assays in heterologous systems
Identifying critical domains required for FF fusion, intermediates in membrane remodeling, and atomic structures of FF proteins will advance the fundamental understanding of the mechanisms of eukaryotic cell fusion. We propose to find the Holy Grail of fertilization and mammalian myoblast fusion. We estimate that this project, if successful, will bring a breakthrough to the sperm-egg and muscle fusion fields with potential applications in basic and applied biomedical sciences.
Summary
Membrane fusion is a universal process essential inside cells (endoplasmic) and between cells in fertilization and organ formation (exoplasmic). With the exception of SNARE-mediated endoplasmic fusion the proteins that mediate cellular fusion (fusogens) are unknown. Despite many years of research, little is known about the mechanism of cell-cell fusion. Our studies of developmental cell fusion in the nematode C. elegans have led to the discovery of the first family of eukaryotic fusogens (FF). These fusogens, EFF-1 and AFF-1, are type I membrane glycoproteins that are essential for cell fusion and can fuse cells when ectopically expressed on the membranes of C. elegans and heterologous cells.
Our main goals are:
(1) To determine the physicochemical mechanism of cell membrane fusion mediated by FF proteins.
(2) To find the missing fusogens that act in cell fusion events across all kingdoms of life.
We hypothesize that FF proteins fuse membranes by a mechanism analogous to viral or endoplasmic fusogens and that unidentified fusogens fuse cells following the same principles as FF proteins.
Our specific aims are:
AIM 1 Determine the mechanism of FF-mediated cell fusion: A paradigm for cell membrane fusion
AIM 2 Find the sperm-egg fusion proteins (fusogens) in C. elegans
AIM 3 Identify the myoblast fusogens in mammals
AIM 4 Test fusogens using functional cell fusion assays in heterologous systems
Identifying critical domains required for FF fusion, intermediates in membrane remodeling, and atomic structures of FF proteins will advance the fundamental understanding of the mechanisms of eukaryotic cell fusion. We propose to find the Holy Grail of fertilization and mammalian myoblast fusion. We estimate that this project, if successful, will bring a breakthrough to the sperm-egg and muscle fusion fields with potential applications in basic and applied biomedical sciences.
Max ERC Funding
2 380 000 €
Duration
Start date: 2011-05-01, End date: 2016-04-30
Project acronym ELENA
Project Electrochemical LEctin and glycan biochips integrated with NAnostructures
Researcher (PI) Ján Tkác
Host Institution (HI) CHEMICKY USTAV SLOVENSKEJ AKADEMIEVIED
Call Details Starting Grant (StG), LS9, ERC-2012-StG_20111109
Summary "Glycomics is currently one of the most progressively evolving scientific fields due to ever growing evidence glycans (sugars) are involved in many aspects of cell physiology and pathology. Glycans are information-rich molecules responsible for sophisticated storage and coding “commands” the cell has to perform to stay “fit” and to deal with uninvited pathogens. Thus, it is very important the “glycocode” is correctly deciphered by the cell to stay healthy, but pathogens developed nasty tricks how to crack the “glycocode“ to their benefit by stealing glycan identity of the host to stay unrecognised until it is too late. A better understanding of these processes can help to develop new, potent and nature-based vaccines and drugs.
Glycomics stayed behind advances in genomics and proteomics, but due to advent of high-throughput biochips glycomics is catching up very quickly. Two biochip formats available to study challenging and complex field of glycomics are either based on immobilised glycans (glycan biochips) or glycan recognising molecules – lectins (lectin biochips). Both technologies proved to be a success story to reveal amazing, precisely tuned “glycocode” reading, but so far biochips do not work under conditions resembling natural process of glycan deciphering.
The aim of the project is to develop biochips for fundamental study of the effect of precisely tuned ligand (glycan and lectin) density, presence of mixed glycans and the length of glycans on the glycan biorecognition. This task will be executed with the aid of nanotechnology to control these aspects at the nanoscale. Moreover, novel label-free electrochemical detection strategies will be used to mimic natural glycan recognition performing without any label. Finally, advanced patterning protocols and novel detection platforms will be integrated to develop fully robust biochips for functional assay of samples from people having some disease with a search for a particular biomarker of the disease."
Summary
"Glycomics is currently one of the most progressively evolving scientific fields due to ever growing evidence glycans (sugars) are involved in many aspects of cell physiology and pathology. Glycans are information-rich molecules responsible for sophisticated storage and coding “commands” the cell has to perform to stay “fit” and to deal with uninvited pathogens. Thus, it is very important the “glycocode” is correctly deciphered by the cell to stay healthy, but pathogens developed nasty tricks how to crack the “glycocode“ to their benefit by stealing glycan identity of the host to stay unrecognised until it is too late. A better understanding of these processes can help to develop new, potent and nature-based vaccines and drugs.
Glycomics stayed behind advances in genomics and proteomics, but due to advent of high-throughput biochips glycomics is catching up very quickly. Two biochip formats available to study challenging and complex field of glycomics are either based on immobilised glycans (glycan biochips) or glycan recognising molecules – lectins (lectin biochips). Both technologies proved to be a success story to reveal amazing, precisely tuned “glycocode” reading, but so far biochips do not work under conditions resembling natural process of glycan deciphering.
The aim of the project is to develop biochips for fundamental study of the effect of precisely tuned ligand (glycan and lectin) density, presence of mixed glycans and the length of glycans on the glycan biorecognition. This task will be executed with the aid of nanotechnology to control these aspects at the nanoscale. Moreover, novel label-free electrochemical detection strategies will be used to mimic natural glycan recognition performing without any label. Finally, advanced patterning protocols and novel detection platforms will be integrated to develop fully robust biochips for functional assay of samples from people having some disease with a search for a particular biomarker of the disease."
Max ERC Funding
1 155 970 €
Duration
Start date: 2013-01-01, End date: 2017-12-31
Project acronym ELIMINATESENESCENT
Project The Role of Elimination of Senescent Cells in Cancer Development
Researcher (PI) Valery Krizhanovsky
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), LS4, ERC-2012-StG_20111109
Summary Cellular senescence, which is a terminal cell cycle arrest, is a potent tumor suppressor mechanism that limits cancer initiation and progression; it also limits tissue damage response. While senescence is protective in the cell autonomous manner, senescent cells secrete a variety of factors that lead to inflammation, tissue destruction and promote tumorigenesis and metastasis in the sites of their presence. Here we propose a unique approach – to eliminate senescent cells from tissues in order to prevent the deleterious cell non-autonomous effects of these cells. We will use our understanding in immune surveillance of senescent cells, and in cell-intrinsic molecular pathways regulating cell viability, to identify the molecular “Achilles’ heal” of senescent cells. We will identify the mechanisms of interaction of senescent cells with NK cells and other immune cells, and harness these mechanisms for elimination of senescent cells. The impact of components of the main pathways regulating cell viability, apoptosis and autophagy, will then be evaluated for their specific contribution to the viability of senescent cells.
The molecular players identified by all these approaches will be readily implemented for the elimination of senescent cells in vivo. We will consequently be able to evaluate the impact of the elimination of senescent cells on tumor progression, in mouse models, where these cells are present during initial stages of tumorigenesis. Additionally, we will develop a novel mouse model that will allow identification of senescent cells in vivo in real time. This model is particularly challenging and valuable due to absence of single molecular marker for senescent cells.
The ability to eliminate senescent cells will lead to the understanding of the role of presence of senescent cells in tissues and the mechanisms regulating their viability. This might suggest novel ways of cancer prevention and treatment.
Summary
Cellular senescence, which is a terminal cell cycle arrest, is a potent tumor suppressor mechanism that limits cancer initiation and progression; it also limits tissue damage response. While senescence is protective in the cell autonomous manner, senescent cells secrete a variety of factors that lead to inflammation, tissue destruction and promote tumorigenesis and metastasis in the sites of their presence. Here we propose a unique approach – to eliminate senescent cells from tissues in order to prevent the deleterious cell non-autonomous effects of these cells. We will use our understanding in immune surveillance of senescent cells, and in cell-intrinsic molecular pathways regulating cell viability, to identify the molecular “Achilles’ heal” of senescent cells. We will identify the mechanisms of interaction of senescent cells with NK cells and other immune cells, and harness these mechanisms for elimination of senescent cells. The impact of components of the main pathways regulating cell viability, apoptosis and autophagy, will then be evaluated for their specific contribution to the viability of senescent cells.
The molecular players identified by all these approaches will be readily implemented for the elimination of senescent cells in vivo. We will consequently be able to evaluate the impact of the elimination of senescent cells on tumor progression, in mouse models, where these cells are present during initial stages of tumorigenesis. Additionally, we will develop a novel mouse model that will allow identification of senescent cells in vivo in real time. This model is particularly challenging and valuable due to absence of single molecular marker for senescent cells.
The ability to eliminate senescent cells will lead to the understanding of the role of presence of senescent cells in tissues and the mechanisms regulating their viability. This might suggest novel ways of cancer prevention and treatment.
Max ERC Funding
1 500 000 €
Duration
Start date: 2012-11-01, End date: 2017-10-31
Project acronym EMbRACe
Project Effective Multidrug Cocktails for Cancer
Researcher (PI) Uri ALON
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Proof of Concept (PoC), PC1, ERC-2015-PoC
Summary Cancer is a global epidemic that affects all ages and socio-economic groups. In turn, tremendous resources are being invested in prevention, diagnosis, and treatment of cancer. For instance, over 1,000 anticancer drugs are currently in various phases of development and pre-approval testing, more than the number for heart disease, stroke, and mental illness combined. Finding multi-drug combinations for cancer is an increasingly pressing therapeutic challenge. However, screening all possible drug combinations is an impossible task because the number of experiments grows exponentially with the number of different drugs and doses. Therefore, highly effective combinations of already approved drugs may likely exist that have never been tested before at the appropriate doses, due the astronomical number of wet lab tests required to find these combinations. Motivated by this challenge, we have developed a novel method for computing the effects of high order combinations of drugs on cancer cells and predicting the best drug for a given tumor based only on a very small number of experiments. In turn, the goals of our PoC project are to further validate the potential of our formula by means of numerous rigorous tests and to establish the business potential of our idea. If successful, this PoC project will pave the way to the development and adoption of highly personalized drug cocktails that are designed based only on a limited number of measurements performed on patient-derived tumor material.
Summary
Cancer is a global epidemic that affects all ages and socio-economic groups. In turn, tremendous resources are being invested in prevention, diagnosis, and treatment of cancer. For instance, over 1,000 anticancer drugs are currently in various phases of development and pre-approval testing, more than the number for heart disease, stroke, and mental illness combined. Finding multi-drug combinations for cancer is an increasingly pressing therapeutic challenge. However, screening all possible drug combinations is an impossible task because the number of experiments grows exponentially with the number of different drugs and doses. Therefore, highly effective combinations of already approved drugs may likely exist that have never been tested before at the appropriate doses, due the astronomical number of wet lab tests required to find these combinations. Motivated by this challenge, we have developed a novel method for computing the effects of high order combinations of drugs on cancer cells and predicting the best drug for a given tumor based only on a very small number of experiments. In turn, the goals of our PoC project are to further validate the potential of our formula by means of numerous rigorous tests and to establish the business potential of our idea. If successful, this PoC project will pave the way to the development and adoption of highly personalized drug cocktails that are designed based only on a limited number of measurements performed on patient-derived tumor material.
Max ERC Funding
150 000 €
Duration
Start date: 2016-11-01, End date: 2018-04-30
Project acronym EMERGE
Project Reconstructing the emergence of the Milky Way’s stellar population with Gaia, SDSS-V and JWST
Researcher (PI) Dan Maoz
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Advanced Grant (AdG), PE9, ERC-2018-ADG
Summary Understanding how the Milky Way arrived at its present state requires a large volume of precision measurements of our Galaxy’s current makeup, as well as an empirically based understanding of the main processes involved in the Galaxy’s evolution. Such data are now about to arrive in the flood of quality information from Gaia and SDSS-V. The demography of the stars and of the compact stellar remnants in our Galaxy, in terms of phase-space location, mass, age, metallicity, and multiplicity are data products that will come directly from these surveys. I propose to integrate this information into a comprehensive picture of the Milky Way’s present state. In parallel, I will build a Galactic chemical evolution model, with input parameters that are as empirically based as possible, that will reproduce and explain the observations. To get those input parameters, I will measure the rates of supernovae (SNe) in nearby galaxies (using data from past and ongoing surveys) and in high-redshift proto-clusters (by conducting a SN search with JWST), to bring into sharp focus the element yields of SNe and the distribution of delay times (the DTD) between star formation and SN explosion. These empirically determined SN metal-production parameters will be used to find the observationally based reconstruction of the Galaxy’s stellar formation history and chemical evolution that reproduces the observed present-day Milky Way stellar population. The population census of stellar multiplicity with Gaia+SDSS-V, and particularly of short-orbit compact-object binaries, will hark back to the rates and the element yields of the various types of SNe, revealing the connections between various progenitor systems, their explosions, and their rates. The plan, while ambitious, is feasible, thanks to the data from these truly game-changing observational projects. My team will perform all steps of the analysis and will combine the results to obtain the clearest picture of how our Galaxy came to be.
Summary
Understanding how the Milky Way arrived at its present state requires a large volume of precision measurements of our Galaxy’s current makeup, as well as an empirically based understanding of the main processes involved in the Galaxy’s evolution. Such data are now about to arrive in the flood of quality information from Gaia and SDSS-V. The demography of the stars and of the compact stellar remnants in our Galaxy, in terms of phase-space location, mass, age, metallicity, and multiplicity are data products that will come directly from these surveys. I propose to integrate this information into a comprehensive picture of the Milky Way’s present state. In parallel, I will build a Galactic chemical evolution model, with input parameters that are as empirically based as possible, that will reproduce and explain the observations. To get those input parameters, I will measure the rates of supernovae (SNe) in nearby galaxies (using data from past and ongoing surveys) and in high-redshift proto-clusters (by conducting a SN search with JWST), to bring into sharp focus the element yields of SNe and the distribution of delay times (the DTD) between star formation and SN explosion. These empirically determined SN metal-production parameters will be used to find the observationally based reconstruction of the Galaxy’s stellar formation history and chemical evolution that reproduces the observed present-day Milky Way stellar population. The population census of stellar multiplicity with Gaia+SDSS-V, and particularly of short-orbit compact-object binaries, will hark back to the rates and the element yields of the various types of SNe, revealing the connections between various progenitor systems, their explosions, and their rates. The plan, while ambitious, is feasible, thanks to the data from these truly game-changing observational projects. My team will perform all steps of the analysis and will combine the results to obtain the clearest picture of how our Galaxy came to be.
Max ERC Funding
1 859 375 €
Duration
Start date: 2019-10-01, End date: 2024-09-30
Project acronym EMODHEBREW
Project The emergence of Modern Hebrew as a case-study of linguistic discontinuity
Researcher (PI) Edit Doron
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Advanced Grant (AdG), SH4, ERC-2016-ADG
Summary The pioneering enterprise I propose is the study of a particular type of linguistic discontinuity – language revival – inspired by the revival of Hebrew at the end of the 19th century. The historical and sociocultural dimensions the revival have been studied before, but not its linguistic dimensions. My main aim is to construct a model of the linguistic factors which have shaped the revival of Hebrew. I expect this model to provide clues for the understanding of the process of language revival in general. For a language to be revived, a new grammar must be created by its native speakers. I hypothesize that the new grammar is formed by some of the general principles which also govern other better known cases of linguistic discontinuity (creoles, mixed languages, emergent sign languages etc.). The model I will develop will lay the foundation for a new subfield within the study of discontinuity – the study of language revival. I will start with careful work of documenting the development of the grammar of Modern Hebrew, in particular its syntax, something which has not been done systematically before. One product of the project will be a linguistic application for the documentation and annotation of the novel syntactic constructions of Modern Hebrew, their sources in previous stages of Hebrew and in the languages with which Modern Hebrew was in contact at the time of the revival, and the development of these constructions since the beginning of the revival until the present time. The linguistic application will be made available on the web for other linguists to use and to contribute to. The institution of an expanding data-base of the syntactic innovations of Modern Hebrew which comprises both documentation/ annotation and theoretical modeling which could be applied to other languages makes this an extremely ambitious proposal with potentially wide-reaching ramifications for the revival and revitalization of the languages of ethno-linguistic minorities world wide.
Summary
The pioneering enterprise I propose is the study of a particular type of linguistic discontinuity – language revival – inspired by the revival of Hebrew at the end of the 19th century. The historical and sociocultural dimensions the revival have been studied before, but not its linguistic dimensions. My main aim is to construct a model of the linguistic factors which have shaped the revival of Hebrew. I expect this model to provide clues for the understanding of the process of language revival in general. For a language to be revived, a new grammar must be created by its native speakers. I hypothesize that the new grammar is formed by some of the general principles which also govern other better known cases of linguistic discontinuity (creoles, mixed languages, emergent sign languages etc.). The model I will develop will lay the foundation for a new subfield within the study of discontinuity – the study of language revival. I will start with careful work of documenting the development of the grammar of Modern Hebrew, in particular its syntax, something which has not been done systematically before. One product of the project will be a linguistic application for the documentation and annotation of the novel syntactic constructions of Modern Hebrew, their sources in previous stages of Hebrew and in the languages with which Modern Hebrew was in contact at the time of the revival, and the development of these constructions since the beginning of the revival until the present time. The linguistic application will be made available on the web for other linguists to use and to contribute to. The institution of an expanding data-base of the syntactic innovations of Modern Hebrew which comprises both documentation/ annotation and theoretical modeling which could be applied to other languages makes this an extremely ambitious proposal with potentially wide-reaching ramifications for the revival and revitalization of the languages of ethno-linguistic minorities world wide.
Max ERC Funding
2 498 750 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym Emotions in Conflict
Project Direct and Indirect Emotion Regulation as a New Path of Conflict Resolution
Researcher (PI) Eran Halperin
Host Institution (HI) INTERDISCIPLINARY CENTER (IDC) HERZLIYA
Call Details Starting Grant (StG), SH4, ERC-2013-StG
Summary Intractable conflicts are one of the gravest challenges to both humanity and science. These conflicts are initiated and perpetuated by people; therefore changing people's hearts and minds constitutes a huge step towards resolution. Research on emotions in conflicts has led to the realization that intergroup emotions are critical to conflict dynamics. This project’s intrinsic question is whether and how intergroup emotions can be regulated to alter attitudes and behavior towards peace. I offer an innovative path, using two strategies of emotion regulation. The first is Direct Emotion Regulation, where traditional, effective emotion regulation strategies can be used to change intergroup emotional experiences and subsequently political positions in conflict situations. The second, Indirect Emotion Regulation, serves to implicitly alter concrete cognitive appraisals, thus changing attitudes by changing discrete emotions. This is the first attempt ever to integrate psychological aggregated knowledge on emotion regulation with conflict resolution. I propose 16 studies, conducted in the context of the intractable Israeli-Palestinian conflict. Seven studies will focus on direct emotion regulation, reducing intergroup anger and hatred, while 9 studies will focus on indirect regulation, aspiring to reduce fear and despair. In both paths, correlational and in-lab experimental studies will be used to refine adequate strategies of down regulating destructive emotions, the results of which will be used to develop innovative, theory-driven education and media interventions that will be tested utilizing wide scale experience sampling methodology. This project aspires to bridge the gap between basic and applied science, creating a pioneering, interdisciplinary framework which contributes to existing knowledge on emotion regulation in conflict and implements ways to apply it in real-world circumstances.
Summary
Intractable conflicts are one of the gravest challenges to both humanity and science. These conflicts are initiated and perpetuated by people; therefore changing people's hearts and minds constitutes a huge step towards resolution. Research on emotions in conflicts has led to the realization that intergroup emotions are critical to conflict dynamics. This project’s intrinsic question is whether and how intergroup emotions can be regulated to alter attitudes and behavior towards peace. I offer an innovative path, using two strategies of emotion regulation. The first is Direct Emotion Regulation, where traditional, effective emotion regulation strategies can be used to change intergroup emotional experiences and subsequently political positions in conflict situations. The second, Indirect Emotion Regulation, serves to implicitly alter concrete cognitive appraisals, thus changing attitudes by changing discrete emotions. This is the first attempt ever to integrate psychological aggregated knowledge on emotion regulation with conflict resolution. I propose 16 studies, conducted in the context of the intractable Israeli-Palestinian conflict. Seven studies will focus on direct emotion regulation, reducing intergroup anger and hatred, while 9 studies will focus on indirect regulation, aspiring to reduce fear and despair. In both paths, correlational and in-lab experimental studies will be used to refine adequate strategies of down regulating destructive emotions, the results of which will be used to develop innovative, theory-driven education and media interventions that will be tested utilizing wide scale experience sampling methodology. This project aspires to bridge the gap between basic and applied science, creating a pioneering, interdisciplinary framework which contributes to existing knowledge on emotion regulation in conflict and implements ways to apply it in real-world circumstances.
Max ERC Funding
1 499 344 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ENCODE
Project Design Principles in Encoding Complex Noisy Environments
Researcher (PI) Alon Zaslaver
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), LS2, ERC-2013-StG
Summary Animals constantly face complex environments consisted of multiple fluctuating cues. Accurate detection and efficient integration of such perplexing information are essential as animals’ fitness and consequently survival depend on making the right behavioral decisions. However, little is known about how multifaceted stimuli are integrated by neural systems, and how this information flows in the neural network in a single-neuron resolution.
Here we aim to address these fundamental questions using C. elegans worms as a model system. With a compact and fully-mapped neural network, C. elegans offers a unique opportunity of generating novel breakthroughs and significantly advance the field.
To study functional dynamics on a network-wide scale with an unprecedented single-neuron resolution, we will construct a comprehensive library of transgenic animals expressing state-of-the-art optogenetic tools and Calcium indicators in individual neurons. Moreover, we will study the entire encoding process, beginning with the sensory layer, through integration in the neural network, to behavioral outputs. At the sensory level, we aim to reveal how small sensory systems efficiently encode the complex external world. Next, we will decipher the design principles by which neural circuits integrate and process information. The optogenetic transgenic animals will allow us interrogating computational roles of various circuits by manipulating individual neurons in the network. At the end, we will integrate the gathered knowledge to study how encoding eventually translates to decision making behavioral outputs.
Throughout this project, we will use a combination of cutting-edge experimental techniques coupled with extensive computational analyses, modelling and theory. The aims of this interdisciplinary project together with the system-level approaches put it in the front line of research in the Systems Biology field.
Summary
Animals constantly face complex environments consisted of multiple fluctuating cues. Accurate detection and efficient integration of such perplexing information are essential as animals’ fitness and consequently survival depend on making the right behavioral decisions. However, little is known about how multifaceted stimuli are integrated by neural systems, and how this information flows in the neural network in a single-neuron resolution.
Here we aim to address these fundamental questions using C. elegans worms as a model system. With a compact and fully-mapped neural network, C. elegans offers a unique opportunity of generating novel breakthroughs and significantly advance the field.
To study functional dynamics on a network-wide scale with an unprecedented single-neuron resolution, we will construct a comprehensive library of transgenic animals expressing state-of-the-art optogenetic tools and Calcium indicators in individual neurons. Moreover, we will study the entire encoding process, beginning with the sensory layer, through integration in the neural network, to behavioral outputs. At the sensory level, we aim to reveal how small sensory systems efficiently encode the complex external world. Next, we will decipher the design principles by which neural circuits integrate and process information. The optogenetic transgenic animals will allow us interrogating computational roles of various circuits by manipulating individual neurons in the network. At the end, we will integrate the gathered knowledge to study how encoding eventually translates to decision making behavioral outputs.
Throughout this project, we will use a combination of cutting-edge experimental techniques coupled with extensive computational analyses, modelling and theory. The aims of this interdisciplinary project together with the system-level approaches put it in the front line of research in the Systems Biology field.
Max ERC Funding
1 498 400 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ENGVASC
Project Engineering Vascularized Tissues
Researcher (PI) Shulamit Levenberg
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Starting Grant (StG), LS7, ERC-2011-StG_20101109
Summary Vascularization, the process in which new blood vessels assemble, is fundamental to tissue vitality. Vessel network assembly within 3D tissues can be induced in-vitro by means of multicellular culturing of endothelial cells (EC), fibroblasts and cells specific to the tissue of interest. This approach supports formation of endothelial vessels and promotes EC and tissue-specific cell interactions. Such EC-dependent tube-like openings may also form the basis for improved media penetration to the inner regions of thick 3D constructs, allowing for enhanced construct survival and for effective engineering of large complex tissues in the lab. Moreover, our own breakthrough results describe the beneficial impact of in vitro prevascularization of engineered muscle tissue on its survival and vascularization upon implantation. These studies have also demonstrated that implanted vascular networks of in vitro engineered constructs, can anastomose with host vasculature and form functional blood vessels in vivo. However, the mechanisms underlying enhanced vascularization of endothelialized engineered constructs and implant-host vessel integration remain unclear. In this proposal, our research objectives are (1) to uncover the mechanisms governing in vitro vessel network formation in engineered 3D tissues and (2) to elucidate the process of graft-host vessel network integration and implant vessel-stimulated promotion of neovascularization in vivo. In addition, the impact of construct prevascularization on implant survival and function will be explored in animal disease models. While there are still many challenges ahead, should we succeed, our research could lay the foundation for significantly enhanced tissue construct vascularization procedures and for their application in regenerative medicine. In addition, it may provide alternative models for studying the vascularization processes in embryogenesis and disease.
Summary
Vascularization, the process in which new blood vessels assemble, is fundamental to tissue vitality. Vessel network assembly within 3D tissues can be induced in-vitro by means of multicellular culturing of endothelial cells (EC), fibroblasts and cells specific to the tissue of interest. This approach supports formation of endothelial vessels and promotes EC and tissue-specific cell interactions. Such EC-dependent tube-like openings may also form the basis for improved media penetration to the inner regions of thick 3D constructs, allowing for enhanced construct survival and for effective engineering of large complex tissues in the lab. Moreover, our own breakthrough results describe the beneficial impact of in vitro prevascularization of engineered muscle tissue on its survival and vascularization upon implantation. These studies have also demonstrated that implanted vascular networks of in vitro engineered constructs, can anastomose with host vasculature and form functional blood vessels in vivo. However, the mechanisms underlying enhanced vascularization of endothelialized engineered constructs and implant-host vessel integration remain unclear. In this proposal, our research objectives are (1) to uncover the mechanisms governing in vitro vessel network formation in engineered 3D tissues and (2) to elucidate the process of graft-host vessel network integration and implant vessel-stimulated promotion of neovascularization in vivo. In addition, the impact of construct prevascularization on implant survival and function will be explored in animal disease models. While there are still many challenges ahead, should we succeed, our research could lay the foundation for significantly enhanced tissue construct vascularization procedures and for their application in regenerative medicine. In addition, it may provide alternative models for studying the vascularization processes in embryogenesis and disease.
Max ERC Funding
1 500 000 €
Duration
Start date: 2012-10-01, End date: 2017-09-30
Project acronym ER ARCHITECTURE
Project Uncovering the Mechanisms of Endoplasmic Reticulum Sub-Domain Creation and Maintenance
Researcher (PI) Maya Benyamina Schuldiner
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), LS3, ERC-2010-StG_20091118
Summary The endoplasmic reticulum (ER) is the cellular organelle that serves as the entry site into the secretory pathway. Although the ER has a single continuous membrane, it is functionally divided into subdomains (SDs). These specialized regions allow the ER to carry out a multitude of functions such as folding, maturation, quality control and export, of all secreted and most membrane bound proteins; lipid biosynthesis; ion homeostasis; and communication with all other organelles. The ER is therefore not only the largest single copy organelle in most eukaryotic cells, but, thanks to the presence of SDs, also one of the more functionally diverse and structurally complex.
Changes in ER functions have been shown to contribute to the progression of many diseases such as heart disease, neurodegeneration and diabetes. Moreover, a robustly functioning ER is required for development of dedicated secretory cells such as antibody producing plasma cells and insulin secreting pancreatic cells. The past years have brought about a revolution in our understanding of basic ER functions and the homeostatic responses coordinating them. However, despite their obvious importance for robust activity of the ER, we still know very little about SD biogenesis and function. Therefore, the time is now ripe to extend our understanding by facing the next challenges in the field.
Specifically, it is now of major importance to understand how cells ensure accurate SD biogenesis and function. This proposal tackles this question by three independent but complementary screens each aimed at revealing one aspect of SDs: their structure/function, biogenesis or dynamics. The merging of all three aspects of information will give us a holistic picture of this process – one that could not have been attained by the pixilated view of any single piece of data. We propose to explore these facets in both yeast and mammals utilizing systematic tools such as high content microscopic screens followed up by the creation of genetic interaction maps and follow-up hypothesis based biochemical and genetic experiments. By combining several approaches and different organisms we hope to enable a more efficient reconstruction of this complex process.
When completed this proposal will have shed light on a little explored but central question in cellular biology. More broadly, the mechanisms that arise as guiding SD biogenesis may help us in understanding how membrane domains form in general. Due to the novelty of our approach and the cutting-edge tools used to tackle this fundamental problem in cell biology, this work will provide a paradigm for addressing complex biological questions in eukaryotic cells. It may very well be that it is this aspect of the proposal that may ultimately most broadly impact the biological community.
Summary
The endoplasmic reticulum (ER) is the cellular organelle that serves as the entry site into the secretory pathway. Although the ER has a single continuous membrane, it is functionally divided into subdomains (SDs). These specialized regions allow the ER to carry out a multitude of functions such as folding, maturation, quality control and export, of all secreted and most membrane bound proteins; lipid biosynthesis; ion homeostasis; and communication with all other organelles. The ER is therefore not only the largest single copy organelle in most eukaryotic cells, but, thanks to the presence of SDs, also one of the more functionally diverse and structurally complex.
Changes in ER functions have been shown to contribute to the progression of many diseases such as heart disease, neurodegeneration and diabetes. Moreover, a robustly functioning ER is required for development of dedicated secretory cells such as antibody producing plasma cells and insulin secreting pancreatic cells. The past years have brought about a revolution in our understanding of basic ER functions and the homeostatic responses coordinating them. However, despite their obvious importance for robust activity of the ER, we still know very little about SD biogenesis and function. Therefore, the time is now ripe to extend our understanding by facing the next challenges in the field.
Specifically, it is now of major importance to understand how cells ensure accurate SD biogenesis and function. This proposal tackles this question by three independent but complementary screens each aimed at revealing one aspect of SDs: their structure/function, biogenesis or dynamics. The merging of all three aspects of information will give us a holistic picture of this process – one that could not have been attained by the pixilated view of any single piece of data. We propose to explore these facets in both yeast and mammals utilizing systematic tools such as high content microscopic screens followed up by the creation of genetic interaction maps and follow-up hypothesis based biochemical and genetic experiments. By combining several approaches and different organisms we hope to enable a more efficient reconstruction of this complex process.
When completed this proposal will have shed light on a little explored but central question in cellular biology. More broadly, the mechanisms that arise as guiding SD biogenesis may help us in understanding how membrane domains form in general. Due to the novelty of our approach and the cutting-edge tools used to tackle this fundamental problem in cell biology, this work will provide a paradigm for addressing complex biological questions in eukaryotic cells. It may very well be that it is this aspect of the proposal that may ultimately most broadly impact the biological community.
Max ERC Funding
1 499 999 €
Duration
Start date: 2010-09-01, End date: 2015-08-31
Project acronym ErgComNum
Project Ergodic theory and additive combinatorics
Researcher (PI) Tamar Ziegler
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Consolidator Grant (CoG), PE1, ERC-2015-CoG
Summary The last decade has witnessed a new spring for dynamical systems. The field - initiated by Poincare in the study of the N-body problem - has become essential in the understanding of seemingly far off fields such as combinatorics, number theory and theoretical computer science. In particular, ideas from ergodic theory played an important role in the resolution of long standing open problems in combinatorics and number theory. A striking example is the role of dynamics on nilmanifolds in the recent proof of Hardy-Littlewood estimates for the number of solutions to systems of linear equations of finite complexity in the prime numbers. The interplay between ergodic theory, number theory and additive combinatorics has proved very fruitful; it is a fast growing area in mathematics attracting many young researchers. We propose to tackle central open problems in the area.
Summary
The last decade has witnessed a new spring for dynamical systems. The field - initiated by Poincare in the study of the N-body problem - has become essential in the understanding of seemingly far off fields such as combinatorics, number theory and theoretical computer science. In particular, ideas from ergodic theory played an important role in the resolution of long standing open problems in combinatorics and number theory. A striking example is the role of dynamics on nilmanifolds in the recent proof of Hardy-Littlewood estimates for the number of solutions to systems of linear equations of finite complexity in the prime numbers. The interplay between ergodic theory, number theory and additive combinatorics has proved very fruitful; it is a fast growing area in mathematics attracting many young researchers. We propose to tackle central open problems in the area.
Max ERC Funding
1 342 500 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym ERGODICNONCOMPACT
Project Ergodic theory on non compact spaces
Researcher (PI) Omri Moshe Sarig
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), PE1, ERC-2009-StG
Summary The proposal is to look for, and investigate, new ergodic theoretic types of behavior for dynamical systems which act on non compact spaces. These could include transience and non-trivial ways of escape to infinity, critical phenomena similar to phase transitions, and new types of measure rigidity. There are potential applications to smooth ergodic theory (non-uniform hyperbolicity), algebraic ergodic theory (actions on homogeneous spaces), and probability theory (weakly dependent stochastic processes).
Summary
The proposal is to look for, and investigate, new ergodic theoretic types of behavior for dynamical systems which act on non compact spaces. These could include transience and non-trivial ways of escape to infinity, critical phenomena similar to phase transitions, and new types of measure rigidity. There are potential applications to smooth ergodic theory (non-uniform hyperbolicity), algebraic ergodic theory (actions on homogeneous spaces), and probability theory (weakly dependent stochastic processes).
Max ERC Funding
539 479 €
Duration
Start date: 2009-10-01, End date: 2014-09-30
Project acronym ERNBPTC
Project Expression regulatory networks: beyond promoters and transcription control
Researcher (PI) Yitzhak Pilpel
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), LS2, ERC-2007-StG
Summary "Gene expression in living cells is a most intricate molecular process, occurring in stages, each of which is regulated by a diversity of mechanisms. Among the various stages leading to gene expression, only transcription is relatively well understood, thanks to Genomics and bioinformatics. In contrast to the vast amounts of genome-wide data and a growing understanding of the structure of networks controlling transcription, we still lack quantitative, genome-wide knowledge of the mechanisms underlying regulation of mRNA degradation and translation. Among the unknowns are the identity of the regulators, their kinetic modes of action, and their means of interaction with the sequence features that make-up their targets; how these target combine to produce a higher level ""grammar"" is also unknown. An important part of the project is dedicated to generating genome-wide experimental data that will form the basis for quantitative and more comprehensive analysis of gene expression. Specifically, the primary objectives of our proposed research plan are: 1) to advance our understanding of the transcriptome, by deciphering the code regulating mRNA decay 2) to break the code which controls protein translation efficiency 3) to understand how mRNA degradation and translation efficiency determine noise in protein expression levels. The proposed strategy is based on an innovative combination of computational prediction, synthetic gene design, and genome-wide data acquisition, all culminating in extensive data analysis, mathematical modeling and focused experiments. This highly challenging, multidisciplinary project is likely to greatly enhance our knowledge of the various modes by which organisms regulate expression of their genomes, how these regulatory mechanisms are interrelated, how they generate precise response to environmental challenges and how they have evolved over time."
Summary
"Gene expression in living cells is a most intricate molecular process, occurring in stages, each of which is regulated by a diversity of mechanisms. Among the various stages leading to gene expression, only transcription is relatively well understood, thanks to Genomics and bioinformatics. In contrast to the vast amounts of genome-wide data and a growing understanding of the structure of networks controlling transcription, we still lack quantitative, genome-wide knowledge of the mechanisms underlying regulation of mRNA degradation and translation. Among the unknowns are the identity of the regulators, their kinetic modes of action, and their means of interaction with the sequence features that make-up their targets; how these target combine to produce a higher level ""grammar"" is also unknown. An important part of the project is dedicated to generating genome-wide experimental data that will form the basis for quantitative and more comprehensive analysis of gene expression. Specifically, the primary objectives of our proposed research plan are: 1) to advance our understanding of the transcriptome, by deciphering the code regulating mRNA decay 2) to break the code which controls protein translation efficiency 3) to understand how mRNA degradation and translation efficiency determine noise in protein expression levels. The proposed strategy is based on an innovative combination of computational prediction, synthetic gene design, and genome-wide data acquisition, all culminating in extensive data analysis, mathematical modeling and focused experiments. This highly challenging, multidisciplinary project is likely to greatly enhance our knowledge of the various modes by which organisms regulate expression of their genomes, how these regulatory mechanisms are interrelated, how they generate precise response to environmental challenges and how they have evolved over time."
Max ERC Funding
1 320 000 €
Duration
Start date: 2008-09-01, End date: 2013-08-31
Project acronym ESCAPE_COPD
Project Elimination of Senescent Cells Approach for treatment of COPD
Researcher (PI) Valery KRIZHANOVSKY
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Proof of Concept (PoC), ERC-2016-PoC, ERC-2016-PoC
Summary Chronic Obstructive Pulmonary Disease (COPD) is estimated to affect up to 600 million people worldwide and by 2020 it will become the third most frequent cause of death. In Europe alone, COPD affects up to 10% of people (i.e. more people than breast cancer and diabetes) and it takes the life of around 300,000 Europeans each year. Current therapies are associated with a variety of side effects some of which can be acute and even life threatening. Moreover, none of the existing medications for COPD has been shown conclusively to modify the long-term decline in lung function thus, COPD remains a disease with a significant unmet medical need.
Our approach is focusing on the pharmacological elimination of senescent cells (i.e. cells that have stopped dividing, but do affect their microenvironment) which accumulate in tissues with age and contribute to multiple age-related diseases, including COPD. In particular, we recently discovered that with the use of a particular molecule, we could efficiently target the molecular mechanisms that are responsible for the viability of senescent cells, leading to specific elimination of these cells from tissues. In turn, the goal of the PoC project is two-fold. (1) The first goal is to establish the technical feasibility of our idea by testing the effect of the identified molecule on the disease development and progression using the COPD mouse model we developed. (2) The second goal is to establish the business feasibility of our revolutionary approach by taking the necessary steps towards its commercialization, focusing on the creation of strategic alliances with key private sector companies.
Summary
Chronic Obstructive Pulmonary Disease (COPD) is estimated to affect up to 600 million people worldwide and by 2020 it will become the third most frequent cause of death. In Europe alone, COPD affects up to 10% of people (i.e. more people than breast cancer and diabetes) and it takes the life of around 300,000 Europeans each year. Current therapies are associated with a variety of side effects some of which can be acute and even life threatening. Moreover, none of the existing medications for COPD has been shown conclusively to modify the long-term decline in lung function thus, COPD remains a disease with a significant unmet medical need.
Our approach is focusing on the pharmacological elimination of senescent cells (i.e. cells that have stopped dividing, but do affect their microenvironment) which accumulate in tissues with age and contribute to multiple age-related diseases, including COPD. In particular, we recently discovered that with the use of a particular molecule, we could efficiently target the molecular mechanisms that are responsible for the viability of senescent cells, leading to specific elimination of these cells from tissues. In turn, the goal of the PoC project is two-fold. (1) The first goal is to establish the technical feasibility of our idea by testing the effect of the identified molecule on the disease development and progression using the COPD mouse model we developed. (2) The second goal is to establish the business feasibility of our revolutionary approach by taking the necessary steps towards its commercialization, focusing on the creation of strategic alliances with key private sector companies.
Max ERC Funding
150 000 €
Duration
Start date: 2017-05-01, End date: 2018-10-31
Project acronym ETASECS
Project Extremely Thin Absorbers for Solar Energy Conversion and Storage
Researcher (PI) Avner Rothschild
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Consolidator Grant (CoG), PE8, ERC-2013-CoG
Summary ETASECS aims at making a breakthrough in the development of photoelectrochemical (PEC) cells for solar-powered water splitting that can be readily integrated with PV cells to provide storage capacity in the form of hydrogen. It builds upon our recent invention for resonant light trapping in ultrathin films of iron oxide (a-Fe2O3), which enables overcoming the deleterious trade-off between light absorption and charge carrier collection efficiency. Although we recently broke the water photo-oxidation record by any a-Fe2O3 photoanode reported to date, the losses are still high and there is plenty of room for further improvements that will lead to a remakable enhancement in the performance of our photoanodes, reaching quantum efficiency level similar to state-of-the-art PV cells. ETASECS aims at reaching this ambitious goal, which is essential for demonstrating the competitiveness of PEC+PV tandem systems for solar energy conversion and storage. Towards this end WP1 will combine theory, modelling and simulations, state-of-the-art experimental methods and advanced diagnostic techniques in order to identify and quantify the different losses in our devices. This work will guide the optimization work in WP2 that will suppress the losses at the photoanode and insure optimal electrical and optical coupling of the PEC and PV cells. We will also explore advanced photon management schemes that will go beyond our current light trapping scheme by combining synergic optical and nanophotonics effects. WP3 will integrate the PEC and PV cells and test their properties and performance. WP4 will disseminate our progress and achievements in professional and public forums. The innovations that will emerge from this frontier research will be further pursued in proof of concept follow up investigations that will demonstrate the feasibility of this technology. Success along these lines holds exciting promises for ground breaking progress towards large scale deployment of solar energy.
Summary
ETASECS aims at making a breakthrough in the development of photoelectrochemical (PEC) cells for solar-powered water splitting that can be readily integrated with PV cells to provide storage capacity in the form of hydrogen. It builds upon our recent invention for resonant light trapping in ultrathin films of iron oxide (a-Fe2O3), which enables overcoming the deleterious trade-off between light absorption and charge carrier collection efficiency. Although we recently broke the water photo-oxidation record by any a-Fe2O3 photoanode reported to date, the losses are still high and there is plenty of room for further improvements that will lead to a remakable enhancement in the performance of our photoanodes, reaching quantum efficiency level similar to state-of-the-art PV cells. ETASECS aims at reaching this ambitious goal, which is essential for demonstrating the competitiveness of PEC+PV tandem systems for solar energy conversion and storage. Towards this end WP1 will combine theory, modelling and simulations, state-of-the-art experimental methods and advanced diagnostic techniques in order to identify and quantify the different losses in our devices. This work will guide the optimization work in WP2 that will suppress the losses at the photoanode and insure optimal electrical and optical coupling of the PEC and PV cells. We will also explore advanced photon management schemes that will go beyond our current light trapping scheme by combining synergic optical and nanophotonics effects. WP3 will integrate the PEC and PV cells and test their properties and performance. WP4 will disseminate our progress and achievements in professional and public forums. The innovations that will emerge from this frontier research will be further pursued in proof of concept follow up investigations that will demonstrate the feasibility of this technology. Success along these lines holds exciting promises for ground breaking progress towards large scale deployment of solar energy.
Max ERC Funding
2 150 000 €
Duration
Start date: 2014-09-01, End date: 2019-08-31
Project acronym EURO-NEUROSTRESS
Project Dissecting the Central Stress Response: Bridging the Genotype-Phenotype Gap
Researcher (PI) Alon Chen
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), LS5, ERC-2010-StG_20091118
Summary The biological response to stress is concerned with the maintenance of homeostasis in the presence of real or perceived challenges. This process requires numerous adaptive responses involving changes in the central nervous and neuroendocrine systems. When a situation is perceived as stressful, the brain activates many neuronal circuits linking centers involved in sensory, motor, autonomic, neuroendocrine, cognitive, and emotional functions in order to adapt to the demand. However, the details of the pathways by which the brain translates stressful stimuli into the final, integrated biological response are presently incompletely understood. Nevertheless, it is clear that dysregulation of these physiological responses to stress can have severe psychological and physiological consequences, and there is much evidence to suggest that inappropriate regulation, disproportional intensity, or chronic and/or irreversible activation of the stress response is linked to the etiology and pathophysiology of anxiety disorders and depression.
Understanding the neurobiology of stress by focusing on the brain circuits and genes, which are associated with, or altered by, the stress response will provide important insights into the brain mechanisms by which stress affects psychological and physiological disorders. This is an integrated multidisciplinary project from gene to behavior using state-of-the-art moue genetics and animal models. We will employ integrated molecular, biochemical, physiological and behavioral methods, focusing on the generation of mice genetic models as an in vivo tool, in order to study the central pathways and molecular mechanisms mediating the stress response. Defining the contributions of known and novel gene products to the maintenance of stress-linked homeostasis may improve our ability to design therapeutic interventions for, and thus manage, stress-related disorders.
Summary
The biological response to stress is concerned with the maintenance of homeostasis in the presence of real or perceived challenges. This process requires numerous adaptive responses involving changes in the central nervous and neuroendocrine systems. When a situation is perceived as stressful, the brain activates many neuronal circuits linking centers involved in sensory, motor, autonomic, neuroendocrine, cognitive, and emotional functions in order to adapt to the demand. However, the details of the pathways by which the brain translates stressful stimuli into the final, integrated biological response are presently incompletely understood. Nevertheless, it is clear that dysregulation of these physiological responses to stress can have severe psychological and physiological consequences, and there is much evidence to suggest that inappropriate regulation, disproportional intensity, or chronic and/or irreversible activation of the stress response is linked to the etiology and pathophysiology of anxiety disorders and depression.
Understanding the neurobiology of stress by focusing on the brain circuits and genes, which are associated with, or altered by, the stress response will provide important insights into the brain mechanisms by which stress affects psychological and physiological disorders. This is an integrated multidisciplinary project from gene to behavior using state-of-the-art moue genetics and animal models. We will employ integrated molecular, biochemical, physiological and behavioral methods, focusing on the generation of mice genetic models as an in vivo tool, in order to study the central pathways and molecular mechanisms mediating the stress response. Defining the contributions of known and novel gene products to the maintenance of stress-linked homeostasis may improve our ability to design therapeutic interventions for, and thus manage, stress-related disorders.
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-04-01, End date: 2016-03-31
Project acronym EUROEMP
Project Employment in Europe
Researcher (PI) Christoforos Pissarides
Host Institution (HI) UNIVERSITY OF CYPRUS
Call Details Advanced Grant (AdG), SH1, ERC-2012-ADG_20120411
Summary "The first part of this project is about employment in Europe, including the new members of the European Union. Both the level of employment and the type of jobs created will be examined. A thorough study of institutional structures and policies is proposed, with a view to arriving at conclusions about their influence on job creation and about the best policy needed to achieve national or European-level employment objectives. Job creation is investigated at the two-digit level and male and female employment, wage inequality and the role of policy will be studied in depth. The research will build on solid theoretical microfoundations taking into account the choices available to firms and workers/consumers about working at home or in the market and buying domestic or foreign goods. The project has a second part about unemployment, with special emphasis on recession. The same emphasis on institutions and policies as for employment is given to this part. A key component of the project is new theory on the evolution of institutions and policies in markets with friction, and on the impact that the policy changes that took place after the recession of the 1980s have had on the responses of European labour markets to the recent recession. Special attention will be given to the formerly planned economies and the reasons for their slow convergence to the western economies."
Summary
"The first part of this project is about employment in Europe, including the new members of the European Union. Both the level of employment and the type of jobs created will be examined. A thorough study of institutional structures and policies is proposed, with a view to arriving at conclusions about their influence on job creation and about the best policy needed to achieve national or European-level employment objectives. Job creation is investigated at the two-digit level and male and female employment, wage inequality and the role of policy will be studied in depth. The research will build on solid theoretical microfoundations taking into account the choices available to firms and workers/consumers about working at home or in the market and buying domestic or foreign goods. The project has a second part about unemployment, with special emphasis on recession. The same emphasis on institutions and policies as for employment is given to this part. A key component of the project is new theory on the evolution of institutions and policies in markets with friction, and on the impact that the policy changes that took place after the recession of the 1980s have had on the responses of European labour markets to the recent recession. Special attention will be given to the formerly planned economies and the reasons for their slow convergence to the western economies."
Max ERC Funding
2 200 143 €
Duration
Start date: 2013-06-01, End date: 2018-05-31
Project acronym EVODEVOPATHS
Project Evolution of Developmental Gene Pathways
Researcher (PI) Itai Yanai
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Starting Grant (StG), LS8, ERC-2012-StG_20111109
Summary The staggering diversity of the living world is a testament to the amount of variation available to the agency of natural selection. While it has been assumed that variation is entirely uniform and unbiased, recent work has challenged this notion. Evolutionary developmental biology seeks to understand the biases on variation imposed by developmental processes and their distinction from selective constraints. Metazoan development is best described by developmental gene pathways which are composed of transcription factors, signaling molecules, and terminal differentiation genes. A systematic comparison of such pathways across species would reveal the patterns of conservation and divergence; however this has not yet been achieved. In the EvoDevoPaths project we will develop a new approach to unravel pathways using both single-cell and tissue-specific transcriptomics. Our aim is to elucidate the evolution of developmental gene pathways using intricate embryology in the nematode phylum, a single-cell transcriptomic method we have developed, and sophisticated computational approaches for pathway comparisons. We will ask how variation is distributed across the specification and differentiation modules of a pathway using the nematode endoderm pathway as a model system. We further propose that the evolutionary change in the tissue specification pathways of early cell lineages is constrained by the properties of cell specification pathways. To test this hypothesis we will, for the first time, determine early developmental cell lineages from single cell transcriptomic data. Finally, we will attempt to unify the molecular signatures of conserved stages in disparate phyla under a framework in which they can be systematically compared. This research collectively represents the first time that developmental gene pathways are examined in an unbiased manner contributing to a theory of molecular variation that explains the evolutionary processes that underlie phenotypic novelty.
Summary
The staggering diversity of the living world is a testament to the amount of variation available to the agency of natural selection. While it has been assumed that variation is entirely uniform and unbiased, recent work has challenged this notion. Evolutionary developmental biology seeks to understand the biases on variation imposed by developmental processes and their distinction from selective constraints. Metazoan development is best described by developmental gene pathways which are composed of transcription factors, signaling molecules, and terminal differentiation genes. A systematic comparison of such pathways across species would reveal the patterns of conservation and divergence; however this has not yet been achieved. In the EvoDevoPaths project we will develop a new approach to unravel pathways using both single-cell and tissue-specific transcriptomics. Our aim is to elucidate the evolution of developmental gene pathways using intricate embryology in the nematode phylum, a single-cell transcriptomic method we have developed, and sophisticated computational approaches for pathway comparisons. We will ask how variation is distributed across the specification and differentiation modules of a pathway using the nematode endoderm pathway as a model system. We further propose that the evolutionary change in the tissue specification pathways of early cell lineages is constrained by the properties of cell specification pathways. To test this hypothesis we will, for the first time, determine early developmental cell lineages from single cell transcriptomic data. Finally, we will attempt to unify the molecular signatures of conserved stages in disparate phyla under a framework in which they can be systematically compared. This research collectively represents the first time that developmental gene pathways are examined in an unbiased manner contributing to a theory of molecular variation that explains the evolutionary processes that underlie phenotypic novelty.
Max ERC Funding
1 500 000 €
Duration
Start date: 2012-10-01, End date: 2017-09-30
Project acronym EvoDevoQuorum
Project Evolution and Development of Bacterial Communication
Researcher (PI) Avigdor Eldar
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), LS2, ERC-2011-StG_20101109
Summary Bacterial cooperation underlies many bacterial traits of practical interest. Many social traits of bacteria are regulated by inter-cellular signalling pathways, generally known as quorum sensing (QS). QS has been proposed as novel target for anti-virulence treatment. To this aim, there is a need to better understand the mechanisms of QS and their social and evolutionary impact.
While the basic schemes of a single quorum sensing pathway acting in homogenous conditions are well understood, the system’s level function of QS regulatory networks can only be appreciated by considering the role phenotypic and genetic variability has on shaping the network’s structure and function. Phenotypic variability in complex communities may arise from division of labour between cells and environmental gradients and substantially impact the way cells secrete and interpret QS signals. Genetic variability in QS networks may lead to multiple social relations between cells of different genotypes including cross-talks, interception, manipulation and quenching of signals. This will affect the population structure and performance.
The proposed project will study the function of QS signalling in heterogeneous communities. Phenotypic variability and its impact on QS function will be studied in a spatially inhomogeneous cooperating system. Genetic variability will be studied at the macro and micro-scales in a bacterial species showing rapid diversification of their QS networks. Finally, we will rationally design strains with superior ‘cheating’ strategies that can invade and eliminate a cooperative population.
Throughout this project, we will use a combination of experimental techniques from microbiology, socio-biology, genetics and microscopy together with mathematical analysis tools from systems biology, population genetics and game theory, to study bacterial cooperation and its dependence on the underlying communication network, social complexity and environmental variation.
Summary
Bacterial cooperation underlies many bacterial traits of practical interest. Many social traits of bacteria are regulated by inter-cellular signalling pathways, generally known as quorum sensing (QS). QS has been proposed as novel target for anti-virulence treatment. To this aim, there is a need to better understand the mechanisms of QS and their social and evolutionary impact.
While the basic schemes of a single quorum sensing pathway acting in homogenous conditions are well understood, the system’s level function of QS regulatory networks can only be appreciated by considering the role phenotypic and genetic variability has on shaping the network’s structure and function. Phenotypic variability in complex communities may arise from division of labour between cells and environmental gradients and substantially impact the way cells secrete and interpret QS signals. Genetic variability in QS networks may lead to multiple social relations between cells of different genotypes including cross-talks, interception, manipulation and quenching of signals. This will affect the population structure and performance.
The proposed project will study the function of QS signalling in heterogeneous communities. Phenotypic variability and its impact on QS function will be studied in a spatially inhomogeneous cooperating system. Genetic variability will be studied at the macro and micro-scales in a bacterial species showing rapid diversification of their QS networks. Finally, we will rationally design strains with superior ‘cheating’ strategies that can invade and eliminate a cooperative population.
Throughout this project, we will use a combination of experimental techniques from microbiology, socio-biology, genetics and microscopy together with mathematical analysis tools from systems biology, population genetics and game theory, to study bacterial cooperation and its dependence on the underlying communication network, social complexity and environmental variation.
Max ERC Funding
1 497 996 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym EVOEPIC
Project Evolutionary mechanisms of epigenomic and chromosomal aberrations in cancer
Researcher (PI) Amos Tanay
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), LS2, ERC-2012-StG_20111109
Summary Our working hypothesis is that tumorigenesis is an evolutionary process that fundamentally couples few major driving events (point mutations, rearrangements) with a complex flux of minor aberrations, many of which are epigenetic. We believe that these minor events are critical factors in the emergence of the cancer phenotype, and that understanding them is essential to the characterization of the disease. In particular, we hypothesize that a quantitative and principled evolutionary model for carcinogenesis is imperative for understanding the heterogeneity within tumor cell populations and predicting the effects of cancer therapies. We will therefore develop an interdisciplinary scheme that combines theoretical models of cancer evolution with in vitro evolutionary experiments and new methods for assaying the population heterogeneity of epigenomic organization. By developing techniques to interrogate DNA methylation and its interaction with other key epigenetic marks at the single-cell level, we will allow quantitative theoretical predictions to be scrutinized and refined. By combining models describing epigenetic aberrations with direct measurements of chromatin organization using Hi-C and 4C-seq, we shall revisit fundamental questions on the causative nature of epigenetic changes during carcinogenesis. Ultimately, we will apply both theoretical and experimental methodologies to assay and characterize the evolutionary histories of tumor cell populations from multiple mouse models and clinical patient samples.
Summary
Our working hypothesis is that tumorigenesis is an evolutionary process that fundamentally couples few major driving events (point mutations, rearrangements) with a complex flux of minor aberrations, many of which are epigenetic. We believe that these minor events are critical factors in the emergence of the cancer phenotype, and that understanding them is essential to the characterization of the disease. In particular, we hypothesize that a quantitative and principled evolutionary model for carcinogenesis is imperative for understanding the heterogeneity within tumor cell populations and predicting the effects of cancer therapies. We will therefore develop an interdisciplinary scheme that combines theoretical models of cancer evolution with in vitro evolutionary experiments and new methods for assaying the population heterogeneity of epigenomic organization. By developing techniques to interrogate DNA methylation and its interaction with other key epigenetic marks at the single-cell level, we will allow quantitative theoretical predictions to be scrutinized and refined. By combining models describing epigenetic aberrations with direct measurements of chromatin organization using Hi-C and 4C-seq, we shall revisit fundamental questions on the causative nature of epigenetic changes during carcinogenesis. Ultimately, we will apply both theoretical and experimental methodologies to assay and characterize the evolutionary histories of tumor cell populations from multiple mouse models and clinical patient samples.
Max ERC Funding
1 499 998 €
Duration
Start date: 2012-12-01, End date: 2017-11-30
Project acronym EVOLOME
Project Genetic and phenotypic precursors of antibiotic resistance in evolving bacterial populations: from single cell to population level analyses
Researcher (PI) Nathalie Balaban
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), LS8, ERC-2010-StG_20091118
Summary Soon after new antibiotics are introduced, bacterial strains resistant to their action emerge. Recently, non-specific factors that promote the later appearance of specific mechanisms of resistance have been found. Some of these so-called global factors (as opposed to specific resistance mechanisms) emerge as major players in shaping the rate of evolution of resistance. For example, a mutation in the mismatch repair system is a global genetic factor that increases the mutation rate and therefore leads to an increased probability to evolve resistance.
In addition to global genetic factors, it is becoming clear that global phenotypic factors play a crucial role in resistance evolution. For example, activation of stress responses can also result in an elevated mutation rate and accelerated evolution of drug resistance. A natural question which arises in this context is how sub-populations of phenotypic variants differ in their evolutionary potential, and how that, in turn, affects the rate at which an entire population adapts to antibiotic stress.
I propose a multidisciplinary approach to the systematic and quantitative study of the non-specific factors that affect the mode and tempo of evolution towards antibiotic resistance. Our preliminary results indicate that the presence of dormant bacteria that survive antibiotic treatment affects the rate of resistance evolution in bacterial populations. I will exploit the established expertise of my lab using microfluidic devices for single cell analyses to track the emergence of resistance at the single-cell level, in real-time, and to study the correlation between the phenotype of single bacteria and the probability to evolve resistance. My second approach will take advantage of the recent developments in experimental evolution and high throughput sequencing and combine those with single cells observations for the systematic search of E.coli genes that affect the rate of resistance evolution. We will study replicate populations of E.coli, founded by either laboratory strains or clinical isolates, as they evolve in parallel, under antibiotic stress. Evolved populations will be compared with ancestral populations in order to identify genes and phenotypes that have changed during the evolution of antibiotic resistance. Finally, in silico evolution that simulates the experimental conditions will be developed to analyze the contribution of global factors on resistance evolution.
The evolution of antibiotic resistance is not only a fascinating demonstration of the power of evolution but also represents one of the major health threats today. I anticipate that this multidisciplinary study of the global factors that influence the evolution of resistance, from the single cell to the population level, will shed light on the mechanisms used by bacteria to accelerate evolution in general, as well as provide clues as to how to prevent the emergence of antibiotic resistance.
Summary
Soon after new antibiotics are introduced, bacterial strains resistant to their action emerge. Recently, non-specific factors that promote the later appearance of specific mechanisms of resistance have been found. Some of these so-called global factors (as opposed to specific resistance mechanisms) emerge as major players in shaping the rate of evolution of resistance. For example, a mutation in the mismatch repair system is a global genetic factor that increases the mutation rate and therefore leads to an increased probability to evolve resistance.
In addition to global genetic factors, it is becoming clear that global phenotypic factors play a crucial role in resistance evolution. For example, activation of stress responses can also result in an elevated mutation rate and accelerated evolution of drug resistance. A natural question which arises in this context is how sub-populations of phenotypic variants differ in their evolutionary potential, and how that, in turn, affects the rate at which an entire population adapts to antibiotic stress.
I propose a multidisciplinary approach to the systematic and quantitative study of the non-specific factors that affect the mode and tempo of evolution towards antibiotic resistance. Our preliminary results indicate that the presence of dormant bacteria that survive antibiotic treatment affects the rate of resistance evolution in bacterial populations. I will exploit the established expertise of my lab using microfluidic devices for single cell analyses to track the emergence of resistance at the single-cell level, in real-time, and to study the correlation between the phenotype of single bacteria and the probability to evolve resistance. My second approach will take advantage of the recent developments in experimental evolution and high throughput sequencing and combine those with single cells observations for the systematic search of E.coli genes that affect the rate of resistance evolution. We will study replicate populations of E.coli, founded by either laboratory strains or clinical isolates, as they evolve in parallel, under antibiotic stress. Evolved populations will be compared with ancestral populations in order to identify genes and phenotypes that have changed during the evolution of antibiotic resistance. Finally, in silico evolution that simulates the experimental conditions will be developed to analyze the contribution of global factors on resistance evolution.
The evolution of antibiotic resistance is not only a fascinating demonstration of the power of evolution but also represents one of the major health threats today. I anticipate that this multidisciplinary study of the global factors that influence the evolution of resistance, from the single cell to the population level, will shed light on the mechanisms used by bacteria to accelerate evolution in general, as well as provide clues as to how to prevent the emergence of antibiotic resistance.
Max ERC Funding
1 458 200 €
Duration
Start date: 2010-11-01, End date: 2015-10-31
Project acronym EXPANDERS
Project Expander Graphs in Pure and Applied Mathematics
Researcher (PI) Alexander Lubotzky
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Advanced Grant (AdG), PE1, ERC-2008-AdG
Summary Expander graphs are finite graphs which play a fundamental role in many areas of computer science such as: communication networks, algorithms and more. Several areas of deep mathematics have been used in order to give explicit constructions of such graphs e.g. Kazhdan property (T) from representation theory of semisimple Lie groups, Ramanujan conjecture from the theory of automorphic forms and more. In recent years, computer science has started to pay its debt to mathematics: expander graphs are playing an increasing role in several areas of pure mathematics. The goal of the current research plan is to deepen these connections in both directions with special emphasis of the more recent and surprising application of expanders to group theory, the geometry of 3-manifolds and number theory.
Summary
Expander graphs are finite graphs which play a fundamental role in many areas of computer science such as: communication networks, algorithms and more. Several areas of deep mathematics have been used in order to give explicit constructions of such graphs e.g. Kazhdan property (T) from representation theory of semisimple Lie groups, Ramanujan conjecture from the theory of automorphic forms and more. In recent years, computer science has started to pay its debt to mathematics: expander graphs are playing an increasing role in several areas of pure mathematics. The goal of the current research plan is to deepen these connections in both directions with special emphasis of the more recent and surprising application of expanders to group theory, the geometry of 3-manifolds and number theory.
Max ERC Funding
1 082 504 €
Duration
Start date: 2008-10-01, End date: 2014-09-30
Project acronym EXPRES
Project Chromatin and transcription in ES cells: from single cells to genome wide views
Researcher (PI) Eran Meshorer
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), LS2, ERC-2011-StG_20101109
Summary How embryonic stem cells (ESCs) maintain their dual capacity to self-renew and to differentiate into all cell types is one of the fundamental questions in biology. Although this question remains largely open, there is growing evidence suggesting that chromatin plasticity is a fundamental hallmark of ESCs, providing their necessary flexibility.
Previously we found that ESCs possess a relatively open chromatin conformation, giving rise to permissive transcriptional program. Here I propose to investigate the mechanisms that support chromatin plasticity and pluripotency in ESCs.
Using a simple biochemical assay which I developed (DCAP: Differential Chromatin Associated Proteins), based on micrococcal nuclease (MNase) digestion combined with multi-dimensional protein identification technology (MudPIT), I seek to identify ESC-specific chromatin proteins. Selected proteins will be knocked-down (or out) and their ESC function will be evaluated.
In addition, I will conduct a hypothesis-driven research using mutant ESCs and epigenetic-related drugs to search for potential mechanisms, (i.e. histone modifications, DNA methylation), that may support chromatin plasticity in ESCs. Based on our intriguing preliminary data, I will also focus on the link between the nuclear lamina and ESC plasticity.
Thirdly, we will analyze non-polyadenylated transcription using genome-wide tiling arrays and RNA-seq. We will design custom microarrays containing the identified sequences, which will allow us to reveal, using ChIP-chip experiments, the mechanistic regulation of the non-polyadenylated transcripts. Finally, we will knockout, using zinc-finger nuclease technology, selected highly conserved candidates in search of their function.
Understanding chromatin regulation, plasticity and function will enable one to intelligently manipulate ESCs to transition between the pluripotent, multipotent and unipotent states and to expedite their use in the clinic.
Summary
How embryonic stem cells (ESCs) maintain their dual capacity to self-renew and to differentiate into all cell types is one of the fundamental questions in biology. Although this question remains largely open, there is growing evidence suggesting that chromatin plasticity is a fundamental hallmark of ESCs, providing their necessary flexibility.
Previously we found that ESCs possess a relatively open chromatin conformation, giving rise to permissive transcriptional program. Here I propose to investigate the mechanisms that support chromatin plasticity and pluripotency in ESCs.
Using a simple biochemical assay which I developed (DCAP: Differential Chromatin Associated Proteins), based on micrococcal nuclease (MNase) digestion combined with multi-dimensional protein identification technology (MudPIT), I seek to identify ESC-specific chromatin proteins. Selected proteins will be knocked-down (or out) and their ESC function will be evaluated.
In addition, I will conduct a hypothesis-driven research using mutant ESCs and epigenetic-related drugs to search for potential mechanisms, (i.e. histone modifications, DNA methylation), that may support chromatin plasticity in ESCs. Based on our intriguing preliminary data, I will also focus on the link between the nuclear lamina and ESC plasticity.
Thirdly, we will analyze non-polyadenylated transcription using genome-wide tiling arrays and RNA-seq. We will design custom microarrays containing the identified sequences, which will allow us to reveal, using ChIP-chip experiments, the mechanistic regulation of the non-polyadenylated transcripts. Finally, we will knockout, using zinc-finger nuclease technology, selected highly conserved candidates in search of their function.
Understanding chromatin regulation, plasticity and function will enable one to intelligently manipulate ESCs to transition between the pluripotent, multipotent and unipotent states and to expedite their use in the clinic.
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-12-01, End date: 2016-11-30
Project acronym EXQFT
Project Exact Results in Quantum Field Theory
Researcher (PI) Zohar Komargodski
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), PE2, ERC-2013-StG
Summary Quantum field theory (QFT) is a unified conceptual and mathematical framework that encompasses a veritable cornucopia of physical phenomena, including phase transitions, condensed matter systems, elementary particle physics, and (via the holographic principle) quantum gravity. QFT has become the standard language of modern theoretical physics.
Despite the fact that QFT is omnipresent in physics, we have virtually no tools to analyze from first principles many of the interesting systems that appear in nature. (For instance, Quantum Chromodynamics, non-Fermi liquids, and even boiling water.)
Our main goal in this proposal is to develop new tools that would allow us to make progress on this fundamental problem. To this end, we will employ two strategies.
First, we propose to study in detail systems that possess extra symmetries (and are hence simpler). For example, critical systems often admit the group of conformal transformations. Another example is given by theories with Bose-Fermi degeneracy (supersymmetric theories). We will explain how we think significant progress can be achieved in this area. Advances here will allow us to wield more analytic control over relatively simple QFTs and extract physical information from these models. Such information can be useful in many areas of physics and lead to new connections with mathematics. Second, we will study general properties of renormalization group flows. Renormalization group flows govern the dynamics of QFT and understanding their properties may lead to substantial developments. Very recent progress along these lines has already led to surprising new results about QFT and may have direct applications in several areas of physics. Much more can be achieved.
These two strategies are complementary and interwoven.
Summary
Quantum field theory (QFT) is a unified conceptual and mathematical framework that encompasses a veritable cornucopia of physical phenomena, including phase transitions, condensed matter systems, elementary particle physics, and (via the holographic principle) quantum gravity. QFT has become the standard language of modern theoretical physics.
Despite the fact that QFT is omnipresent in physics, we have virtually no tools to analyze from first principles many of the interesting systems that appear in nature. (For instance, Quantum Chromodynamics, non-Fermi liquids, and even boiling water.)
Our main goal in this proposal is to develop new tools that would allow us to make progress on this fundamental problem. To this end, we will employ two strategies.
First, we propose to study in detail systems that possess extra symmetries (and are hence simpler). For example, critical systems often admit the group of conformal transformations. Another example is given by theories with Bose-Fermi degeneracy (supersymmetric theories). We will explain how we think significant progress can be achieved in this area. Advances here will allow us to wield more analytic control over relatively simple QFTs and extract physical information from these models. Such information can be useful in many areas of physics and lead to new connections with mathematics. Second, we will study general properties of renormalization group flows. Renormalization group flows govern the dynamics of QFT and understanding their properties may lead to substantial developments. Very recent progress along these lines has already led to surprising new results about QFT and may have direct applications in several areas of physics. Much more can be achieved.
These two strategies are complementary and interwoven.
Max ERC Funding
1 158 692 €
Duration
Start date: 2013-09-01, End date: 2018-08-31
Project acronym EXTPRO
Project Quasi-Randomness in Extremal Combinatorics
Researcher (PI) Asaf Shapira
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), PE1, ERC-2014-STG
Summary Combinatorics is an extremely fast growing mathematical discipline. While it started as a collection of isolated problems that
were tackled using ad-hoc arguments it has since grown into a mature discipline which both incorporated into it deep tools from other mathematical areas, and has also found applications in other mathematical areas such as Additive Number Theory, Theoretical Computer Science, Computational Biology and Information Theory.
The PI will work on a variety of problems in Extremal Combinatorics which is one of the most active subareas within Combinatorics with spectacular recent developments. A typical problem in this area asks to minimize (or maximize) a certain parameter attached to a discrete structure given several other constrains. One of the most powerful tools used in attacking problems in this area uses the so called Structure vs Randomness phenomenon. This roughly means that any {\em deterministic} object can be partitioned into smaller quasi-random objects, that is, objects that have properties we expect to find in truly random ones. The PI has already made significant contributions in this area and our goal in this proposal is to obtain further results of this caliber by tackling some of the hardest open problems at the forefront of current research. Some of these problems are related to the celebrated Hypergraph and Arithmetic Regularity Lemmas, to Super-saturation problems in Additive Combinatorics and Graph Theory, to problems in Ramsey Theory, as well as to applications of Extremal Combinatorics to problems in Theoretical Computer Science. Another major goal of this proposal is to develop new approaches and techniques for tackling problems in Extremal Combinatorics.
The support by means of a 5-year research grant will enable the PI to further establish himself as a leading researcher in Extremal Combinatorics and to build a vibrant research group in Extremal Combinatorics.
Summary
Combinatorics is an extremely fast growing mathematical discipline. While it started as a collection of isolated problems that
were tackled using ad-hoc arguments it has since grown into a mature discipline which both incorporated into it deep tools from other mathematical areas, and has also found applications in other mathematical areas such as Additive Number Theory, Theoretical Computer Science, Computational Biology and Information Theory.
The PI will work on a variety of problems in Extremal Combinatorics which is one of the most active subareas within Combinatorics with spectacular recent developments. A typical problem in this area asks to minimize (or maximize) a certain parameter attached to a discrete structure given several other constrains. One of the most powerful tools used in attacking problems in this area uses the so called Structure vs Randomness phenomenon. This roughly means that any {\em deterministic} object can be partitioned into smaller quasi-random objects, that is, objects that have properties we expect to find in truly random ones. The PI has already made significant contributions in this area and our goal in this proposal is to obtain further results of this caliber by tackling some of the hardest open problems at the forefront of current research. Some of these problems are related to the celebrated Hypergraph and Arithmetic Regularity Lemmas, to Super-saturation problems in Additive Combinatorics and Graph Theory, to problems in Ramsey Theory, as well as to applications of Extremal Combinatorics to problems in Theoretical Computer Science. Another major goal of this proposal is to develop new approaches and techniques for tackling problems in Extremal Combinatorics.
The support by means of a 5-year research grant will enable the PI to further establish himself as a leading researcher in Extremal Combinatorics and to build a vibrant research group in Extremal Combinatorics.
Max ERC Funding
1 221 921 €
Duration
Start date: 2015-03-01, End date: 2021-02-28
Project acronym FACT
Project Factorizing the wave function of large quantum systems
Researcher (PI) Eberhard Gross
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Advanced Grant (AdG), PE3, ERC-2017-ADG
Summary This proposal puts forth a novel strategy to tackle large quantum systems. A variety of highly sophisticated methods such as quantum Monte Carlo, configuration interaction, coupled cluster, tensor networks, Feynman diagrams, dynamical mean-field theory, density functional theory, and semi-classical techniques have been developed to deal with the enormous complexity of the many-particle Schrödinger equation. The goal of our proposal is not to add another method to these standard techniques but, instead, we develop a systematic way of combining them. The essential ingredient is a novel way of decomposing the wave function without approximation into factors that describe subsystems of the full quantum system. This so-called exact factorization is asymmetric. In the case of two subsystems, one factor is a wave function satisfying a regular Schrödinger equation, while the other factor is a conditional probability amplitude satisfying a more complicated Schrödinger-like equation with a non-local, non-linear and non-Hermitian “Hamiltonian”. Since each subsystem is necessarily smaller than the full system, the above standard techniques can be applied more efficiently and, most importantly, different standard techniques can be applied to different subsystems. The power of the exact factorization lies in its versatility. Here we apply the technique to five different scenarios: The first two deal with non-adiabatic effects in (i) molecules and (ii) solids. Here the natural subsystems are electrons and nuclei. The third scenario deals with nuclear motion in (iii) molecules attached to semi-infinite metallic leads, requiring three subsystems: the electrons, the nuclei in the leads which ultimately reduce to a phonon bath, and the molecular nuclei which may perform large-amplitude movements, such as current-induced isomerization, (iv) purely electronic correlations, and (v) the interaction of matter with the quantized electromagnetic field, i.e., electrons, nuclei and photons.
Summary
This proposal puts forth a novel strategy to tackle large quantum systems. A variety of highly sophisticated methods such as quantum Monte Carlo, configuration interaction, coupled cluster, tensor networks, Feynman diagrams, dynamical mean-field theory, density functional theory, and semi-classical techniques have been developed to deal with the enormous complexity of the many-particle Schrödinger equation. The goal of our proposal is not to add another method to these standard techniques but, instead, we develop a systematic way of combining them. The essential ingredient is a novel way of decomposing the wave function without approximation into factors that describe subsystems of the full quantum system. This so-called exact factorization is asymmetric. In the case of two subsystems, one factor is a wave function satisfying a regular Schrödinger equation, while the other factor is a conditional probability amplitude satisfying a more complicated Schrödinger-like equation with a non-local, non-linear and non-Hermitian “Hamiltonian”. Since each subsystem is necessarily smaller than the full system, the above standard techniques can be applied more efficiently and, most importantly, different standard techniques can be applied to different subsystems. The power of the exact factorization lies in its versatility. Here we apply the technique to five different scenarios: The first two deal with non-adiabatic effects in (i) molecules and (ii) solids. Here the natural subsystems are electrons and nuclei. The third scenario deals with nuclear motion in (iii) molecules attached to semi-infinite metallic leads, requiring three subsystems: the electrons, the nuclei in the leads which ultimately reduce to a phonon bath, and the molecular nuclei which may perform large-amplitude movements, such as current-induced isomerization, (iv) purely electronic correlations, and (v) the interaction of matter with the quantized electromagnetic field, i.e., electrons, nuclei and photons.
Max ERC Funding
2 443 932 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym FADER
Project Flight Algorithms for Disaggregated Space Architectures
Researcher (PI) Pinchas Pini Gurfil
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Starting Grant (StG), PE7, ERC-2011-StG_20101014
Summary Standard spacecraft designs comprise modules assembled in a single monolithic structure. When unexpected situations occur, the spacecraft are unable to adequately respond, and significant functional and financial losses are unavoidable. For instance, if the payload of a spacecraft fails, the whole system becomes unserviceable and substitution of the entire spacecraft is required. It would be much easier to replace the payload module only than launch a completely new satellite. This idea gives rise to an emerging concept in space engineering termed disaggregated spacecraft. Disaggregated space architectures (DSA) consist of several physically-separated modules, interacting through wireless communication links to form a single virtual platform. Each module has one or more pre-determined functions: Navigation, attitude control, power generation and payload operation. The free-flying modules, capable of resource sharing, do not have to operate in a tightly-controlled formation, but are rather required to remain in bounded relative position and attitude, termed cluster flying. DSA enables novel space system architectures, which are expected to be much more efficient, adaptable, robust and responsive. The main goal of the proposed research is to develop beyond the state-of-the-art technologies in order to enable operational flight of DSA, by (i) developing algorithms for semi-autonomous long-duration maintenance of a cluster and cluster network, capable of adding and removing spacecraft modules to/from the cluster and cluster network; (ii) finding methods so as to autonomously reconfigure the cluster to retain safety- and mission-critical functionality in the face of network degradation or component failures; (iii) designing semi-autonomous cluster scatter and re-gather maneuvesr to rapidly evade a debris-like threat; and (iv) validating the said algorithms and methods in the Distributed Space Systems Laboratory in which the PI serves as a Principal Investigator.
Summary
Standard spacecraft designs comprise modules assembled in a single monolithic structure. When unexpected situations occur, the spacecraft are unable to adequately respond, and significant functional and financial losses are unavoidable. For instance, if the payload of a spacecraft fails, the whole system becomes unserviceable and substitution of the entire spacecraft is required. It would be much easier to replace the payload module only than launch a completely new satellite. This idea gives rise to an emerging concept in space engineering termed disaggregated spacecraft. Disaggregated space architectures (DSA) consist of several physically-separated modules, interacting through wireless communication links to form a single virtual platform. Each module has one or more pre-determined functions: Navigation, attitude control, power generation and payload operation. The free-flying modules, capable of resource sharing, do not have to operate in a tightly-controlled formation, but are rather required to remain in bounded relative position and attitude, termed cluster flying. DSA enables novel space system architectures, which are expected to be much more efficient, adaptable, robust and responsive. The main goal of the proposed research is to develop beyond the state-of-the-art technologies in order to enable operational flight of DSA, by (i) developing algorithms for semi-autonomous long-duration maintenance of a cluster and cluster network, capable of adding and removing spacecraft modules to/from the cluster and cluster network; (ii) finding methods so as to autonomously reconfigure the cluster to retain safety- and mission-critical functionality in the face of network degradation or component failures; (iii) designing semi-autonomous cluster scatter and re-gather maneuvesr to rapidly evade a debris-like threat; and (iv) validating the said algorithms and methods in the Distributed Space Systems Laboratory in which the PI serves as a Principal Investigator.
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-10-01, End date: 2016-09-30
Project acronym FAFC
Project Foundations and Applications of Functional Cryptography
Researcher (PI) Gil SEGEV
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), PE6, ERC-2016-STG
Summary "Modern cryptography has successfully followed an ""all-or-nothing"" design paradigm over the years. For example, the most fundamental task of data encryption requires that encrypted data be fully recoverable using the encryption key, but be completely useless without it. Nowadays, however, this paradigm is insufficient for a wide variety of evolving applications, and a more subtle approach is urgently needed. This has recently motivated the cryptography community to put forward a vision of ""functional cryptography'': Designing cryptographic primitives that allow fine-grained access to sensitive data.
This proposal aims at making substantial progress towards realizing the premise of functional cryptography. By tackling challenging key problems in both the foundations and the applications of functional cryptography, I plan to direct the majority of our effort towards addressing the following three fundamental objectives, which span a broad and interdisciplinary flavor of research directions: (1) Obtain a better understanding of functional cryptography's building blocks, (2) develop functional cryptographic tools and schemes based on well-studied assumptions, and (3) increase the usability of functional cryptographic systems via algorithmic techniques.
Realizing the premise of functional cryptography is of utmost importance not only to the development of modern cryptography, but in fact to our entire technological development, where fine-grained access to sensitive data plays an instrumental role. Moreover, our objectives are tightly related to two of the most fundamental open problems in cryptography: Basing cryptography on widely-believed worst-case complexity assumptions, and basing public-key cryptography on private-key primitives. I strongly believe that meaningful progress towards achieving our objectives will shed new light on these key problems, and thus have a significant impact on our understanding of modern cryptography."
Summary
"Modern cryptography has successfully followed an ""all-or-nothing"" design paradigm over the years. For example, the most fundamental task of data encryption requires that encrypted data be fully recoverable using the encryption key, but be completely useless without it. Nowadays, however, this paradigm is insufficient for a wide variety of evolving applications, and a more subtle approach is urgently needed. This has recently motivated the cryptography community to put forward a vision of ""functional cryptography'': Designing cryptographic primitives that allow fine-grained access to sensitive data.
This proposal aims at making substantial progress towards realizing the premise of functional cryptography. By tackling challenging key problems in both the foundations and the applications of functional cryptography, I plan to direct the majority of our effort towards addressing the following three fundamental objectives, which span a broad and interdisciplinary flavor of research directions: (1) Obtain a better understanding of functional cryptography's building blocks, (2) develop functional cryptographic tools and schemes based on well-studied assumptions, and (3) increase the usability of functional cryptographic systems via algorithmic techniques.
Realizing the premise of functional cryptography is of utmost importance not only to the development of modern cryptography, but in fact to our entire technological development, where fine-grained access to sensitive data plays an instrumental role. Moreover, our objectives are tightly related to two of the most fundamental open problems in cryptography: Basing cryptography on widely-believed worst-case complexity assumptions, and basing public-key cryptography on private-key primitives. I strongly believe that meaningful progress towards achieving our objectives will shed new light on these key problems, and thus have a significant impact on our understanding of modern cryptography."
Max ERC Funding
1 307 188 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym FAITh
Project Fighting Anxiety with Importin-based Therapeutics
Researcher (PI) Michael FAINZILBER
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Proof of Concept (PoC), ERC-2017-PoC
Summary Anxiety and stress-related conditions represent a significant health burden in modern society. Anxiety disorders are currently treated with a variety of agents targeting synaptic mechanisms. These agents either directly affect neurotransmitter receptor systems or modulate neurotransmitter levels or availability, but their long-term use is limited by problematic side effects and suboptimal efficacy. The development of new anxiolytic drugs has been fraught with difficulty, hence there is a need for new targets and new avenues for therapeutic development.
Importin-dependent transport mechanisms link synapse to nucleus in a diversity of physiological contexts, rendering them potentially interesting targets for behavioural control. However importins and related molecules have not been evaluated for roles in anxiolysis to date. We discovered the roles of importins in axonal injury signaling and in cell size sensing. During the course of our current ERC Advanced grant, and as part of one of the aims, we have conducted comprehensive phenotyping of importin mouse mutants to identify in vivo consequences of the deregulation of size control pathways. One importin mutant line presented a specific phenotype in anxiety tests, and follow-up analyses identified a new molecular pathway for anxiety regulation, and approved drugs affecting this pathway that can be repositioned for anxiety treatment.
In this PoC, we will (1) carry out IP protection on our initial identifications of anxiolytic drugs; (2) further validate the anxiolytic activities of these drugs and their closely related structural or functional analogs; and (3) devise an HTS-compatible assay for targeting the importin involved and conduct a pilot screen of ~200,000 compounds in this assay to identify new drug leads for anxiety treatment. As a final step, we will carry out (4) additional IP protection and pre-commercialisation tasks for maximizing the commercialisation potential of our discovery.
Summary
Anxiety and stress-related conditions represent a significant health burden in modern society. Anxiety disorders are currently treated with a variety of agents targeting synaptic mechanisms. These agents either directly affect neurotransmitter receptor systems or modulate neurotransmitter levels or availability, but their long-term use is limited by problematic side effects and suboptimal efficacy. The development of new anxiolytic drugs has been fraught with difficulty, hence there is a need for new targets and new avenues for therapeutic development.
Importin-dependent transport mechanisms link synapse to nucleus in a diversity of physiological contexts, rendering them potentially interesting targets for behavioural control. However importins and related molecules have not been evaluated for roles in anxiolysis to date. We discovered the roles of importins in axonal injury signaling and in cell size sensing. During the course of our current ERC Advanced grant, and as part of one of the aims, we have conducted comprehensive phenotyping of importin mouse mutants to identify in vivo consequences of the deregulation of size control pathways. One importin mutant line presented a specific phenotype in anxiety tests, and follow-up analyses identified a new molecular pathway for anxiety regulation, and approved drugs affecting this pathway that can be repositioned for anxiety treatment.
In this PoC, we will (1) carry out IP protection on our initial identifications of anxiolytic drugs; (2) further validate the anxiolytic activities of these drugs and their closely related structural or functional analogs; and (3) devise an HTS-compatible assay for targeting the importin involved and conduct a pilot screen of ~200,000 compounds in this assay to identify new drug leads for anxiety treatment. As a final step, we will carry out (4) additional IP protection and pre-commercialisation tasks for maximizing the commercialisation potential of our discovery.
Max ERC Funding
150 000 €
Duration
Start date: 2017-11-01, End date: 2019-04-30
Project acronym FAST FILTERING
Project Fast Filtering for Computer Graphics, Vision and Computational Sciences
Researcher (PI) Raanan Fattal
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), PE6, ERC-2013-StG
Summary The world of digital signal processing, in particular computer graphics, vision and image processing, use linear and non-linear, explicit and implicit filtering extensively to analyze, process and synthesize images. Given nowadays high-resolution sensors, these operations are often very time consuming and are limited to devices with high-CPU power.
Traditional linear translation-invariant (LTI) transformations, executed using convolution, requires O(N^2) operations. This can be lowered to O(N \log N) via FFT over suitable domains. There are very few sets of filters to which optimal, linear-time, procedures are known. This situation is more complicated in the newly-emerging domain of non-linear spatially-varying filters. Exact application of such filter requires O(N^2) operations and acceleration methods involve higher space dimension introducing severe memory cost and truncation errors.
In this research proposal we intend to derive fast, linear-time, procedures for different types of LTI filters by exploiting a deep connection between convolution, spatially-homogeneous elliptic equations and the multigrid method for solving such equations. Based on this circular connection we draw novel prospects for deriving new multiscale filtering procedures.
A second part of this research proposal is devoted to deriving efficient explicit and implicit non-linear spatially-varying edge-aware filters. One front consists of the derivation of novel multi-level image decomposition that mimics the action of inhomogeneous diffusion operators. The idea here is, once again, to bridge the gap with numerical analysis and use ideas from multiscale matrix preconditioning for the design of new biorthogonal second-generation wavelets.
Moreover, this proposal outlines a new multiscale preconditioning paradigm combining ideas from algebraic multigrid and combinatorial matrix preconditioning. This intermediate approach offers new ways for overcoming fundamental shortcomings in this domain.
Summary
The world of digital signal processing, in particular computer graphics, vision and image processing, use linear and non-linear, explicit and implicit filtering extensively to analyze, process and synthesize images. Given nowadays high-resolution sensors, these operations are often very time consuming and are limited to devices with high-CPU power.
Traditional linear translation-invariant (LTI) transformations, executed using convolution, requires O(N^2) operations. This can be lowered to O(N \log N) via FFT over suitable domains. There are very few sets of filters to which optimal, linear-time, procedures are known. This situation is more complicated in the newly-emerging domain of non-linear spatially-varying filters. Exact application of such filter requires O(N^2) operations and acceleration methods involve higher space dimension introducing severe memory cost and truncation errors.
In this research proposal we intend to derive fast, linear-time, procedures for different types of LTI filters by exploiting a deep connection between convolution, spatially-homogeneous elliptic equations and the multigrid method for solving such equations. Based on this circular connection we draw novel prospects for deriving new multiscale filtering procedures.
A second part of this research proposal is devoted to deriving efficient explicit and implicit non-linear spatially-varying edge-aware filters. One front consists of the derivation of novel multi-level image decomposition that mimics the action of inhomogeneous diffusion operators. The idea here is, once again, to bridge the gap with numerical analysis and use ideas from multiscale matrix preconditioning for the design of new biorthogonal second-generation wavelets.
Moreover, this proposal outlines a new multiscale preconditioning paradigm combining ideas from algebraic multigrid and combinatorial matrix preconditioning. This intermediate approach offers new ways for overcoming fundamental shortcomings in this domain.
Max ERC Funding
1 320 200 €
Duration
Start date: 2013-08-01, End date: 2018-07-31
Project acronym FAULT-ADAPTIVE
Project Fault-Adaptive Monitoring and Control of Complex Distributed Dynamical Systems
Researcher (PI) Marios Polycarpou
Host Institution (HI) UNIVERSITY OF CYPRUS
Call Details Advanced Grant (AdG), PE7, ERC-2011-ADG_20110209
Summary "The emergence of networked embedded systems and sensor/actuator networks has facilitated the development of advanced monitoring and control applications, where a large amount of sensor data is collected and processed in real-time in order to activate the appropriate actuators and achieve the desired control objectives. However, in situations where a fault arises in some of the components (e.g., sensors, actuators, communication links), or an unexpected event occurs in the environment, this may lead to a serious degradation in performance or, even worse, to an overall system failure. There is a need to develop a systematic framework to enhance the reliability, fault-tolerance and sustainability of complex distributed dynamical systems through the use of fault-adaptive monitoring and control methods. The work proposed here will contribute to the development of such a framework with emphasis on applications related to critical infrastructure systems (e.g., power, water, telecommunications and transportation systems). It will provide an innovative approach based on the use of networked intelligent agent systems, where the state of the infrastructure is monitored and controlled by a network of sensors and actuators with cooperating agents for fault diagnosis and fault tolerant control. A hierarchical fault diagnosis architecture will be developed, with neighbouring fault diagnosis agents cooperating at a local level, while transmitting their information, as needed, to a regional monitoring agent, responsible for integrating in real-time local information into a large-scale “picture” of the health of the infrastructure. A key motivation is to exploit spatial and temporal correlations between measured variables using learning methods, and to develop the tools and design methodologies that will prevent relatively “small” faults or unexpected events from causing significant disruption or complete system failures in complex distributed dynamical systems."
Summary
"The emergence of networked embedded systems and sensor/actuator networks has facilitated the development of advanced monitoring and control applications, where a large amount of sensor data is collected and processed in real-time in order to activate the appropriate actuators and achieve the desired control objectives. However, in situations where a fault arises in some of the components (e.g., sensors, actuators, communication links), or an unexpected event occurs in the environment, this may lead to a serious degradation in performance or, even worse, to an overall system failure. There is a need to develop a systematic framework to enhance the reliability, fault-tolerance and sustainability of complex distributed dynamical systems through the use of fault-adaptive monitoring and control methods. The work proposed here will contribute to the development of such a framework with emphasis on applications related to critical infrastructure systems (e.g., power, water, telecommunications and transportation systems). It will provide an innovative approach based on the use of networked intelligent agent systems, where the state of the infrastructure is monitored and controlled by a network of sensors and actuators with cooperating agents for fault diagnosis and fault tolerant control. A hierarchical fault diagnosis architecture will be developed, with neighbouring fault diagnosis agents cooperating at a local level, while transmitting their information, as needed, to a regional monitoring agent, responsible for integrating in real-time local information into a large-scale “picture” of the health of the infrastructure. A key motivation is to exploit spatial and temporal correlations between measured variables using learning methods, and to develop the tools and design methodologies that will prevent relatively “small” faults or unexpected events from causing significant disruption or complete system failures in complex distributed dynamical systems."
Max ERC Funding
2 035 200 €
Duration
Start date: 2012-04-01, End date: 2018-03-31
Project acronym FDP-MBH
Project Fundamental dynamical processes near massive black holes in galactic nuclei
Researcher (PI) Tal Alexander
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), PE7, ERC-2007-StG
Summary "I propose to combine analytical studies and simulations to explore fundamental open questions in the dynamics and statistical mechanics of stars near massive black holes. These directly affect key issues such as the rate of supply of single and binary stars to the black hole, the growth and evolution of single and binary massive black holes and the connections to the evolution of the host galaxy, capture of stars around the black hole, the rate and modes of gravitational wave emission from captured compact objects, stellar tidal heating and destruction, and the emergence of ""exotic"" stellar populations around massive black holes. These processes have immediate observational implications and relevance in view of the huge amounts of data on massive black holes and galactic nuclei coming from earth-bound and space-borne telescopes, from across the electromagnetic spectrum, from cosmic rays, and in the near future also from neutrinos and gravitational waves."
Summary
"I propose to combine analytical studies and simulations to explore fundamental open questions in the dynamics and statistical mechanics of stars near massive black holes. These directly affect key issues such as the rate of supply of single and binary stars to the black hole, the growth and evolution of single and binary massive black holes and the connections to the evolution of the host galaxy, capture of stars around the black hole, the rate and modes of gravitational wave emission from captured compact objects, stellar tidal heating and destruction, and the emergence of ""exotic"" stellar populations around massive black holes. These processes have immediate observational implications and relevance in view of the huge amounts of data on massive black holes and galactic nuclei coming from earth-bound and space-borne telescopes, from across the electromagnetic spectrum, from cosmic rays, and in the near future also from neutrinos and gravitational waves."
Max ERC Funding
880 000 €
Duration
Start date: 2008-09-01, End date: 2013-08-31
Project acronym FIELDGRADIENTS
Project Phase Transitions and Chemical Reactions in Electric Field Gradients
Researcher (PI) Yoav Tsori
Host Institution (HI) BEN-GURION UNIVERSITY OF THE NEGEV
Call Details Starting Grant (StG), PE3, ERC-2010-StG_20091028
Summary We will study phase transitions and chemical and biological reactions in liquid mixtures
in electric field gradients. These new phase transitions are essential in statistical
physics and thermodynamics. We will examine theoretically the complex and yet unexplored
phase ordering dynamics in which droplets nucleate and move under the external nonuniform
force. We will look in detail at the interfacial instabilities which develop when the
field is increased. We will investigate how time-varying potentials produce
electromagnetic waves and how their spatial decay in the bistable liquid leads to phase
changes.
These transitions open a new and general way to control the spatio-temporal behaviour of
chemical reactions by directly manipulating the solvents' concentrations. When two or more
reagents are preferentially soluble in one of the mixture's components, field-induced
phase separation leads to acceleration of the reaction. When the reagents are soluble in
different solvents, field-induced demixing will lead to the reaction taking place at a
slow rate and at a two-dimensional surface. Additionally, the electric field allows us to
turn the reaction on or off. The numerical study and simulations will be complemented by
experiments. We will study theoretically and experimentally biochemical reactions. We will
find how actin-related structures are affected by field gradients. Using an electric field
as a tool we will control the rate of actin polymerisation. We will investigate if an
external field can damage cancer cells by disrupting their actin-related activity. The above
phenomena will be studied in a microfluidics environment. We will elucidate the separation
hydrodynamics occurring when thermodynamically miscible liquids flow in a channel and how
electric fields can reversibly create and destroy optical interfaces, as is relevant in
optofluidics. Chemical and biological reactions will be examined in the context of
lab-on-a-chip.
Summary
We will study phase transitions and chemical and biological reactions in liquid mixtures
in electric field gradients. These new phase transitions are essential in statistical
physics and thermodynamics. We will examine theoretically the complex and yet unexplored
phase ordering dynamics in which droplets nucleate and move under the external nonuniform
force. We will look in detail at the interfacial instabilities which develop when the
field is increased. We will investigate how time-varying potentials produce
electromagnetic waves and how their spatial decay in the bistable liquid leads to phase
changes.
These transitions open a new and general way to control the spatio-temporal behaviour of
chemical reactions by directly manipulating the solvents' concentrations. When two or more
reagents are preferentially soluble in one of the mixture's components, field-induced
phase separation leads to acceleration of the reaction. When the reagents are soluble in
different solvents, field-induced demixing will lead to the reaction taking place at a
slow rate and at a two-dimensional surface. Additionally, the electric field allows us to
turn the reaction on or off. The numerical study and simulations will be complemented by
experiments. We will study theoretically and experimentally biochemical reactions. We will
find how actin-related structures are affected by field gradients. Using an electric field
as a tool we will control the rate of actin polymerisation. We will investigate if an
external field can damage cancer cells by disrupting their actin-related activity. The above
phenomena will be studied in a microfluidics environment. We will elucidate the separation
hydrodynamics occurring when thermodynamically miscible liquids flow in a channel and how
electric fields can reversibly create and destroy optical interfaces, as is relevant in
optofluidics. Chemical and biological reactions will be examined in the context of
lab-on-a-chip.
Max ERC Funding
1 482 200 €
Duration
Start date: 2010-08-01, End date: 2015-07-31
Project acronym Fireworks
Project Celestial fireworks: revealing the physics of the time-variable sky
Researcher (PI) Avishay Gal-Yam
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Consolidator Grant (CoG), PE9, ERC-2016-COG
Summary Experimental time-domain astrophysics is on the verge of a new era as technological, computational, and operational progress combine to revolutionise the manner in which we study the time-variable sky. This proposal consolidates previous breakthrough work on wide-field surveys into a coherent program to advance our study of the variable sky on ever decreasing time-scales: from days, through hours, to minutes. We will watch how stars explode in real time in order to study the complex physics of stellar death, build new tools to handle and analyse the uniquely new data sets we are collecting, and shed light on some of the most fundamental questions in modern astrophysics: from the origin of the elements, via the explosions mechanism of supernova explosions, to the feedback processes that drive star formation and galaxy evolution.
Summary
Experimental time-domain astrophysics is on the verge of a new era as technological, computational, and operational progress combine to revolutionise the manner in which we study the time-variable sky. This proposal consolidates previous breakthrough work on wide-field surveys into a coherent program to advance our study of the variable sky on ever decreasing time-scales: from days, through hours, to minutes. We will watch how stars explode in real time in order to study the complex physics of stellar death, build new tools to handle and analyse the uniquely new data sets we are collecting, and shed light on some of the most fundamental questions in modern astrophysics: from the origin of the elements, via the explosions mechanism of supernova explosions, to the feedback processes that drive star formation and galaxy evolution.
Max ERC Funding
2 461 111 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym FLDcure
Project A potent Micro-RNA therapeutic for nonalcoholic fatty liver disease (NAFLD)
Researcher (PI) Hermona Soreq
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Proof of Concept (PoC), PC1, ERC-2014-PoC
Summary Fatty Liver disease (FLD) is a widespread disease which can often progress to Nonalcoholic steatohepatitis, cirrhosis and liver cancer. At present FLD disease affects a huge perportion of the population and prompt treatment will be of major health benefits to the general population. We have developed a specific therapeutic targeting a microRNA we and others have shown to be involved in the pathogenesis of FLD. This therapeutic agent can dramatically reduce FLD in a mouse model. We would like to extend the pre-clinical studies in order to encourage interest of a pharmaceutical company who will license the technology and pursue clinical trials.
Summary
Fatty Liver disease (FLD) is a widespread disease which can often progress to Nonalcoholic steatohepatitis, cirrhosis and liver cancer. At present FLD disease affects a huge perportion of the population and prompt treatment will be of major health benefits to the general population. We have developed a specific therapeutic targeting a microRNA we and others have shown to be involved in the pathogenesis of FLD. This therapeutic agent can dramatically reduce FLD in a mouse model. We would like to extend the pre-clinical studies in order to encourage interest of a pharmaceutical company who will license the technology and pursue clinical trials.
Max ERC Funding
149 800 €
Duration
Start date: 2015-01-01, End date: 2016-06-30
Project acronym FOC
Project Foundations of Cryptographic Hardness
Researcher (PI) Iftach Ilan Haitner
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), PE6, ERC-2014-STG
Summary A fundamental research challenge in modern cryptography is understanding the necessary hardness assumptions required to build different cryptographic primitives. Attempts to answer this question have gained tremendous success in the last 20-30 years. Most notably, it was shown that many highly complicated primitives can be based on the mere existence of one-way functions (i.e., easy to compute and hard to invert), while other primitives cannot be based on such functions. This research has yielded fundamental tools and concepts such as randomness extractors and computational notions of entropy. Yet many of the most fundamental questions remain unanswered.
Our first goal is to answer the fundamental question of whether cryptography can be based on the assumption that P not equal NP. Our second and third goals are to build a more efficient symmetric-key cryptographic primitives from one-way functions, and to establish effective methods for security amplification of cryptographic primitives. Succeeding in the second and last goals is likely to have great bearing on the way that we construct the very basic cryptographic primitives. A positive answer for the first question will be considered a dramatic result in the cryptography and computational complexity communities.
To address these goals, it is very useful to understand the relationship between different types and quantities of cryptographic hardness. Such understanding typically involves defining and manipulating different types of computational entropy, and comprehending the power of security reductions. We believe that this research will yield new concepts and techniques, with ramification beyond the realm of foundational cryptography.
Summary
A fundamental research challenge in modern cryptography is understanding the necessary hardness assumptions required to build different cryptographic primitives. Attempts to answer this question have gained tremendous success in the last 20-30 years. Most notably, it was shown that many highly complicated primitives can be based on the mere existence of one-way functions (i.e., easy to compute and hard to invert), while other primitives cannot be based on such functions. This research has yielded fundamental tools and concepts such as randomness extractors and computational notions of entropy. Yet many of the most fundamental questions remain unanswered.
Our first goal is to answer the fundamental question of whether cryptography can be based on the assumption that P not equal NP. Our second and third goals are to build a more efficient symmetric-key cryptographic primitives from one-way functions, and to establish effective methods for security amplification of cryptographic primitives. Succeeding in the second and last goals is likely to have great bearing on the way that we construct the very basic cryptographic primitives. A positive answer for the first question will be considered a dramatic result in the cryptography and computational complexity communities.
To address these goals, it is very useful to understand the relationship between different types and quantities of cryptographic hardness. Such understanding typically involves defining and manipulating different types of computational entropy, and comprehending the power of security reductions. We believe that this research will yield new concepts and techniques, with ramification beyond the realm of foundational cryptography.
Max ERC Funding
1 239 838 €
Duration
Start date: 2015-03-01, End date: 2021-02-28
Project acronym FORECASToneMONTH
Project Forecasting Surface Weather and Climate at One-Month Leads through Stratosphere-Troposphere Coupling
Researcher (PI) Chaim Israel Garfinkel
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), PE10, ERC-2015-STG
Summary Anomalies in surface temperatures, winds, and precipitation can significantly alter energy supply and demand, cause flooding, and cripple transportation networks. Better management of these impacts can be achieved by extending the duration of reliable predictions of the atmospheric circulation.
Polar stratospheric variability can impact surface weather for well over a month, and this proposed research presents a novel approach towards understanding the fundamentals of how this coupling occurs. Specifically, we are interested in: 1) how predictable are anomalies in the stratospheric circulation? 2) why do only some stratospheric events modify surface weather? and 3) what is the mechanism whereby stratospheric anomalies reach the surface? While this last question may appear academic, several studies indicate that stratosphere-troposphere coupling drives the midlatitude tropospheric response to climate change; therefore, a clearer understanding of the mechanisms will aid in the interpretation of the upcoming changes in the surface climate.
I propose a multi-pronged effort aimed at addressing these questions and improving monthly forecasting. First, carefully designed modelling experiments using a novel modelling framework will be used to clarify how, and under what conditions, stratospheric variability couples to tropospheric variability. Second, novel linkages between variability external to the stratospheric polar vortex and the stratospheric polar vortex will be pursued, thus improving our ability to forecast polar vortex variability itself. To these ends, my group will develop 1) an analytic model for Rossby wave propagation on the sphere, and 2) a simplified general circulation model, which captures the essential processes underlying stratosphere-troposphere coupling. By combining output from the new models, observational data, and output from comprehensive climate models, the connections between the stratosphere and surface climate will be elucidated.
Summary
Anomalies in surface temperatures, winds, and precipitation can significantly alter energy supply and demand, cause flooding, and cripple transportation networks. Better management of these impacts can be achieved by extending the duration of reliable predictions of the atmospheric circulation.
Polar stratospheric variability can impact surface weather for well over a month, and this proposed research presents a novel approach towards understanding the fundamentals of how this coupling occurs. Specifically, we are interested in: 1) how predictable are anomalies in the stratospheric circulation? 2) why do only some stratospheric events modify surface weather? and 3) what is the mechanism whereby stratospheric anomalies reach the surface? While this last question may appear academic, several studies indicate that stratosphere-troposphere coupling drives the midlatitude tropospheric response to climate change; therefore, a clearer understanding of the mechanisms will aid in the interpretation of the upcoming changes in the surface climate.
I propose a multi-pronged effort aimed at addressing these questions and improving monthly forecasting. First, carefully designed modelling experiments using a novel modelling framework will be used to clarify how, and under what conditions, stratospheric variability couples to tropospheric variability. Second, novel linkages between variability external to the stratospheric polar vortex and the stratospheric polar vortex will be pursued, thus improving our ability to forecast polar vortex variability itself. To these ends, my group will develop 1) an analytic model for Rossby wave propagation on the sphere, and 2) a simplified general circulation model, which captures the essential processes underlying stratosphere-troposphere coupling. By combining output from the new models, observational data, and output from comprehensive climate models, the connections between the stratosphere and surface climate will be elucidated.
Max ERC Funding
1 808 000 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym FORMAT
Project FORMAT: a novel medium FOr Revolutionizing stem cell MAnufacturing Technologies
Researcher (PI) Yaqub HANNA
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Proof of Concept (PoC), PC1, ERC-2015-PoC
Summary One of the greatest challenges facing society is treating patients afflicted with degenerative and age-related disorders, such as multiple sclerosis, Parkinson’s disease and diabetes. For all of these, stem cell therapy represents a novel treatment approach and a great hope for the millions of patients worldwide. In my ERC-funded research project, we successfully managed to generate Extracellular-signal-Regulated Kinases (ERK) signalling independent human naïve Pluripotent Stem Cells (PSCs), a new category of human stem cells that retrain features that are characteristic of earlier developmental stages, i.e. they are more “primitive” than typical/conventional human PSCs. We managed to do so by employing a novel medium that allows the acquisition of many apparent naïve features that were previously observed only in rodent pluripotent stem cells. Importantly, this new pluripotent configuration, not only comes in different molecular flavours, but also has different functional properties. In turn, the first goal of our PoC project is to establish the technical feasibility of our novel medium by carrying out a series of molecular and functional tests. Such tests would enable us to further improve the existing medium conditions and create a commercial-like platform for enhanced expansion and derivation of human naïve induced PSCs. The second goal of FORMAT project is to establish the commercialization potential of our novel medium as a means to maintain standardized cells that can in turn be used to replenish, regenerate and repair damaged human tissues.
Summary
One of the greatest challenges facing society is treating patients afflicted with degenerative and age-related disorders, such as multiple sclerosis, Parkinson’s disease and diabetes. For all of these, stem cell therapy represents a novel treatment approach and a great hope for the millions of patients worldwide. In my ERC-funded research project, we successfully managed to generate Extracellular-signal-Regulated Kinases (ERK) signalling independent human naïve Pluripotent Stem Cells (PSCs), a new category of human stem cells that retrain features that are characteristic of earlier developmental stages, i.e. they are more “primitive” than typical/conventional human PSCs. We managed to do so by employing a novel medium that allows the acquisition of many apparent naïve features that were previously observed only in rodent pluripotent stem cells. Importantly, this new pluripotent configuration, not only comes in different molecular flavours, but also has different functional properties. In turn, the first goal of our PoC project is to establish the technical feasibility of our novel medium by carrying out a series of molecular and functional tests. Such tests would enable us to further improve the existing medium conditions and create a commercial-like platform for enhanced expansion and derivation of human naïve induced PSCs. The second goal of FORMAT project is to establish the commercialization potential of our novel medium as a means to maintain standardized cells that can in turn be used to replenish, regenerate and repair damaged human tissues.
Max ERC Funding
150 000 €
Duration
Start date: 2017-01-01, End date: 2018-06-30
Project acronym FQHE
Project Statistics of Fractionally Charged Quasi-Particles
Researcher (PI) Mordehai (Moty) Heiblum
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Advanced Grant (AdG), PE3, ERC-2008-AdG
Summary The discovery of the fractional quantum Hall effect created a revolution in solid state research by introducing a new state of matter resulting from strong electron interactions. The new state is characterized by excitations (quasi-particles) that carry fractional charge, which are expected to obey fractional statistics. While odd denominator fractional states are expected to have an abelian statistics, the newly discovered 5/2 even denominator fractional state is expected to have a non-abelian statistics. Moreover, a large number of emerging proposals predict that the latter state can be employed for topological quantum computing ( Station Q was founded by Microsoft Corp. in order to pursue this goal). This proposal aims at studying the abelian and non-abelian fractional charges, and in particular to observe their peculiar statistics. While charges are preferably determined by measuring quantum shot noise, their statistics must be determined via interference experiments, where one particle goes around another. The experiments are very demanding since the even denominator fractions turn to be very fragile and thus can be observed only in the purest possible two dimensional electron gas and at the lowest temperatures. While until very recently such high quality samples were available only by a single grower (in the USA), we have the capability now to grow extremely pure samples with profound even denominator states. As will be detailed in the proposal, we have all the necessary tools to study charge and statistics of these fascinating excitations, due to our experience in crystal growth, shot noise and interferometry measurements.
Summary
The discovery of the fractional quantum Hall effect created a revolution in solid state research by introducing a new state of matter resulting from strong electron interactions. The new state is characterized by excitations (quasi-particles) that carry fractional charge, which are expected to obey fractional statistics. While odd denominator fractional states are expected to have an abelian statistics, the newly discovered 5/2 even denominator fractional state is expected to have a non-abelian statistics. Moreover, a large number of emerging proposals predict that the latter state can be employed for topological quantum computing ( Station Q was founded by Microsoft Corp. in order to pursue this goal). This proposal aims at studying the abelian and non-abelian fractional charges, and in particular to observe their peculiar statistics. While charges are preferably determined by measuring quantum shot noise, their statistics must be determined via interference experiments, where one particle goes around another. The experiments are very demanding since the even denominator fractions turn to be very fragile and thus can be observed only in the purest possible two dimensional electron gas and at the lowest temperatures. While until very recently such high quality samples were available only by a single grower (in the USA), we have the capability now to grow extremely pure samples with profound even denominator states. As will be detailed in the proposal, we have all the necessary tools to study charge and statistics of these fascinating excitations, due to our experience in crystal growth, shot noise and interferometry measurements.
Max ERC Funding
2 000 000 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym FRACTALSANDMETRICNT
Project Fractals, algebraic dynamics and metric number theory
Researcher (PI) Michael Hochman
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), PE1, ERC-2012-StG_20111012
Summary We propose to study the fractal geometry of invariant sets for endomorphisms of compact abelian groups, specifically a family of conjectures by Furstenberg on the dimensions of orbit closures under such dynamics, and on the size of sums and intersections of invariant sets. These conjectures are related to problems on expansion in integer bases, in Diophantine approximation, measure rigidity, analysis and equidistribution. The project focuses on the conjectures themselves and some related problems, e.g. Bernoulli convolutions, and on applications to equidistribution on tori. Our approach combines tools from ergodic theory, geometric measure theory and additive combinatorics, building on recent progress in these fields and recent partial results towards the main conjectures.
Summary
We propose to study the fractal geometry of invariant sets for endomorphisms of compact abelian groups, specifically a family of conjectures by Furstenberg on the dimensions of orbit closures under such dynamics, and on the size of sums and intersections of invariant sets. These conjectures are related to problems on expansion in integer bases, in Diophantine approximation, measure rigidity, analysis and equidistribution. The project focuses on the conjectures themselves and some related problems, e.g. Bernoulli convolutions, and on applications to equidistribution on tori. Our approach combines tools from ergodic theory, geometric measure theory and additive combinatorics, building on recent progress in these fields and recent partial results towards the main conjectures.
Max ERC Funding
1 107 000 €
Duration
Start date: 2012-10-01, End date: 2018-09-30
Project acronym FRACTFRICT
Project Fracture and Friction: Rapid Dynamics of Material Failure
Researcher (PI) Jay Fineberg
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Advanced Grant (AdG), PE3, ERC-2010-AdG_20100224
Summary FractFrict is a comprehensive study of the space-time dynamics that lead to the failure of both bulk materials and frictionally bound interfaces. In these systems, failure is precipitated by rapidly moving singular fields at the tips of propagating cracks or crack-like fronts that cause material damage at microscopic scales. These generate damage that is macroscopically reflected as characteristic large-scale, modes of material failure. Thus, the structure of the fields that microscopically drive failure is critically important for an overall understanding of how macroscopic failure occurs.
The innovative real-time measurements proposed here will provide fundamental understanding of the form of the singular fields, their modes of regularization and their relation to the resultant macroscopic modes of failure. Encompassing different classes of bulk materials and material interfaces.
We aim to:
[1] To establish a fundamental understanding of the dynamics of the near-tip singular fields, their regularization modes and how they couple to the macroscopic dynamics in both frictional motion and fracture.
[2] To determine the types of singular failure processes in different classes of materials and interfaces (e.g. the brittle to ductile transition in amorphous materials, the role of fast fracture processes in frictional motion).
[3] To establish local (microscopic) laws of friction/failure and how they evolve into their macroscopic counterparts
[4]. To identify the existence and origins of crack instabilities in bulk and interface failure
The insights obtained in this research will enable us to manipulate and/or predict material failure modes. The results of this study will shed considerable new light on fundamental open questions in fields as diverse as material design, tribology and geophysics.
Summary
FractFrict is a comprehensive study of the space-time dynamics that lead to the failure of both bulk materials and frictionally bound interfaces. In these systems, failure is precipitated by rapidly moving singular fields at the tips of propagating cracks or crack-like fronts that cause material damage at microscopic scales. These generate damage that is macroscopically reflected as characteristic large-scale, modes of material failure. Thus, the structure of the fields that microscopically drive failure is critically important for an overall understanding of how macroscopic failure occurs.
The innovative real-time measurements proposed here will provide fundamental understanding of the form of the singular fields, their modes of regularization and their relation to the resultant macroscopic modes of failure. Encompassing different classes of bulk materials and material interfaces.
We aim to:
[1] To establish a fundamental understanding of the dynamics of the near-tip singular fields, their regularization modes and how they couple to the macroscopic dynamics in both frictional motion and fracture.
[2] To determine the types of singular failure processes in different classes of materials and interfaces (e.g. the brittle to ductile transition in amorphous materials, the role of fast fracture processes in frictional motion).
[3] To establish local (microscopic) laws of friction/failure and how they evolve into their macroscopic counterparts
[4]. To identify the existence and origins of crack instabilities in bulk and interface failure
The insights obtained in this research will enable us to manipulate and/or predict material failure modes. The results of this study will shed considerable new light on fundamental open questions in fields as diverse as material design, tribology and geophysics.
Max ERC Funding
2 265 399 €
Duration
Start date: 2010-12-01, End date: 2016-11-30
Project acronym FROMCHILDTOPARENT
Project From the Child's Genes to Parental Environment and Back to the Child: Gene-environment Correlations in Early Social Development
Researcher (PI) Ariel Knafo
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), SH4, ERC-2009-StG
Summary The role of children's behavior and temperament is increasingly acknowledged in family research. Gene-environment Correlation (rGE) processes may account for some child effects, as parents react to children s behavior which is in part genetically influenced (evocative rGE). In addition, passive rGE, in which parenting and children s behavior are correlated through overlapping genetic influences on family members behavior may account in part for the parenting-child behavior relationships. The proposed project will be the first one to directly address these issues with DNA information on family members and quality observational data on parent and child behaviors, following children through early development. Two separate longitudinal studies will investigate the paths from children s genes to their behavior, to the way parents react and modify their parenting towards the child, affecting child development: Study 1 will follow first-time parents from pregnancy through children s early childhood, decoupling parent effect and child effects. Study 2 will follow dizygotic twins and their parents through middle childhood, capitalizing on genetic differences between twins reared by the same parents. We will test the hypothesis that parents' characteristics, such as parenting style and parental attitudes, are associated with children's genetic tendencies. Both parenting and child behaviors will be monitored consecutively, to investigate the co-development of parents and children in an evocative rGE process. Child and parent candidate genes relevant to social behavior, notably those from the dompaminergic and serotonergic systems, will be linked to parents behaviors. Pilot results show children s genes predict parenting, and an important task for the study will be to identify mediators of this effect, such as children s temperament. We will lay the ground for further research into the complexity of gene-environment correlations as children and parents co-develop.
Summary
The role of children's behavior and temperament is increasingly acknowledged in family research. Gene-environment Correlation (rGE) processes may account for some child effects, as parents react to children s behavior which is in part genetically influenced (evocative rGE). In addition, passive rGE, in which parenting and children s behavior are correlated through overlapping genetic influences on family members behavior may account in part for the parenting-child behavior relationships. The proposed project will be the first one to directly address these issues with DNA information on family members and quality observational data on parent and child behaviors, following children through early development. Two separate longitudinal studies will investigate the paths from children s genes to their behavior, to the way parents react and modify their parenting towards the child, affecting child development: Study 1 will follow first-time parents from pregnancy through children s early childhood, decoupling parent effect and child effects. Study 2 will follow dizygotic twins and their parents through middle childhood, capitalizing on genetic differences between twins reared by the same parents. We will test the hypothesis that parents' characteristics, such as parenting style and parental attitudes, are associated with children's genetic tendencies. Both parenting and child behaviors will be monitored consecutively, to investigate the co-development of parents and children in an evocative rGE process. Child and parent candidate genes relevant to social behavior, notably those from the dompaminergic and serotonergic systems, will be linked to parents behaviors. Pilot results show children s genes predict parenting, and an important task for the study will be to identify mediators of this effect, such as children s temperament. We will lay the ground for further research into the complexity of gene-environment correlations as children and parents co-develop.
Max ERC Funding
1 443 687 €
Duration
Start date: 2010-01-01, End date: 2015-12-31
Project acronym FSC
Project Fast and Sound Cryptography: From Theoretical Foundations to Practical Constructions
Researcher (PI) Alon Rosen
Host Institution (HI) INTERDISCIPLINARY CENTER (IDC) HERZLIYA
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary "Much currently deployed cryptography is designed using more “art'” than “science,” and most of the schemes used in practice lack rigorous justification for their security. While theoretically sound designs do exist, they tend to be quite a bit slower to run and hence are not realistic from a practical point of view. This gap is especially evident in “low-level” cryptographic primitives, which are the building blocks that ultimately process the largest quantities of data.
Recent years have witnessed dramatic progress in the understanding of highly-parallelizable (local) cryptography, and in the construction of schemes based on the mathematics of geometric objects called lattices. Besides being based on firm theoretical foundations, these schemes also allow for very efficient implementations, especially on modern microprocessors. Yet despite all this recent progress, there has not yet been a major effort specifically focused on bringing the efficiency of such constructions as close as possible to practicality; this project will do exactly that.
The main goal of the Fast and Sound Cryptography project is to develop new tools and techniques that would lead to practical and theoretically sound implementations of cryptographic primitives. We plan to draw ideas from both theory and practice, and expect their combination to generate new questions, conjectures, and insights. A considerable fraction of our efforts will be devoted to demonstrating the efficiency of our constructions. This will be achieved by a concrete setting of parameters, allowing for cryptanalysis and direct performance comparison to popular designs.
While our initial focus will be on low-level primitives, we expect our research to also have direct impact on the practical efficiency of higher-level cryptographic tasks. Indeed, many of the recent improvements in the efficiency of lattice-based public-key cryptography can be traced back to research on the efficiency of lattice-based hash functions."
Summary
"Much currently deployed cryptography is designed using more “art'” than “science,” and most of the schemes used in practice lack rigorous justification for their security. While theoretically sound designs do exist, they tend to be quite a bit slower to run and hence are not realistic from a practical point of view. This gap is especially evident in “low-level” cryptographic primitives, which are the building blocks that ultimately process the largest quantities of data.
Recent years have witnessed dramatic progress in the understanding of highly-parallelizable (local) cryptography, and in the construction of schemes based on the mathematics of geometric objects called lattices. Besides being based on firm theoretical foundations, these schemes also allow for very efficient implementations, especially on modern microprocessors. Yet despite all this recent progress, there has not yet been a major effort specifically focused on bringing the efficiency of such constructions as close as possible to practicality; this project will do exactly that.
The main goal of the Fast and Sound Cryptography project is to develop new tools and techniques that would lead to practical and theoretically sound implementations of cryptographic primitives. We plan to draw ideas from both theory and practice, and expect their combination to generate new questions, conjectures, and insights. A considerable fraction of our efforts will be devoted to demonstrating the efficiency of our constructions. This will be achieved by a concrete setting of parameters, allowing for cryptanalysis and direct performance comparison to popular designs.
While our initial focus will be on low-level primitives, we expect our research to also have direct impact on the practical efficiency of higher-level cryptographic tasks. Indeed, many of the recent improvements in the efficiency of lattice-based public-key cryptography can be traced back to research on the efficiency of lattice-based hash functions."
Max ERC Funding
1 498 214 €
Duration
Start date: 2012-10-01, End date: 2017-09-30
Project acronym FTHPC
Project Fault Tolerant High Performance Computing
Researcher (PI) Oded Schwartz
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Consolidator Grant (CoG), PE6, ERC-2018-COG
Summary Supercomputers are strategically crucial for facilitating advances in science and technology: in climate change research, accelerated genome sequencing towards cancer treatments, cutting edge physics, devising engineering innovative solutions, and many other compute intensive problems. However, the future of super-computing depends on our ability to cope with the ever increasing rate of faults (bit flips and component failure), resulting from the steadily increasing machine size and decreasing operating voltage. Indeed, hardware trends predict at least two faults per minute for next generation (exascale) supercomputers.
The challenge of ascertaining fault tolerance for high-performance computing is not new, and has been the focus of extensive research for over two decades. However, most solutions are either (i) general purpose, requiring little to no algorithmic effort, but severely degrading performance (e.g., checkpoint-restart), or (ii) tailored to specific applications and very efficient, but requiring high expertise and significantly increasing programmers' workload. We seek the best of both worlds: high performance and general purpose fault resilience.
Efficient general purpose solutions (e.g., via error correcting codes) have revolutionized memory and communication devices over two decades ago, enabling programmers to effectively disregard the very
likely memory and communication errors. The time has come for a similar paradigm shift in the computing regimen. I argue that exciting recent advances in error correcting codes, and in short probabilistically checkable proofs, make this goal feasible. Success along these lines will eliminate the bottleneck of required fault-tolerance expertise, and open exascale computing to all algorithm designers and programmers, for the benefit of the scientific, engineering, and industrial communities.
Summary
Supercomputers are strategically crucial for facilitating advances in science and technology: in climate change research, accelerated genome sequencing towards cancer treatments, cutting edge physics, devising engineering innovative solutions, and many other compute intensive problems. However, the future of super-computing depends on our ability to cope with the ever increasing rate of faults (bit flips and component failure), resulting from the steadily increasing machine size and decreasing operating voltage. Indeed, hardware trends predict at least two faults per minute for next generation (exascale) supercomputers.
The challenge of ascertaining fault tolerance for high-performance computing is not new, and has been the focus of extensive research for over two decades. However, most solutions are either (i) general purpose, requiring little to no algorithmic effort, but severely degrading performance (e.g., checkpoint-restart), or (ii) tailored to specific applications and very efficient, but requiring high expertise and significantly increasing programmers' workload. We seek the best of both worlds: high performance and general purpose fault resilience.
Efficient general purpose solutions (e.g., via error correcting codes) have revolutionized memory and communication devices over two decades ago, enabling programmers to effectively disregard the very
likely memory and communication errors. The time has come for a similar paradigm shift in the computing regimen. I argue that exciting recent advances in error correcting codes, and in short probabilistically checkable proofs, make this goal feasible. Success along these lines will eliminate the bottleneck of required fault-tolerance expertise, and open exascale computing to all algorithm designers and programmers, for the benefit of the scientific, engineering, and industrial communities.
Max ERC Funding
1 824 467 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym FUNMANIA
Project Functional nano Materials for Neuronal Interfacing Applications
Researcher (PI) Yael Hanein
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), PE7, ERC-2012-StG_20111012
Summary Recent advances in nano technologies provide an exciting new tool-box best suited for stimulating and monitoring neurons at a very high accuracy and with improved bio-compatibility. In this project we propose the development of an innovative nano-material based platform to interface with neurons in-vivo, with unprecedented resolution. In particular we aim to form the building blocks for future sight restoration devices. By doing so we will address one of the most challenging and important applications in the realm of in-vivo neuronal stimulation: high-acuity artificial retina.
Existing technologies in the field of artificial retinas offer only very limited acuity and a radically new approach is needed to make the needed leap to achieve high-resolution stimulation. In this project we propose the development of flexible, electrically conducting, optically addressable and vertically aligned carbon nanotube based electrodes as a novel platform for targeting neurons at high fidelity. The morphology and density of the aligned tubes will mimic that of the retina photo-receptors to achieve record-high resolution.
The most challenging element of the project is the transduction from an optical signal to electrical activations at high resolution placing this effort at the forefront of nano-science and nano-technology research. To deal with this difficult challenge, vertically aligned carbon nanotubes will be conjugated with additional engineered materials, such as conducting polymers and quantum dots to build a supreme platform allowing unprecedented resolution and bio-compatibility. Ultimately, in this project we will focus on devising materials and processes that will become the building blocks of future devices so high density retinal implants and consequent sight restoration will become a reality in the conceivable future.
Summary
Recent advances in nano technologies provide an exciting new tool-box best suited for stimulating and monitoring neurons at a very high accuracy and with improved bio-compatibility. In this project we propose the development of an innovative nano-material based platform to interface with neurons in-vivo, with unprecedented resolution. In particular we aim to form the building blocks for future sight restoration devices. By doing so we will address one of the most challenging and important applications in the realm of in-vivo neuronal stimulation: high-acuity artificial retina.
Existing technologies in the field of artificial retinas offer only very limited acuity and a radically new approach is needed to make the needed leap to achieve high-resolution stimulation. In this project we propose the development of flexible, electrically conducting, optically addressable and vertically aligned carbon nanotube based electrodes as a novel platform for targeting neurons at high fidelity. The morphology and density of the aligned tubes will mimic that of the retina photo-receptors to achieve record-high resolution.
The most challenging element of the project is the transduction from an optical signal to electrical activations at high resolution placing this effort at the forefront of nano-science and nano-technology research. To deal with this difficult challenge, vertically aligned carbon nanotubes will be conjugated with additional engineered materials, such as conducting polymers and quantum dots to build a supreme platform allowing unprecedented resolution and bio-compatibility. Ultimately, in this project we will focus on devising materials and processes that will become the building blocks of future devices so high density retinal implants and consequent sight restoration will become a reality in the conceivable future.
Max ERC Funding
1 499 560 €
Duration
Start date: 2012-10-01, End date: 2018-09-30
Project acronym GAME-DYNAMICS
Project Game Theory: Dynamic Approaches
Researcher (PI) Sergiu Hart
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Advanced Grant (AdG), SH1, ERC-2009-AdG
Summary The general framework is that of game theory, with multiple participants ( players ) that interact repeatedly over time. The players may be people, corporations, nations, computers even genes. While many of the standard concepts of game theory are static by their very nature (for example, strategic equilibria and cooperative solutions), it is of utmost importance theoretically as well as in applications to study dynamic processes, and relate them to appropriate static solutions. This is a fundamental issue. On the one hand, the significance of a solution depends in particular on how easy it is to reach it. On the other hand, natural dynamics, that is, processes that to a certain degree reflect observed behaviors and actual institutions, are important to study and understand in their own right. We propose to work on three main areas. First, adaptive dynamics: the goal is to characterize those classes of dynamics for which convergence to Nash or correlated equilibria can be obtained, and those for which it cannot, and to find and study natural dynamics that are related to actual behavior and yield useful insights. Second, evolutionary dynamics: the goal is to investigate evolutionary and similar dynamics, with a particular emphasis on understanding the role that large populations may play, and on characterizing which equilibria are evolutionarily stable and which are not. Third, bargaining and cooperation: the goal is to develop a general research program that studies natural bargaining procedures that lead to cooperation and are based directly on the strategic form; some particular aims are to establish connections between the bargaining institutions and the resulting cooperative solutions, and to analyze relevant economic models.
Summary
The general framework is that of game theory, with multiple participants ( players ) that interact repeatedly over time. The players may be people, corporations, nations, computers even genes. While many of the standard concepts of game theory are static by their very nature (for example, strategic equilibria and cooperative solutions), it is of utmost importance theoretically as well as in applications to study dynamic processes, and relate them to appropriate static solutions. This is a fundamental issue. On the one hand, the significance of a solution depends in particular on how easy it is to reach it. On the other hand, natural dynamics, that is, processes that to a certain degree reflect observed behaviors and actual institutions, are important to study and understand in their own right. We propose to work on three main areas. First, adaptive dynamics: the goal is to characterize those classes of dynamics for which convergence to Nash or correlated equilibria can be obtained, and those for which it cannot, and to find and study natural dynamics that are related to actual behavior and yield useful insights. Second, evolutionary dynamics: the goal is to investigate evolutionary and similar dynamics, with a particular emphasis on understanding the role that large populations may play, and on characterizing which equilibria are evolutionarily stable and which are not. Third, bargaining and cooperation: the goal is to develop a general research program that studies natural bargaining procedures that lead to cooperation and are based directly on the strategic form; some particular aims are to establish connections between the bargaining institutions and the resulting cooperative solutions, and to analyze relevant economic models.
Max ERC Funding
1 361 000 €
Duration
Start date: 2010-01-01, End date: 2015-12-31
Project acronym GAtransport
Project A direct, multi-faceted approach to investigate plant hormones spatial regulation: the case of gibberellins
Researcher (PI) Roy Weinstain
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), LS3, ERC-2015-STG
Summary Plants evolved a unique molecular mechanism that spatially regulate auxin, forming finely tuned gradients and local maxima of auxin that inform and direct developmental patterning and adaptive growth processes. Recent findings call into question the uniqueness of polar auxin transport in the sense that more plant hormones seem to be actively transported. Although still lacking many mechanistic details, as well as comprehensive functional connotations, these findings warrant a more thorough investigation into the prospect of a broader scope for plants spatial regulation capacity in the context of additional hormones. Critically, we lack an effective set of tools to directly investigate and dissect the particulars of plant hormones mobility at the molecular level. My long-term goal is to provide a molecular and mechanistic understanding of plant hormones dynamics that will augment our evolving model of how they are regulated and how they convey information. Here, I hypothesize that GA mobility in plants is controlled and directed by an active transport mechanism to form distinct distribution patterns that affect signaling. I will test my hypothesis with a multi-faceted and multi-disciplinary approach, combining: fluorescent labeling of key gibberellins to map their accumulation sites in whole plants and at the sub-cellular level; chemical-biology strategies that facilitate manipulation of GA “origin point” in planta to map and quantify GA flow pathways; probe-based genetic screens and un-biased photo-affinity labeling to identify proteins affecting GA mobility; and genetic and molecular biology techniques to characterize identified proteins’ functions. I expect to offer an exceptional, detailed view into the inner workings of gibberellins dynamics in planta and into the mechanisms driving it. I further anticipate that the strategies developed here to specifically address gibberellins could be straightforwardly re-tailored to investigate additional plant hormones.
Summary
Plants evolved a unique molecular mechanism that spatially regulate auxin, forming finely tuned gradients and local maxima of auxin that inform and direct developmental patterning and adaptive growth processes. Recent findings call into question the uniqueness of polar auxin transport in the sense that more plant hormones seem to be actively transported. Although still lacking many mechanistic details, as well as comprehensive functional connotations, these findings warrant a more thorough investigation into the prospect of a broader scope for plants spatial regulation capacity in the context of additional hormones. Critically, we lack an effective set of tools to directly investigate and dissect the particulars of plant hormones mobility at the molecular level. My long-term goal is to provide a molecular and mechanistic understanding of plant hormones dynamics that will augment our evolving model of how they are regulated and how they convey information. Here, I hypothesize that GA mobility in plants is controlled and directed by an active transport mechanism to form distinct distribution patterns that affect signaling. I will test my hypothesis with a multi-faceted and multi-disciplinary approach, combining: fluorescent labeling of key gibberellins to map their accumulation sites in whole plants and at the sub-cellular level; chemical-biology strategies that facilitate manipulation of GA “origin point” in planta to map and quantify GA flow pathways; probe-based genetic screens and un-biased photo-affinity labeling to identify proteins affecting GA mobility; and genetic and molecular biology techniques to characterize identified proteins’ functions. I expect to offer an exceptional, detailed view into the inner workings of gibberellins dynamics in planta and into the mechanisms driving it. I further anticipate that the strategies developed here to specifically address gibberellins could be straightforwardly re-tailored to investigate additional plant hormones.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-02-01, End date: 2021-01-31
Project acronym GELANDERINDGEOMRGD
Project Independence of Group Elements and Geometric Rigidity
Researcher (PI) Tsachik Gelander
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), PE1, ERC-2007-StG
Summary The proposed research contains two main directions in group theory and geometry: Independence of Group Elements and Geometric Rigidity. The first consists of problems related to the existence of free subgroups, uniform and effective ways of producing such, and analogous questions for finite groups where the analog of independent elements are elements for which the Cayley graph has a large girth, or non-small expanding constant. This line of research began almost a century ago and contains many important works including works of Hausdorff, Banach and Tarski on paradoxical decompositions, works of Margulis, Sullivan and Drinfeld on the Banach-Ruziewicz problem, the classical Tits Alternative, Margulis-Soifer result on maximal subgroups, the recent works of Eskin-Mozes-Oh and Bourgain-Gamburd, etc. Among the famous questions is Milnor's problem on the exponential verses polynomial growth for f.p. groups, originally stated for f.g. groups but reformulated after Grigorchuk's counterexample. Related works of the PI includes a joint work with Breuillard on the topological Tits alternative, where several well known conjectures were solved, e.g. the foliated version of Milnor's problem conjectured by Carriere, and on the uniform Tits alternative which significantly improved Tits' and EMO theorems. A joint work with Glasner on primitive groups where in particular a conjecture of Higman and Neumann was solved. A paper on the deformation varieties where a conjecture of Margulis and Soifer and a conjecture of Goldman were proved. The second involves extensions of Margulis' and Mostow's rigidity theorems to actions of lattices in general topological groups on metric spaces, and extensions of Kazhdan's property (T) for group actions on Banach and metric spaces. This area is very active today. Related work of the PI includes his joint work with Karlsson and Margulis on generalized harmonic maps, and his joint work with Bader, Furman and Monod on actions on Banach spaces.
Summary
The proposed research contains two main directions in group theory and geometry: Independence of Group Elements and Geometric Rigidity. The first consists of problems related to the existence of free subgroups, uniform and effective ways of producing such, and analogous questions for finite groups where the analog of independent elements are elements for which the Cayley graph has a large girth, or non-small expanding constant. This line of research began almost a century ago and contains many important works including works of Hausdorff, Banach and Tarski on paradoxical decompositions, works of Margulis, Sullivan and Drinfeld on the Banach-Ruziewicz problem, the classical Tits Alternative, Margulis-Soifer result on maximal subgroups, the recent works of Eskin-Mozes-Oh and Bourgain-Gamburd, etc. Among the famous questions is Milnor's problem on the exponential verses polynomial growth for f.p. groups, originally stated for f.g. groups but reformulated after Grigorchuk's counterexample. Related works of the PI includes a joint work with Breuillard on the topological Tits alternative, where several well known conjectures were solved, e.g. the foliated version of Milnor's problem conjectured by Carriere, and on the uniform Tits alternative which significantly improved Tits' and EMO theorems. A joint work with Glasner on primitive groups where in particular a conjecture of Higman and Neumann was solved. A paper on the deformation varieties where a conjecture of Margulis and Soifer and a conjecture of Goldman were proved. The second involves extensions of Margulis' and Mostow's rigidity theorems to actions of lattices in general topological groups on metric spaces, and extensions of Kazhdan's property (T) for group actions on Banach and metric spaces. This area is very active today. Related work of the PI includes his joint work with Karlsson and Margulis on generalized harmonic maps, and his joint work with Bader, Furman and Monod on actions on Banach spaces.
Max ERC Funding
750 000 €
Duration
Start date: 2008-07-01, End date: 2013-12-31
Project acronym Gendever
Project Genome, the Edited Version: DNA and RNA Editing of Mammalian Retroelements
Researcher (PI) Erez Levanon
Host Institution (HI) BAR ILAN UNIVERSITY
Call Details Starting Grant (StG), LS2, ERC-2012-StG_20111109
Summary It is generally thought that an organism contains the exactly same genomic information in all its cells and that a genome remains unaltered throughout the organism’s life, with the exception of rare and random somatic mutations that might occur. This genomic information will also serve as a template for exact RNA copies. However, endogenous and powerful means of creating inner genomic diversity are known to exist: (1) RNA editing that leads to alteration of one nucleotide into another, (mainly A-to-I); (2) DNA editing that changes the DNA’s content by shifting C-into-U; (3) active retroelements that can insert copies of their sequences into new locations in a genome.
Recently, we and others have found that although considered extremely rare, all three mechanisms are active somatically or at least leave traces of their occurrence in the genome, and are linked together, as most editing events occur in retroelements. However, the magnitude and scope of these mechanisms, which can lead to huge diversity and complexity within an organism and even within a cell, are still a mystery. This explosion of genomic variety can have dramatic effect on diverse biological processes, such as brain complexity, cancer and evolution acceleration.
In GENEDVER, we aim to perform the first genome-wide mapping of editing and active retroelements in various genomes using a combination of computational and genomic approaches. Specifically, we will develop a strategy to detect RNA and DNA editing in retroelements, scan for editing events in various genomes, and build the first global editing atlas. In addition, we will exploit the close association between editing and retroelements in to produce a genome-wide approach to detect active retroelements. Finally, we will screen for editing events and retrotranspositions in various biological conditions, in order to expose their involvement in many biological states and evolution.
Summary
It is generally thought that an organism contains the exactly same genomic information in all its cells and that a genome remains unaltered throughout the organism’s life, with the exception of rare and random somatic mutations that might occur. This genomic information will also serve as a template for exact RNA copies. However, endogenous and powerful means of creating inner genomic diversity are known to exist: (1) RNA editing that leads to alteration of one nucleotide into another, (mainly A-to-I); (2) DNA editing that changes the DNA’s content by shifting C-into-U; (3) active retroelements that can insert copies of their sequences into new locations in a genome.
Recently, we and others have found that although considered extremely rare, all three mechanisms are active somatically or at least leave traces of their occurrence in the genome, and are linked together, as most editing events occur in retroelements. However, the magnitude and scope of these mechanisms, which can lead to huge diversity and complexity within an organism and even within a cell, are still a mystery. This explosion of genomic variety can have dramatic effect on diverse biological processes, such as brain complexity, cancer and evolution acceleration.
In GENEDVER, we aim to perform the first genome-wide mapping of editing and active retroelements in various genomes using a combination of computational and genomic approaches. Specifically, we will develop a strategy to detect RNA and DNA editing in retroelements, scan for editing events in various genomes, and build the first global editing atlas. In addition, we will exploit the close association between editing and retroelements in to produce a genome-wide approach to detect active retroelements. Finally, we will screen for editing events and retrotranspositions in various biological conditions, in order to expose their involvement in many biological states and evolution.
Max ERC Funding
1 499 249 €
Duration
Start date: 2012-10-01, End date: 2017-09-30
Project acronym GeneBodyMethylation
Project Resolving the Nuts and Bolts of Gene Body Methylation
Researcher (PI) Assaf Zemach
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), LS2, ERC-2015-STG
Summary DNA methylation, the covalent binding of a methyl group (CH3) to cytosine base, regulates the genome activity and plays a fundamental developmental role in eukaryotes. Its epigenetic characteristics of regulating transcription without changing the genetic code together with the ability to be transmitted through DNA replication allow organisms to memorize cellular events for many generations. DNA methylation is mostly known for its role in transcriptional silencing; however, its functional output is more complex and is influenced in part by its DNA context. Recent genomic studies, have found DNA methylation to be targeted inside sequences of actively transcribed genes, thus termed gene body methylation. Despite being an evolutionary conserved and a robust methylation pathway targeted to thousands of genes in animal and plant genomes, the function of gene body methylation is still poorly understood at both the molecular and functional level. Similar to the chicken and egg conundrum, because we do not know what gene body methylation does, therefore scientists could not apply its function to discover its regulators either. Gene body methylation is targeted to a very specific subset and subregions of genes, thus we strongly believe that specific factors exist and are missing simply because that no one has ever searched for them before. Hence, to make major breakthroughs in the field, our approach is to artificially generate gene-body-specific hypomethylated plants that together with customized genetic and biochemical systems will allow us to discover regulators and interpreters of gene body methylation. Using these unique genetic tools and novel molecular factors, we will be able to ultimately explore the particular biological roles of gene body methylation. These findings will fill the gap towards a full comprehension of the entire functional array of DNA methylation, and to its more precise exploitation in yielding better crops and in treating human diseases.
Summary
DNA methylation, the covalent binding of a methyl group (CH3) to cytosine base, regulates the genome activity and plays a fundamental developmental role in eukaryotes. Its epigenetic characteristics of regulating transcription without changing the genetic code together with the ability to be transmitted through DNA replication allow organisms to memorize cellular events for many generations. DNA methylation is mostly known for its role in transcriptional silencing; however, its functional output is more complex and is influenced in part by its DNA context. Recent genomic studies, have found DNA methylation to be targeted inside sequences of actively transcribed genes, thus termed gene body methylation. Despite being an evolutionary conserved and a robust methylation pathway targeted to thousands of genes in animal and plant genomes, the function of gene body methylation is still poorly understood at both the molecular and functional level. Similar to the chicken and egg conundrum, because we do not know what gene body methylation does, therefore scientists could not apply its function to discover its regulators either. Gene body methylation is targeted to a very specific subset and subregions of genes, thus we strongly believe that specific factors exist and are missing simply because that no one has ever searched for them before. Hence, to make major breakthroughs in the field, our approach is to artificially generate gene-body-specific hypomethylated plants that together with customized genetic and biochemical systems will allow us to discover regulators and interpreters of gene body methylation. Using these unique genetic tools and novel molecular factors, we will be able to ultimately explore the particular biological roles of gene body methylation. These findings will fill the gap towards a full comprehension of the entire functional array of DNA methylation, and to its more precise exploitation in yielding better crops and in treating human diseases.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym GeneREFORM
Project Genetically Encoded Multicolor Reporter Systems For Multiplexed MRI
Researcher (PI) Amnon Bar-Shir
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE LTD
Call Details Starting Grant (StG), PE5, ERC-2015-STG
Summary In order to fully understand the complexity of biological processes that are reflected by simultaneous occurrences of intra and inter-cellular events, multiplexed imaging platforms are needed. Fluorescent reporter genes, with their “multicolor” imaging capabilities, have revolutionized science and their founders have been awarded the Nobel Prize. Nevertheless, the light signal source of these reporters, which restricts their use in deep tissues and in large animals (and potentially in humans), calls for alternatives.
Reporter genes for MRI, although in their infancy, showed several exceptionalities, including the ability to longitudinal study the same subject with unlimited tissue penetration and to coregister information from reporter gene expression with high-resolution anatomical images. Inspired by the multicolor capabilities of optical reporter genes, this proposal aims to develop, optimize, and implement genetically engineered reporter systems for MRI with artificial “multicolor” characteristics. Capitalizing on (i) the Chemical Exchange Saturation Transfer (CEST)-MRI contrast mechanism that allows the use of small bioorganic molecules as MRI sensors, (ii) the frequency encoding, color-like features of CEST, and on (iii) enzyme engineering procedures that allow the optimization of enzymatic activity for a desired substrate, a “multicolor” genetically encoded MRI reporter system is proposed.
By (a) synthesizing libraries of non-natural nucleosides (“reporter probes”) to generate artificially “colored” CEST contrast, and (b) performing directed evolution of deoxyribonucleoside kinase (dNK) enzymes (“reporter genes”) to phosphorylate those nucleosides, the “multicolor” genetically encoded MRI “reporter system” will be created. The orthogonally of the obtained pairs of substrate (CEST sensor)/ enzyme (mutant dNK) will allow their simultaneous use as a genetically encoded reporter system for in vivo “multicolor” monitoring of reporter gene expression with MRI.
Summary
In order to fully understand the complexity of biological processes that are reflected by simultaneous occurrences of intra and inter-cellular events, multiplexed imaging platforms are needed. Fluorescent reporter genes, with their “multicolor” imaging capabilities, have revolutionized science and their founders have been awarded the Nobel Prize. Nevertheless, the light signal source of these reporters, which restricts their use in deep tissues and in large animals (and potentially in humans), calls for alternatives.
Reporter genes for MRI, although in their infancy, showed several exceptionalities, including the ability to longitudinal study the same subject with unlimited tissue penetration and to coregister information from reporter gene expression with high-resolution anatomical images. Inspired by the multicolor capabilities of optical reporter genes, this proposal aims to develop, optimize, and implement genetically engineered reporter systems for MRI with artificial “multicolor” characteristics. Capitalizing on (i) the Chemical Exchange Saturation Transfer (CEST)-MRI contrast mechanism that allows the use of small bioorganic molecules as MRI sensors, (ii) the frequency encoding, color-like features of CEST, and on (iii) enzyme engineering procedures that allow the optimization of enzymatic activity for a desired substrate, a “multicolor” genetically encoded MRI reporter system is proposed.
By (a) synthesizing libraries of non-natural nucleosides (“reporter probes”) to generate artificially “colored” CEST contrast, and (b) performing directed evolution of deoxyribonucleoside kinase (dNK) enzymes (“reporter genes”) to phosphorylate those nucleosides, the “multicolor” genetically encoded MRI “reporter system” will be created. The orthogonally of the obtained pairs of substrate (CEST sensor)/ enzyme (mutant dNK) will allow their simultaneous use as a genetically encoded reporter system for in vivo “multicolor” monitoring of reporter gene expression with MRI.
Max ERC Funding
1 478 284 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym GENEREGULATION
Project Deciphering the code of gene regulation using massively parallel assays of designed sequence libraries
Researcher (PI) Eran Segal
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Consolidator Grant (CoG), LS2, ERC-2013-CoG
Summary Many gene expression changes that are associated with disease states have in turn been linked to changes in the genes’ regulatory regions. However, without a ‘regulatory code’ that informs us how DNA sequences determine expression levels, we cannot predict which sequence changes will affect expression, by how much, and by what mechanism.
Here, we aim to arrive at a mechanistic and quantitative understanding of how expression levels are encoded in DNA sequence using a combined experimental and computational approach. To this end, we will construct libraries of >50,000 sequences, fuse them to fluorescent reporters, and genomically integrate them to yeast or human cells. We will then develop methods for accurately measuring, in parallel, the expression of each fused sequence within a single experiment, and for measuring the DNA binding state of each sequence at single cell resolution, resulting in ~1000-fold increase in the scale with which we can study the effect of sequence on expression.
Notably, we will design our experimental system to be modular, allowing us to propose a highly ambitious yet realistic plan in which we will study the effect of sequence on (1) transcriptional and (2) post-transcriptional regulation; (3) Unravel the effect of genetic variation across human individuals on expression; (4) Quantify how cellular fitness depends on the expression level of individual endogenous genes; and (5) Construct a predictive model of the effect of DNA sequence on expression.
Each of our libraries should provide novel insights into a different aspect of gene regulation, leading to new means by which we can interpret whole genome sequencing, which is rapidly being collected for many individuals. In particular, our unified model should allow us to predict expression changes among human individuals based only on their genotypic variation, greatly enhancing the ability to identify common or rare sequence variants that may affect molecular function or cause disease.
Summary
Many gene expression changes that are associated with disease states have in turn been linked to changes in the genes’ regulatory regions. However, without a ‘regulatory code’ that informs us how DNA sequences determine expression levels, we cannot predict which sequence changes will affect expression, by how much, and by what mechanism.
Here, we aim to arrive at a mechanistic and quantitative understanding of how expression levels are encoded in DNA sequence using a combined experimental and computational approach. To this end, we will construct libraries of >50,000 sequences, fuse them to fluorescent reporters, and genomically integrate them to yeast or human cells. We will then develop methods for accurately measuring, in parallel, the expression of each fused sequence within a single experiment, and for measuring the DNA binding state of each sequence at single cell resolution, resulting in ~1000-fold increase in the scale with which we can study the effect of sequence on expression.
Notably, we will design our experimental system to be modular, allowing us to propose a highly ambitious yet realistic plan in which we will study the effect of sequence on (1) transcriptional and (2) post-transcriptional regulation; (3) Unravel the effect of genetic variation across human individuals on expression; (4) Quantify how cellular fitness depends on the expression level of individual endogenous genes; and (5) Construct a predictive model of the effect of DNA sequence on expression.
Each of our libraries should provide novel insights into a different aspect of gene regulation, leading to new means by which we can interpret whole genome sequencing, which is rapidly being collected for many individuals. In particular, our unified model should allow us to predict expression changes among human individuals based only on their genotypic variation, greatly enhancing the ability to identify common or rare sequence variants that may affect molecular function or cause disease.
Max ERC Funding
2 000 000 €
Duration
Start date: 2014-03-01, End date: 2019-02-28