Project acronym STAND4HERITAGE
Project New STANDards for seismic assessment of built cultural HERITAGE
Researcher (PI) Paulo B. LOURENCO
Host Institution (HI) UNIVERSIDADE DO MINHO
Call Details Advanced Grant (AdG), PE8, ERC-2018-ADG
Summary STAND4HERITAGE ambitiously engages in introducing new standards for safeguarding built cultural heritage for the next generations, which is a major societal demand. Due to its large diversity, the accurate description of the structural behaviour of heritage buildings is still an open issue, particularly when subjected to earthquake ground motions. Among the most frequently observed seismic damage mechanisms in these buildings, the out-of-plane of masonry walls is acknowledged as the main cause for building loss and injuries to people. There are many unresolved challenges to effectively assess the out-of-plane seismic behaviour of masonry structures. First, it is necessary to understand less known phenomena in masonry dynamics, which largely influence the out-of-plane behaviour and capacity of heritage buildings. A recent blind exercise to predict the capacity of a benchmark masonry structure to resist a dynamic excitation demonstrated that, although advanced simulation tools are available, leading international researchers are still unable to consistently provide a collapse estimate. STAND4HERITAGE will address the aspects for successful development of approaches for seismic response prediction of masonry structures, integrating the necessary stages for out-of-plane assessment. It will generate novel: integrated stochastic-based models to consider the seismic signal in the dynamic response and capacity; datasets of the dynamic response evaluated after an extensive shake table testing program; numerical approaches for simulation of the out-of-plane seismic behaviour; an integrated analytical approach for out-of-plane seismic assessment of heritage buildings. STAND4HERITAGE objectives are in line with the UN 2030 agenda for sustainable cities and communities. The project will be founded on the experience of the PI in the topic, and on the interdisciplinary expertise of his team in facing the challenges to provide optimal intervention solutions for heritage buildings.
Summary
STAND4HERITAGE ambitiously engages in introducing new standards for safeguarding built cultural heritage for the next generations, which is a major societal demand. Due to its large diversity, the accurate description of the structural behaviour of heritage buildings is still an open issue, particularly when subjected to earthquake ground motions. Among the most frequently observed seismic damage mechanisms in these buildings, the out-of-plane of masonry walls is acknowledged as the main cause for building loss and injuries to people. There are many unresolved challenges to effectively assess the out-of-plane seismic behaviour of masonry structures. First, it is necessary to understand less known phenomena in masonry dynamics, which largely influence the out-of-plane behaviour and capacity of heritage buildings. A recent blind exercise to predict the capacity of a benchmark masonry structure to resist a dynamic excitation demonstrated that, although advanced simulation tools are available, leading international researchers are still unable to consistently provide a collapse estimate. STAND4HERITAGE will address the aspects for successful development of approaches for seismic response prediction of masonry structures, integrating the necessary stages for out-of-plane assessment. It will generate novel: integrated stochastic-based models to consider the seismic signal in the dynamic response and capacity; datasets of the dynamic response evaluated after an extensive shake table testing program; numerical approaches for simulation of the out-of-plane seismic behaviour; an integrated analytical approach for out-of-plane seismic assessment of heritage buildings. STAND4HERITAGE objectives are in line with the UN 2030 agenda for sustainable cities and communities. The project will be founded on the experience of the PI in the topic, and on the interdisciplinary expertise of his team in facing the challenges to provide optimal intervention solutions for heritage buildings.
Max ERC Funding
2 968 755 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym STANPAS
Project Statistical and Nonlinear Physics of Amorphous Solids
Researcher (PI) Itamar Procaccia
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Advanced Grant (AdG), PE3, ERC-2010-AdG_20100224
Summary I propose an extensive and ambitious program to greatly increase our understanding of the properties of amorphous solids, focusing mainly on the mechanical and magnetic properties of these fascinating materials, including their modes of failure via plastic flow, shear banding and fracture. Amorphous solids are important in many modern engineering applications, including as important examples structural glasses, metallic glasses and polymeric glasses. Our work combines a careful analysis of computer simulations of model-glasses with analytic theory in which we introduce to material science methods from statistical and nonlinear physics, both of which are subjects of expertise in our group. We challenge some present approaches that try to connect linear elasticity with some objects that carry plasticity; we claim that nonlinear elasticity is crucial, as its signature appears much before plastic failure. Similarly, we break away from current theories that assume that plastic events are spatially localized. We show that in athermal conditions the opposite is true, and we discover very interesting sub-extensive scaling phenomena characterized by a host of scaling exponents that need to be understood. The peculiarities of amorphous solids, in particular their memory of past deformation, call for the identification of new 'order parameters' that are sorely missing in present theories. Understanding the dependence on system size, temperature, external loading rates etc. calls for introducing new approaches and methods from statistical and nonlinear physics. In the body of the proposal we present a number of preliminary results that point towards a radically new way of thinking that we propose to develop to a new theory over the next five years.
Summary
I propose an extensive and ambitious program to greatly increase our understanding of the properties of amorphous solids, focusing mainly on the mechanical and magnetic properties of these fascinating materials, including their modes of failure via plastic flow, shear banding and fracture. Amorphous solids are important in many modern engineering applications, including as important examples structural glasses, metallic glasses and polymeric glasses. Our work combines a careful analysis of computer simulations of model-glasses with analytic theory in which we introduce to material science methods from statistical and nonlinear physics, both of which are subjects of expertise in our group. We challenge some present approaches that try to connect linear elasticity with some objects that carry plasticity; we claim that nonlinear elasticity is crucial, as its signature appears much before plastic failure. Similarly, we break away from current theories that assume that plastic events are spatially localized. We show that in athermal conditions the opposite is true, and we discover very interesting sub-extensive scaling phenomena characterized by a host of scaling exponents that need to be understood. The peculiarities of amorphous solids, in particular their memory of past deformation, call for the identification of new 'order parameters' that are sorely missing in present theories. Understanding the dependence on system size, temperature, external loading rates etc. calls for introducing new approaches and methods from statistical and nonlinear physics. In the body of the proposal we present a number of preliminary results that point towards a radically new way of thinking that we propose to develop to a new theory over the next five years.
Max ERC Funding
1 792 858 €
Duration
Start date: 2011-04-01, End date: 2016-03-31
Project acronym STEMCELL2MAX
Project A novel solution to efficient Haematopoietic Stem Cell regeneration
Researcher (PI) Jose Henrique VEIGA FERNANDES
Host Institution (HI) INSTITUTO DE MEDICINA MOLECULAR JOAO LOBO ANTUNES
Call Details Proof of Concept (PoC), PC1, ERC-2013-PoC
Summary There is an urgent need for improved expansion protocols for haematopoietic stem cells (HSCs). Many studies have tried to stimulate HSC expansion by experimenting with cytokine cocktails in cell culture media. Although these combinations of growth factors are able to increase the amount of HSCs, this happens at the cost of cell maturation: many of the cells in the final culture have lost their 'stemness' and thus their use for treatment and R&D. Dr. Veiga Fernandes has identified crucial survival factors for HSC expansion that increase the stem cell yield 20-fold compared to the current state-of-the-art. This dramatic increase poses serious commercial and societal opportunities. In this ERC PoC project a team of technological and commercial experts will investigate the commercial feasibility of this potentially groundbreaking approach and aims to design the optimal route-to-market with the ultimate goal to attract investors and other strategic partners.
Summary
There is an urgent need for improved expansion protocols for haematopoietic stem cells (HSCs). Many studies have tried to stimulate HSC expansion by experimenting with cytokine cocktails in cell culture media. Although these combinations of growth factors are able to increase the amount of HSCs, this happens at the cost of cell maturation: many of the cells in the final culture have lost their 'stemness' and thus their use for treatment and R&D. Dr. Veiga Fernandes has identified crucial survival factors for HSC expansion that increase the stem cell yield 20-fold compared to the current state-of-the-art. This dramatic increase poses serious commercial and societal opportunities. In this ERC PoC project a team of technological and commercial experts will investigate the commercial feasibility of this potentially groundbreaking approach and aims to design the optimal route-to-market with the ultimate goal to attract investors and other strategic partners.
Max ERC Funding
149 885 €
Duration
Start date: 2014-03-01, End date: 2015-02-28
Project acronym StemCellHabitat
Project Metabolic and Timed Control of Stem Cell Fate in the Developing Animal
Researcher (PI) Catarina DE CERTIMA FERNANDES HOMEM
Host Institution (HI) UNIVERSIDADE NOVA DE LISBOA
Call Details Starting Grant (StG), LS3, ERC-2017-STG
Summary Stem cell (SC) proliferation during development requires tight spatial and temporal regulation to ensure correct cell number and right cell types are formed at the proper positions. Currently very little is known about how SCs are regulated during development. Specifically, it is unclear how SC waves of proliferation are regulated and how the fate of their progeny changes during development. In addition, it has recently become evident that metabolism provides additional complexity in cell fate regulation, highlighting the need for integrating metabolic information across physiological levels.
This project will answer the question of how the combination of metabolic state and temporal cues (animal developmental stage) regulate SC fate. I will use Drosophila melanogaster, an animal complex enough to be similar to higher eukaryotes and yet simple enough to dissect the mechanistic details of cell regulation and its impact on the organism. Drosophila neural stem cells, the neuroblasts (NB), are a fantastic model of temporally and metabolically regulated cells. NB lineage fate changes with time, directing the generation of a stereotypical set of neurons, after which they disappear. I have previously found that metabolism is an important regulator of NB cell cycle exit, which occurs in response to an increase in levels of oxidative phosphorylation.
Using a multidisciplinary approach combining genetics, cell type/age sorting, multi-omics analysis, fixed and 3D-live NB imaging and metabolite dynamics, I propose an integrative approach to investigate how NBs are regulated in the developing animal. First I will dissect the mechanisms by which metabolism regulates NB fate. Second, I will investigate how metabolism contributes to NB unlimited proliferation and brain tumors. Finally, we will address how temporal transcription factors and hormones dynamically affect cell fate decisions during development.
Summary
Stem cell (SC) proliferation during development requires tight spatial and temporal regulation to ensure correct cell number and right cell types are formed at the proper positions. Currently very little is known about how SCs are regulated during development. Specifically, it is unclear how SC waves of proliferation are regulated and how the fate of their progeny changes during development. In addition, it has recently become evident that metabolism provides additional complexity in cell fate regulation, highlighting the need for integrating metabolic information across physiological levels.
This project will answer the question of how the combination of metabolic state and temporal cues (animal developmental stage) regulate SC fate. I will use Drosophila melanogaster, an animal complex enough to be similar to higher eukaryotes and yet simple enough to dissect the mechanistic details of cell regulation and its impact on the organism. Drosophila neural stem cells, the neuroblasts (NB), are a fantastic model of temporally and metabolically regulated cells. NB lineage fate changes with time, directing the generation of a stereotypical set of neurons, after which they disappear. I have previously found that metabolism is an important regulator of NB cell cycle exit, which occurs in response to an increase in levels of oxidative phosphorylation.
Using a multidisciplinary approach combining genetics, cell type/age sorting, multi-omics analysis, fixed and 3D-live NB imaging and metabolite dynamics, I propose an integrative approach to investigate how NBs are regulated in the developing animal. First I will dissect the mechanisms by which metabolism regulates NB fate. Second, I will investigate how metabolism contributes to NB unlimited proliferation and brain tumors. Finally, we will address how temporal transcription factors and hormones dynamically affect cell fate decisions during development.
Max ERC Funding
1 697 493 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym STEMREPAIR
Project Novel mesenchymal stem cell based therapies for articular cartilage repair
Researcher (PI) Daniel John Kelly
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Starting Grant (StG), PE8, ERC-2010-StG_20091028
Summary Once damaged articular cartilage has a limited reparative capacity and thus lesions often progress to arthritis. This has motivated the development of cell based therapies for the repair of cartilage defects such as autologous chondrocyte implantation (ACI). Such therapies are limited in two ways. Firstly, they do not result in the regeneration of hyaline cartilage and hence the repair is temporary. Secondly, widespread adaptation into the clinical setting is impeded by practical issues such as the high fiscal cost and time required for culture expansion of chondrocytes. The applicant is of the belief that both issues cannot currently be addressed by a single new therapy. Therefore the proposed project will put forward separate solutions to both issues. The first theme of the project will determine whether freshly isolated (not culture expanded) infrapatellar fat pad derived cells, embedded in a hydrogel containing microbead-encapsulated growth factors, can used to engineer functional cartilage tissue. A component of this theme will involve magnetic microbead enrichment for cells with surface markers associated with highly chondrogenic cells. Theme 2 of the proposed project will explore an alternative therapy for cartilage defect repair. Specifically, the objective is to tissue engineer in vitro a functional tissue with a zonal structure mimicking that of normal articular cartilage using mesenchymal stem cells. It is hypothesised that such a zonal structure can be generated by controlling the oxygen tension and mechanical environment within the developing tissue. The final theme of the project will be to determine if repairing high-load bearing cartilage defects using either tissue engineering therapies will result in significantly improved repair compared to ACI in a cartilage defect model.
Summary
Once damaged articular cartilage has a limited reparative capacity and thus lesions often progress to arthritis. This has motivated the development of cell based therapies for the repair of cartilage defects such as autologous chondrocyte implantation (ACI). Such therapies are limited in two ways. Firstly, they do not result in the regeneration of hyaline cartilage and hence the repair is temporary. Secondly, widespread adaptation into the clinical setting is impeded by practical issues such as the high fiscal cost and time required for culture expansion of chondrocytes. The applicant is of the belief that both issues cannot currently be addressed by a single new therapy. Therefore the proposed project will put forward separate solutions to both issues. The first theme of the project will determine whether freshly isolated (not culture expanded) infrapatellar fat pad derived cells, embedded in a hydrogel containing microbead-encapsulated growth factors, can used to engineer functional cartilage tissue. A component of this theme will involve magnetic microbead enrichment for cells with surface markers associated with highly chondrogenic cells. Theme 2 of the proposed project will explore an alternative therapy for cartilage defect repair. Specifically, the objective is to tissue engineer in vitro a functional tissue with a zonal structure mimicking that of normal articular cartilage using mesenchymal stem cells. It is hypothesised that such a zonal structure can be generated by controlling the oxygen tension and mechanical environment within the developing tissue. The final theme of the project will be to determine if repairing high-load bearing cartilage defects using either tissue engineering therapies will result in significantly improved repair compared to ACI in a cartilage defect model.
Max ERC Funding
1 499 770 €
Duration
Start date: 2010-09-01, End date: 2015-08-31
Project acronym StroMaP
Project Stromal stress networks underlying phenotypic plasticity and tumor fitness
Researcher (PI) Ruth Shouval
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), LS4, ERC-2017-STG
Summary The contribution of genetic and epigenetic changes to rewiring of cancer cells into their malignant state has been much studied. But tumors are more than cancer cells and the tumor microenvironment (TME) is a key player in tumor progression. We lack an overarching view of how, despite being genomically stable, the TME is heterogeneously reprogrammed across time and space to promote evolution of aggressive disease.
Recently I discovered that Heat-Shock Factor 1 (HSF1), a cytoprotective transcription factor (TF), is vital to this reprogramming, promoting malignancy in patients and mice upon activation in the stroma. Other stress TFs have also been implicated. This leads me to hypothesize that stress responses help tumors adapt and evolve into aggressive malignancies, by enabling heterogeneity and phenotypic diversity in the TME. This plasticity is achieved through cycles of massive transcriptional rewiring orchestrated by a network of stress TFs.
To test this hypothesis in a global way we will proceed in three aims. First we will define patterns of stress response activation in the TME by multiplexed immunofluorescence of patient tumors. Then, we will map the associated transcriptional landscape in patients by RNA-sequencing down to single cell resolution and interrogate it in the context of a novel theory of evolutionary tradeoffs so as to discover signatures that promote tumor aggressiveness. Next, we will identify actionable nodes for intervention and test them in cell co-cultures and mouse models.
The expected outcome of the proposed research is a detailed network of stress responses that can explain how the TME is rewired in tumors and how variable this rewiring is. This knowledge will provide new ways to target the TME in order to complement treatments focused on cancer cells. More generally, we address key aspects of stress responses, tissue plasticity, hoemostasis and evolution that are expected to be valuable across diverse fields of biology.
Summary
The contribution of genetic and epigenetic changes to rewiring of cancer cells into their malignant state has been much studied. But tumors are more than cancer cells and the tumor microenvironment (TME) is a key player in tumor progression. We lack an overarching view of how, despite being genomically stable, the TME is heterogeneously reprogrammed across time and space to promote evolution of aggressive disease.
Recently I discovered that Heat-Shock Factor 1 (HSF1), a cytoprotective transcription factor (TF), is vital to this reprogramming, promoting malignancy in patients and mice upon activation in the stroma. Other stress TFs have also been implicated. This leads me to hypothesize that stress responses help tumors adapt and evolve into aggressive malignancies, by enabling heterogeneity and phenotypic diversity in the TME. This plasticity is achieved through cycles of massive transcriptional rewiring orchestrated by a network of stress TFs.
To test this hypothesis in a global way we will proceed in three aims. First we will define patterns of stress response activation in the TME by multiplexed immunofluorescence of patient tumors. Then, we will map the associated transcriptional landscape in patients by RNA-sequencing down to single cell resolution and interrogate it in the context of a novel theory of evolutionary tradeoffs so as to discover signatures that promote tumor aggressiveness. Next, we will identify actionable nodes for intervention and test them in cell co-cultures and mouse models.
The expected outcome of the proposed research is a detailed network of stress responses that can explain how the TME is rewired in tumors and how variable this rewiring is. This knowledge will provide new ways to target the TME in order to complement treatments focused on cancer cells. More generally, we address key aspects of stress responses, tissue plasticity, hoemostasis and evolution that are expected to be valuable across diverse fields of biology.
Max ERC Funding
1 499 990 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym STRONG
Project Nanoscale magnetic and thermal imaging of strongly correlated electronic materials
Researcher (PI) Yonathan Anahory
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), PE3, ERC-2018-STG
Summary Strongly correlated electronic materials have phase diagrams that are intrinsically complex. Multiple, distinct, broken symmetry phases can occur simultaneously and the presence of these intertwined orders gives rise to spontaneously inhomogeneous electronic structures. Observing and characterizing these patterns is crucial to understanding the mechanisms that govern these electronic states. I propose to study these phases using a novel scanning SQUID microscope with single electron spin sensitivity and thermal sensitivity better than one millionth of a degree. The SQUID is mounted on a nanometric tip and has ~50 nm resolution. I will expand the experimental capabilities of this technique by improving the resolution to a few nm, and by enabling near field microwave microscopy.
Local magnetic probes are ideal for spatially resolving magnetic order and can also be used to probe local superconducting phase fluctuation since they generate local currents and thus local magnetic fields. However, the required resolution is of the order of a few nm, which is far beyond the capabilities of most local magnetic probes. While thermal microscopy provides information about dissipation mechanisms, which is relevant in high Tc superconductors (HTSC) above Tc where superconducting correlations are locally present, there is no technique that can perform thermal microscopy at low temperatures. The SQUID-on-tip will allow us to look at all the above-mentioned aspects. We propose to look at three types of systems (1) Observe local signatures of pair-density waves and other manifestations of broken time reversal symmetry in HTSC (2) Characterize the unconventional superconducting phase at the LaAlO3/SrTiO3 interface (3) Study the inhomogeneous magnetic phases at the LaMnO3/SrTiO3 interface. These measurements will provide significant contributions to the understanding of phenomena in strongly correlated materials such as superconductivity and its relation to other electronic order.
Summary
Strongly correlated electronic materials have phase diagrams that are intrinsically complex. Multiple, distinct, broken symmetry phases can occur simultaneously and the presence of these intertwined orders gives rise to spontaneously inhomogeneous electronic structures. Observing and characterizing these patterns is crucial to understanding the mechanisms that govern these electronic states. I propose to study these phases using a novel scanning SQUID microscope with single electron spin sensitivity and thermal sensitivity better than one millionth of a degree. The SQUID is mounted on a nanometric tip and has ~50 nm resolution. I will expand the experimental capabilities of this technique by improving the resolution to a few nm, and by enabling near field microwave microscopy.
Local magnetic probes are ideal for spatially resolving magnetic order and can also be used to probe local superconducting phase fluctuation since they generate local currents and thus local magnetic fields. However, the required resolution is of the order of a few nm, which is far beyond the capabilities of most local magnetic probes. While thermal microscopy provides information about dissipation mechanisms, which is relevant in high Tc superconductors (HTSC) above Tc where superconducting correlations are locally present, there is no technique that can perform thermal microscopy at low temperatures. The SQUID-on-tip will allow us to look at all the above-mentioned aspects. We propose to look at three types of systems (1) Observe local signatures of pair-density waves and other manifestations of broken time reversal symmetry in HTSC (2) Characterize the unconventional superconducting phase at the LaAlO3/SrTiO3 interface (3) Study the inhomogeneous magnetic phases at the LaMnO3/SrTiO3 interface. These measurements will provide significant contributions to the understanding of phenomena in strongly correlated materials such as superconductivity and its relation to other electronic order.
Max ERC Funding
1 997 926 €
Duration
Start date: 2019-08-01, End date: 2024-07-31
Project acronym STRONGPCP
Project Strong Probabilistically Checkable Proofs
Researcher (PI) Irit Dveer Dinur
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), PE6, ERC-2009-StG
Summary Probabilistically Checkable Proofs (PCPs) encapsulate the striking idea that verification of proofs becomes nearly trivial if one is willing to use randomness. The PCP theorem, proven in the early 90's, is a cornerstone of modern computational complexity theory. It completely revises our notion of a proof, leading to an amazingly robust behavior: A PCP proof is guaranteed to have an abundance of errors if attempting to prove a falsity. This stands in sharp contrast to our classical notion of a proof whose correctness can collapse due to one wrong step. An important drive in the development of PCP theory is the revolutionary effect it had on the field of approximation. Feige et. al. [JACM, 1996] discovered that the PCP theorem is *equivalent* to the inapproximability of several classical optimization problems. Thus, PCP theory has resulted in a leap in our understanding of approximability and opened the gate to a flood of results. To date, virtually all inapproximability results are based on the PCP theorem, and while there is an impressive body of work on hardness-of-approximation, much work still lies ahead. The central goal of this proposal is to obtain stronger PCPs than currently known, leading towards optimal inapproximability results and novel notions of robustness in computation and in proofs. This study will build upon (i) new directions opened up by my novel proof of the PCP theorem [JACM, 2007]; and on (ii) state-of-the-art PCP machinery involving techniques from algebra, functional and harmonic analysis, probability, combinatorics, and coding theory. The broader impact of this study spans a better understanding of limits for approximation algorithms saving time and resources for algorithm designers; and new understanding of robustness in a variety of mathematical contexts, arising from the many connections between PCPs and stability questions in combinatorics, functional analysis, metric embeddings, probability, and more.
Summary
Probabilistically Checkable Proofs (PCPs) encapsulate the striking idea that verification of proofs becomes nearly trivial if one is willing to use randomness. The PCP theorem, proven in the early 90's, is a cornerstone of modern computational complexity theory. It completely revises our notion of a proof, leading to an amazingly robust behavior: A PCP proof is guaranteed to have an abundance of errors if attempting to prove a falsity. This stands in sharp contrast to our classical notion of a proof whose correctness can collapse due to one wrong step. An important drive in the development of PCP theory is the revolutionary effect it had on the field of approximation. Feige et. al. [JACM, 1996] discovered that the PCP theorem is *equivalent* to the inapproximability of several classical optimization problems. Thus, PCP theory has resulted in a leap in our understanding of approximability and opened the gate to a flood of results. To date, virtually all inapproximability results are based on the PCP theorem, and while there is an impressive body of work on hardness-of-approximation, much work still lies ahead. The central goal of this proposal is to obtain stronger PCPs than currently known, leading towards optimal inapproximability results and novel notions of robustness in computation and in proofs. This study will build upon (i) new directions opened up by my novel proof of the PCP theorem [JACM, 2007]; and on (ii) state-of-the-art PCP machinery involving techniques from algebra, functional and harmonic analysis, probability, combinatorics, and coding theory. The broader impact of this study spans a better understanding of limits for approximation algorithms saving time and resources for algorithm designers; and new understanding of robustness in a variety of mathematical contexts, arising from the many connections between PCPs and stability questions in combinatorics, functional analysis, metric embeddings, probability, and more.
Max ERC Funding
1 639 584 €
Duration
Start date: 2009-09-01, End date: 2016-06-30
Project acronym Struct. vs. Individ
Project The ‘Declining Significance of Gender’ Reexamined: Cross-Country Comparison of Individual and Structural Aspects of Gender Inequality
Researcher (PI) Hadas Mandel Levy
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Consolidator Grant (CoG), SH3, ERC-2016-COG
Summary The comparative research of long-term trends largely neglects structural mechanisms of gender inequality, i.e. the gender bias in which jobs and activities are evaluated and rewarded. I argue that as more women become integrated in positions of power, the stronger the role of structural elements is likely to become. However, because these are less visible and amenable to empirical assessment, they are under-researched compared to individual aspects, and are commonly assumed to be gender-neutral. The implication is that the importance of gender as a determinant of economic inequality in the labour market becomes insufficiently acknowledged, and thus difficult to track and eradicate.
My empirical objective is to track structural vs. individual processes of gender inequality over a period of 40 years, using the case of occupations. My aim is to uncover the countervailing processes of women’s (individual) upward occupational mobility versus women’s (collective) effect on occupational pay. I argue that the effects of structural aspects of gender inequality increase over time, but are concealed by women’s (individual) upward mobility.
I expect the dynamic of the two processes to vary between countries and also by class. I thus seek to examine the processes in four representative countries – Sweden, Germany, Spain and the United States – that differ in many of the institutional aspects that affect gender inequality, including the provision of welfare, gender ideology, wage structure, and political economy factors. Therefore, gender in/equality processes in these countries are expected to take different forms in both structural and individual appearances. That said, in all countries I expect gender equality processes to be more pronounced and rapid for advantaged women. At the structural level, however, the rapid upward occupational mobility of skilled and educated women may expose highly rewarded occupations to devaluation and pay reduction more than others.
Summary
The comparative research of long-term trends largely neglects structural mechanisms of gender inequality, i.e. the gender bias in which jobs and activities are evaluated and rewarded. I argue that as more women become integrated in positions of power, the stronger the role of structural elements is likely to become. However, because these are less visible and amenable to empirical assessment, they are under-researched compared to individual aspects, and are commonly assumed to be gender-neutral. The implication is that the importance of gender as a determinant of economic inequality in the labour market becomes insufficiently acknowledged, and thus difficult to track and eradicate.
My empirical objective is to track structural vs. individual processes of gender inequality over a period of 40 years, using the case of occupations. My aim is to uncover the countervailing processes of women’s (individual) upward occupational mobility versus women’s (collective) effect on occupational pay. I argue that the effects of structural aspects of gender inequality increase over time, but are concealed by women’s (individual) upward mobility.
I expect the dynamic of the two processes to vary between countries and also by class. I thus seek to examine the processes in four representative countries – Sweden, Germany, Spain and the United States – that differ in many of the institutional aspects that affect gender inequality, including the provision of welfare, gender ideology, wage structure, and political economy factors. Therefore, gender in/equality processes in these countries are expected to take different forms in both structural and individual appearances. That said, in all countries I expect gender equality processes to be more pronounced and rapid for advantaged women. At the structural level, however, the rapid upward occupational mobility of skilled and educated women may expose highly rewarded occupations to devaluation and pay reduction more than others.
Max ERC Funding
1 395 000 €
Duration
Start date: 2017-07-01, End date: 2022-06-30
Project acronym SUBLRN
Project Information-optimal machine learning
Researcher (PI) Elad Eliezer Hazan
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Starting Grant (StG), PE6, ERC-2013-StG
Summary The statistical and computational theory of learning is one of the prime achievements of computer science and engineering. This is evident both in terms of mathematical elegance of capturing intuitive notions rigorously as well as in terms of practical applicability: machine learning has effectively reshaped the way we use information.
In this proposal we tackle the very basic notions of learning. Learning theory traditional focuses on statistics and computation. We propose to add information to the characterization of learning: namely the research question we address is: how much information is necessary to learn a certain concept efficiently?
The crucial difference from classical learning theory is that traditionally statistical complexity was measured in terms of the number of examples needed to learn a concept. Our question is more finely grained: what if we are allowed to inspect only parts of a given example? Can we reduce the amount of information necessary to successfully learn important concepts? This question is fundamental in understanding learning in general and designing efficient learning algorithms in particular. We show how recent advancements in convex optimization for machine learning yields positive answers to some of the above questions: there exists cases in which much more efficient algorithms exist for learning practically important concepts. Our goal is to characterize learning from the viewpoint of the amount of information necessary to learn, to design new algorithms that access less information than current state-of-the-art and are consequently significantly more efficient. New answers for these fundamental questions will be a breakthrough in our understanding of learning at large with significant potential for impact on the field of machine learning and its applications.
Summary
The statistical and computational theory of learning is one of the prime achievements of computer science and engineering. This is evident both in terms of mathematical elegance of capturing intuitive notions rigorously as well as in terms of practical applicability: machine learning has effectively reshaped the way we use information.
In this proposal we tackle the very basic notions of learning. Learning theory traditional focuses on statistics and computation. We propose to add information to the characterization of learning: namely the research question we address is: how much information is necessary to learn a certain concept efficiently?
The crucial difference from classical learning theory is that traditionally statistical complexity was measured in terms of the number of examples needed to learn a concept. Our question is more finely grained: what if we are allowed to inspect only parts of a given example? Can we reduce the amount of information necessary to successfully learn important concepts? This question is fundamental in understanding learning in general and designing efficient learning algorithms in particular. We show how recent advancements in convex optimization for machine learning yields positive answers to some of the above questions: there exists cases in which much more efficient algorithms exist for learning practically important concepts. Our goal is to characterize learning from the viewpoint of the amount of information necessary to learn, to design new algorithms that access less information than current state-of-the-art and are consequently significantly more efficient. New answers for these fundamental questions will be a breakthrough in our understanding of learning at large with significant potential for impact on the field of machine learning and its applications.
Max ERC Funding
1 453 802 €
Duration
Start date: 2013-10-01, End date: 2018-09-30
Project acronym SULTENG
Project Protein engineering for the study of detoxification enzymes and hub proteins
Researcher (PI) Amir Aharoni
Host Institution (HI) BEN-GURION UNIVERSITY OF THE NEGEV
Call Details Starting Grant (StG), LS7, ERC-2007-StG
Summary Proteins that exhibit broad specificity play important roles in different biological processes. These proteins include enzymes that catalyse the chemical transformation of many different substrates and proteins that bind to multiple protein partners. We propose to develop and apply novel directed evolution and chemical genetic methodologies for the study of proteins that exhibit broad specificity, with focus on cytosolic sulfotransferases (SULTs), which detoxify a broad range of xeno- and endobiotics, and proliferating cellular nuclear antigen (PCNA), which binds to multiple protein partners to play a central role in DNA replication and repair. SULTs belong to a large family of detoxification enzymes that exhibit broad specificity and relatively poor catalytic efficiency. It is not clear how SULTs can detoxify a variety of different compounds and what constitutes the molecular basis for their broad specificity. Application of directed evolution methodologies will allow us to identify and isolate SULT mutants with improved catalytic efficiency and novel specificity. These mutants will be thoroughly characterised by applying a variety of biochemical and structural methodologies to provide new insights into the broad specificity, catalytic activity and biological functions of SULTs. In parallel, we propose to develop and apply directed evolution methodologies for the study of PCNA. PCNA is a homotrimeric hub protein that forms a DNA sliding clamp to mediate DNA replication and repair by recruitment of a variety of essential proteins to the DNA template. Very little is known about how these multiple binding choices are regulated or about the importance of the different PCNA-protein interactions at different stages of replication. We propose to generate PCNA mutants with new binding activity and novel specificity, followed by thorough in-vitro and in-vivo characterisation, to study the roles of PCNA-protein interactions in DNA replication and repair.
Summary
Proteins that exhibit broad specificity play important roles in different biological processes. These proteins include enzymes that catalyse the chemical transformation of many different substrates and proteins that bind to multiple protein partners. We propose to develop and apply novel directed evolution and chemical genetic methodologies for the study of proteins that exhibit broad specificity, with focus on cytosolic sulfotransferases (SULTs), which detoxify a broad range of xeno- and endobiotics, and proliferating cellular nuclear antigen (PCNA), which binds to multiple protein partners to play a central role in DNA replication and repair. SULTs belong to a large family of detoxification enzymes that exhibit broad specificity and relatively poor catalytic efficiency. It is not clear how SULTs can detoxify a variety of different compounds and what constitutes the molecular basis for their broad specificity. Application of directed evolution methodologies will allow us to identify and isolate SULT mutants with improved catalytic efficiency and novel specificity. These mutants will be thoroughly characterised by applying a variety of biochemical and structural methodologies to provide new insights into the broad specificity, catalytic activity and biological functions of SULTs. In parallel, we propose to develop and apply directed evolution methodologies for the study of PCNA. PCNA is a homotrimeric hub protein that forms a DNA sliding clamp to mediate DNA replication and repair by recruitment of a variety of essential proteins to the DNA template. Very little is known about how these multiple binding choices are regulated or about the importance of the different PCNA-protein interactions at different stages of replication. We propose to generate PCNA mutants with new binding activity and novel specificity, followed by thorough in-vitro and in-vivo characterisation, to study the roles of PCNA-protein interactions in DNA replication and repair.
Max ERC Funding
1 000 000 €
Duration
Start date: 2008-07-01, End date: 2013-06-30
Project acronym SUPERSTARS
Project Type Ia supernovae: from explosions to cosmology
Researcher (PI) Kate MAGUIRE
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Starting Grant (StG), PE9, ERC-2017-STG
Summary Type Ia supernovae (SNe Ia) are the incredibly luminous deaths of white dwarfs in binaries. They play a vital role in chemical enrichment, galaxy feedback, stellar evolution, and were instrumental in the discovery of dark energy. However, what are the progenitor systems of SNe Ia, and how they explode remains a mystery. My recent work has concluded the controversial result that there may be more than one way to produce SNe Ia. As SN Ia cosmology samples reach higher precision, understanding subtle differences in their properties becomes increasingly important. A surprising diversity in white-dwarf explosions has also been uncovered, with a much wider-than-expected range in luminosities, light-curve timescales and spectral properties. A key open question is ‘What explosion mechanisms result in normal SNe Ia compared to more exotic transients?’
My team will use novel early-time observations (within hours of explosion) of 100 SNe Ia in a volume-limited search (<75 Mpc). The targets will come from the ATLAS and Pan-STARRS surveys that will provide unprecedented sky coverage and cadence (>20000 square degrees, up to four times a night). These data will be combined with key progenitor diagnostics of each SN (companion interaction, circumstellar material, central density studies). The observed zoo of transients predicted to result from white-dwarf explosions (He-shell explosions, tidal-disruption events, violent mergers) will also be investigated, with the goal of constraining the mechanisms by which white dwarfs can explode. My access to ATLAS/Pan-STARRS and my previous experience puts me in a unique position to obtain ‘day-zero’ light curves, rapid spectroscopic follow-up, and late-time observations. The data will be analysed with detailed spectral modelling to unveil the progenitors and diversity of SNe Ia. This project is timely with the potential for significant breakthroughs to be made before the start of the next-generation ‘transient machine’, LSST in ~2021.
Summary
Type Ia supernovae (SNe Ia) are the incredibly luminous deaths of white dwarfs in binaries. They play a vital role in chemical enrichment, galaxy feedback, stellar evolution, and were instrumental in the discovery of dark energy. However, what are the progenitor systems of SNe Ia, and how they explode remains a mystery. My recent work has concluded the controversial result that there may be more than one way to produce SNe Ia. As SN Ia cosmology samples reach higher precision, understanding subtle differences in their properties becomes increasingly important. A surprising diversity in white-dwarf explosions has also been uncovered, with a much wider-than-expected range in luminosities, light-curve timescales and spectral properties. A key open question is ‘What explosion mechanisms result in normal SNe Ia compared to more exotic transients?’
My team will use novel early-time observations (within hours of explosion) of 100 SNe Ia in a volume-limited search (<75 Mpc). The targets will come from the ATLAS and Pan-STARRS surveys that will provide unprecedented sky coverage and cadence (>20000 square degrees, up to four times a night). These data will be combined with key progenitor diagnostics of each SN (companion interaction, circumstellar material, central density studies). The observed zoo of transients predicted to result from white-dwarf explosions (He-shell explosions, tidal-disruption events, violent mergers) will also be investigated, with the goal of constraining the mechanisms by which white dwarfs can explode. My access to ATLAS/Pan-STARRS and my previous experience puts me in a unique position to obtain ‘day-zero’ light curves, rapid spectroscopic follow-up, and late-time observations. The data will be analysed with detailed spectral modelling to unveil the progenitors and diversity of SNe Ia. This project is timely with the potential for significant breakthroughs to be made before the start of the next-generation ‘transient machine’, LSST in ~2021.
Max ERC Funding
1 876 496 €
Duration
Start date: 2018-06-01, End date: 2023-05-31
Project acronym Supramol
Project Towards Artificial Enzymes: Bio-inspired Oxidations in Photoactive Metal-Organic Frameworks
Researcher (PI) Wolfgang Schmitt
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Consolidator Grant (CoG), PE5, ERC-2014-CoG
Summary Metal-organic frameworks (MOFs) are key compounds related to energy storage and conversion, as their unprecedented surface areas make them promising materials for gas storage and catalysis purposes. We believe that their modular construction principles allow the replication of key features of natural enzymes thus demonstrating how cavity size, shape, charge and functional group availability influence the performances in catalytic reactions. This proposal addresses the question of how such novel, bio-inspired metallo-supramolecular systems can be prepared and exploited for sustainable energy applications. A scientific breakthrough that demonstrates the efficient conversion of light into chemical energy would be one of the greatest scientific achievements with unprecedented impact to future generations. We focus on the following key aspects:
a) MOFs containing novel, catalytically active complexes with labile coordination sites will be synthesised using rigid organic ligands that allow us to control the topologies, cavity sizes and surface areas. We will incorporate photosensitizers to develop robust porous MOFs in which light-absorption initiates electron-transfer events that lead to the activation of a catalytic centre. In addition, photoactive molecules will serve as addressable ligands whereby reversible, photo-induced structural transformations impose changes to porosity and chemical attributes at the active sites.
b) Catalytic studies will focus on important oxidations of alkenes and alcohols. These reactions are relevant to H2-based energy concepts as the anodic liberation of protons and electrons can be coupled to their cathodic recombination to produce H2. The studies will provide proof-of-concept for the development of photocatalytic systems for the highly endergonic H2O oxidation reaction that will be explored using most stable MOFs. Further, gas storage and magnetic properties that may also be influenced by light-irradiation will be analysed.
Summary
Metal-organic frameworks (MOFs) are key compounds related to energy storage and conversion, as their unprecedented surface areas make them promising materials for gas storage and catalysis purposes. We believe that their modular construction principles allow the replication of key features of natural enzymes thus demonstrating how cavity size, shape, charge and functional group availability influence the performances in catalytic reactions. This proposal addresses the question of how such novel, bio-inspired metallo-supramolecular systems can be prepared and exploited for sustainable energy applications. A scientific breakthrough that demonstrates the efficient conversion of light into chemical energy would be one of the greatest scientific achievements with unprecedented impact to future generations. We focus on the following key aspects:
a) MOFs containing novel, catalytically active complexes with labile coordination sites will be synthesised using rigid organic ligands that allow us to control the topologies, cavity sizes and surface areas. We will incorporate photosensitizers to develop robust porous MOFs in which light-absorption initiates electron-transfer events that lead to the activation of a catalytic centre. In addition, photoactive molecules will serve as addressable ligands whereby reversible, photo-induced structural transformations impose changes to porosity and chemical attributes at the active sites.
b) Catalytic studies will focus on important oxidations of alkenes and alcohols. These reactions are relevant to H2-based energy concepts as the anodic liberation of protons and electrons can be coupled to their cathodic recombination to produce H2. The studies will provide proof-of-concept for the development of photocatalytic systems for the highly endergonic H2O oxidation reaction that will be explored using most stable MOFs. Further, gas storage and magnetic properties that may also be influenced by light-irradiation will be analysed.
Max ERC Funding
1 979 366 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym SUPREL
Project "Scaling Up Reinforcement Learning: Structure Learning, Skill Acquisition, and Reward Shaping"
Researcher (PI) Shie Mannor
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary "Learning how to act optimally in high-dimensional stochastic dynamic environments is a fundamental problem in many areas of engineering and computer science. The basic setup is that of an agent who interacts with an environment trying to maximize some long term payoff while having access to observations of the state of the environment. A standard approach to solving this problem is the Reinforcement Learning (RL) paradigm in which an agent is trying to improve its policy by interacting with the environment or, more generally, by using different sources of information such as traces from an expert and interacting with a simulator. In spite of several success stories of the RL paradigm, a unified methodology for scaling-up RL has not emerged to date. The goal of this research proposal is to create a methodology for learning and acting in high-dimensional stochastic dynamic environments that would scale up to real-world applications well and that will be useful across domains and engineering disciplines.
We focus on three key aspects of learning and optimization in high dimensional stochastic dynamic environments that are interrelated and essential to scaling up RL. First, we consider the problem of structure learning. This is the problem of how to identify the key features and underlying structures in the environment that are most useful for optimization and learning. Second, we consider the problem of learning, defining, and optimizing skills. Skills are sub-policies whose goal is more focused than solving the whole optimization problem and can hence be more easily learned and optimized. Third, we consider changing the natural reward of the system to obtain desirable properties of the solution such as robustness, adversity to risk and smoothness of the control policy. In order to validate our approach we study two challenging real-world domains: a jet fighter flight simulator and a smart-grid short term control problem."
Summary
"Learning how to act optimally in high-dimensional stochastic dynamic environments is a fundamental problem in many areas of engineering and computer science. The basic setup is that of an agent who interacts with an environment trying to maximize some long term payoff while having access to observations of the state of the environment. A standard approach to solving this problem is the Reinforcement Learning (RL) paradigm in which an agent is trying to improve its policy by interacting with the environment or, more generally, by using different sources of information such as traces from an expert and interacting with a simulator. In spite of several success stories of the RL paradigm, a unified methodology for scaling-up RL has not emerged to date. The goal of this research proposal is to create a methodology for learning and acting in high-dimensional stochastic dynamic environments that would scale up to real-world applications well and that will be useful across domains and engineering disciplines.
We focus on three key aspects of learning and optimization in high dimensional stochastic dynamic environments that are interrelated and essential to scaling up RL. First, we consider the problem of structure learning. This is the problem of how to identify the key features and underlying structures in the environment that are most useful for optimization and learning. Second, we consider the problem of learning, defining, and optimizing skills. Skills are sub-policies whose goal is more focused than solving the whole optimization problem and can hence be more easily learned and optimized. Third, we consider changing the natural reward of the system to obtain desirable properties of the solution such as robustness, adversity to risk and smoothness of the control policy. In order to validate our approach we study two challenging real-world domains: a jet fighter flight simulator and a smart-grid short term control problem."
Max ERC Funding
1 500 000 €
Duration
Start date: 2013-01-01, End date: 2017-12-31
Project acronym SURFCOMP
Project Comparing and Analyzing Collections of Surfaces
Researcher (PI) Yaron Lipman
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary The proposed research program intends to cover all aspects of the problem of learning and analyzing collections of surfaces and apply the developed methods and algorithms to a wide range of scientific data.
The proposal has two parts:
In the first part of the proposal, we concentrate on developing the most basic operators comparing automatically pairs of surfaces. Although this problem has received
a lot of attention in recent years,
and significant progress has been made, there is still a great need for algorithms that are both efficient/tractable and come with guarantees
of convergence or accuracy. The main difficulty in most approaches so far
is that they work in a huge and non-linear search space to compare surfaces; most algorithms resort to gradient descent from an initial guess, risking to find only local optimal solution.
We offer a few research directions to tackle this problem based on the idea of identifying EFFICIENT search spaces that APPROXIMATE the desired optimal correspondence.
In the second part of the proposal we propose to make use of the methods developed in the first part to perform global analysis of, or learn, collections of surfaces. We
put special emphasis on ``real-world'' applications and intend to validate our algorithm on a significant collection, including data-sets such as biological anatomic data-sets and computer graphics' benchmark collections of surfaces. We propose to formulate and construct geometric structures on these collections and investigate their domain specific implications.
Summary
The proposed research program intends to cover all aspects of the problem of learning and analyzing collections of surfaces and apply the developed methods and algorithms to a wide range of scientific data.
The proposal has two parts:
In the first part of the proposal, we concentrate on developing the most basic operators comparing automatically pairs of surfaces. Although this problem has received
a lot of attention in recent years,
and significant progress has been made, there is still a great need for algorithms that are both efficient/tractable and come with guarantees
of convergence or accuracy. The main difficulty in most approaches so far
is that they work in a huge and non-linear search space to compare surfaces; most algorithms resort to gradient descent from an initial guess, risking to find only local optimal solution.
We offer a few research directions to tackle this problem based on the idea of identifying EFFICIENT search spaces that APPROXIMATE the desired optimal correspondence.
In the second part of the proposal we propose to make use of the methods developed in the first part to perform global analysis of, or learn, collections of surfaces. We
put special emphasis on ``real-world'' applications and intend to validate our algorithm on a significant collection, including data-sets such as biological anatomic data-sets and computer graphics' benchmark collections of surfaces. We propose to formulate and construct geometric structures on these collections and investigate their domain specific implications.
Max ERC Funding
1 113 744 €
Duration
Start date: 2012-09-01, End date: 2017-08-31
Project acronym SUSCAT
Project New Directions in Sustainable Catalysis by Metal Complexes
Researcher (PI) David Milstein
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Advanced Grant (AdG), PE5, ERC-2015-AdG
Summary The discovery of novel sustainable catalytic reactions is a major current goal. Based on recent discoveries in our group, we plan to develop unprecedented sustainable catalytic reactions with special emphasis on reactions catalyzed by complexes of earth-abundant metals.
We have recently discovered an intriguing reaction, namely the oxidation of organic compounds using water, with no added oxidant, evolving H2. This simple, selective reaction, offers now a novel, conceptually new, environmentally benign approach in the field of oxidation of organic compounds, which we will explore.
We recently discovered a new mode of activation of multiple bonds by metal-ligand cooperation, including activation of CO2 and nitrile triple bonds, in which reversible C-C bond formation with the ligand is involved. Based on that, activation of nitriles has resulted in unprecedented C-C bond formation involving addition of simple aliphatic nitriles to various α,β-unsaturated carbonyl compounds. This mode of multiple bond activation may open a new approach to catalysis, “template catalysis”, which we plan to explore.
In addition, the highly desirable, catalytic activation of the kinetically very stable, potent greenhouse gas N2O for the (so far elusive), efficient oxygen transfer to organic compounds, will be pursued.
The use of CO2 in organic synthesis is an important timely topic. Based on its activation by metal ligand cooperation, new catalytic reactions of CO2 will be pursued, including unprecedented carbonylation of non-activated C-H bonds.
Most reactions catalysed by metal complexes involve noble metals. Development of sustainable catalysis based on complexes of earth-abundant metals is of great interest. In all topics described above, catalysis by complexes of such metals will be emphasized. Moreover, based on recent results in our group, we plan to develop an unprecedented family of complexes of earth-abundant metals, and pursue novel sustainable catalysis, based on it.
Summary
The discovery of novel sustainable catalytic reactions is a major current goal. Based on recent discoveries in our group, we plan to develop unprecedented sustainable catalytic reactions with special emphasis on reactions catalyzed by complexes of earth-abundant metals.
We have recently discovered an intriguing reaction, namely the oxidation of organic compounds using water, with no added oxidant, evolving H2. This simple, selective reaction, offers now a novel, conceptually new, environmentally benign approach in the field of oxidation of organic compounds, which we will explore.
We recently discovered a new mode of activation of multiple bonds by metal-ligand cooperation, including activation of CO2 and nitrile triple bonds, in which reversible C-C bond formation with the ligand is involved. Based on that, activation of nitriles has resulted in unprecedented C-C bond formation involving addition of simple aliphatic nitriles to various α,β-unsaturated carbonyl compounds. This mode of multiple bond activation may open a new approach to catalysis, “template catalysis”, which we plan to explore.
In addition, the highly desirable, catalytic activation of the kinetically very stable, potent greenhouse gas N2O for the (so far elusive), efficient oxygen transfer to organic compounds, will be pursued.
The use of CO2 in organic synthesis is an important timely topic. Based on its activation by metal ligand cooperation, new catalytic reactions of CO2 will be pursued, including unprecedented carbonylation of non-activated C-H bonds.
Most reactions catalysed by metal complexes involve noble metals. Development of sustainable catalysis based on complexes of earth-abundant metals is of great interest. In all topics described above, catalysis by complexes of such metals will be emphasized. Moreover, based on recent results in our group, we plan to develop an unprecedented family of complexes of earth-abundant metals, and pursue novel sustainable catalysis, based on it.
Max ERC Funding
2 497 975 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym SVIS
Project Supervised Verification of Infinite-State Systems
Researcher (PI) Sharon Shoham Buchbinder
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), PE6, ERC-2017-STG
Summary Modern society relies more and more on computing for managing complex safety-critical tasks (e.g., in medicine, avionics, economy). Correctness of computerized systems is therefore crucial, as incorrect behaviors might lead to disastrous outcomes. Still, correctness is a big open problem without satisfactory theoretical or practical solutions. This has dramatic effects on the security of our lives. The current practice in industry mainly employs testing which detects bugs but cannot ensure their absence. In contrast, formal verification can provide formal correctness guarantees. Unfortunately, existing formal methods are still limited in their applicability to real world systems since most of them are either too automatic, hindered by the undecidability of verification, or too manual, relying on substantial human effort. This proposal will introduce a new methodology of supervised verification, based on which, we will develop novel approaches that can be used to formally verify certain realistic systems. This will be achieved by dividing the verification process into tasks that are well suited for automation, and tasks that are best done by a human supervisor, and finding a suitable mode of interaction between the human and the machine. Supervised verification represents a conceptual leap as it is centered around automatic procedures but complements them with human guidance; It therefore breaks the classical pattern of automated verification, and creates new opportunities, both practical and theoretical. In addition to opening the way for developing tractable verification algorithms, it can be used to prove lower and upper bounds on the asymptotic complexity of verification by explicitly distilling the human's role. The approaches developed by this research will significantly improve system correctness and agility. At the same time, they will reduce the cost of testing, increase developers productivity, and improve the overall understanding of computerized systems.
Summary
Modern society relies more and more on computing for managing complex safety-critical tasks (e.g., in medicine, avionics, economy). Correctness of computerized systems is therefore crucial, as incorrect behaviors might lead to disastrous outcomes. Still, correctness is a big open problem without satisfactory theoretical or practical solutions. This has dramatic effects on the security of our lives. The current practice in industry mainly employs testing which detects bugs but cannot ensure their absence. In contrast, formal verification can provide formal correctness guarantees. Unfortunately, existing formal methods are still limited in their applicability to real world systems since most of them are either too automatic, hindered by the undecidability of verification, or too manual, relying on substantial human effort. This proposal will introduce a new methodology of supervised verification, based on which, we will develop novel approaches that can be used to formally verify certain realistic systems. This will be achieved by dividing the verification process into tasks that are well suited for automation, and tasks that are best done by a human supervisor, and finding a suitable mode of interaction between the human and the machine. Supervised verification represents a conceptual leap as it is centered around automatic procedures but complements them with human guidance; It therefore breaks the classical pattern of automated verification, and creates new opportunities, both practical and theoretical. In addition to opening the way for developing tractable verification algorithms, it can be used to prove lower and upper bounds on the asymptotic complexity of verification by explicitly distilling the human's role. The approaches developed by this research will significantly improve system correctness and agility. At the same time, they will reduce the cost of testing, increase developers productivity, and improve the overall understanding of computerized systems.
Max ERC Funding
1 499 528 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym SweetAim
Project Selective glycoimmuno-targeting for cancer therapy
Researcher (PI) VERED KARAVANI
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), LS7, ERC-2016-STG
Summary Immunotherapy recently became an important alternative to conventional treatment regimes, yet cancer remains a universal leading cause of death. Thus, novel cancer theranostic approaches are still much desired. Although altered cell surface glycosylation is one of the hallmarks of cancer, targeting this ‘sweet aim’ for cancer therapy has been elusive, largely due to carbohydrates poor immunogenicity and the low affinity of antibodies against them. A red meat-derived carbohydrate antigen is a novel immunogenic moiety providing a key to unlock the theranostic potential of tumor-associated carbohydrate antigens. This foreign non-human sugar can be acquired only through the diet and subsequently appears on diverse cell surface glycoconjugates as ‘self’, accumulating mostly on carcinomas, and resulting in a polyclonal xeno-autoantibodies response. I have shown that such antibodies have both diagnostic and therapeutic potential, although basic understanding of their specificity and potency is scarce. The primary objective of this proposal is to design a novel personalized cancer therapeutic approach based on xeno-autoantibodies against the dietary sugar antigen. We propose an innovative interdisciplinary approach crossing the boundaries of cancer research, glycosciences, immunology and nanotechnology, with cutting-edge technologies, to design, engineer, screen and fully investigate potent targeting of ‘SweetAim’ moieties. Our discovery line is based on a two-arms platform to generate optimized antibodies for passive/active therapy, together with refined tumor cells through glyco-engineering/reprogramming for unveiling novel theranostics, finally evaluated both in vitro and in vivo. I expect our groundbreaking achievements will lead to promising new clinical tools, particularly for cancer, but also for other chronic inflammation-mediated diseases. Importantly, it will establish fundamental new concepts regarding carbohydrate recognition and response by the immune system.
Summary
Immunotherapy recently became an important alternative to conventional treatment regimes, yet cancer remains a universal leading cause of death. Thus, novel cancer theranostic approaches are still much desired. Although altered cell surface glycosylation is one of the hallmarks of cancer, targeting this ‘sweet aim’ for cancer therapy has been elusive, largely due to carbohydrates poor immunogenicity and the low affinity of antibodies against them. A red meat-derived carbohydrate antigen is a novel immunogenic moiety providing a key to unlock the theranostic potential of tumor-associated carbohydrate antigens. This foreign non-human sugar can be acquired only through the diet and subsequently appears on diverse cell surface glycoconjugates as ‘self’, accumulating mostly on carcinomas, and resulting in a polyclonal xeno-autoantibodies response. I have shown that such antibodies have both diagnostic and therapeutic potential, although basic understanding of their specificity and potency is scarce. The primary objective of this proposal is to design a novel personalized cancer therapeutic approach based on xeno-autoantibodies against the dietary sugar antigen. We propose an innovative interdisciplinary approach crossing the boundaries of cancer research, glycosciences, immunology and nanotechnology, with cutting-edge technologies, to design, engineer, screen and fully investigate potent targeting of ‘SweetAim’ moieties. Our discovery line is based on a two-arms platform to generate optimized antibodies for passive/active therapy, together with refined tumor cells through glyco-engineering/reprogramming for unveiling novel theranostics, finally evaluated both in vitro and in vivo. I expect our groundbreaking achievements will lead to promising new clinical tools, particularly for cancer, but also for other chronic inflammation-mediated diseases. Importantly, it will establish fundamental new concepts regarding carbohydrate recognition and response by the immune system.
Max ERC Funding
1 479 995 €
Duration
Start date: 2016-12-01, End date: 2021-11-30
Project acronym SYMPAC
Project Synthetic metabolic pathways for carbon fixation
Researcher (PI) Ron Milo
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), LS2, ERC-2010-StG_20091118
Summary Carbon fixation is the main pathway for storing energy and accumulating biomass in the living world. It is also the principal reason for humanity s utilization of land and water. Under human cultivation, carbon fixation significantly limits growth. Hence increasing carbon fixation rate is of major importance towards agricultural and energetic sustainability.
Are there limits on the rate of such central metabolic pathways? Attempts to improve the rate of Rubisco, the key enzyme in the Calvin-Benson cycle, have achieved very limited success. In this proposal we try to overcome this bottleneck by systematically exploring the space of carbon fixation pathways that can be assembled from all ~4000 metabolic enzymes known in nature. We computationally compare all possible pathways based on kinetics, energetics and topology. Our initial analysis suggests a new family of synthetic carbon fixation pathways utilizing the most effective carboxylating enzyme, PEPC. We propose to experimentally test these cycles in the most genetically tractable context by constructing an E. coli strain that will depend on carbon fixation as its sole carbon input. Energy will be supplied by compounds that cannot be used as carbon source. Initially, we will devise an autotrophic E. coli strain to use the Calvin-Benson Cycle; in the next stage, we will implement the most promising synthetic cycles. Systematic in vivo comparison will guide the future implementation in natural photosynthetic organisms.
At the basic science level, this proposal revisits and challenges our understanding of central carbon metabolism and growth. Concomitantly, it is an evolutionary experiment on integration of a biological novelty. It will serve as a model for significantly adapting a central metabolic pathway.
Summary
Carbon fixation is the main pathway for storing energy and accumulating biomass in the living world. It is also the principal reason for humanity s utilization of land and water. Under human cultivation, carbon fixation significantly limits growth. Hence increasing carbon fixation rate is of major importance towards agricultural and energetic sustainability.
Are there limits on the rate of such central metabolic pathways? Attempts to improve the rate of Rubisco, the key enzyme in the Calvin-Benson cycle, have achieved very limited success. In this proposal we try to overcome this bottleneck by systematically exploring the space of carbon fixation pathways that can be assembled from all ~4000 metabolic enzymes known in nature. We computationally compare all possible pathways based on kinetics, energetics and topology. Our initial analysis suggests a new family of synthetic carbon fixation pathways utilizing the most effective carboxylating enzyme, PEPC. We propose to experimentally test these cycles in the most genetically tractable context by constructing an E. coli strain that will depend on carbon fixation as its sole carbon input. Energy will be supplied by compounds that cannot be used as carbon source. Initially, we will devise an autotrophic E. coli strain to use the Calvin-Benson Cycle; in the next stage, we will implement the most promising synthetic cycles. Systematic in vivo comparison will guide the future implementation in natural photosynthetic organisms.
At the basic science level, this proposal revisits and challenges our understanding of central carbon metabolism and growth. Concomitantly, it is an evolutionary experiment on integration of a biological novelty. It will serve as a model for significantly adapting a central metabolic pathway.
Max ERC Funding
1 498 792 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym SYMPLECTIC
Project Symplectic Measurements and Hamiltonian Dynamics
Researcher (PI) Yaron Ostrover
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), PE1, ERC-2014-STG
Summary Symplectic geometry combines a broad spectrum of interrelated disciplines lying in the mainstream of modern mathematics. The past two decades have given rise to several exciting developments in this field, which introduced new mathematical tools and opened challenging new questions. Nowadays symplectic geometry reaches out to an amazingly wide range of areas, such as differential and algebraic geometry, complex analysis, dynamical systems, as well as quantum mechanics, and string theory. Moreover, symplectic geometry serves as a basis for Hamiltonian dynamics, a discipline providing efficient tools for modeling a variety of physical and technological processes, such as orbital motion of satellites (telecommunication and GPS navigation), and propagation of light in optical fibers (with significant applications to medicine).
The proposed research is composed of several innovative studies in the frontier of symplectic geometry and Hamiltonian dynamics, which are of highly significant interest in both fields. These studies have strong interactions on a variety of topics that lie at the heart of contemporary symplectic geometry, such as symplectic embedding questions, the geometry of Hofer’s metric, Lagrangian
intersection problems, and the theory of symplectic capacities.
My research objectives are twofold. First, to solve the open research questions described below, which I consider to be pivotal in the field. Some of these questions have already been studied intensively, and progress toward solving them would be of considerable significance. Second, some of the studies in this proposal are interdisciplinary by nature, and use symplectic tools in order to address major open questions in other fields, such as the famous Mahler conjecture in convex geometry. My goal is to deepen the connections between symplectic geometry and these fields, thus creating a powerful framework that will allow the consideration of questions currently unattainable.
Summary
Symplectic geometry combines a broad spectrum of interrelated disciplines lying in the mainstream of modern mathematics. The past two decades have given rise to several exciting developments in this field, which introduced new mathematical tools and opened challenging new questions. Nowadays symplectic geometry reaches out to an amazingly wide range of areas, such as differential and algebraic geometry, complex analysis, dynamical systems, as well as quantum mechanics, and string theory. Moreover, symplectic geometry serves as a basis for Hamiltonian dynamics, a discipline providing efficient tools for modeling a variety of physical and technological processes, such as orbital motion of satellites (telecommunication and GPS navigation), and propagation of light in optical fibers (with significant applications to medicine).
The proposed research is composed of several innovative studies in the frontier of symplectic geometry and Hamiltonian dynamics, which are of highly significant interest in both fields. These studies have strong interactions on a variety of topics that lie at the heart of contemporary symplectic geometry, such as symplectic embedding questions, the geometry of Hofer’s metric, Lagrangian
intersection problems, and the theory of symplectic capacities.
My research objectives are twofold. First, to solve the open research questions described below, which I consider to be pivotal in the field. Some of these questions have already been studied intensively, and progress toward solving them would be of considerable significance. Second, some of the studies in this proposal are interdisciplinary by nature, and use symplectic tools in order to address major open questions in other fields, such as the famous Mahler conjecture in convex geometry. My goal is to deepen the connections between symplectic geometry and these fields, thus creating a powerful framework that will allow the consideration of questions currently unattainable.
Max ERC Funding
1 221 921 €
Duration
Start date: 2015-03-01, End date: 2021-02-28
Project acronym SYMPTOPODYNQUANT
Project Symplectic topology and its interactions: from dynamics to quantization
Researcher (PI) Leonid Polterovich
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Advanced Grant (AdG), PE1, ERC-2013-ADG
Summary "The proposed research belongs to symplectic topology, a rapidly developing
field of mathematics which originally appeared as a geometric tool for problems of classical mechanics. Since the 1980ies, new powerful methods such as theory of pseudo-holomorphic curves, Morse-Floer theory on loop spaces, symplectic field theory and mirror symmetry changed the face of the field and put it at the crossroads of several mathematical disciplines. In this proposal I develop function theory on symplectic manifolds, a recently emerged subject providing new tools and an alternative intuition in the field. With these tools, I explore footprints of symplectic rigidity in quantum mechanics, a brand new playground for applications of ``hard"" symplectic methods. This enterprise should bring novel insights into both fields. Other proposed applications of function theory on symplectic manifolds include Hamiltonian dynamics and Lagrangian knots. Function theory on symplectic manifolds is fruitfully interacting with geometry and algebra of groups of symplectic and contact transformations, which form another objective of this proposal. I focus on distortion of cyclic subgroups, quasi-morphisms and restrictions on finitely generated subgroups including the symplectic and contact versions of the Zimmer program. In the contact case, this subject is making nowadays its very first steps and is essentially unexplored. The progress in this direction will shed new light on the structure of these transformation groups playing a fundamental role in geometry, topology and dynamics."
Summary
"The proposed research belongs to symplectic topology, a rapidly developing
field of mathematics which originally appeared as a geometric tool for problems of classical mechanics. Since the 1980ies, new powerful methods such as theory of pseudo-holomorphic curves, Morse-Floer theory on loop spaces, symplectic field theory and mirror symmetry changed the face of the field and put it at the crossroads of several mathematical disciplines. In this proposal I develop function theory on symplectic manifolds, a recently emerged subject providing new tools and an alternative intuition in the field. With these tools, I explore footprints of symplectic rigidity in quantum mechanics, a brand new playground for applications of ``hard"" symplectic methods. This enterprise should bring novel insights into both fields. Other proposed applications of function theory on symplectic manifolds include Hamiltonian dynamics and Lagrangian knots. Function theory on symplectic manifolds is fruitfully interacting with geometry and algebra of groups of symplectic and contact transformations, which form another objective of this proposal. I focus on distortion of cyclic subgroups, quasi-morphisms and restrictions on finitely generated subgroups including the symplectic and contact versions of the Zimmer program. In the contact case, this subject is making nowadays its very first steps and is essentially unexplored. The progress in this direction will shed new light on the structure of these transformation groups playing a fundamental role in geometry, topology and dynamics."
Max ERC Funding
1 787 200 €
Duration
Start date: 2013-10-01, End date: 2019-09-30
Project acronym SynapticMitochondria
Project Quality Control and Maintenance of Synaptic Mitochondria
Researcher (PI) Vanessa Alexandra Dos Santos Morais Epifânio
Host Institution (HI) INSTITUTO DE MEDICINA MOLECULAR JOAO LOBO ANTUNES
Call Details Starting Grant (StG), LS5, ERC-2015-STG
Summary Mitochondria at the synapse have a pivotal role in neurotransmitter release, but almost nothing is known about synaptic mitochondria composition or specific functions. Synaptic mitochondria compared to mitochondria in other cells, need to cope with increased calcium load, more oxidative stress, and high demands of energy generation during synaptic activity. My hypothesis is that synaptic mitochondria have acquired specific mechanisms to manage local stress and that disruption of these mechanisms contributes to neurodegeneration.
How mitochondria sense their dysfunction is unclear. Even more intriguing is the question how they decide whether their failure should lead to removal of the organelle or dismissal of the complete neuron via cell death. We anticipate that these decisions are not only operational during disease, but might constitute a fundamental mechanism relevant for maintenance of synaptic activity and establishment of new synapses.
Recent studies have revealed several genes implicated in neurodegenerative disorders involved in mitochondrial maintenance. However the function of these genes at the synapse, where the initial damage occurs, remains to be clarified. These genes provide excellent starting points to decipher the molecular mechanisms discussed above. Furthermore I propose to use proteomic approaches to identify the protein fingerprint of synaptic mitochondria and to compare them to mitochondria from other tissues. I plan to identify key players of the proposed regulatory pathways involved in intrinsic mitochondria quality control. In a complimentary approach, I will exploit our findings and use in vitro and in vivo experimental approaches to measure mitochondrial function of synaptic versus non-synaptic mitochondria and the relevance of those changes for synaptic function. Our work will unravel the specific properties of synaptic mitochondria and provide much needed insight in their hypothesized predominant role in neurodegenerative disorders.
Summary
Mitochondria at the synapse have a pivotal role in neurotransmitter release, but almost nothing is known about synaptic mitochondria composition or specific functions. Synaptic mitochondria compared to mitochondria in other cells, need to cope with increased calcium load, more oxidative stress, and high demands of energy generation during synaptic activity. My hypothesis is that synaptic mitochondria have acquired specific mechanisms to manage local stress and that disruption of these mechanisms contributes to neurodegeneration.
How mitochondria sense their dysfunction is unclear. Even more intriguing is the question how they decide whether their failure should lead to removal of the organelle or dismissal of the complete neuron via cell death. We anticipate that these decisions are not only operational during disease, but might constitute a fundamental mechanism relevant for maintenance of synaptic activity and establishment of new synapses.
Recent studies have revealed several genes implicated in neurodegenerative disorders involved in mitochondrial maintenance. However the function of these genes at the synapse, where the initial damage occurs, remains to be clarified. These genes provide excellent starting points to decipher the molecular mechanisms discussed above. Furthermore I propose to use proteomic approaches to identify the protein fingerprint of synaptic mitochondria and to compare them to mitochondria from other tissues. I plan to identify key players of the proposed regulatory pathways involved in intrinsic mitochondria quality control. In a complimentary approach, I will exploit our findings and use in vitro and in vivo experimental approaches to measure mitochondrial function of synaptic versus non-synaptic mitochondria and the relevance of those changes for synaptic function. Our work will unravel the specific properties of synaptic mitochondria and provide much needed insight in their hypothesized predominant role in neurodegenerative disorders.
Max ERC Funding
1 300 000 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym SynChI
Project Striatal cholinergic cell assemblies in movement disorders
Researcher (PI) Joshua Avi Goldberg
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Consolidator Grant (CoG), LS5, ERC-2014-CoG
Summary Pathological neuronal synchrony is the hallmark of many neurological disorders, including Parkinson’s disease (PD) and Huntington’s disease (HD), which further share deficits in cholinergic signaling. Moreover, recent findings have underscored the therapeutic relevance of the synchrony among striatal cholinergic interneurons (ChI) that orchestrate this signaling. They have shown that excessively synchronous ChI discharge induces di-synaptic release of dopamine, GABA and glutamate. Here, I propose to elucidate how ChI synchronization is generated under normal and pathological conditions and thereby identify novel therapeutic targets to treat PD and HD. This study has only very recently become feasible with the advent of powerful tools that I have mastered to explore ChI synchrony.
We will employ a combination of cutting-edge in vitro and in vivo techniques to simultaneously record a far larger population of pre-identified ChIs than is currently possible. We will express GCaMP6, a genetically encoded calcium indicator (GECI), exclusively in ChIs, and use multiphoton microscopy to image calcium transients from several ChIs simultaneously in conjunction with intracellular recording from individual ChIs in acute brain slices and in anesthetized mice. Additionally, we will use endoscopic GECI imaging in freely-moving classically conditioned mice. We will employ modern analyses that reveal low-dimensional structures in large neuronal datasets to quantify synchrony (1) during on-going activity; (2) during optogenetic activation of afferents; and (3), in the freely-moving mice, while presenting conditioned cues. Finally, we will study the origins of pathological synchrony in PD and HD mouse models and explore means to correct this condition. This comprehensive approach should explain the pathological ChI synchrony observed in PD; identify novel targets to treat PD and HD; and create a general methodology to study pathological synchrony in many other neurological disorders.
Summary
Pathological neuronal synchrony is the hallmark of many neurological disorders, including Parkinson’s disease (PD) and Huntington’s disease (HD), which further share deficits in cholinergic signaling. Moreover, recent findings have underscored the therapeutic relevance of the synchrony among striatal cholinergic interneurons (ChI) that orchestrate this signaling. They have shown that excessively synchronous ChI discharge induces di-synaptic release of dopamine, GABA and glutamate. Here, I propose to elucidate how ChI synchronization is generated under normal and pathological conditions and thereby identify novel therapeutic targets to treat PD and HD. This study has only very recently become feasible with the advent of powerful tools that I have mastered to explore ChI synchrony.
We will employ a combination of cutting-edge in vitro and in vivo techniques to simultaneously record a far larger population of pre-identified ChIs than is currently possible. We will express GCaMP6, a genetically encoded calcium indicator (GECI), exclusively in ChIs, and use multiphoton microscopy to image calcium transients from several ChIs simultaneously in conjunction with intracellular recording from individual ChIs in acute brain slices and in anesthetized mice. Additionally, we will use endoscopic GECI imaging in freely-moving classically conditioned mice. We will employ modern analyses that reveal low-dimensional structures in large neuronal datasets to quantify synchrony (1) during on-going activity; (2) during optogenetic activation of afferents; and (3), in the freely-moving mice, while presenting conditioned cues. Finally, we will study the origins of pathological synchrony in PD and HD mouse models and explore means to correct this condition. This comprehensive approach should explain the pathological ChI synchrony observed in PD; identify novel targets to treat PD and HD; and create a general methodology to study pathological synchrony in many other neurological disorders.
Max ERC Funding
2 000 000 €
Duration
Start date: 2015-05-01, End date: 2020-04-30
Project acronym SynProAtCell
Project Delivery and On-Demand Activation of Chemically Synthesized and Uniquely Modified Proteins in Living Cells
Researcher (PI) Ashraf BRIK
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Advanced Grant (AdG), PE5, ERC-2018-ADG
Summary While advanced molecular biology approaches provide insight on the role of proteins in cellular processes, their ability to freely modify proteins and control their functions when desired is limited, hindering the achievement of a detailed understanding of the cellular functions of numerous proteins. At the same time, chemical synthesis of proteins allows for unlimited protein design, enabling the preparation of unique protein analogues that are otherwise difficult or impossible to obtain. However, effective methods to introduce these designed proteins into cells are for the most part limited to simple systems. To monitor proteins cellular functions and fates in real time, and in order to answer currently unanswerable fundamental questions about the cellular roles of proteins, the fields of protein synthesis and cellular protein manipulation must be bridged by significant advances in methods for protein delivery and real-time activation. Here, we propose to develop a general approach for enabling considerably more detailed in-cell study of uniquely modified proteins by preparing proteins having the following features: 1) traceless cell delivery unit(s), 2) an activation unit for on-demand activation of protein function in the cell, and 3) a fluorescence probe for monitoring the state and the fate of the protein.
We will adopt this approach to shed light on the processes of ubiquitination and deubiquitination, which are critical cellular signals for many biological processes. We will employ our approach to study 1) the effect of inhibition of deubiquitinases in cancer. 2) Examining effect of phosphorylation on proteasomal degradation and on ubiquitin chain elongation. 3) Examining effect of covalent attachment of a known ligase ligand to a target protein on its degradation Moreover, which could trigger the development of new methods to modify the desired protein in cell by selective chemistries and so rationally promote their degradation.
Summary
While advanced molecular biology approaches provide insight on the role of proteins in cellular processes, their ability to freely modify proteins and control their functions when desired is limited, hindering the achievement of a detailed understanding of the cellular functions of numerous proteins. At the same time, chemical synthesis of proteins allows for unlimited protein design, enabling the preparation of unique protein analogues that are otherwise difficult or impossible to obtain. However, effective methods to introduce these designed proteins into cells are for the most part limited to simple systems. To monitor proteins cellular functions and fates in real time, and in order to answer currently unanswerable fundamental questions about the cellular roles of proteins, the fields of protein synthesis and cellular protein manipulation must be bridged by significant advances in methods for protein delivery and real-time activation. Here, we propose to develop a general approach for enabling considerably more detailed in-cell study of uniquely modified proteins by preparing proteins having the following features: 1) traceless cell delivery unit(s), 2) an activation unit for on-demand activation of protein function in the cell, and 3) a fluorescence probe for monitoring the state and the fate of the protein.
We will adopt this approach to shed light on the processes of ubiquitination and deubiquitination, which are critical cellular signals for many biological processes. We will employ our approach to study 1) the effect of inhibition of deubiquitinases in cancer. 2) Examining effect of phosphorylation on proteasomal degradation and on ubiquitin chain elongation. 3) Examining effect of covalent attachment of a known ligase ligand to a target protein on its degradation Moreover, which could trigger the development of new methods to modify the desired protein in cell by selective chemistries and so rationally promote their degradation.
Max ERC Funding
2 500 000 €
Duration
Start date: 2019-10-01, End date: 2024-09-30
Project acronym SYNTECH
Project Synthesis Technologies for Reactive Systems Software Engineers
Researcher (PI) Shahar Maoz
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), PE6, ERC-2014-STG
Summary The design and development of open reactive systems, which compute by reacting to ongoing stimuli from their environment, and include, for example, mobile applications running on smart phone devices, web-based applications, industrial robotic systems, embedded software running on chips inside cars and aircraft, etc., is a complex and challenging task. Despite advancement from low-level assembly languages to higher-level languages with powerful abstraction mechanisms, and the use of automated testing and formal verification, reactive systems software development is still a mostly manual and error-prone iterative activity of coding and debugging.
A fundamentally different alternative approach to reactive systems development is synthesis, the automatic creation of correct-by-construction software from its specification. Synthesis has the potential to transform the way open reactive systems software is developed, making the process more effective and productive, and making its results more reliable and usable. However, while important advancements have been recently made on the algorithmic aspects of synthesis, no work has yet taken advantage of these achievements to change software engineering practices from “program centric” to “specification centric”. No effective end-to-end means to use synthesis are available to engineers, and the potential revolutionary impact of synthesis on the engineering of reactive systems software is far from being fully explored.
The proposal targets four objectives: a new, rich specification language, tailored for synthesis and for use by software engineers; a set of new methods for specification centric development; tool implementations in ‘killer app’ application domains; and systematic evaluation with engineers.
The research aims to unleash and evaluate the potential of synthesis to revolutionize reactive systems software development and to open the way for new directions in software engineering research and practice.
Summary
The design and development of open reactive systems, which compute by reacting to ongoing stimuli from their environment, and include, for example, mobile applications running on smart phone devices, web-based applications, industrial robotic systems, embedded software running on chips inside cars and aircraft, etc., is a complex and challenging task. Despite advancement from low-level assembly languages to higher-level languages with powerful abstraction mechanisms, and the use of automated testing and formal verification, reactive systems software development is still a mostly manual and error-prone iterative activity of coding and debugging.
A fundamentally different alternative approach to reactive systems development is synthesis, the automatic creation of correct-by-construction software from its specification. Synthesis has the potential to transform the way open reactive systems software is developed, making the process more effective and productive, and making its results more reliable and usable. However, while important advancements have been recently made on the algorithmic aspects of synthesis, no work has yet taken advantage of these achievements to change software engineering practices from “program centric” to “specification centric”. No effective end-to-end means to use synthesis are available to engineers, and the potential revolutionary impact of synthesis on the engineering of reactive systems software is far from being fully explored.
The proposal targets four objectives: a new, rich specification language, tailored for synthesis and for use by software engineers; a set of new methods for specification centric development; tool implementations in ‘killer app’ application domains; and systematic evaluation with engineers.
The research aims to unleash and evaluate the potential of synthesis to revolutionize reactive systems software development and to open the way for new directions in software engineering research and practice.
Max ERC Funding
1 477 000 €
Duration
Start date: 2015-04-01, End date: 2020-09-30
Project acronym T_CELL(S)_DIFFER
Project Differentiation of pro-inflammatory T cell subsets in vivo
Researcher (PI) Bruno Miguel De Carvalho E Silva Santos
Host Institution (HI) INSTITUTO DE MEDICINA MOLECULAR JOAO LOBO ANTUNES
Call Details Starting Grant (StG), LS6, ERC-2010-StG_20091118
Summary Our understanding of T cell differentiation impacts on vaccine development and on the treatment of (auto) immune disorders. T cells are key players in inflammation, a crucial component of the immune response to pathogens that causes severe damage to the host when uncontrolled. The cytokines Interferon-(IFN-) and Interleukin-17 (IL-17) are critical mediators of the proinflammatory activity of T cells usually designated as “T helper 1” (Th1) and Th17, respectively.
Here we propose to investigate the contribution of all T cell lineages - CD4+ and CD8+ cells, and NKT cells – to global Th1 or Th17 immune responses, using various tools including a IFN-/IL-17 double reporter mouse. Importantly, we will study Th1/ Th17 differentiation in vivo, inmodels of infection with Plasmodium berghei or Mycobacterium tuberculosis. We will analyse theindividual and combined contributions of the distinct T cell subsets, their cellular interactions andpotential interdependence in lymphoid organs and in target organs of infection.
We further envisage a molecular understanding of how innate (and NKT) and adaptive (CD4+ and CD8+) T cell subsets acquire their respective capacities to produce IFN-or IL-17 in vivo. We will dissect (pre-/ post-) transcriptional mechanisms of regulation of Ifng and Il17 expression in the various T cell subsets, ultimately at the single-cell level. We aim at characterizing networks of transcription factors and microRNAs that regulate Th1/ Th17 differentiation either in all or in specific T cell subsets. We are particularly interested in addressing the constitutive expression of IFN-or IL-17 by innate T lymphocytes, which is set up in the thymus. We will define the molecular components of this “developmental pre-programming” of and NKT cells in comparison with the mechanisms of peripheral induction of CD4+ Th1/ Th17 cell differentiation upon infection.
By contrast to the generalized focus on CD4+ T cells, this project will consider Th1/ Th17 differentiation of all T cell lineages and their in vivo contributions to relevant models of infection. I believe this holistic view of organism-based immune parameters and their underlying molecular mechanisms, down to the single-cell level, will significantly advance our understanding of how the Immune System works.
Summary
Our understanding of T cell differentiation impacts on vaccine development and on the treatment of (auto) immune disorders. T cells are key players in inflammation, a crucial component of the immune response to pathogens that causes severe damage to the host when uncontrolled. The cytokines Interferon-(IFN-) and Interleukin-17 (IL-17) are critical mediators of the proinflammatory activity of T cells usually designated as “T helper 1” (Th1) and Th17, respectively.
Here we propose to investigate the contribution of all T cell lineages - CD4+ and CD8+ cells, and NKT cells – to global Th1 or Th17 immune responses, using various tools including a IFN-/IL-17 double reporter mouse. Importantly, we will study Th1/ Th17 differentiation in vivo, inmodels of infection with Plasmodium berghei or Mycobacterium tuberculosis. We will analyse theindividual and combined contributions of the distinct T cell subsets, their cellular interactions andpotential interdependence in lymphoid organs and in target organs of infection.
We further envisage a molecular understanding of how innate (and NKT) and adaptive (CD4+ and CD8+) T cell subsets acquire their respective capacities to produce IFN-or IL-17 in vivo. We will dissect (pre-/ post-) transcriptional mechanisms of regulation of Ifng and Il17 expression in the various T cell subsets, ultimately at the single-cell level. We aim at characterizing networks of transcription factors and microRNAs that regulate Th1/ Th17 differentiation either in all or in specific T cell subsets. We are particularly interested in addressing the constitutive expression of IFN-or IL-17 by innate T lymphocytes, which is set up in the thymus. We will define the molecular components of this “developmental pre-programming” of and NKT cells in comparison with the mechanisms of peripheral induction of CD4+ Th1/ Th17 cell differentiation upon infection.
By contrast to the generalized focus on CD4+ T cells, this project will consider Th1/ Th17 differentiation of all T cell lineages and their in vivo contributions to relevant models of infection. I believe this holistic view of organism-based immune parameters and their underlying molecular mechanisms, down to the single-cell level, will significantly advance our understanding of how the Immune System works.
Max ERC Funding
1 500 000 €
Duration
Start date: 2010-12-01, End date: 2015-06-30
Project acronym TARGETING_CANCER
Project Eradication of tumors by targeting dsRNA selectively to cancer cells and recruitment of the innate immune system
Researcher (PI) Alexander Levitzki
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Advanced Grant (AdG), LS7, ERC-2009-AdG
Summary We have recently shown that EGFR over-expressing tumors can be eradicated by an EGFR homing chemical vector, carrying dsRNA. The vector is PolyInosine/Cytosine (PolyIC) bound to Polyethleneimine-Polyethyleneglycol-EGF (PEI-PEG-EGF, PPE). We have shown that even tumors in which up to 50% of cells do not express EGFR are eradicated, due to the strong tumor-localized bystander effects, which involve the innate immune system. Using this EGFR homing vector we have been able to eradicate EGFR overexpressing tumors by either local or systemic application. Since the success of this strategy seems to be due to the strong bystander effects induced by the internalized PolyIC it is likely that heterogeneous tumors, in which only a portion of the cells harbor the targeted receptor, will be eradicated too, as shown in our preliminary studies (PloS Med, 2006). This strategy actually targets the innate immune system to the tumor. We propose to establish tumors in which decreasing portions of cells over-express EGFR and determine the lowest number of EGFR over-expressing cells that can yield tumor eradication by the lowest dose of PolyIC/PPE. The principle behind the success of the Trojan horse approach is that the targeting moiety, EGF, is tethered to the other components of the vector in such a way that it retains its native EGFR binding properties and its ability to internalize with the receptor. The composition of the vector is such that the ligand EGF can be replaced by any other ligand, if the appropriate coupling conditions are used, retaining the ability of the ligand to bind to the target protein and internalize with it. We propose to replace EGF by a number of other ligands, such PSMA binding ligand (targeting prostate cancer) and Her-2 affibodies. Although only a fraction of women who over-express Her-2 respond to Herceptin, it is likely that they will respond to PolyIC/PP-Her-2 affibody.
Summary
We have recently shown that EGFR over-expressing tumors can be eradicated by an EGFR homing chemical vector, carrying dsRNA. The vector is PolyInosine/Cytosine (PolyIC) bound to Polyethleneimine-Polyethyleneglycol-EGF (PEI-PEG-EGF, PPE). We have shown that even tumors in which up to 50% of cells do not express EGFR are eradicated, due to the strong tumor-localized bystander effects, which involve the innate immune system. Using this EGFR homing vector we have been able to eradicate EGFR overexpressing tumors by either local or systemic application. Since the success of this strategy seems to be due to the strong bystander effects induced by the internalized PolyIC it is likely that heterogeneous tumors, in which only a portion of the cells harbor the targeted receptor, will be eradicated too, as shown in our preliminary studies (PloS Med, 2006). This strategy actually targets the innate immune system to the tumor. We propose to establish tumors in which decreasing portions of cells over-express EGFR and determine the lowest number of EGFR over-expressing cells that can yield tumor eradication by the lowest dose of PolyIC/PPE. The principle behind the success of the Trojan horse approach is that the targeting moiety, EGF, is tethered to the other components of the vector in such a way that it retains its native EGFR binding properties and its ability to internalize with the receptor. The composition of the vector is such that the ligand EGF can be replaced by any other ligand, if the appropriate coupling conditions are used, retaining the ability of the ligand to bind to the target protein and internalize with it. We propose to replace EGF by a number of other ligands, such PSMA binding ligand (targeting prostate cancer) and Her-2 affibodies. Although only a fraction of women who over-express Her-2 respond to Herceptin, it is likely that they will respond to PolyIC/PP-Her-2 affibody.
Max ERC Funding
2 054 340 €
Duration
Start date: 2010-02-01, End date: 2015-01-31
Project acronym TC2D
Project 2D nanomaterials-based composite films for more efficient thermal conduction
Researcher (PI) Valeria NICOLOSI
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Proof of Concept (PoC), ERC-2016-PoC, ERC-2016-PoC
Summary The aim of this proposal is to determine the economic and technical feasibility of using readily scalable technologies for the development of inexpensive and high performance solutions for heat dissipation for the high-end automobile industry, as well as other markets including household appliances, injection moulding, advanced aircraft and pharmaceutical manufacturing, ranging from the actual fabrication protocols, down to a wide range of finite products.
The technology here described is focussed to solving heat dissipation issues by the use of novel 2-dimensional (2D) nanomaterials. While graphene is the most well-known 2D system, hundreds of other inorganic layered materials exist. 2-dimensional materials have immediate and far-reaching potential in several high-impact technological applications amongst which are heat harvesting and dissipation.
Our technology will offer very cheap, scalable solution of using advanced 2D nanomaterials for enhanced heat transport. Moreover, our technology offers the advantage of being extremely versatile: 2D nanomaterial dispersions can be sprayed on their own directly onto surfaces or they can be mixed to different matrixes such as Polysil to obtain enhanced resistance to wear, abrasion, oxidation etc. This will allow us to improve the performance of existing systems, as well as improve the performance of new designs. Our developed solutions will not need to be applied through the whole heat recovery system, but mainly at those critical parts that limit the system performance. This technology has the potential of becoming a feasible, easy and efficient solution for a range of manufacturing companies. It will constitute a huge economic return, not to consider the societal overall impact of having much more efficient ways to deal with energy consumption.
Summary
The aim of this proposal is to determine the economic and technical feasibility of using readily scalable technologies for the development of inexpensive and high performance solutions for heat dissipation for the high-end automobile industry, as well as other markets including household appliances, injection moulding, advanced aircraft and pharmaceutical manufacturing, ranging from the actual fabrication protocols, down to a wide range of finite products.
The technology here described is focussed to solving heat dissipation issues by the use of novel 2-dimensional (2D) nanomaterials. While graphene is the most well-known 2D system, hundreds of other inorganic layered materials exist. 2-dimensional materials have immediate and far-reaching potential in several high-impact technological applications amongst which are heat harvesting and dissipation.
Our technology will offer very cheap, scalable solution of using advanced 2D nanomaterials for enhanced heat transport. Moreover, our technology offers the advantage of being extremely versatile: 2D nanomaterial dispersions can be sprayed on their own directly onto surfaces or they can be mixed to different matrixes such as Polysil to obtain enhanced resistance to wear, abrasion, oxidation etc. This will allow us to improve the performance of existing systems, as well as improve the performance of new designs. Our developed solutions will not need to be applied through the whole heat recovery system, but mainly at those critical parts that limit the system performance. This technology has the potential of becoming a feasible, easy and efficient solution for a range of manufacturing companies. It will constitute a huge economic return, not to consider the societal overall impact of having much more efficient ways to deal with energy consumption.
Max ERC Funding
129 774 €
Duration
Start date: 2016-12-01, End date: 2018-05-31
Project acronym TCCECJ
Project Theologies of conversion to Christianity in early modern east-central Europan Judaism
Researcher (PI) Pawel Tadeusz Maciejko
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), SH6, ERC-2010-StG_20091209
Summary This project endeavors to recalibrate the accepted understanding of the Jewish-Christian interchange in the early modern East-Central Europe in light of an analysis of Jewish theological elaborations of conversion to Christianity. From the mid-seventeenth century onwards conversion to Christianity became one of the central intellectual (and not merely practical) concerns of Judaism. By attempting to reconstruct the theological conceptualizations of conversions (and not – as did other scholars – biographies of the converts), I shall challenge the prevailing scholarly paradigm of the existence of clear and impenetrable boundaries between Judaism and Christianity. My project seeks to systematically discuss this issue on the basis of an analysis of a large amount of previously unknown primary sources, thereby shedding significant new light on the Jewish Christian relations in Central Europe in the early modern period.
Summary
This project endeavors to recalibrate the accepted understanding of the Jewish-Christian interchange in the early modern East-Central Europe in light of an analysis of Jewish theological elaborations of conversion to Christianity. From the mid-seventeenth century onwards conversion to Christianity became one of the central intellectual (and not merely practical) concerns of Judaism. By attempting to reconstruct the theological conceptualizations of conversions (and not – as did other scholars – biographies of the converts), I shall challenge the prevailing scholarly paradigm of the existence of clear and impenetrable boundaries between Judaism and Christianity. My project seeks to systematically discuss this issue on the basis of an analysis of a large amount of previously unknown primary sources, thereby shedding significant new light on the Jewish Christian relations in Central Europe in the early modern period.
Max ERC Funding
1 045 200 €
Duration
Start date: 2011-02-01, End date: 2016-01-31
Project acronym TDRFSP
Project Time-Domain RF and Analog Signal Processing
Researcher (PI) Robert Staszewski
Host Institution (HI) UNIVERSITY COLLEGE DUBLIN, NATIONAL UNIVERSITY OF IRELAND, DUBLIN
Call Details Starting Grant (StG), PE7, ERC-2012-StG_20111012
Summary "One of the most important developments in the communication microelectronics in the last decade was the invention and popularization of “Digital RF”. It transforms the radio frequency (RF) analog functionality of a wireless transceiver into digitally-intensive implementations that operate in time-domain. They are best realized in mainstream nanometer-scale CMOS technologies and easily integrated with digital processors. As a result, RF transceivers based on this new approach now enjoy significant benefits. Consequently, the RF transceivers based on this architecture are now the majority of the 1.5 billion mobile handsets produced annually.
The invention and development of “Digital RF” was pioneered in the last decade by this applicant at Texas Instruments in Dallas, Texas, USA. Despite having published over 130 scientific papers, that industrial research focus has been mainly limited to the highest volume segment of the wireless communications market: low-cost GSM/EDGE cellular phones and Bluetooth radios. Unfortunately, that low-cost low-data-rate market segment has already reached the saturation. The fastest growing segments of the wireless communications are now: high-data-rate “smart phones”, ultra-low-power wireless sensor network devices, antenna-array and millimeter-wave transceivers, where the original “Digital RF” approach could not be readily exploited.
The goal of this proposal is to revisit and exploit the fundamental theory of the time-domain operation of RF and analog circuits. This way the broad area of the wireless communications, as well as analog and mixed-signal electronics in general, can be transformed for the ready realization in the advanced CMOS technology. This is expected to revolutionize the entire research field to even a larger extent than the “Digital RF” breakthrough in low-cost low-data-rate radios pioneered by this applicant in the last decade."
Summary
"One of the most important developments in the communication microelectronics in the last decade was the invention and popularization of “Digital RF”. It transforms the radio frequency (RF) analog functionality of a wireless transceiver into digitally-intensive implementations that operate in time-domain. They are best realized in mainstream nanometer-scale CMOS technologies and easily integrated with digital processors. As a result, RF transceivers based on this new approach now enjoy significant benefits. Consequently, the RF transceivers based on this architecture are now the majority of the 1.5 billion mobile handsets produced annually.
The invention and development of “Digital RF” was pioneered in the last decade by this applicant at Texas Instruments in Dallas, Texas, USA. Despite having published over 130 scientific papers, that industrial research focus has been mainly limited to the highest volume segment of the wireless communications market: low-cost GSM/EDGE cellular phones and Bluetooth radios. Unfortunately, that low-cost low-data-rate market segment has already reached the saturation. The fastest growing segments of the wireless communications are now: high-data-rate “smart phones”, ultra-low-power wireless sensor network devices, antenna-array and millimeter-wave transceivers, where the original “Digital RF” approach could not be readily exploited.
The goal of this proposal is to revisit and exploit the fundamental theory of the time-domain operation of RF and analog circuits. This way the broad area of the wireless communications, as well as analog and mixed-signal electronics in general, can be transformed for the ready realization in the advanced CMOS technology. This is expected to revolutionize the entire research field to even a larger extent than the “Digital RF” breakthrough in low-cost low-data-rate radios pioneered by this applicant in the last decade."
Max ERC Funding
1 497 000 €
Duration
Start date: 2012-09-01, End date: 2017-08-31
Project acronym TDSTC
Project Time Dependent String Theories and Cosmology
Researcher (PI) Nissan Itzhaki
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), PE2, ERC-2007-StG
Summary Great progress was made in recent years on two different fronts. Experimentally we have improved our understanding of cosmology to the level that we now have a standard model of cosmology. Perhaps the most exciting aspect of this model is the realization that most of the energy in the universe is made out of dark matter and dark energy which are not well understood. On the theoretical front string theory has been developed to the level that we now understand many non-perturbative aspects of the theory. This progress offers new ways of making contact between string theory and our four dimensional real world in general and cosmology in particular. The objective of this proposal is to take advantage of this progress and to improve our understanding of string theory in time dependent situations. We are hopeful that such a progress could lead to a {\it precise} realization in string theory of the standard model of cosmology in general, and of dark matter, dark energy and inflation in particular.
Summary
Great progress was made in recent years on two different fronts. Experimentally we have improved our understanding of cosmology to the level that we now have a standard model of cosmology. Perhaps the most exciting aspect of this model is the realization that most of the energy in the universe is made out of dark matter and dark energy which are not well understood. On the theoretical front string theory has been developed to the level that we now understand many non-perturbative aspects of the theory. This progress offers new ways of making contact between string theory and our four dimensional real world in general and cosmology in particular. The objective of this proposal is to take advantage of this progress and to improve our understanding of string theory in time dependent situations. We are hopeful that such a progress could lead to a {\it precise} realization in string theory of the standard model of cosmology in general, and of dark matter, dark energy and inflation in particular.
Max ERC Funding
441 116 €
Duration
Start date: 2008-09-01, End date: 2012-08-31
Project acronym TEC_Pro
Project Molecular control of self-renewal and lineage specification in thymic epithelial cell progenitors in vivo.
Researcher (PI) Nuno Miguel De Oliveira Lages Alves
Host Institution (HI) INSTITUTO DE BIOLOGIA MOLECULAR E CELULAR-IBMC
Call Details Starting Grant (StG), LS6, ERC-2014-STG
Summary The development of vaccines for the treatment of infectious diseases, cancer and autoimmunity depends on our knowledge of T-cell differentiation. This proposal is focused on studying the thymus, the organ responsible for the generation of T cells that are responsive against pathogen-derived antigens, and yet tolerant to self. Within the thymus, thymic epithelial cells (TECs) provide key inductive microenvironments for the development and selection of T cells that arise from hematopoietic progenitors. As a result, defects in TEC differentiation cause syndromes that range from immunodeficiency to autoimmunity, which makes the study of TECs of fundamental, and clinical, importance to understand immunity and tolerance induction. TECs are divided into two functionally distinct cortical (cTECs) and medullary (mTECs) subtypes, which derive from common bipotent TEC progenitors (TEPs). Yet, the genetic and epigenetic details that control cTEC/mTEC lineage specifications from TEPs are unsettled.
My objectives are to identify TEC progenitors and their niches within the thymus, define new molecular components involved in their self-renewal and lineage potential, and elucidate the epigenetic codes that regulate the genetic programs during cTEC/mTEC fate decisions. We take a global approach to examine TEC differentiation, which integrates the study of molecular processes taking place at cellular level and the analysis of in vivo mouse models. Using advanced research tools that combine reporter mice, clonogenic assays, organotypic cultures, high-throughput RNAi screen and genome-wide epigenetic and transcriptomic profiling, we will dissect the principles that underlie the self-renewal and lineage differentiation of TEC progenitors in vivo. I believe this project has the potential to contribute to one of the great challenges of modern immunology – modulate thymic function through the induction of TEPs - and therefore, represents a major advance in Health Sciences.
Summary
The development of vaccines for the treatment of infectious diseases, cancer and autoimmunity depends on our knowledge of T-cell differentiation. This proposal is focused on studying the thymus, the organ responsible for the generation of T cells that are responsive against pathogen-derived antigens, and yet tolerant to self. Within the thymus, thymic epithelial cells (TECs) provide key inductive microenvironments for the development and selection of T cells that arise from hematopoietic progenitors. As a result, defects in TEC differentiation cause syndromes that range from immunodeficiency to autoimmunity, which makes the study of TECs of fundamental, and clinical, importance to understand immunity and tolerance induction. TECs are divided into two functionally distinct cortical (cTECs) and medullary (mTECs) subtypes, which derive from common bipotent TEC progenitors (TEPs). Yet, the genetic and epigenetic details that control cTEC/mTEC lineage specifications from TEPs are unsettled.
My objectives are to identify TEC progenitors and their niches within the thymus, define new molecular components involved in their self-renewal and lineage potential, and elucidate the epigenetic codes that regulate the genetic programs during cTEC/mTEC fate decisions. We take a global approach to examine TEC differentiation, which integrates the study of molecular processes taking place at cellular level and the analysis of in vivo mouse models. Using advanced research tools that combine reporter mice, clonogenic assays, organotypic cultures, high-throughput RNAi screen and genome-wide epigenetic and transcriptomic profiling, we will dissect the principles that underlie the self-renewal and lineage differentiation of TEC progenitors in vivo. I believe this project has the potential to contribute to one of the great challenges of modern immunology – modulate thymic function through the induction of TEPs - and therefore, represents a major advance in Health Sciences.
Max ERC Funding
1 491 749 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym TechChild
Project Just because we can, should we? An anthropological perspective on the initiation of technology dependence to sustain a child’s life
Researcher (PI) Maria BRENNER
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Starting Grant (StG), SH3, ERC-2018-STG
Summary There is an increasing number of children with complex healthcare needs who require continuous technological support to sustain their lives. This technology dependence is initiated in an environment of medical interventionism, with potential for discrete discrimination. A scarcity of empirical data on the influences and interactions at the point of the initiation of technology dependence means that clinical, legal and ethical deliberations are driven more by opinion than empirical evidence. An evidence-based theoretical construct is required to articulate and contextualise the levers and penalties of the initiation of this technology. TechChild asks just because we can, should we? and how are the influences on the initiation of technology dependence understood in contrasting health, legal, and socio-political systems? Serendipitous findings from my research indicates parental concern regarding an absence of transparency, and parents are questioning patterns of family characteristics, when technology dependence is initiated. TechChild will be a step change in how we understand the coexistence of humans with an increasing availability of technological augmentations. This is urgent in a society where this debate predominantly happens in the public domain with limited opportunity for healthcare professionals to offer their perspective. TechChild will revolutionise how we conceive access to care and offers a research horizon that questions cultural relativism in a cyborg era. This is a scholarly ambitious project involving Paediatric Intensive Care Units in four international sites using a Bayesian framework to elicit the probability of factors likely to influence the initiation of technology dependence, leading to the development of a theory of technology initiation. This ground-breaking exploration will inform technology initiation across the lifespan with implications for healthcare, bioethics, education, parenting, policy making, and legal.
Summary
There is an increasing number of children with complex healthcare needs who require continuous technological support to sustain their lives. This technology dependence is initiated in an environment of medical interventionism, with potential for discrete discrimination. A scarcity of empirical data on the influences and interactions at the point of the initiation of technology dependence means that clinical, legal and ethical deliberations are driven more by opinion than empirical evidence. An evidence-based theoretical construct is required to articulate and contextualise the levers and penalties of the initiation of this technology. TechChild asks just because we can, should we? and how are the influences on the initiation of technology dependence understood in contrasting health, legal, and socio-political systems? Serendipitous findings from my research indicates parental concern regarding an absence of transparency, and parents are questioning patterns of family characteristics, when technology dependence is initiated. TechChild will be a step change in how we understand the coexistence of humans with an increasing availability of technological augmentations. This is urgent in a society where this debate predominantly happens in the public domain with limited opportunity for healthcare professionals to offer their perspective. TechChild will revolutionise how we conceive access to care and offers a research horizon that questions cultural relativism in a cyborg era. This is a scholarly ambitious project involving Paediatric Intensive Care Units in four international sites using a Bayesian framework to elicit the probability of factors likely to influence the initiation of technology dependence, leading to the development of a theory of technology initiation. This ground-breaking exploration will inform technology initiation across the lifespan with implications for healthcare, bioethics, education, parenting, policy making, and legal.
Max ERC Funding
1 464 101 €
Duration
Start date: 2019-02-01, End date: 2024-01-31
Project acronym TechEvo
Project Technology Evolution in Regional Economies
Researcher (PI) Dieter Franz KOGLER
Host Institution (HI) UNIVERSITY COLLEGE DUBLIN, NATIONAL UNIVERSITY OF IRELAND, DUBLIN
Call Details Starting Grant (StG), SH2, ERC-2016-STG
Summary The creation and accumulation of knowledge are processes at the heart of technological change and economic growth. Attention has been directed at aggregate measures of knowledge production in regional and national contexts, but little consideration has been given to the properties of knowledge produced in specific places. How does the nature of knowledge that is produced vary over space, what conditions the scope of technologies generated in different locations, and how do these knowledge sets impact the performance of local firms and industries?
To date, the way in which specific regional knowledge capabilities influence the evolution of local technology trajectories and thus shape geographies of economic prosperity have not yet been considered systematically. The objective of the “Technology Evolution in Regional Economies” (TechEvo) project is to address these significant shortcomings. Focusing on the evolution of scientific and technical knowledge, as indicated by patent, trademark and scientific literature records, the point of departure is the pan-European knowledge space for all 28 European Union member countries, plus Norway and Switzerland, over the time period 1981-2015. The knowledge space, based on the co-occurrence matrix of particular knowledge domains (629), maps the proximity of patent technology classes and enables the development of regional measures of knowledge specialization for all 1,369 (NUTS3) regions. Set in an evolutionary framework the investigation provides ground breaking insights into how innovative entities and individual inventors are embedded in social and cognitive local and non-local networks, and how regional technology trajectories are shaped through entry, exit, and selection processes. TechEvo will provide a wealth of indicators, models and tools that will assist firms and policy makers in place-based investment decisions, and deliver a science and technology policy evaluation tool capable of assessing impact.
Summary
The creation and accumulation of knowledge are processes at the heart of technological change and economic growth. Attention has been directed at aggregate measures of knowledge production in regional and national contexts, but little consideration has been given to the properties of knowledge produced in specific places. How does the nature of knowledge that is produced vary over space, what conditions the scope of technologies generated in different locations, and how do these knowledge sets impact the performance of local firms and industries?
To date, the way in which specific regional knowledge capabilities influence the evolution of local technology trajectories and thus shape geographies of economic prosperity have not yet been considered systematically. The objective of the “Technology Evolution in Regional Economies” (TechEvo) project is to address these significant shortcomings. Focusing on the evolution of scientific and technical knowledge, as indicated by patent, trademark and scientific literature records, the point of departure is the pan-European knowledge space for all 28 European Union member countries, plus Norway and Switzerland, over the time period 1981-2015. The knowledge space, based on the co-occurrence matrix of particular knowledge domains (629), maps the proximity of patent technology classes and enables the development of regional measures of knowledge specialization for all 1,369 (NUTS3) regions. Set in an evolutionary framework the investigation provides ground breaking insights into how innovative entities and individual inventors are embedded in social and cognitive local and non-local networks, and how regional technology trajectories are shaped through entry, exit, and selection processes. TechEvo will provide a wealth of indicators, models and tools that will assist firms and policy makers in place-based investment decisions, and deliver a science and technology policy evaluation tool capable of assessing impact.
Max ERC Funding
1 496 599 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym Temporal Coding
Project Do behaving animals extract information from precise spike timing? – The use of temporal codes
Researcher (PI) Moshe Parnas
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), LS5, ERC-2015-STG
Summary Neural temporal codes have come to dominate our way of thinking on how information is coded in the brain. When precise spike timing is found to carry information, the neural code is defined as a temporal code. In spite of the importance of temporal codes, whether behaving animals actually use this type of coding is still an unresolved question. To date studying temporal codes was technically impossible due to the inability to manipulate spike timing in behaving animals. However, very recent developments in optogenetics solved this problem. Despite these modern tools, this key question is very difficult to resolve in mammals, because the meaning of manipulating a part of a neural circuit without knowledge of the neural activity of all the neurons involved in the coding is unclear.
The fly is an ideal model system to study temporal codes because its small number of neurons allows for complete mapping of the neural activity of all the neurons involved. Since temporal codes are suggested to be involved in olfactory intensity coding, I will study this process. I will device a multidisciplinary approach of electrophysiology, two-photon imaging and behavior.
I aim to examine for the first time directly whether temporal coding is used by behaving animals and to unravel the circuits and mechanisms that underlie intensity coding. To do so, I will manipulate the temporal codes in behaving animals and examine whether the behavioral responses change accordingly. To guide this study I will generate three novel databases of: i. the temporal activity of all neurons involved in Drosophila olfactory intensity coding. ii. The functional connectivity between the two brain regions that are involved in the intensity coding and iii. behavioral responses to different odors and intensities.
Thus, this research will use cutting edge techniques to resolve a long standing basic question in neuroscience: how does the brain actually code information?
Summary
Neural temporal codes have come to dominate our way of thinking on how information is coded in the brain. When precise spike timing is found to carry information, the neural code is defined as a temporal code. In spite of the importance of temporal codes, whether behaving animals actually use this type of coding is still an unresolved question. To date studying temporal codes was technically impossible due to the inability to manipulate spike timing in behaving animals. However, very recent developments in optogenetics solved this problem. Despite these modern tools, this key question is very difficult to resolve in mammals, because the meaning of manipulating a part of a neural circuit without knowledge of the neural activity of all the neurons involved in the coding is unclear.
The fly is an ideal model system to study temporal codes because its small number of neurons allows for complete mapping of the neural activity of all the neurons involved. Since temporal codes are suggested to be involved in olfactory intensity coding, I will study this process. I will device a multidisciplinary approach of electrophysiology, two-photon imaging and behavior.
I aim to examine for the first time directly whether temporal coding is used by behaving animals and to unravel the circuits and mechanisms that underlie intensity coding. To do so, I will manipulate the temporal codes in behaving animals and examine whether the behavioral responses change accordingly. To guide this study I will generate three novel databases of: i. the temporal activity of all neurons involved in Drosophila olfactory intensity coding. ii. The functional connectivity between the two brain regions that are involved in the intensity coding and iii. behavioral responses to different odors and intensities.
Thus, this research will use cutting edge techniques to resolve a long standing basic question in neuroscience: how does the brain actually code information?
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym TENDONTOBONE
Project The mechanisms that underlie the development of a tendon-bone attachment unit
Researcher (PI) Elazar Zelzer
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), LS3, ERC-2012-StG_20111109
Summary We walk, run and jump using the complex and ingenious musculoskeletal system. It is therefore puzzling that although each of its components has been extensively studied, research of the musculoskeleton as an integrated system and, in particular, of its assembly has been scarce. In recent years, studies conducted in my lab have demonstrated the centrality of cross regulation between musculoskeletal tissues in skeletogenesis. These works have provided me with the inspiration for a revolutionary hypothesis on the way tendons connect to bones, along with sufficient preliminary data on which to base it.
The critical component in the assembly of the musculoskeleton is the formation of an attachment unit, where a tendon is inserted into a bone. Instead of two tissues that attach to each other, my novel hypothesis suggests that the entire attachment unit originates from a single pool of progenitor cells, which following differentiation diverges to form a tendon attached to cartilage.
With the support of the ERC scheme, I will uncover the previously uncharacterized cellular origin of the attachment unit and the genetic program underlying its development. The attachment unit is a compound tissue, as it is composed of chondrocytes at one end and of tenocytes at the other end. We will investigate the mechanisms that facilitate in situ differentiation of mesenchymal progenitor cells into two distinct cell fates, under one defined niche. In addition, I will identify the contribution of both mechanical stimuli and molecular signals to the development of the attachment unit.
The ultimate goal of this program is to provide a complete picture of attachment unit development, in order to promote understanding of musculoskeletal assembly. The acquired knowledge may provide the basis for new therapies for enthesopathies, through tissue engineering or repair.
Summary
We walk, run and jump using the complex and ingenious musculoskeletal system. It is therefore puzzling that although each of its components has been extensively studied, research of the musculoskeleton as an integrated system and, in particular, of its assembly has been scarce. In recent years, studies conducted in my lab have demonstrated the centrality of cross regulation between musculoskeletal tissues in skeletogenesis. These works have provided me with the inspiration for a revolutionary hypothesis on the way tendons connect to bones, along with sufficient preliminary data on which to base it.
The critical component in the assembly of the musculoskeleton is the formation of an attachment unit, where a tendon is inserted into a bone. Instead of two tissues that attach to each other, my novel hypothesis suggests that the entire attachment unit originates from a single pool of progenitor cells, which following differentiation diverges to form a tendon attached to cartilage.
With the support of the ERC scheme, I will uncover the previously uncharacterized cellular origin of the attachment unit and the genetic program underlying its development. The attachment unit is a compound tissue, as it is composed of chondrocytes at one end and of tenocytes at the other end. We will investigate the mechanisms that facilitate in situ differentiation of mesenchymal progenitor cells into two distinct cell fates, under one defined niche. In addition, I will identify the contribution of both mechanical stimuli and molecular signals to the development of the attachment unit.
The ultimate goal of this program is to provide a complete picture of attachment unit development, in order to promote understanding of musculoskeletal assembly. The acquired knowledge may provide the basis for new therapies for enthesopathies, through tissue engineering or repair.
Max ERC Funding
1 499 999 €
Duration
Start date: 2013-01-01, End date: 2017-12-31
Project acronym THE FALL
Project The Fall of 1200BC: The role of migration and conflict in social crises at end of the Bronze Age in South-eastern Europe
Researcher (PI) Barry MOLLOY
Host Institution (HI) UNIVERSITY COLLEGE DUBLIN, NATIONAL UNIVERSITY OF IRELAND, DUBLIN
Call Details Consolidator Grant (CoG), SH6, ERC-2017-COG
Summary This project explores changes in migration and conflict at the end of the Bronze Age (ca.1300-1000 BC) and their relevance for understanding the collapse of Europe’s first urban civilisation in the Aegean and proto-urban groups of the Balkans. The objective is to uncover the human face of this turning point in European prehistory by directly tracing the movement of people and the spread of new social practices across cultural boundaries. Hotly debated ancient tales of migrations are tested for the first time using recent advances in genetic and isotopic methods that can measure human mobility. Combined with mortuary research, this will precisely define relations between personal mobility and status, gender, identity and health to explore social scenarios in which people moved between groups.
To better understand the context of mobility, the project also evaluates social networks through which cultural traditions moved within and between distinct societies. For this purpose, regionally particular ways for making and using objects are analysed to explore how practices were exchanged and how types of objects shaped, and were shaped by, their new contexts of use. Metalwork is chosen for this research because new forms came to be widely shared across the region during the crisis, and we can employ a novel suite of analytic methods that explore how this material exposes wider social changes.
As personal and cultural mobility took place in social landscapes, the changing strategies for controlling access and mobility in settlement organisation are next explored. The character and causes of conflicts arising through these diverse venues for interaction are identified and we assess if they were catalysts for, or consequences of, unstable social systems.
THE FALL uses new primary research to test how this interplay between local developments, cultural transmissions and movement of people shaped the processes and events leading to the collapse of these early complex societies
Summary
This project explores changes in migration and conflict at the end of the Bronze Age (ca.1300-1000 BC) and their relevance for understanding the collapse of Europe’s first urban civilisation in the Aegean and proto-urban groups of the Balkans. The objective is to uncover the human face of this turning point in European prehistory by directly tracing the movement of people and the spread of new social practices across cultural boundaries. Hotly debated ancient tales of migrations are tested for the first time using recent advances in genetic and isotopic methods that can measure human mobility. Combined with mortuary research, this will precisely define relations between personal mobility and status, gender, identity and health to explore social scenarios in which people moved between groups.
To better understand the context of mobility, the project also evaluates social networks through which cultural traditions moved within and between distinct societies. For this purpose, regionally particular ways for making and using objects are analysed to explore how practices were exchanged and how types of objects shaped, and were shaped by, their new contexts of use. Metalwork is chosen for this research because new forms came to be widely shared across the region during the crisis, and we can employ a novel suite of analytic methods that explore how this material exposes wider social changes.
As personal and cultural mobility took place in social landscapes, the changing strategies for controlling access and mobility in settlement organisation are next explored. The character and causes of conflicts arising through these diverse venues for interaction are identified and we assess if they were catalysts for, or consequences of, unstable social systems.
THE FALL uses new primary research to test how this interplay between local developments, cultural transmissions and movement of people shaped the processes and events leading to the collapse of these early complex societies
Max ERC Funding
1 998 779 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym THE MR CHALLENGE
Project Expanding the horizons of magnetic resonance in sensitivity, imaging resolution, and availability
Researcher (PI) Aharon Blank
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Starting Grant (StG), PE4, ERC-2007-StG
Summary "We propose to develop and implement advanced magnetic resonance detection and micro-imaging techniques that will benefit many biophysical, chemical, physical, and medical applications. Magnetic resonance (MR) is one of the most profound observation methods in science. MR includes Nuclear Magnetic Resonance (NMR) and Electron Spin Resonance (ESR). It has a variety of applications ranging from chemical structure determination to medical imaging and quantum computing. From a scientific standpoint, MR was the main focus of at least seven Nobel prizes in physics, chemistry, and medicine. From an industrial standpoint, MR is a multibillion industry focused on a range of medical (MRI) and chemical applications (MR spectrometers). Despite the fact that magnetic resonance was discovered more than 60 years ago, there is still plenty of room for new methodologies and applications. This research will confront some of the most challenging issues that this field has yet to offer, which also contain the greatest potential benefits. This is what we call “The MR Challenge”. We will focus on three key MR issues: sensitivity, image resolution, and affordability. Our first goal is to substantially improve the sensitivity of MR spectroscopy and the resolution of MR micro-imaging. We will put most of our efforts on ESR spectroscopy and on the detection of NMR information through an ESR signal (ENDOR). At ambient conditions our goal is to achieve a sensitivity of ~10^4 electron spins and a resolution of 1 micron; at low temperatures we will approach single electron spin sensitivity and image resolution as high as 10nm. In terms of affordability, our goal is to introduce a small probe that is capable of acquiring NMR spectra from samples located outside the magnet (an ""ex-situ"" probe). We will also design and construct a new family of hand-held 3D NMR imaging probes. The new capabilities would be applied in the field of single cell imaging and biophysics, materials science, and medicine."
Summary
"We propose to develop and implement advanced magnetic resonance detection and micro-imaging techniques that will benefit many biophysical, chemical, physical, and medical applications. Magnetic resonance (MR) is one of the most profound observation methods in science. MR includes Nuclear Magnetic Resonance (NMR) and Electron Spin Resonance (ESR). It has a variety of applications ranging from chemical structure determination to medical imaging and quantum computing. From a scientific standpoint, MR was the main focus of at least seven Nobel prizes in physics, chemistry, and medicine. From an industrial standpoint, MR is a multibillion industry focused on a range of medical (MRI) and chemical applications (MR spectrometers). Despite the fact that magnetic resonance was discovered more than 60 years ago, there is still plenty of room for new methodologies and applications. This research will confront some of the most challenging issues that this field has yet to offer, which also contain the greatest potential benefits. This is what we call “The MR Challenge”. We will focus on three key MR issues: sensitivity, image resolution, and affordability. Our first goal is to substantially improve the sensitivity of MR spectroscopy and the resolution of MR micro-imaging. We will put most of our efforts on ESR spectroscopy and on the detection of NMR information through an ESR signal (ENDOR). At ambient conditions our goal is to achieve a sensitivity of ~10^4 electron spins and a resolution of 1 micron; at low temperatures we will approach single electron spin sensitivity and image resolution as high as 10nm. In terms of affordability, our goal is to introduce a small probe that is capable of acquiring NMR spectra from samples located outside the magnet (an ""ex-situ"" probe). We will also design and construct a new family of hand-held 3D NMR imaging probes. The new capabilities would be applied in the field of single cell imaging and biophysics, materials science, and medicine."
Max ERC Funding
1 250 000 €
Duration
Start date: 2008-08-01, End date: 2013-07-31
Project acronym TheoryDL
Project Practically Relevant Theory of Deep Learning
Researcher (PI) Shai Shalev-shwartz
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary One of the most significant recent developments in applied machine learning has been the resurgence of ``deep learning'', usually in the form of artificial neural networks. The empirical success of deep learning is stunning, and deep learning based systems have already led to breakthroughs in computer vision and speech recognition. In contrast, from the theoretical point of view, by and large, we do not understand why deep learning is at all possible, since most state of
the art theoretical results show that deep learning is computationally hard.
Bridging this gap is a great challenge since it involves proficiency in several theoretic fields (algorithms, complexity, and statistics) and at the same time requires a good understanding of real world practical problems and the ability to conduct applied research. We believe that a good theory must lead to better practical algorithms. It should also broaden the applicability of learning in general, and deep learning in particular, to new domains. Such a practically relevant theory may also lead to a fundamental paradigm shift in the way we currently analyze the complexity of algorithms.
Previous works by the PI and his colleagues and students have provided novel ways to analyze the computational complexity of learning algorithms and understand the tradeoffs between data and computational time. In this proposal, in order to bridge the gap between theory and practice, I suggest a departure from worst-case analyses and the development of a more optimistic, data dependent, theory with ``grey'' components. Success will lead to a breakthrough in our understanding of learning at large with significant potential for impact on the field of machine learning and its applications.
Summary
One of the most significant recent developments in applied machine learning has been the resurgence of ``deep learning'', usually in the form of artificial neural networks. The empirical success of deep learning is stunning, and deep learning based systems have already led to breakthroughs in computer vision and speech recognition. In contrast, from the theoretical point of view, by and large, we do not understand why deep learning is at all possible, since most state of
the art theoretical results show that deep learning is computationally hard.
Bridging this gap is a great challenge since it involves proficiency in several theoretic fields (algorithms, complexity, and statistics) and at the same time requires a good understanding of real world practical problems and the ability to conduct applied research. We believe that a good theory must lead to better practical algorithms. It should also broaden the applicability of learning in general, and deep learning in particular, to new domains. Such a practically relevant theory may also lead to a fundamental paradigm shift in the way we currently analyze the complexity of algorithms.
Previous works by the PI and his colleagues and students have provided novel ways to analyze the computational complexity of learning algorithms and understand the tradeoffs between data and computational time. In this proposal, in order to bridge the gap between theory and practice, I suggest a departure from worst-case analyses and the development of a more optimistic, data dependent, theory with ``grey'' components. Success will lead to a breakthrough in our understanding of learning at large with significant potential for impact on the field of machine learning and its applications.
Max ERC Funding
1 342 500 €
Duration
Start date: 2016-02-01, End date: 2021-01-31
Project acronym ThermoQuantumImage
Project Thermal imaging of nano and atomic-scale dissipation in quantum states of matter
Researcher (PI) Elia ZELDOV
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Advanced Grant (AdG), PE3, ERC-2017-ADG
Summary Energy dissipation is a fundamental process governing the dynamics of physical, chemical and biological systems and is of major importance in condensed matter physics where scattering, loss of quantum information, and even breakdown of topological protection are deeply linked to intricate details of how and where the dissipation occurs. But despite its vital importance, dissipation is currently not a readily measurable microscopic quantity. The aim of this proposal is to launch a new discipline of nanoscale dissipation imaging and spectroscopy and to apply it to study of quantum systems and novel states of matter. The proposed scanning thermal microscopy will be revolutionary in three aspects: the first-ever cryogenic thermal imaging; improvement of thermal sensitivity by five orders of magnitude over the state of the art; and imaging and spectroscopy of dissipation of single atomic defects. We will develop a superconducting quantum interference nano-thermometer on the apex of a sharp tip which will provide non-contact non-invasive low-temperature scanning thermal microscopy with unprecedented target sensitivity of 100 nK/Hz1/2 at 4 K. These advances will enable hitherto impossible direct thermal imaging of the most elemental processes such as phonon emission from a single atomic defect due to inelastic electron scattering, relaxation mechanisms in topological surface and edge states, and variation in dissipation in individual quantum dots due to single electron changes in their occupation. We will utilize this trailblazing tool to uncover nanoscale processes that lead to energy dissipation in novel systems including resonant quasi-bound edge states in graphene, helical surface states in topological insulators, and chiral anomaly in Weyl semimetals, and to provide groundbreaking insight into nonlocal dissipation and transport properties in mesoscopic systems and in 2D topological states of matter including quantum Hall, quantum anomalous Hall, and quantum spin Hall.
Summary
Energy dissipation is a fundamental process governing the dynamics of physical, chemical and biological systems and is of major importance in condensed matter physics where scattering, loss of quantum information, and even breakdown of topological protection are deeply linked to intricate details of how and where the dissipation occurs. But despite its vital importance, dissipation is currently not a readily measurable microscopic quantity. The aim of this proposal is to launch a new discipline of nanoscale dissipation imaging and spectroscopy and to apply it to study of quantum systems and novel states of matter. The proposed scanning thermal microscopy will be revolutionary in three aspects: the first-ever cryogenic thermal imaging; improvement of thermal sensitivity by five orders of magnitude over the state of the art; and imaging and spectroscopy of dissipation of single atomic defects. We will develop a superconducting quantum interference nano-thermometer on the apex of a sharp tip which will provide non-contact non-invasive low-temperature scanning thermal microscopy with unprecedented target sensitivity of 100 nK/Hz1/2 at 4 K. These advances will enable hitherto impossible direct thermal imaging of the most elemental processes such as phonon emission from a single atomic defect due to inelastic electron scattering, relaxation mechanisms in topological surface and edge states, and variation in dissipation in individual quantum dots due to single electron changes in their occupation. We will utilize this trailblazing tool to uncover nanoscale processes that lead to energy dissipation in novel systems including resonant quasi-bound edge states in graphene, helical surface states in topological insulators, and chiral anomaly in Weyl semimetals, and to provide groundbreaking insight into nonlocal dissipation and transport properties in mesoscopic systems and in 2D topological states of matter including quantum Hall, quantum anomalous Hall, and quantum spin Hall.
Max ERC Funding
3 075 000 €
Duration
Start date: 2018-06-01, End date: 2023-05-31
Project acronym TheSocialBusiness
Project The advantages and pitfalls of elicitated online user engagement
Researcher (PI) Gal Oestreicher-Singer
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), SH1, ERC-2017-STG
Summary The notion that websites benefit when their users are socially engaged—i.e., when they interact with content and with other users—has become so entrenched it is practically an axiom. Accordingly, websites in numerous domains invest heavily in ‘social computing’ features that encourage such engagement. In fact, many attempt to elicit engagement proactively, through the use of calls to action—prompts that ask users to carry out participatory actions such as rating or ‘liking’ content. Given the vast popularity of social computing, it is surprising how little we actually know about how user engagement affects websites and their users. From a business perspective, the direct value of user engagement is far from clear. From a societal perspective, it is unclear whether the increasing expectation for users to engage with firms may lead users to behave in ways that do not serve them. This research aims to provide a comprehensive understanding of user engagement, and specifically, engagement elicited by calls to action, from those two perspectives. I will use an empirical approach, relying on innovative lab and large-scale field experiments. The lab experiments leverage a specially-designed website. For the field experiments, we will collaborate with websites spanning several domains; we have already initiated a relationship with a leading website-development service provider that uses a freemium business model, and have been able to observe the actual behavior of its users. Our preliminary results are promising, supporting the idea that calls to action have strong effects on conversion and information revelation. Moving forward, I plan to fully characterize the nature of these effects in multiple product domains, and to isolate their underlying mechanisms. I am confident that this research program will transform our understanding of the economic and broader societal impact of the social computing phenomenon.
Summary
The notion that websites benefit when their users are socially engaged—i.e., when they interact with content and with other users—has become so entrenched it is practically an axiom. Accordingly, websites in numerous domains invest heavily in ‘social computing’ features that encourage such engagement. In fact, many attempt to elicit engagement proactively, through the use of calls to action—prompts that ask users to carry out participatory actions such as rating or ‘liking’ content. Given the vast popularity of social computing, it is surprising how little we actually know about how user engagement affects websites and their users. From a business perspective, the direct value of user engagement is far from clear. From a societal perspective, it is unclear whether the increasing expectation for users to engage with firms may lead users to behave in ways that do not serve them. This research aims to provide a comprehensive understanding of user engagement, and specifically, engagement elicited by calls to action, from those two perspectives. I will use an empirical approach, relying on innovative lab and large-scale field experiments. The lab experiments leverage a specially-designed website. For the field experiments, we will collaborate with websites spanning several domains; we have already initiated a relationship with a leading website-development service provider that uses a freemium business model, and have been able to observe the actual behavior of its users. Our preliminary results are promising, supporting the idea that calls to action have strong effects on conversion and information revelation. Moving forward, I plan to fully characterize the nature of these effects in multiple product domains, and to isolate their underlying mechanisms. I am confident that this research program will transform our understanding of the economic and broader societal impact of the social computing phenomenon.
Max ERC Funding
1 487 500 €
Duration
Start date: 2018-06-01, End date: 2023-05-31
Project acronym ThforPV
Project New Thermodynamic for Frequency Conversion and Photovoltaics
Researcher (PI) Carmel Rotschild
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Starting Grant (StG), PE2, ERC-2014-STG
Summary "The Shockley Queisser (SQ) limits the efficiency of single junction photovoltaic (PV) cells and sets the maximum efficiency for Si PV at about 30%. This is because of two constraints: i. The energy PV generates at each conversion event is set by its bandgap, irrespective of the photon’s energy. Thus, energetic photons lose most of their energy to heat. ii. PV cannot harness photons at lower energy than its bandgap. Therefore, splitting energetic photons, and fusing two photons each below the Si bandgap to generate one higher-energy photon that match the PV, push the potential efficiency above the Shockley Queisser limit. Nonlinear optics (NLO) offers efficient frequency conversion, yet it is inefficient at the intensity and the coherence level of solar and thermal radiation.
Here I propose new thermodynamic concepts for frequency conversion of partially incoherent light aiming to overcome the SQ limit for single junction PVs. Specifically, I propose entropy driven up-conversion of low energy photons such as in thermal radiation to emission that matches Si PV cell. This concept is based on coupling ""hot phonons"" to Near-IR emitters, while the bulk remains at low temperature. As preliminary results we experimentally demonstrate entropy-driven ten-fold up-conversion of 10.6m excitation to 1m at internal efficiency of 27% and total efficiency of 10%. This is more efficient by orders of magnitude from any prior art, and opens the way for efficient up-conversion of thermal radiation.
We continue by applying similar thermodynamic ideas for harvesting the otherwise lost thermalization in single junction PVs and present the concept of ""optical refrigeration for ultra-efficient PV"" with theoretical efficiencies as high as 69%. We support the theory by experimental validation, showing enhancement in photon energy of 107% and orders of magnitude enhancement in the number of accessible photons for high-bandgap PV. This opens the way for disruptive innovation in photovoltaics"
Summary
"The Shockley Queisser (SQ) limits the efficiency of single junction photovoltaic (PV) cells and sets the maximum efficiency for Si PV at about 30%. This is because of two constraints: i. The energy PV generates at each conversion event is set by its bandgap, irrespective of the photon’s energy. Thus, energetic photons lose most of their energy to heat. ii. PV cannot harness photons at lower energy than its bandgap. Therefore, splitting energetic photons, and fusing two photons each below the Si bandgap to generate one higher-energy photon that match the PV, push the potential efficiency above the Shockley Queisser limit. Nonlinear optics (NLO) offers efficient frequency conversion, yet it is inefficient at the intensity and the coherence level of solar and thermal radiation.
Here I propose new thermodynamic concepts for frequency conversion of partially incoherent light aiming to overcome the SQ limit for single junction PVs. Specifically, I propose entropy driven up-conversion of low energy photons such as in thermal radiation to emission that matches Si PV cell. This concept is based on coupling ""hot phonons"" to Near-IR emitters, while the bulk remains at low temperature. As preliminary results we experimentally demonstrate entropy-driven ten-fold up-conversion of 10.6m excitation to 1m at internal efficiency of 27% and total efficiency of 10%. This is more efficient by orders of magnitude from any prior art, and opens the way for efficient up-conversion of thermal radiation.
We continue by applying similar thermodynamic ideas for harvesting the otherwise lost thermalization in single junction PVs and present the concept of ""optical refrigeration for ultra-efficient PV"" with theoretical efficiencies as high as 69%. We support the theory by experimental validation, showing enhancement in photon energy of 107% and orders of magnitude enhancement in the number of accessible photons for high-bandgap PV. This opens the way for disruptive innovation in photovoltaics"
Max ERC Funding
1 500 000 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym THUNDEEP
Project A Theory for Understanding, Designing, and Training Deep Learning Systems
Researcher (PI) Ohad SHAMIR
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), PE6, ERC-2017-STG
Summary The rise of deep learning, in the form of artificial neural networks, has been the most dramatic and important development in machine learning over the past decade. Much more than a merely academic topic, deep learning is currently being widely adopted in industry, placed inside commercial products, and is expected to play a key role in anticipated technological leaps such as autonomous driving and general-purpose artificial intelligence. However, our scientific understanding of deep learning is woefully incomplete. Most methods to design and train these systems are based on rules-of-thumb and heuristics, and there is a drastic theory-practice gap in our understanding of why these systems work in practice. We believe this poses a significant risk to the long-term health of the field, as well as an obstacle to widening the applicability of deep learning beyond what has been achieved with current methods.
Our goal is to tackle head-on this important problem, and develop principled tools for understanding, designing, and training deep learning systems, based on rigorous theoretical results.
Our approach is to focus on three inter-related sources of performance losses in neural networks learning: Their optimization error (that is, how to train a given network in a computationally efficient manner); their estimation error (how to ensure that training a network on a finite training set will ensure good performance on future examples); and their approximation error (how architectural choices of the networks affect the type of functions they can compute). For each of these problems, we show how recent advances allow us to effectively approach them, and describe concrete preliminary results and ideas, which will serve as starting points and indicate the feasibility of this challenging project.
Summary
The rise of deep learning, in the form of artificial neural networks, has been the most dramatic and important development in machine learning over the past decade. Much more than a merely academic topic, deep learning is currently being widely adopted in industry, placed inside commercial products, and is expected to play a key role in anticipated technological leaps such as autonomous driving and general-purpose artificial intelligence. However, our scientific understanding of deep learning is woefully incomplete. Most methods to design and train these systems are based on rules-of-thumb and heuristics, and there is a drastic theory-practice gap in our understanding of why these systems work in practice. We believe this poses a significant risk to the long-term health of the field, as well as an obstacle to widening the applicability of deep learning beyond what has been achieved with current methods.
Our goal is to tackle head-on this important problem, and develop principled tools for understanding, designing, and training deep learning systems, based on rigorous theoretical results.
Our approach is to focus on three inter-related sources of performance losses in neural networks learning: Their optimization error (that is, how to train a given network in a computationally efficient manner); their estimation error (how to ensure that training a network on a finite training set will ensure good performance on future examples); and their approximation error (how architectural choices of the networks affect the type of functions they can compute). For each of these problems, we show how recent advances allow us to effectively approach them, and describe concrete preliminary results and ideas, which will serve as starting points and indicate the feasibility of this challenging project.
Max ERC Funding
1 442 360 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym ThymusTolerance
Project Delineation of molecular mechanisms underlying the establishment and breakdown of immunological tolerance in the thymus
Researcher (PI) Jakub ABRAMSON
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Consolidator Grant (CoG), LS6, ERC-2016-COG
Summary Central tolerance is shaped in the thymus, a primary lymphoid organ, where immature T lymphocytes are “educated” into mature cells, capable of recognizing foreign antigens, while tolerating the body’s own components. This process is driven mainly by two separate lineages of thymic epithelial cells (TECs), the cortical (cTEC) and the medullary (mTEC). While cTECs are critical at the early stages of T cell development, mTECs play a pivotal role in negative selection of self-reactive thymocytes and the generation of Foxp3+ regulatory T (Treg) cells. Crucial to the key role of mTECs in the screening of self-reactive T cell clones, is their unique capacity to promiscuously express and present almost all self-antigens, including thousands of tissue-specific antigen (TSA) genes. Strikingly, the expression of most of this TSA repertoire in mTECs is regulated by a single transcriptional regulator called Aire. Indeed, Aire deficiency in mice and human patients results to multi-organ autoimmunity. Although there has been dramatic progress in our understanding of how thymic epithelial cells shape and govern the establishment of adaptive immunity and of immunological self-tolerance, there are still several outstanding questions with no comprehensive answers. Therefore, in the research proposed herein, we wish to provide more comprehensive answers to these still elusive, but very fundamental questions. Specifically we will aim at: 1.) Delineation of molecular mechanisms controlling TEC development and thymus organogenesis; 2.) Delineation of molecular mechanisms underlying promiscuous gene expression in the thymus; 3.) Identification and characterization of molecular determinants responsible for the breakdown of thymus-dependent self-tolerance. To this end, we will build upon our recently published data, as well as unpublished preliminary data and utilize several state-of-the-art and interdisciplinary approaches, which have become an integral part of our lab’s toolbox.
Summary
Central tolerance is shaped in the thymus, a primary lymphoid organ, where immature T lymphocytes are “educated” into mature cells, capable of recognizing foreign antigens, while tolerating the body’s own components. This process is driven mainly by two separate lineages of thymic epithelial cells (TECs), the cortical (cTEC) and the medullary (mTEC). While cTECs are critical at the early stages of T cell development, mTECs play a pivotal role in negative selection of self-reactive thymocytes and the generation of Foxp3+ regulatory T (Treg) cells. Crucial to the key role of mTECs in the screening of self-reactive T cell clones, is their unique capacity to promiscuously express and present almost all self-antigens, including thousands of tissue-specific antigen (TSA) genes. Strikingly, the expression of most of this TSA repertoire in mTECs is regulated by a single transcriptional regulator called Aire. Indeed, Aire deficiency in mice and human patients results to multi-organ autoimmunity. Although there has been dramatic progress in our understanding of how thymic epithelial cells shape and govern the establishment of adaptive immunity and of immunological self-tolerance, there are still several outstanding questions with no comprehensive answers. Therefore, in the research proposed herein, we wish to provide more comprehensive answers to these still elusive, but very fundamental questions. Specifically we will aim at: 1.) Delineation of molecular mechanisms controlling TEC development and thymus organogenesis; 2.) Delineation of molecular mechanisms underlying promiscuous gene expression in the thymus; 3.) Identification and characterization of molecular determinants responsible for the breakdown of thymus-dependent self-tolerance. To this end, we will build upon our recently published data, as well as unpublished preliminary data and utilize several state-of-the-art and interdisciplinary approaches, which have become an integral part of our lab’s toolbox.
Max ERC Funding
2 220 000 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym TiDrugArchitectures
Project Highly Competent and Safe Titanium(IV) Therapeutic Frameworks that are Cancer Targeted based on Complex 1, 2, and 3D Chemical Architectures
Researcher (PI) Edit Yehudit Tshuva Goldberg
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Consolidator Grant (CoG), PE5, ERC-2015-CoG
Summary This proposal aims to develop custom designed anticancer therapeutic frameworks that are effective, stable, safe, and tumor targeted, based on the biocompatible TiIV metal. The Tshuva group has established that water stable phenolato TiIV complexes are especially effective as anticancer agents both in vitro and in vivo, with markedly reduced side effects. Optimal derivatives will be developed to combine activity, stability, and biological accessibility, by maintaining small steric bulk while incorporating strong binding donors and hydrophilicity. The mechanism of action will be investigated by chemical and biological methods, including analyzing bio-distribution, cellular pathways and targets, and interaction with bio-molecules. Specifically, the active metal centers will be linked to bioactive moieties through redox-sensitive S–S bonds to enable tumor targeting. Cell penetrating peptides will facilitate cellular penetration for redox-dependent release of the active species selectively in cancer cells; steroid moieties will direct selectivity to hormone-dependent cancer cell types. Since the combination of TiIV- with Pt-based drugs has shown synergistic effects, multi-active entities will include two or more metal centers, possibly also linked to a transport unit. In addition to linear conjugates, polymeric and dendritic assemblies, exploiting the enhanced permeability of cancer cells, will be constructed with theoretically unlimited options for targeted delivery of multiple active sites. Most importantly, flexible well-defined redox-sensitive cages, as well as rigid pH sensitive complex cages, constructed with customized 3D geometries, will enable specific targeting of any active compound or conjugate and selective dissociation only where desired. This study should yield superior anticancer drugs, while unraveling the mystery of their complex biochemistry, and will contribute to the development of novel chemical and medicinal research directions and applications.
Summary
This proposal aims to develop custom designed anticancer therapeutic frameworks that are effective, stable, safe, and tumor targeted, based on the biocompatible TiIV metal. The Tshuva group has established that water stable phenolato TiIV complexes are especially effective as anticancer agents both in vitro and in vivo, with markedly reduced side effects. Optimal derivatives will be developed to combine activity, stability, and biological accessibility, by maintaining small steric bulk while incorporating strong binding donors and hydrophilicity. The mechanism of action will be investigated by chemical and biological methods, including analyzing bio-distribution, cellular pathways and targets, and interaction with bio-molecules. Specifically, the active metal centers will be linked to bioactive moieties through redox-sensitive S–S bonds to enable tumor targeting. Cell penetrating peptides will facilitate cellular penetration for redox-dependent release of the active species selectively in cancer cells; steroid moieties will direct selectivity to hormone-dependent cancer cell types. Since the combination of TiIV- with Pt-based drugs has shown synergistic effects, multi-active entities will include two or more metal centers, possibly also linked to a transport unit. In addition to linear conjugates, polymeric and dendritic assemblies, exploiting the enhanced permeability of cancer cells, will be constructed with theoretically unlimited options for targeted delivery of multiple active sites. Most importantly, flexible well-defined redox-sensitive cages, as well as rigid pH sensitive complex cages, constructed with customized 3D geometries, will enable specific targeting of any active compound or conjugate and selective dissociation only where desired. This study should yield superior anticancer drugs, while unraveling the mystery of their complex biochemistry, and will contribute to the development of novel chemical and medicinal research directions and applications.
Max ERC Funding
2 000 000 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym TIGITtherapy
Project TIGIT therapy for cancer treatment
Researcher (PI) Ofer MANDELBOIM
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Proof of Concept (PoC), ERC-2016-PoC, ERC-2016-PoC
Summary Treating tumors with immune-related therapies is one of the most exciting and promising advancements made in the past decade. Cancer immunotherapy drugs have captured nearly 50% of the overall oncology drugs market. TIGIT is an important checkpoint inhibitory receptor discovered by our group in 2009. It is constitutively expressed by various immune cells and its expression is further increased on tumor infiltrating lymphocytes (TILs). TIGIT recognizes two main ligands, PVR and Nectin2 that are highly expressed on various tumors. Blockage of TIGIT on TILs either alone or in combination with another checkpoint inhibitory receptor, PD-1, leads to increased T and Natural Killer (NK) cell activity in vitro and inhibited tumor growth in vivo. We developed 9 different anti-TIGIT mAbs during my BacNK ERC advanced grant and previously. In the POC grant TIGITtherapy, which already attracted interest from several bio pharma companies, I propose testing which of the 9 anti-TIGIT mAbs and TIGIT-Ig are able to antagonize TIGIT activity.
Summary
Treating tumors with immune-related therapies is one of the most exciting and promising advancements made in the past decade. Cancer immunotherapy drugs have captured nearly 50% of the overall oncology drugs market. TIGIT is an important checkpoint inhibitory receptor discovered by our group in 2009. It is constitutively expressed by various immune cells and its expression is further increased on tumor infiltrating lymphocytes (TILs). TIGIT recognizes two main ligands, PVR and Nectin2 that are highly expressed on various tumors. Blockage of TIGIT on TILs either alone or in combination with another checkpoint inhibitory receptor, PD-1, leads to increased T and Natural Killer (NK) cell activity in vitro and inhibited tumor growth in vivo. We developed 9 different anti-TIGIT mAbs during my BacNK ERC advanced grant and previously. In the POC grant TIGITtherapy, which already attracted interest from several bio pharma companies, I propose testing which of the 9 anti-TIGIT mAbs and TIGIT-Ig are able to antagonize TIGIT activity.
Max ERC Funding
150 000 €
Duration
Start date: 2017-07-01, End date: 2018-12-31
Project acronym TIMP
Project Ultrahigh-speed nanometer-scale microscopy
Researcher (PI) Oren COHEN
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Consolidator Grant (CoG), PE7, ERC-2018-COG
Summary Ultrahigh-speed microscopy at Tera-scale frames per second frame-rate is essential for various applications in science and technology. In particular, it is critical for observing ultrafast non-repetitive events, for which the pump-probe technique is inapplicable. The spatial resolutions of such microscopes is to date limited to the micrometer scale.
I propose to develop such microscopes with nanometric resolution.
The Tera-scale frames per second frame rate microscopes with nanometric resolution will be based on a new approach for ultrahigh-speed imaging that we recently proposed: time-resolved imaging by multiplexed ptychography (TIMP). In TIMP, multiple frames of the object are recovered algorithmically from data measured in a single CCD exposure of a single-shot ptychographic microscope. The frame rate is determined by the light source (burst of pulses) and it is largely uncoupled from the microscope spatial resolution, which can be sub-wavelength. Also important, TIMP yields movies of both the amplitude and phase dynamics of the imaged object. It is simple and versatile, thus it can be implemented across the electromagnetic spectrum, as well as with other waves.
I aim to develop TIMP-based microscopes, in the visible, extreme UV and x-ray spectral regions with Tera-scale frames per second frame rate and nanometric resolution. We will utilize the unprecedented imaging capabilities in applications, including exploring ultrafast phase transitions, ultrafast dynamics in nanostructures, and tracking the spatiotemporal dynamics during passive mode-locking build-up in lasers and Kerr micro-resonators.
This program, if successful, will bring the field of imaging into a new era, where ultrafast dynamics of non-repetitive transient complex-valued objects can be viewed at nanometric resolution.
Summary
Ultrahigh-speed microscopy at Tera-scale frames per second frame-rate is essential for various applications in science and technology. In particular, it is critical for observing ultrafast non-repetitive events, for which the pump-probe technique is inapplicable. The spatial resolutions of such microscopes is to date limited to the micrometer scale.
I propose to develop such microscopes with nanometric resolution.
The Tera-scale frames per second frame rate microscopes with nanometric resolution will be based on a new approach for ultrahigh-speed imaging that we recently proposed: time-resolved imaging by multiplexed ptychography (TIMP). In TIMP, multiple frames of the object are recovered algorithmically from data measured in a single CCD exposure of a single-shot ptychographic microscope. The frame rate is determined by the light source (burst of pulses) and it is largely uncoupled from the microscope spatial resolution, which can be sub-wavelength. Also important, TIMP yields movies of both the amplitude and phase dynamics of the imaged object. It is simple and versatile, thus it can be implemented across the electromagnetic spectrum, as well as with other waves.
I aim to develop TIMP-based microscopes, in the visible, extreme UV and x-ray spectral regions with Tera-scale frames per second frame rate and nanometric resolution. We will utilize the unprecedented imaging capabilities in applications, including exploring ultrafast phase transitions, ultrafast dynamics in nanostructures, and tracking the spatiotemporal dynamics during passive mode-locking build-up in lasers and Kerr micro-resonators.
This program, if successful, will bring the field of imaging into a new era, where ultrafast dynamics of non-repetitive transient complex-valued objects can be viewed at nanometric resolution.
Max ERC Funding
2 381 700 €
Duration
Start date: 2019-03-01, End date: 2024-02-29
Project acronym TMIHCV
Project Microfabrication-Based Rational Design of Transcriptional-Metabolic Intervention for the Treatment of Hepatitis C Virus (HCV) Infection
Researcher (PI) Yaakov Nahmias
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), LS9, ERC-2009-StG
Summary Hepatitis C Virus (HCV) infection affects over 3% of the world population and is the leading cause of chronic liver disease worldwide. Current treatments are effective in only 50% of the cases and associated with significant side effects. Therefore, there is a pressing need for the development of alternative treatments. Recently, our group and others demonstrated that the HCV lifecycle is critically dependent on host lipid metabolism. In this context, we demonstrated that the grapefruit flavonoid naringenin blocks HCV production through PPAR± and LXR±, transcriptional regulators of hepatic lipid metabolism. While these results are promising, our ability to rationally control metabolic pathways in infected cells is limited due to an incomplete understanding of the regulation of hepatic metabolism by its underlying transcriptional network. This project aims to develop a comprehensive model of hepatic metabolism by integrating metabolic fluxes with transcriptional regulation enabling the rational design of transcriptional-interventions which will minimize HCV replication and release. Our approach is to develop two microfabricated platforms that will enable high-throughput data acquisition and a human-relevant screening. One component is the Transcriptional Activity Array (TAA), a microdevice for the high-throughput temporal acquisition of transcriptional activity data. The second is the Portal Circulation Platform (PCP) which integrates intestinal absorption module with a liver metabolism compartment enabling the high-throughput human-relevant screening of treatments as a substitute to animal experiments. This work will lead to the development of novel drug combinations for the treatment of HCV infection and impact the treatment of diabetes, obesity, and dyslipidemia.
Summary
Hepatitis C Virus (HCV) infection affects over 3% of the world population and is the leading cause of chronic liver disease worldwide. Current treatments are effective in only 50% of the cases and associated with significant side effects. Therefore, there is a pressing need for the development of alternative treatments. Recently, our group and others demonstrated that the HCV lifecycle is critically dependent on host lipid metabolism. In this context, we demonstrated that the grapefruit flavonoid naringenin blocks HCV production through PPAR± and LXR±, transcriptional regulators of hepatic lipid metabolism. While these results are promising, our ability to rationally control metabolic pathways in infected cells is limited due to an incomplete understanding of the regulation of hepatic metabolism by its underlying transcriptional network. This project aims to develop a comprehensive model of hepatic metabolism by integrating metabolic fluxes with transcriptional regulation enabling the rational design of transcriptional-interventions which will minimize HCV replication and release. Our approach is to develop two microfabricated platforms that will enable high-throughput data acquisition and a human-relevant screening. One component is the Transcriptional Activity Array (TAA), a microdevice for the high-throughput temporal acquisition of transcriptional activity data. The second is the Portal Circulation Platform (PCP) which integrates intestinal absorption module with a liver metabolism compartment enabling the high-throughput human-relevant screening of treatments as a substitute to animal experiments. This work will lead to the development of novel drug combinations for the treatment of HCV infection and impact the treatment of diabetes, obesity, and dyslipidemia.
Max ERC Funding
1 994 395 €
Duration
Start date: 2010-04-01, End date: 2015-03-31
Project acronym Tolerome
Project Evolution of antibiotic tolerance in the 'wild': A quantitative approach
Researcher (PI) Nathalie Balaban
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Consolidator Grant (CoG), LS8, ERC-2015-CoG
Summary Bacterial ability to evolve strategies for evading antibiotic treatment is a fascinating example of an evolutionary process, as well as a major health threat. Despite efforts to understand treatment failure, we lack the means to prevent evolution of resistance when a new drug is released to the market. Most efforts are directed towards understanding the mechanisms of antibiotic resistance. Whereas ‘resistance’ is due to mutations that enable microorganisms to grow even at high concentrations of the drug, ‘tolerance’ is the ability to sustain a transient treatment, for example by entering a mode of transient dormancy. The importance of tolerance in the clinic has not been investigated as thoroughly as resistance. The presence of tolerant bacteria is not detected in the clinic because of the inherent difficulty of tracking dormant bacteria that often constitute only a minute fraction of the bacterial population. I hypothesize that bacterial dormancy may evolve quickly in the host under antibiotic treatment. This hypothesis is strengthened by our recent results demonstrating the rapid evolution of dormancy leading to tolerance in vitro, and by the increasing number of cases of treatment failure in the clinic not explained by resistance. My goal is to develop a multidisciplinary approach to detect, quantify and characterize tolerant bacteria in the clinic. Using my background in quantitative single-cell analyses, I will develop microfluidic devices for the rapid detection of tolerant bacteria in the clinic, systems biology tools to isolate and analyze dormant sub-populations directly from clinical isolates. I will search for the genetic mutations leading to tolerance, namely build what I term here the ‘tolerome’. The results will be analyzed in a mathematical framework of tolerance evolution. This approach should reveal the role of tolerance in the clinic and may lead to a paradigm shift in the way bacterial infections are characterized and treated.
Summary
Bacterial ability to evolve strategies for evading antibiotic treatment is a fascinating example of an evolutionary process, as well as a major health threat. Despite efforts to understand treatment failure, we lack the means to prevent evolution of resistance when a new drug is released to the market. Most efforts are directed towards understanding the mechanisms of antibiotic resistance. Whereas ‘resistance’ is due to mutations that enable microorganisms to grow even at high concentrations of the drug, ‘tolerance’ is the ability to sustain a transient treatment, for example by entering a mode of transient dormancy. The importance of tolerance in the clinic has not been investigated as thoroughly as resistance. The presence of tolerant bacteria is not detected in the clinic because of the inherent difficulty of tracking dormant bacteria that often constitute only a minute fraction of the bacterial population. I hypothesize that bacterial dormancy may evolve quickly in the host under antibiotic treatment. This hypothesis is strengthened by our recent results demonstrating the rapid evolution of dormancy leading to tolerance in vitro, and by the increasing number of cases of treatment failure in the clinic not explained by resistance. My goal is to develop a multidisciplinary approach to detect, quantify and characterize tolerant bacteria in the clinic. Using my background in quantitative single-cell analyses, I will develop microfluidic devices for the rapid detection of tolerant bacteria in the clinic, systems biology tools to isolate and analyze dormant sub-populations directly from clinical isolates. I will search for the genetic mutations leading to tolerance, namely build what I term here the ‘tolerome’. The results will be analyzed in a mathematical framework of tolerance evolution. This approach should reveal the role of tolerance in the clinic and may lead to a paradigm shift in the way bacterial infections are characterized and treated.
Max ERC Funding
1 978 750 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym TOPCHARM
Project The LHC Battle for Naturalness on the Top Charm Front
Researcher (PI) Gilad Perez
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Consolidator Grant (CoG), PE2, ERC-2013-CoG
Summary Now that a Higgs-like particle has been discovered naturalness becomes the most pressing and fundamental question within the reach of the Large Hadron Collider (LHC). The main contribution that destabilises the electroweak scale comes from a top-quark loop. The key for addressing the naturalness problem is thus identifying the top partners that are stabilizing the weak scale.
We consider the following two general possibilities in which the partners have escaped detection:
(I) the top partners are light but elusive, this calls for a theoretical explanation as well as for new innovative experimental techniques for signal exhumation;
(II) the partners are relatively heavy and the signal would consist of energetic (boosted) top from the decay of its partner. I plan to carry out a comprehensive research program designed to attack these challenges, and I believe that I am uniquely prepared to do this. Regarding (I), as proven below, current searches have not considered the impact of non-trivial flavor physics, e.g. splitting between the first two generation partner masses, as well as the mixing between the top-partners and other flavors. The consequences are: (i) significantly weaker mass bounds on some of the partners (e.g. the scharm, charm-supersymmetric-partner); (ii) improved naturalness as even the stops (or fermion partners) can be lighter; and (iii) modified Higgs rates in composite models.
Regarding (II), with collaborators I have been the first one to understand the difficulties of dealing with highly boosted top jets, and since then I was intensively involved in developing theoretical as well as novel techniques to study them, including coauthoring two important papers with the CDF and ATLAS collaborations.
To uncover these new possibilities, expertise in collider phenomenology and flavor physics is required. I have a proven record in these frontiers and thus, given the required support, am well positioned to pursue this quest to save naturalness.
Summary
Now that a Higgs-like particle has been discovered naturalness becomes the most pressing and fundamental question within the reach of the Large Hadron Collider (LHC). The main contribution that destabilises the electroweak scale comes from a top-quark loop. The key for addressing the naturalness problem is thus identifying the top partners that are stabilizing the weak scale.
We consider the following two general possibilities in which the partners have escaped detection:
(I) the top partners are light but elusive, this calls for a theoretical explanation as well as for new innovative experimental techniques for signal exhumation;
(II) the partners are relatively heavy and the signal would consist of energetic (boosted) top from the decay of its partner. I plan to carry out a comprehensive research program designed to attack these challenges, and I believe that I am uniquely prepared to do this. Regarding (I), as proven below, current searches have not considered the impact of non-trivial flavor physics, e.g. splitting between the first two generation partner masses, as well as the mixing between the top-partners and other flavors. The consequences are: (i) significantly weaker mass bounds on some of the partners (e.g. the scharm, charm-supersymmetric-partner); (ii) improved naturalness as even the stops (or fermion partners) can be lighter; and (iii) modified Higgs rates in composite models.
Regarding (II), with collaborators I have been the first one to understand the difficulties of dealing with highly boosted top jets, and since then I was intensively involved in developing theoretical as well as novel techniques to study them, including coauthoring two important papers with the CDF and ATLAS collaborations.
To uncover these new possibilities, expertise in collider phenomenology and flavor physics is required. I have a proven record in these frontiers and thus, given the required support, am well positioned to pursue this quest to save naturalness.
Max ERC Funding
1 434 154 €
Duration
Start date: 2014-07-01, End date: 2019-06-30
Project acronym TopFront
Project Expanding the Topological Frontier in Quantum Matter: from Concepts to Future Applications
Researcher (PI) Netanel Hanan Lindner
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Starting Grant (StG), PE3, ERC-2014-STG
Summary Topological phases arise from a fascinating interplay between quantum mechanics and many-body physics. They exhibit an abundance of extraordinary properties, such as protected edge and surface modes, exotic particle statistics, and non-local correlations. These make them not only scientifically stimulating, but also appealing for ground-breaking future applications, such as quantum computing using non-Abelian systems. Their subtle nature often renders them hard to study theoretically, and even more so to detect and control experimentally. To date, only a small subset of them has been accessed in experiments. The purpose of this research program is to expand the scope of possible realizations of topological quantum matter, and to develop methods to detect, control and manipulate them. Two main research directions will be considered. The first will focus on utilizing defects to synthesize new non-Abelian systems. We will study the mathematical theory describing the defects, starting from microscopic considerations and aiming to achieve a unifying mathematical framework. New non-Abelian phases arising in networks of coupled defects will be explored. Protocols for controlling non-Abelian anyons and zero modes will be developed and optimized, aiming to minimize errors arising from imperfections in physical implementations. The second direction will explore the exciting possibility of inducing topological behaviour in non-equilibrium systems. Periodically driven systems, such as matter interacting with light, can exhibit anomalous topological phenomena with no analogue in static systems, which we intend to reveal and classify. We will study the unique many body physics arising from the interplay of topological Bloch-Floquet band structures, inter-particle interactions, and coupling to the environment. Finally, for both research directions we will consider possible experimental realizations in a variety of solid state and cold atom systems along with designated probes.
Summary
Topological phases arise from a fascinating interplay between quantum mechanics and many-body physics. They exhibit an abundance of extraordinary properties, such as protected edge and surface modes, exotic particle statistics, and non-local correlations. These make them not only scientifically stimulating, but also appealing for ground-breaking future applications, such as quantum computing using non-Abelian systems. Their subtle nature often renders them hard to study theoretically, and even more so to detect and control experimentally. To date, only a small subset of them has been accessed in experiments. The purpose of this research program is to expand the scope of possible realizations of topological quantum matter, and to develop methods to detect, control and manipulate them. Two main research directions will be considered. The first will focus on utilizing defects to synthesize new non-Abelian systems. We will study the mathematical theory describing the defects, starting from microscopic considerations and aiming to achieve a unifying mathematical framework. New non-Abelian phases arising in networks of coupled defects will be explored. Protocols for controlling non-Abelian anyons and zero modes will be developed and optimized, aiming to minimize errors arising from imperfections in physical implementations. The second direction will explore the exciting possibility of inducing topological behaviour in non-equilibrium systems. Periodically driven systems, such as matter interacting with light, can exhibit anomalous topological phenomena with no analogue in static systems, which we intend to reveal and classify. We will study the unique many body physics arising from the interplay of topological Bloch-Floquet band structures, inter-particle interactions, and coupling to the environment. Finally, for both research directions we will consider possible experimental realizations in a variety of solid state and cold atom systems along with designated probes.
Max ERC Funding
1 500 000 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym Topo Ins Laser
Project Topological Insulator Laser
Researcher (PI) Mordechay SEGEV
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Advanced Grant (AdG), PE2, ERC-2017-ADG
Summary Triggered by condensed matter, a new frontier recently emerged: Photonic Topological Insulators (PTIs). These are photonic structures where the transport of light is topologically protected: light propagates in a unidirectional manner without reflection, even in the presence of corners, defects, or disorder. The first step toward PTIs was the electromagnetic analogue of the quantum Hall effect, employing magnetic fields in gyrooptic media. Bringing the concepts of topological insulators into photonics required fundamentally different effects, eluding researchers until in 2013 we demonstrated the first PTI. That, along with experiments in silicon photonics and pioneering theory work, launched the field of Topological Photonics.
This proposal aims to explore the possibility of the “next big thing”, a fundamentally new concept, never suggested before in any context, with high potential impact on fundamentals and on lasers technology: we will explore the idea of the Topological Insulator Laser.
Topological Insulator Lasers are lasers where the lasing mode is topologically protected: light propagates around the cavity unaffected by disorder and defects. Based on our preliminary studies, we envision that by lasing in a topological mode, the interplay between the topology and gain will lead to a highly efficient laser, robust to defects and disorder, that lases in a single mode even at high gain values.
The road to achieve this goes against current knowledge: topological insulators are linear Hermitian closed systems, whereas the topological insulator laser is a non-Hermitian, highly nonlinear, open system.
Our study will be theoretical and experimental, starting at the fundamentals of topological transport in systems with gain, and we will take it all the way to experimentally demonstrate the concepts in several different platforms.
The idea of the Topological Insulator Laser is unique: success will mark a new milestone in optics and topological physics.
Summary
Triggered by condensed matter, a new frontier recently emerged: Photonic Topological Insulators (PTIs). These are photonic structures where the transport of light is topologically protected: light propagates in a unidirectional manner without reflection, even in the presence of corners, defects, or disorder. The first step toward PTIs was the electromagnetic analogue of the quantum Hall effect, employing magnetic fields in gyrooptic media. Bringing the concepts of topological insulators into photonics required fundamentally different effects, eluding researchers until in 2013 we demonstrated the first PTI. That, along with experiments in silicon photonics and pioneering theory work, launched the field of Topological Photonics.
This proposal aims to explore the possibility of the “next big thing”, a fundamentally new concept, never suggested before in any context, with high potential impact on fundamentals and on lasers technology: we will explore the idea of the Topological Insulator Laser.
Topological Insulator Lasers are lasers where the lasing mode is topologically protected: light propagates around the cavity unaffected by disorder and defects. Based on our preliminary studies, we envision that by lasing in a topological mode, the interplay between the topology and gain will lead to a highly efficient laser, robust to defects and disorder, that lases in a single mode even at high gain values.
The road to achieve this goes against current knowledge: topological insulators are linear Hermitian closed systems, whereas the topological insulator laser is a non-Hermitian, highly nonlinear, open system.
Our study will be theoretical and experimental, starting at the fundamentals of topological transport in systems with gain, and we will take it all the way to experimentally demonstrate the concepts in several different platforms.
The idea of the Topological Insulator Laser is unique: success will mark a new milestone in optics and topological physics.
Max ERC Funding
1 864 000 €
Duration
Start date: 2018-06-01, End date: 2023-05-31
Project acronym TOPO-NW
Project VISUALIZATION OF TOPOLGICAL STATES IN PRISTINE NANOWIRES
Researcher (PI) Haim Beidenkopf
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), PE3, ERC-2015-STG
Summary Topological phases of matter have been at the center of intense scientific research. Over the past decade this has led to the discovery of dozens of topological materials with exotic boundary states. In three dimensional topological phases, scanning tunneling microscopy (STM) has been instrumental in unveiling the unusual properties of these surface states. This success, however, did not encompass lower dimensional topological systems. The main reason is surface contamination which is disruptive both for STM and for the fragile electronic states. We propose to study topological states of matter in pristine epitaxial nanowires by combining growth, fabrication and STM, all in a single modular ultra-high vacuum space. This platform will uniquely allow us to observe well anticipated topological phenomena in one dimension such as the Majorana end-modes in semiconducting nanowires. On a broader view, the nanowire configuration intertwines dimensionality and geometry with topology giving rise to novel topological systems with high tunability. A vivid instance is given by topological crystalline insulator nanowires in which the topological symmetry protection can be broken by a variety of perturbations. We will selectively tune the surface states band structure and study the local response of massless and massive surface Dirac electrons. Tunability provides a higher degree of control. We will utilize this to realize topological nanowire-based electronic and spintronic devices such as a Z2 pump and spin-based Mach-Zehnder interferometer for Dirac electrons. The low dimensionality of the nanowire alongside various singularities in the electronic spectra of different topological phases enhance interaction effects, serving as a cradle for novel correlated topological states. This new paradigm of topological nanowires will allow us to elucidate deep notions in topological matter as well as to explore new concepts and novel states, thus providing ample experimental prospects.
Summary
Topological phases of matter have been at the center of intense scientific research. Over the past decade this has led to the discovery of dozens of topological materials with exotic boundary states. In three dimensional topological phases, scanning tunneling microscopy (STM) has been instrumental in unveiling the unusual properties of these surface states. This success, however, did not encompass lower dimensional topological systems. The main reason is surface contamination which is disruptive both for STM and for the fragile electronic states. We propose to study topological states of matter in pristine epitaxial nanowires by combining growth, fabrication and STM, all in a single modular ultra-high vacuum space. This platform will uniquely allow us to observe well anticipated topological phenomena in one dimension such as the Majorana end-modes in semiconducting nanowires. On a broader view, the nanowire configuration intertwines dimensionality and geometry with topology giving rise to novel topological systems with high tunability. A vivid instance is given by topological crystalline insulator nanowires in which the topological symmetry protection can be broken by a variety of perturbations. We will selectively tune the surface states band structure and study the local response of massless and massive surface Dirac electrons. Tunability provides a higher degree of control. We will utilize this to realize topological nanowire-based electronic and spintronic devices such as a Z2 pump and spin-based Mach-Zehnder interferometer for Dirac electrons. The low dimensionality of the nanowire alongside various singularities in the electronic spectra of different topological phases enhance interaction effects, serving as a cradle for novel correlated topological states. This new paradigm of topological nanowires will allow us to elucidate deep notions in topological matter as well as to explore new concepts and novel states, thus providing ample experimental prospects.
Max ERC Funding
1 750 000 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym TORMCJ
Project Thermal, optical and redox processes in molecular conduction junctions
Researcher (PI) Abraham Nitzan
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Advanced Grant (AdG), PE4, ERC-2008-AdG
Summary Much of the current intense study of molecular conduction junctions is motivated by their possible technological applications, however this research focuses on fundamental questions associated with the properties and operation of such systems. Junctions based on redox molecules often show non-linear conduction behavior as function of imposed bias. Optical interactions in molecular junctions pertain to junction characterization and control. Issues of heating and thermal stability require a proper definition of thermal states (effective temperature) and the understanding of heat production and thermal conduction in non-equilibrium junctions. This proposal focuses on theoretical problems pertaining to these phenomena with the following goals: (a) Develop theoretical methodologies for treating non-equilibrium molecular systems under the combined driving of electrical bias, thermal gradients and optical fields; (b) provide theoretical tools needed for the understanding and interpretation of new and ongoing experimental efforts involving thermal, optical and redox (charging) phenomena in molecular junctions, and (c) use the acquired insight to suggest new methods for characterization, functionality, control and stability of molecular junctions.
Summary
Much of the current intense study of molecular conduction junctions is motivated by their possible technological applications, however this research focuses on fundamental questions associated with the properties and operation of such systems. Junctions based on redox molecules often show non-linear conduction behavior as function of imposed bias. Optical interactions in molecular junctions pertain to junction characterization and control. Issues of heating and thermal stability require a proper definition of thermal states (effective temperature) and the understanding of heat production and thermal conduction in non-equilibrium junctions. This proposal focuses on theoretical problems pertaining to these phenomena with the following goals: (a) Develop theoretical methodologies for treating non-equilibrium molecular systems under the combined driving of electrical bias, thermal gradients and optical fields; (b) provide theoretical tools needed for the understanding and interpretation of new and ongoing experimental efforts involving thermal, optical and redox (charging) phenomena in molecular junctions, and (c) use the acquired insight to suggest new methods for characterization, functionality, control and stability of molecular junctions.
Max ERC Funding
842 420 €
Duration
Start date: 2008-12-01, End date: 2014-05-31
Project acronym TRACTAR
Project Tracking and Targeting a T-DNA Vector for Precise Engineering of Plant Genomes
Researcher (PI) Avraham Albert Levy
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Advanced Grant (AdG), LS9, ERC-2010-AdG_20100317
Summary DNA introduced into a cell usually integrates, if at all, at random in the genome. In order for gene targeting to take place, a small vector must scan a huge genome that is packaged in chromatin, identify and bind the target, and engage in strand exchange. This formidable task is likely to be rate limiting. Our goal is to study the process of genome scanning by the vector, to track it from the time of transformation through genome integration and to assist the vector to identify the homologous target. Our tools are particle imaging and tracking, molecular analysis of integration events, and manipulation of the integration process through protein recognition chemistry. Two main approaches will be used to assist homologous integration: first, by protein bridging (proteins that would bind both target and vector), and second by chromatin remodeling. Second, we propose to analyze the connection between chromatin structure and DNA integration. We will analyze how nucleosome positioning affects patterns of DNA integration. In addition, we will stimulate chromatin remodeling in an attempt to facilitate target invasion by the incoming vector. Parallel assays will be built upon fluorescence and genetic markers to correlate between the mode of search and integration per se. The interdisciplinary use of biophysics, genetics, and computational tools opens the prospect to better understand and manipulate the fundamental mechanisms involved in DNA mobility, plant transformation, and gene targeting.
Summary
DNA introduced into a cell usually integrates, if at all, at random in the genome. In order for gene targeting to take place, a small vector must scan a huge genome that is packaged in chromatin, identify and bind the target, and engage in strand exchange. This formidable task is likely to be rate limiting. Our goal is to study the process of genome scanning by the vector, to track it from the time of transformation through genome integration and to assist the vector to identify the homologous target. Our tools are particle imaging and tracking, molecular analysis of integration events, and manipulation of the integration process through protein recognition chemistry. Two main approaches will be used to assist homologous integration: first, by protein bridging (proteins that would bind both target and vector), and second by chromatin remodeling. Second, we propose to analyze the connection between chromatin structure and DNA integration. We will analyze how nucleosome positioning affects patterns of DNA integration. In addition, we will stimulate chromatin remodeling in an attempt to facilitate target invasion by the incoming vector. Parallel assays will be built upon fluorescence and genetic markers to correlate between the mode of search and integration per se. The interdisciplinary use of biophysics, genetics, and computational tools opens the prospect to better understand and manipulate the fundamental mechanisms involved in DNA mobility, plant transformation, and gene targeting.
Max ERC Funding
1 958 408 €
Duration
Start date: 2011-05-01, End date: 2016-04-30
Project acronym TRANSCRIPTION_REG
Project A combined experimental and computational approach for quantitative and mechanistic understanding of transcriptional regulation
Researcher (PI) Eran Segal
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), LS2, ERC-2007-StG
Summary The complex functions of a living cell are carried out through the coordinated activity of many genes. Since transcription is a key step in establishing such coordinated activity, much effort was devoted to its study, and tremendous progress was made in identifying many of the transcription factors and regulatory DNA elements involved in the regulation of specific systems. However, very few attempts were made at going beyond these phenomenological and qualitative descriptions. Consequently, we are far from a quantitative and predictive understanding of transcriptional regulation. Through this program, I aim to develop a mechanistic understanding of transcriptional regulation, and for the first time model the entire process. We wish to go much beyond identifying and qualitatively describing the involved components, and arrive at a quantitative understanding of how transcriptional programs are encoded in the DNA sequences. To this end, my team and I will first work to mechanistically understand various building blocks of the transcriptional system, including: mechanisms of activation and repression; binding cooperativity; binding competition; transcription factors and chromatin interplay; architectural features of promoters that are important for its function; and the transcription functions “computed” by promoters. Since existing data are clearly insufficient for addressing such questions, I have opened an experimental lab and began to assemble a multidisciplinary team of scientists whose expertise span the experimental biology, computer science, physics, statistics, and mathematics disciplines, that will work synergistically to generate the appropriate data, analyze it, and use it to construct and experimentally validate models for the above transcriptional building blocks. We will then integrate all the insights gained into unified and quantitative models that should significantly enhance our understanding of the mechanistic workings of transcriptional regulation.
Summary
The complex functions of a living cell are carried out through the coordinated activity of many genes. Since transcription is a key step in establishing such coordinated activity, much effort was devoted to its study, and tremendous progress was made in identifying many of the transcription factors and regulatory DNA elements involved in the regulation of specific systems. However, very few attempts were made at going beyond these phenomenological and qualitative descriptions. Consequently, we are far from a quantitative and predictive understanding of transcriptional regulation. Through this program, I aim to develop a mechanistic understanding of transcriptional regulation, and for the first time model the entire process. We wish to go much beyond identifying and qualitatively describing the involved components, and arrive at a quantitative understanding of how transcriptional programs are encoded in the DNA sequences. To this end, my team and I will first work to mechanistically understand various building blocks of the transcriptional system, including: mechanisms of activation and repression; binding cooperativity; binding competition; transcription factors and chromatin interplay; architectural features of promoters that are important for its function; and the transcription functions “computed” by promoters. Since existing data are clearly insufficient for addressing such questions, I have opened an experimental lab and began to assemble a multidisciplinary team of scientists whose expertise span the experimental biology, computer science, physics, statistics, and mathematics disciplines, that will work synergistically to generate the appropriate data, analyze it, and use it to construct and experimentally validate models for the above transcriptional building blocks. We will then integrate all the insights gained into unified and quantitative models that should significantly enhance our understanding of the mechanistic workings of transcriptional regulation.
Max ERC Funding
1 005 600 €
Duration
Start date: 2008-07-01, End date: 2013-06-30
Project acronym TRANSFORM OPTICS
Project Transformation optics: cloaking, perfect imaging and horizons
Researcher (PI) Ulf Leonhardt
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Advanced Grant (AdG), PE2, ERC-2012-ADG_20120216
Summary Transformation optics grew out of ideas for invisibility cloaking devices and exploits connections between electromagnetism in media and in geometries. Invisibility has turned from fiction into science since 2006, but is far from being practical yet. Advances in the theory of transformation optical are the key for bringing invisibility closer to practicality. Probably the most important practical application of connections between media and geometries is perfect imaging, the ability to optically transfer images with a resolution not limited by the wavelength. This is because imaging lies at the heart of photolithography, the key technology used for making electronic chips. On the other hand, probably the intellectually most important application of connections between media and geometries lies in the quantum physics of the event horizon, which, for the first time, could be studied in the laboratory. The objective of this proposal is to make significant breakthroughs in (1) moving cloaking from frontier research closer to practicality, (2) turning perfect imaging into a viable technology and (3) demonstrating the quantum physics of the event horizon in the laboratory. This project is at the cutting edge of a global communal effort in the research of metamaterials. The overarching theme of the project is to make abstract and seemingly fantastic ideas practical, by combining ideas from geometry and general relativity with the latest advances in optical metamaterials and integrated and ultrafast photonics.
Summary
Transformation optics grew out of ideas for invisibility cloaking devices and exploits connections between electromagnetism in media and in geometries. Invisibility has turned from fiction into science since 2006, but is far from being practical yet. Advances in the theory of transformation optical are the key for bringing invisibility closer to practicality. Probably the most important practical application of connections between media and geometries is perfect imaging, the ability to optically transfer images with a resolution not limited by the wavelength. This is because imaging lies at the heart of photolithography, the key technology used for making electronic chips. On the other hand, probably the intellectually most important application of connections between media and geometries lies in the quantum physics of the event horizon, which, for the first time, could be studied in the laboratory. The objective of this proposal is to make significant breakthroughs in (1) moving cloaking from frontier research closer to practicality, (2) turning perfect imaging into a viable technology and (3) demonstrating the quantum physics of the event horizon in the laboratory. This project is at the cutting edge of a global communal effort in the research of metamaterials. The overarching theme of the project is to make abstract and seemingly fantastic ideas practical, by combining ideas from geometry and general relativity with the latest advances in optical metamaterials and integrated and ultrafast photonics.
Max ERC Funding
2 495 399 €
Duration
Start date: 2013-03-01, End date: 2018-02-28
Project acronym TransgenerationalRNA
Project RNA-Mediated Inheritance of Acquired Traits
Researcher (PI) Oded Rechavi
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), LS2, ERC-2013-StG
Summary Inheritance of acquired traits is a topic of long-standing interest and controversy. While some of the classic Lamarckian ideas have been dismissed, more recent observations suggest that certain characteristics acquired by an animal during its lifetime might be transmitted to the next generations. Recently I described, for the first time in animals, a biological context in which acquired traits are inherited via small RNA molecules, which ignore the boundary between the soma and the germ line (“The Weizmann Barrier”). Specifically, I showed that the nematode C.elegans inherit an acquired trait, antiviral resistance, through transgenerational transmission of antiviral small RNAs (viRNAs), which mediate RNA interference (RNAi) (Cell, 2011). viRNAs, which protect the worm from viral propagation, pass down to many ensuing generations in a non-Mendelian manner, in the absence of their DNA template, and thus defend RNAi-deficient progeny from viral propagation. Here I suggest defining the rules that govern RNA-mediated transgenerational inheritance of acquired traits and exploring its contribution for the genetics of complex traits. My first efforts will be directed towards elucidating the mechanism behind transgenerational transmission of small RNAs; I established a well-defined system for monitoring transgenerational silencing that should allow unveiling of its genetic and biochemical basis. Second, I will examine whether responses to relevant environmental stresses carry on to the next generations so that the progeny is better prepared to cope with similar conditions. Lastly, I will explore whether sensing of environmental cues by the nervous system drives small RNA biogenesis, which transfer transgenerationally and mediate inheritance of neuronally-encoded traits. While the idea that RNA encodes for “Inherited Memory” sounds heretic at first, my preliminary efforts suggest that inherited small RNAs may indeed transmit information about ancestral acquired experiences.
Summary
Inheritance of acquired traits is a topic of long-standing interest and controversy. While some of the classic Lamarckian ideas have been dismissed, more recent observations suggest that certain characteristics acquired by an animal during its lifetime might be transmitted to the next generations. Recently I described, for the first time in animals, a biological context in which acquired traits are inherited via small RNA molecules, which ignore the boundary between the soma and the germ line (“The Weizmann Barrier”). Specifically, I showed that the nematode C.elegans inherit an acquired trait, antiviral resistance, through transgenerational transmission of antiviral small RNAs (viRNAs), which mediate RNA interference (RNAi) (Cell, 2011). viRNAs, which protect the worm from viral propagation, pass down to many ensuing generations in a non-Mendelian manner, in the absence of their DNA template, and thus defend RNAi-deficient progeny from viral propagation. Here I suggest defining the rules that govern RNA-mediated transgenerational inheritance of acquired traits and exploring its contribution for the genetics of complex traits. My first efforts will be directed towards elucidating the mechanism behind transgenerational transmission of small RNAs; I established a well-defined system for monitoring transgenerational silencing that should allow unveiling of its genetic and biochemical basis. Second, I will examine whether responses to relevant environmental stresses carry on to the next generations so that the progeny is better prepared to cope with similar conditions. Lastly, I will explore whether sensing of environmental cues by the nervous system drives small RNA biogenesis, which transfer transgenerationally and mediate inheritance of neuronally-encoded traits. While the idea that RNA encodes for “Inherited Memory” sounds heretic at first, my preliminary efforts suggest that inherited small RNAs may indeed transmit information about ancestral acquired experiences.
Max ERC Funding
1 500 000 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym TranslationRegCode
Project Cracking the Translation Regulatory Code
Researcher (PI) Reut Gitit Shalgi
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Starting Grant (StG), LS2, ERC-2015-STG
Summary Organisms across all kingdoms share several systems that are essential to life, one of the most central being protein synthesis. Living in a continuously changing environment, cells need to constantly respond to various environmental cues and change their protein landscape. In extreme cases, cells globally shut down protein synthesis and upregulate stress-protective proteins.
Mechanisms of translational repression or selective enhancement of stress-induced proteins have been characterized, but their effects were demonstrated on an individual mRNA basis. Which target mRNAs are translationally regulated in response to different environmental cues, and what are the cis-regulatory elements involved, largely remain as open questions. Using ribosome footprint profiling, I recently discovered a novel mode of translational control in stress, underscoring the potential of new technologies to uncover novel regulatory mechanisms. But while transcription cis-regulatory elements have been thoroughly mapped in the past decade, and splicing regulatory elements are accumulating, the identification of translation cis-regulatory elements is lagging behind.
Here I propose to crack the mammalian translation regulatory code, and close this long-standing gap. I present a novel interdisciplinary framework to comprehensively identify translation cis-regulatory elements, and map their mRNAs targets in a variety of cellular perturbations. Importantly, we plan to explore mechanisms underlying novel cis-regulatory elements, and create the first genome-wide functionally annotated translation regulatory code.
The translation regulatory code will map targets of existing mechanisms and shed light on newly identified pathways that play a role in stress-induced translational control. The proposed project is an imperative stepping stone to understanding translational regulation by cis-regulatory elements, opening new avenues in the functional genomics research of translational control.
Summary
Organisms across all kingdoms share several systems that are essential to life, one of the most central being protein synthesis. Living in a continuously changing environment, cells need to constantly respond to various environmental cues and change their protein landscape. In extreme cases, cells globally shut down protein synthesis and upregulate stress-protective proteins.
Mechanisms of translational repression or selective enhancement of stress-induced proteins have been characterized, but their effects were demonstrated on an individual mRNA basis. Which target mRNAs are translationally regulated in response to different environmental cues, and what are the cis-regulatory elements involved, largely remain as open questions. Using ribosome footprint profiling, I recently discovered a novel mode of translational control in stress, underscoring the potential of new technologies to uncover novel regulatory mechanisms. But while transcription cis-regulatory elements have been thoroughly mapped in the past decade, and splicing regulatory elements are accumulating, the identification of translation cis-regulatory elements is lagging behind.
Here I propose to crack the mammalian translation regulatory code, and close this long-standing gap. I present a novel interdisciplinary framework to comprehensively identify translation cis-regulatory elements, and map their mRNAs targets in a variety of cellular perturbations. Importantly, we plan to explore mechanisms underlying novel cis-regulatory elements, and create the first genome-wide functionally annotated translation regulatory code.
The translation regulatory code will map targets of existing mechanisms and shed light on newly identified pathways that play a role in stress-induced translational control. The proposed project is an imperative stepping stone to understanding translational regulation by cis-regulatory elements, opening new avenues in the functional genomics research of translational control.
Max ERC Funding
1 587 500 €
Duration
Start date: 2016-03-01, End date: 2021-02-28
Project acronym TRANSRIGHTS
Project Gender citizenship and sexual rights in Europe: transgender lives from a transnational perspective
Researcher (PI) Sofia Isabel Da Costa D'aboim Inglez
Host Institution (HI) INSTITUTO DE CIENCIAS SOCIAIS
Call Details Consolidator Grant (CoG), SH2, ERC-2013-CoG
Summary "The TRANSRIGHTS project investigates transgender lives and the institutional apparatus that frames them. Rather than focusing exclusively on self displayed identities, four lines of inquiry will be developed. Firstly, gender politics and sexual rights are analyzed as the opposition between politics of equality and of difference is unable to provide answers for the inclusion of trans-people. Secondly, by comparing the lives of trans-people in five European countries – Portugal, France, United Kingdom, the Netherlands and Sweden – we wish to attain an overview of how institutional frameworks impact on these lives. Thirdly, our approach will take into account the immigration of trans-individuals to Europe, whether in search for recognition or as a way of survival often leading to sex work. Fourthly, by comparing different countries, different groups of transgender people, different forms of attaining inclusion or dealing with exclusion, different conceptions of gender citizenship and sexual rights, we wish not only to gain a deeper understanding of societal change and its impact on the lives of transgender individuals, but also to identify the gaps between policies and rights and the categories actually mobilized for self-identification. Such a task implies examining the voices of trans-people, the effect of policies on the materiality of lives as well as conceptualizations of selfhood that do not necessarily confine to the European context. Project outputs will contribute to the fields of gender, sexuality and citizenship by providing a grounded theoretical debate, discussing the gender categories of citizenship. Trans-people are a heterogeneous group that represents one of the most challenging boundaries for framing this debate within and beyond Europe. The voices of trans-people are essential to avoid an excessive reduction of lives to institutional categories, whether from the institutional apparatus, the LGBT movements or the social sciences."
Summary
"The TRANSRIGHTS project investigates transgender lives and the institutional apparatus that frames them. Rather than focusing exclusively on self displayed identities, four lines of inquiry will be developed. Firstly, gender politics and sexual rights are analyzed as the opposition between politics of equality and of difference is unable to provide answers for the inclusion of trans-people. Secondly, by comparing the lives of trans-people in five European countries – Portugal, France, United Kingdom, the Netherlands and Sweden – we wish to attain an overview of how institutional frameworks impact on these lives. Thirdly, our approach will take into account the immigration of trans-individuals to Europe, whether in search for recognition or as a way of survival often leading to sex work. Fourthly, by comparing different countries, different groups of transgender people, different forms of attaining inclusion or dealing with exclusion, different conceptions of gender citizenship and sexual rights, we wish not only to gain a deeper understanding of societal change and its impact on the lives of transgender individuals, but also to identify the gaps between policies and rights and the categories actually mobilized for self-identification. Such a task implies examining the voices of trans-people, the effect of policies on the materiality of lives as well as conceptualizations of selfhood that do not necessarily confine to the European context. Project outputs will contribute to the fields of gender, sexuality and citizenship by providing a grounded theoretical debate, discussing the gender categories of citizenship. Trans-people are a heterogeneous group that represents one of the most challenging boundaries for framing this debate within and beyond Europe. The voices of trans-people are essential to avoid an excessive reduction of lives to institutional categories, whether from the institutional apparatus, the LGBT movements or the social sciences."
Max ERC Funding
1 262 943 €
Duration
Start date: 2014-09-01, End date: 2019-08-31
Project acronym TRAPLAB
Project Lab Based Searches for Beyond Standard Model Physics Using Traps
Researcher (PI) Guy RON
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), PE2, ERC-2016-STG
Summary In this project I will measure a critical constant (beta-nu correlation) of the standard model to a precision of at least 0.1%, an order of magnitude improvement over the state of the art. The project will provide a platform for beyond standard-model (BSM) explorations, based on modern atom/ion trapping and a new accelerator facility.
High precision measurements of beta decay correlations in trapped radioactive atoms and ions are one of the most precise tools with which to search for BSM physics. The recently published US National Science Advisory Council 2015 Long Range Plan states: ``Measurements of the decays of neutrons and nuclei provide the most precise and sensitive characterization of the charge-changing weak force of quarks and are a very sensitive probe of yet undiscovered new forces. In fact, weak decay measurements with an accuracy of 0.1% or better provide a unique probe of new physics at the TeV energy scale``. Ne and He isotopes are particularly attractive due to calculable SM values, high sensitivity to several manifestations of BSM physics, ease of production, and lifetimes in the useful range for such experiments.
This program combines a Magneto-Optical Trap (MOT) and an Electrostatic Ion Beam Trap (EIBT) to perform a high-precision, competitive, measurement of correlations in the decay of such nuclei. The MOT program focuses on the neon isotopes, where existing measurements are of insufficient quality, and have unique sensitivities to aspects of BSM physics. The EIBT program focuses on measurements using 6He (where a comparison with existing measurements is of great import) and the aforementioned neon isotopes, allowing a direct comparison between the two systems within the same facility (a unique worldwide capability). The combination of these methods will allow an extraction of the beta-nu coefficient to the 0.1% level, making this proposal a forerunner in the field, which will provide a leap-step in the current set of world data.
Summary
In this project I will measure a critical constant (beta-nu correlation) of the standard model to a precision of at least 0.1%, an order of magnitude improvement over the state of the art. The project will provide a platform for beyond standard-model (BSM) explorations, based on modern atom/ion trapping and a new accelerator facility.
High precision measurements of beta decay correlations in trapped radioactive atoms and ions are one of the most precise tools with which to search for BSM physics. The recently published US National Science Advisory Council 2015 Long Range Plan states: ``Measurements of the decays of neutrons and nuclei provide the most precise and sensitive characterization of the charge-changing weak force of quarks and are a very sensitive probe of yet undiscovered new forces. In fact, weak decay measurements with an accuracy of 0.1% or better provide a unique probe of new physics at the TeV energy scale``. Ne and He isotopes are particularly attractive due to calculable SM values, high sensitivity to several manifestations of BSM physics, ease of production, and lifetimes in the useful range for such experiments.
This program combines a Magneto-Optical Trap (MOT) and an Electrostatic Ion Beam Trap (EIBT) to perform a high-precision, competitive, measurement of correlations in the decay of such nuclei. The MOT program focuses on the neon isotopes, where existing measurements are of insufficient quality, and have unique sensitivities to aspects of BSM physics. The EIBT program focuses on measurements using 6He (where a comparison with existing measurements is of great import) and the aforementioned neon isotopes, allowing a direct comparison between the two systems within the same facility (a unique worldwide capability). The combination of these methods will allow an extraction of the beta-nu coefficient to the 0.1% level, making this proposal a forerunner in the field, which will provide a leap-step in the current set of world data.
Max ERC Funding
1 297 813 €
Duration
Start date: 2016-12-01, End date: 2021-11-30
Project acronym TREND
Project Transparent and flexible electronics with embedded energy harvesting based on oxide nanowire devices
Researcher (PI) Pedro CANDIDO BARQUINHA
Host Institution (HI) NOVA ID FCT - ASSOCIACAO PARA A INOVACAO E DESENVOLVIMENTO DA FCT
Call Details Starting Grant (StG), PE8, ERC-2016-STG
Summary The Internet of Things is shaping the evolution of information society, requiring an increasing number of objects with embedded electronics, sensors and connectivity. This spurs the need for systems where summing to performance and low cost, multifunctionality has to be assured. In this context, TREND aims to take transparent electronics into as-of-yet unexplored levels of integration, by combining on flexible substrates transparent and high-speed nanocircuits with energy harvesting capabilities, all based on multicomponent metal oxide nanowires (NWs). For this end, sustainable and recyclable materials as ZnO, SnO2, TiO2 and Cu2O will be synthesized in different forms of heterostructured NWs, using low-temperature and low-cost solution processes. For precise positioning, NWs will be directly grow on flexible substrates using seed layers patterned by nanoimprint lithography. This will be crucial for integration in different nanotransistor structures, which will be combined into digital/analog nanocircuits following planar and 3D approaches. Energy will be provided by piezoelectric nanogenerators with innovative structures and materials. Final platform of nanocircuits+nanogenerators will make use of NW interconnects, bringing a new dimension to the systems-on-foil concept.
The research will be carried out at FCT-UNL, in a group pioneering transparent electronics. My PhD on oxide materials/devices and proven expertise on circuit integration, oxide nanostructure synthesis and nanofabrication/characterization tools will be a decisive contribute to the implementation of the proposal. TREND is an ambitious multidisciplinary project motivating advances in materials science, engineering, physics and chemistry, with impact extending from consumer electronics to health monitoring wearable devices. By promoting new ideas for practical ends, it will contribute to place Europe in the leading position of such strategic areas, where sustainability and innovation are key factors.
Summary
The Internet of Things is shaping the evolution of information society, requiring an increasing number of objects with embedded electronics, sensors and connectivity. This spurs the need for systems where summing to performance and low cost, multifunctionality has to be assured. In this context, TREND aims to take transparent electronics into as-of-yet unexplored levels of integration, by combining on flexible substrates transparent and high-speed nanocircuits with energy harvesting capabilities, all based on multicomponent metal oxide nanowires (NWs). For this end, sustainable and recyclable materials as ZnO, SnO2, TiO2 and Cu2O will be synthesized in different forms of heterostructured NWs, using low-temperature and low-cost solution processes. For precise positioning, NWs will be directly grow on flexible substrates using seed layers patterned by nanoimprint lithography. This will be crucial for integration in different nanotransistor structures, which will be combined into digital/analog nanocircuits following planar and 3D approaches. Energy will be provided by piezoelectric nanogenerators with innovative structures and materials. Final platform of nanocircuits+nanogenerators will make use of NW interconnects, bringing a new dimension to the systems-on-foil concept.
The research will be carried out at FCT-UNL, in a group pioneering transparent electronics. My PhD on oxide materials/devices and proven expertise on circuit integration, oxide nanostructure synthesis and nanofabrication/characterization tools will be a decisive contribute to the implementation of the proposal. TREND is an ambitious multidisciplinary project motivating advances in materials science, engineering, physics and chemistry, with impact extending from consumer electronics to health monitoring wearable devices. By promoting new ideas for practical ends, it will contribute to place Europe in the leading position of such strategic areas, where sustainability and innovation are key factors.
Max ERC Funding
1 500 000 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym TReX
Project Transient Relativistic eXplosions
Researcher (PI) Tsvi PIRAN
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Advanced Grant (AdG), PE9, ERC-2015-AdG
Summary The recent and upcoming deep and large field of view surveys has ascribed transient sources an ever-increasing role in 21st century astronomy. We propose to explore three relativistic transients: Compact binary mergers; Stellar disruption by massive black holes (TDEs) and Gamma-Ray Bursts (GRBs). Mergers are the prime targets of advanced Gravitational Waves (GW) detectors. Their detection will open a new window on the Universe. However localization, based on electromagnetic (EM) counterparts, that we propose to study here, is essential for GW Astronomy. TDEs provide a novel view on galactic centers’ massive black holes. However, TDE observations pose some puzzles, suggesting that a revision of the current tidal disruption theory is needed. New observations provide a wealth of data on GRBs and this is the time to determine their inner workings and to obtain a clear model for the prompt emission mechanism – a long standing puzzle. This project includes theoretical modeling of these events as well as phenomenology of the observations and even some data analysis and observations. Mergers, TDEs and GRBs, are tightly interconnected and share similar physical mechanisms. The theory of merger radio flares and of TDE’s radio emission draws, for example, on GRBs’ afterglow theory and the interpretation of TDE high-energy emission is based on concepts borrowed from the prompt emission of GRBs. A coordinated theoretical study will reveal and utilize the commonalities of these phenomena and has a strong potential to obtain far reaching results beyond the current state of the art with possible implications to other high energy astrophysical phenomena. While this is a theoretical proposal we address at all stages directly observational issues. Hence the proposal is closely related to observations - interpreting existing puzzling observations, predicting new ones or suggesting strategies how to obtain them.
Summary
The recent and upcoming deep and large field of view surveys has ascribed transient sources an ever-increasing role in 21st century astronomy. We propose to explore three relativistic transients: Compact binary mergers; Stellar disruption by massive black holes (TDEs) and Gamma-Ray Bursts (GRBs). Mergers are the prime targets of advanced Gravitational Waves (GW) detectors. Their detection will open a new window on the Universe. However localization, based on electromagnetic (EM) counterparts, that we propose to study here, is essential for GW Astronomy. TDEs provide a novel view on galactic centers’ massive black holes. However, TDE observations pose some puzzles, suggesting that a revision of the current tidal disruption theory is needed. New observations provide a wealth of data on GRBs and this is the time to determine their inner workings and to obtain a clear model for the prompt emission mechanism – a long standing puzzle. This project includes theoretical modeling of these events as well as phenomenology of the observations and even some data analysis and observations. Mergers, TDEs and GRBs, are tightly interconnected and share similar physical mechanisms. The theory of merger radio flares and of TDE’s radio emission draws, for example, on GRBs’ afterglow theory and the interpretation of TDE high-energy emission is based on concepts borrowed from the prompt emission of GRBs. A coordinated theoretical study will reveal and utilize the commonalities of these phenomena and has a strong potential to obtain far reaching results beyond the current state of the art with possible implications to other high energy astrophysical phenomena. While this is a theoretical proposal we address at all stages directly observational issues. Hence the proposal is closely related to observations - interpreting existing puzzling observations, predicting new ones or suggesting strategies how to obtain them.
Max ERC Funding
1 449 375 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym TRNAPROLIF
Project Control of translation efficiency in proliferating and differentiated mammalian cells
Researcher (PI) Yitzhak Pilpel
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Consolidator Grant (CoG), LS2, ERC-2013-CoG
Summary Translation maps the transcriptome onto the proteome. It is regulated at the initiation levels and also by mRNA secondary structure, and – our current focus – by the interplay between the mRNA and tRNA pools. Although a lot is known about the mechanics of translation, its effects on physiology, particularly on proliferation and differentiation in mammals presents major open questions.
tRNAProlif offers an approach that consists of genome-wide measurements and analyses of the tRNA and mRNAs pools, and the interplay between them, in proliferative and differentiated cells. The project will explain how changes in translation affect, and are affected by, these two states.
We rely on our preliminary results that show a striking dichotomy in codon usage: genes involved in cellular proliferation have distinct codon usage compared to genes involved in differentiation and other multi-cellular process. Further, the tRNA pool consists of two distinct sub-populations: tRNAs whose codons are enriched among the proliferation genes are induced in proliferation and cancer, and tRNAs whose codons are enriched in differentiation genes are repressed in proliferation and are induced in differentiation. Towards understanding this “tRNA Switch” we aim at:
Causality: We will determine whether the tRNA pool affects the proliferation/differentiation status of cell. Conversely, we will determine the effects of the proliferation/differentiation status on the tRNA pool.
Regulation: We will establish the regulatory scheme that governs the “tRNA Switch”: we will determine the effects of transcriptional and post-transcriptional regulation on the tRNAs, depending on cell state. We will determine how the balance between the two tRNA sub-populations affects the proteome.
Evolution: We will conduct comparative genomics of regulation of tRNA availability and codon usage and its effect on physiology in multiple species.
Summary
Translation maps the transcriptome onto the proteome. It is regulated at the initiation levels and also by mRNA secondary structure, and – our current focus – by the interplay between the mRNA and tRNA pools. Although a lot is known about the mechanics of translation, its effects on physiology, particularly on proliferation and differentiation in mammals presents major open questions.
tRNAProlif offers an approach that consists of genome-wide measurements and analyses of the tRNA and mRNAs pools, and the interplay between them, in proliferative and differentiated cells. The project will explain how changes in translation affect, and are affected by, these two states.
We rely on our preliminary results that show a striking dichotomy in codon usage: genes involved in cellular proliferation have distinct codon usage compared to genes involved in differentiation and other multi-cellular process. Further, the tRNA pool consists of two distinct sub-populations: tRNAs whose codons are enriched among the proliferation genes are induced in proliferation and cancer, and tRNAs whose codons are enriched in differentiation genes are repressed in proliferation and are induced in differentiation. Towards understanding this “tRNA Switch” we aim at:
Causality: We will determine whether the tRNA pool affects the proliferation/differentiation status of cell. Conversely, we will determine the effects of the proliferation/differentiation status on the tRNA pool.
Regulation: We will establish the regulatory scheme that governs the “tRNA Switch”: we will determine the effects of transcriptional and post-transcriptional regulation on the tRNAs, depending on cell state. We will determine how the balance between the two tRNA sub-populations affects the proteome.
Evolution: We will conduct comparative genomics of regulation of tRNA availability and codon usage and its effect on physiology in multiple species.
Max ERC Funding
1 540 000 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym TSGPs-of-CFSs
Project Role of Tumour Suppressor Gene Products of Common Fragile Sites in Human Diseases
Researcher (PI) Rami Aqeilan
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Consolidator Grant (CoG), LS4, ERC-2015-CoG
Summary Common fragile sites (CFSs) are large chromosomal regions identified by conventional cytogenetics as sequences prone to breakage in cells subjected to replication stress. The interest in CFSs stems from their key role in DNA damage, resulting in chromosomal rearrangements. The instability of CFSs was correlated with genome instability in precancerous lesions and during tumour progression. Two opposing views dominate the discussion regarding the role of CFSs. One school of thought suggested that genomic instability during cancer progression causes collateral damage to genes residing within CFSs, such as WWOX and FHIT. These genes are proposed to be unselected ‘‘passenger’’ mutations. The counter argument is that deletions and other genomic alterations in CFSs occur early in cancer development. Cancer cells with deletions in genes that span CFSs are then selectively expanded due to loss of tumour suppressor functions such as protection of genome stability, coordination of cell cycle or apoptosis.
Recent observations from my lab clearly suggest that gene products from CFSs play driver roles in cancer transformation. Moreover, we have evidence for the involvement of DNA damage and Wwox in pancreatic β-cells in the context of diabetes. Here, I propose to investigate the role of tumour suppressor gene products (TSGPs) of CFSs in human diseases. Three approaches will be taken to tackle this question. First, molecular functions of TSGPs of CFSs will be determined using state-of-the-art genetic tools in vitro. Second, novel transgenic mouse tools will be used to study CFSs and their associated TSGs in preneoplastic lesions and tumours in vivo, with confirmatory studies in human material. Third, we will examine the potential involvement of CFSs and their TSGPs in type-2 diabetes (T2D).
The expected outcome is a detailed molecular understanding of CFSs and their associated TSGPs in genomic instability as well as their roles in cancer and metabolic diseases.
Summary
Common fragile sites (CFSs) are large chromosomal regions identified by conventional cytogenetics as sequences prone to breakage in cells subjected to replication stress. The interest in CFSs stems from their key role in DNA damage, resulting in chromosomal rearrangements. The instability of CFSs was correlated with genome instability in precancerous lesions and during tumour progression. Two opposing views dominate the discussion regarding the role of CFSs. One school of thought suggested that genomic instability during cancer progression causes collateral damage to genes residing within CFSs, such as WWOX and FHIT. These genes are proposed to be unselected ‘‘passenger’’ mutations. The counter argument is that deletions and other genomic alterations in CFSs occur early in cancer development. Cancer cells with deletions in genes that span CFSs are then selectively expanded due to loss of tumour suppressor functions such as protection of genome stability, coordination of cell cycle or apoptosis.
Recent observations from my lab clearly suggest that gene products from CFSs play driver roles in cancer transformation. Moreover, we have evidence for the involvement of DNA damage and Wwox in pancreatic β-cells in the context of diabetes. Here, I propose to investigate the role of tumour suppressor gene products (TSGPs) of CFSs in human diseases. Three approaches will be taken to tackle this question. First, molecular functions of TSGPs of CFSs will be determined using state-of-the-art genetic tools in vitro. Second, novel transgenic mouse tools will be used to study CFSs and their associated TSGs in preneoplastic lesions and tumours in vivo, with confirmatory studies in human material. Third, we will examine the potential involvement of CFSs and their TSGPs in type-2 diabetes (T2D).
The expected outcome is a detailed molecular understanding of CFSs and their associated TSGPs in genomic instability as well as their roles in cancer and metabolic diseases.
Max ERC Funding
2 000 000 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym Tumor microbiome
Project The tumor microbial communities: Characterization, effects and translational opportunities
Researcher (PI) Ravid STRAUSSMAN
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Consolidator Grant (CoG), LS4, ERC-2018-COG
Summary The human body is host to a huge number of bacteria. While bacteria were first detected in human tumors over 150 years ago, our knowledge about their number, identity or effects in different human tumor types is still mostly rudimentary.
Over the last few years we focused on studying the tumor microbiome. We developed multiple methods to characterize and visualize intra-tumor bacteria, taking special measures to tell apart true tumor-bacteria from contamination. Profiling over 1,800 human tumors and normal adjacent tissues demostrated that bacteria are prevalent in many tumor types and that each tumor type has a unique microbial signature.
This application aims to capitalize on our cutting-edge research tools, vast experience and comprehensive preliminary data to shed light on the uncharted field of the tumor microbiome. We plan to:
(1) Characterize the different microbial communities (bacteria, fungi, others) and their dynamics in human tumors. We strive to provide a near complete narrative of how bacteria are distributed in the tumor, their association with immune cells and their dynamic changes with tumor development, metastasis and drug treatments.
(2) Study the effects that microbial communities may have on tumor biology, focusing primarily on tumor microbiome-mediated drug resistance and on tumor microbiome effects on tumor-associated macrophages.
(3) Exploit intra-tumor microbial communities for translational opportunities and novel therapeutics, in particular as an adjunct to current mainstay therapy or delivering innovative therapeutics via live bacteria.
Thus, this project will transform our understanding of the structure and function of intra-tumor microbial communities and will pave the way for us, as well as many others, to translate our findings into novel therapeutic options for cancer patients.
Summary
The human body is host to a huge number of bacteria. While bacteria were first detected in human tumors over 150 years ago, our knowledge about their number, identity or effects in different human tumor types is still mostly rudimentary.
Over the last few years we focused on studying the tumor microbiome. We developed multiple methods to characterize and visualize intra-tumor bacteria, taking special measures to tell apart true tumor-bacteria from contamination. Profiling over 1,800 human tumors and normal adjacent tissues demostrated that bacteria are prevalent in many tumor types and that each tumor type has a unique microbial signature.
This application aims to capitalize on our cutting-edge research tools, vast experience and comprehensive preliminary data to shed light on the uncharted field of the tumor microbiome. We plan to:
(1) Characterize the different microbial communities (bacteria, fungi, others) and their dynamics in human tumors. We strive to provide a near complete narrative of how bacteria are distributed in the tumor, their association with immune cells and their dynamic changes with tumor development, metastasis and drug treatments.
(2) Study the effects that microbial communities may have on tumor biology, focusing primarily on tumor microbiome-mediated drug resistance and on tumor microbiome effects on tumor-associated macrophages.
(3) Exploit intra-tumor microbial communities for translational opportunities and novel therapeutics, in particular as an adjunct to current mainstay therapy or delivering innovative therapeutics via live bacteria.
Thus, this project will transform our understanding of the structure and function of intra-tumor microbial communities and will pave the way for us, as well as many others, to translate our findings into novel therapeutic options for cancer patients.
Max ERC Funding
2 000 000 €
Duration
Start date: 2019-05-01, End date: 2024-04-30
Project acronym TUNNEL
Project Tunneling Spectroscopy in van-der-Waals Device
Researcher (PI) Hadar Steinberg
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), PE3, ERC-2014-STG
Summary I will expand the experimental reach of tunneling spectroscopy to new materials and device geometries. The technique is ideal for tackling two challenges: (i) Probing Andreev bound states and Majorana states in graphene and topological insulators (TIs) coupled to superconductors, and (ii) realizing momentum-conserving tunneling.
I will utilize a breakthrough in device fabrication to stack layered van-der-Waals materials, such as graphene and hexagonal Boron Nitride (hBN), to form vertical structures. Ultrathin layers of mechanically deposited dielectrics will be used as tunnel-barriers. These can interface any smooth surface, expanding the range of possible device-based tunneling systems.
A tunnel junction has decisive advantages over STM in access to lower temperatures and hence higher energy resolution. Significantly, the effort to probe the energy spectra of graphene and TIs coupled to superconductors is often resolution-limited. I will develop artificial-vortex devices and Josephson devices where induced spectra are expected to reveal the Majorana mode, a quantum state of unusual statistics sought as a platform for fault-tolerant quantum computation.
Using the same technology, I will develop devices where tunneling takes place between extended states. The aim is to realize momentum resolved tunneling for μeV-resolution measurement of dispersions in graphene, other 2D systems, and smooth interfaces. Momentum control will be achieved using density-tuning of the Fermi surfaces or using parallel magnetic field. The high resolution spectra will reveal details of interaction effects, manifest as modifications to the single-electron picture.
Carriers can be injected into a system with full control over their direction and energy – a powerful experimental knob, useful for injecting carriers using one electrode and extracting them in another. Such geometry is sensitive to relaxation effects, and will allow unprecedented resolution studies of out-of-equilibrium systems.
Summary
I will expand the experimental reach of tunneling spectroscopy to new materials and device geometries. The technique is ideal for tackling two challenges: (i) Probing Andreev bound states and Majorana states in graphene and topological insulators (TIs) coupled to superconductors, and (ii) realizing momentum-conserving tunneling.
I will utilize a breakthrough in device fabrication to stack layered van-der-Waals materials, such as graphene and hexagonal Boron Nitride (hBN), to form vertical structures. Ultrathin layers of mechanically deposited dielectrics will be used as tunnel-barriers. These can interface any smooth surface, expanding the range of possible device-based tunneling systems.
A tunnel junction has decisive advantages over STM in access to lower temperatures and hence higher energy resolution. Significantly, the effort to probe the energy spectra of graphene and TIs coupled to superconductors is often resolution-limited. I will develop artificial-vortex devices and Josephson devices where induced spectra are expected to reveal the Majorana mode, a quantum state of unusual statistics sought as a platform for fault-tolerant quantum computation.
Using the same technology, I will develop devices where tunneling takes place between extended states. The aim is to realize momentum resolved tunneling for μeV-resolution measurement of dispersions in graphene, other 2D systems, and smooth interfaces. Momentum control will be achieved using density-tuning of the Fermi surfaces or using parallel magnetic field. The high resolution spectra will reveal details of interaction effects, manifest as modifications to the single-electron picture.
Carriers can be injected into a system with full control over their direction and energy – a powerful experimental knob, useful for injecting carriers using one electrode and extracting them in another. Such geometry is sensitive to relaxation effects, and will allow unprecedented resolution studies of out-of-equilibrium systems.
Max ERC Funding
1 499 875 €
Duration
Start date: 2015-05-01, End date: 2020-04-30
Project acronym UB12
Project Ergodic Group Theory
Researcher (PI) Uri Bader
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), PE1, ERC-2012-StG_20111012
Summary "The aim of the proposed research is gaining a better understanding of locally compact groups and their lattices. Our tools are mainly ergodic theoretical.
We propose a variety of novel ideas that open new horizons for research.
The first meta idea is the adoption of tools from the semi-simple theory in order to apply them for general locally compact groups. In particular, we suggest a construction of a ""Weyl group"" and an abstract definition of rank for every locally compact group. We are able to construct a ""Coxeter complex"" and we foresee a construction of a ""building like"" object.
A second set of ideas concerns the category of measure equivalences, which is a natural generalization of the notion of a lattice in a group. This category is long known to be a measurable counterpart of the better studied category of quasi-isometries, yet it misses a good definition of self measure equivalences of an object, analog to the group of quasi-isometries.
We suggest such a definition, and propose to study it, among a variety of related constructions.
A full implementation of our ideas requires a better understanding of locally compact groups.
Thus, an important aspect of the proposed research is that it leaves plenty of room for the study of specific examples and test cases."
Summary
"The aim of the proposed research is gaining a better understanding of locally compact groups and their lattices. Our tools are mainly ergodic theoretical.
We propose a variety of novel ideas that open new horizons for research.
The first meta idea is the adoption of tools from the semi-simple theory in order to apply them for general locally compact groups. In particular, we suggest a construction of a ""Weyl group"" and an abstract definition of rank for every locally compact group. We are able to construct a ""Coxeter complex"" and we foresee a construction of a ""building like"" object.
A second set of ideas concerns the category of measure equivalences, which is a natural generalization of the notion of a lattice in a group. This category is long known to be a measurable counterpart of the better studied category of quasi-isometries, yet it misses a good definition of self measure equivalences of an object, analog to the group of quasi-isometries.
We suggest such a definition, and propose to study it, among a variety of related constructions.
A full implementation of our ideas requires a better understanding of locally compact groups.
Thus, an important aspect of the proposed research is that it leaves plenty of room for the study of specific examples and test cases."
Max ERC Funding
1 150 000 €
Duration
Start date: 2012-09-01, End date: 2018-08-31
Project acronym Ubl-Code
Project Revealing the ubiquitin and ubiquitin-like modification landscape in health and disease
Researcher (PI) Yifat Haya Merbl
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE LTD
Call Details Starting Grant (StG), LS2, ERC-2015-STG
Summary Post-translational modifications (PTMs) of proteins are a major tool that the cell uses to monitor events and initiate appropriate responses. While a protein is defined by its backbone of amino acid sequence, its function is often determined by PTMs, which specify stability, activity, or cellular localization. Among PTMs, ubiquitin and ubiquitin-like (Ubl) modifications were shown to regulate a variety of fundamental cellular processes such as cell division and differentiation. Aberrations in these pathways have been implicated in the pathogenesis of cancer. Over the past decade high-throughput genomic and transcriptional analyses have profoundly broadened our understanding of the processes underlying cancer development and progression. Yet, proteomic analyses and the PTM landscape in cancer, remained relatively unexplored.
Our goal is to decipher molecular mechanisms of Ubl regulation in cancer. We will utilize the PTM profiling technology that I developed and further develop it to allow for subsequent MS analysis. Together with cutting-edge genomic, imaging and proteomic technologies, we will analyze novel aspects of PTM regulation at the level of the enzymatic machinery, the substrates and the downstream cellular network. We will rely on ample in-vitro and in-vivo characterization of Ubl conjugation to:a. Elucidate the regulatory principles of substrate specificity and recognition. b. Understand signalling dynamics in the ubiquitin system. c. Reveal how aberrations in these pathways may lead to diseases such as cancer. Identifying both the Ubl modifying enzymes and the modified substrates will form the basis for deciphering the molecular pathways in which they operate in the cell and the principles of their dynamic regulation. Revealing the PTM regulatory code presents a unique opportunity for the development of novel therapeutics. More broadly, our approaches may provide a new paradigm for addressing other complex biological questions involving PTM regulation.
Summary
Post-translational modifications (PTMs) of proteins are a major tool that the cell uses to monitor events and initiate appropriate responses. While a protein is defined by its backbone of amino acid sequence, its function is often determined by PTMs, which specify stability, activity, or cellular localization. Among PTMs, ubiquitin and ubiquitin-like (Ubl) modifications were shown to regulate a variety of fundamental cellular processes such as cell division and differentiation. Aberrations in these pathways have been implicated in the pathogenesis of cancer. Over the past decade high-throughput genomic and transcriptional analyses have profoundly broadened our understanding of the processes underlying cancer development and progression. Yet, proteomic analyses and the PTM landscape in cancer, remained relatively unexplored.
Our goal is to decipher molecular mechanisms of Ubl regulation in cancer. We will utilize the PTM profiling technology that I developed and further develop it to allow for subsequent MS analysis. Together with cutting-edge genomic, imaging and proteomic technologies, we will analyze novel aspects of PTM regulation at the level of the enzymatic machinery, the substrates and the downstream cellular network. We will rely on ample in-vitro and in-vivo characterization of Ubl conjugation to:a. Elucidate the regulatory principles of substrate specificity and recognition. b. Understand signalling dynamics in the ubiquitin system. c. Reveal how aberrations in these pathways may lead to diseases such as cancer. Identifying both the Ubl modifying enzymes and the modified substrates will form the basis for deciphering the molecular pathways in which they operate in the cell and the principles of their dynamic regulation. Revealing the PTM regulatory code presents a unique opportunity for the development of novel therapeutics. More broadly, our approaches may provide a new paradigm for addressing other complex biological questions involving PTM regulation.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym uDAS
Project An Automatic Microfluidic Device Assembly System
Researcher (PI) Doron Gerber
Host Institution (HI) BAR ILAN UNIVERSITY
Call Details Proof of Concept (PoC), PC1, ERC-2014-PoC
Summary Microfluidic systems, in general, have proven important platforms for biomedical assays. These systems benefit from reduced requirements for expensive reagents, short analysis times, and portability. Although microfluidic systems are convenient platforms, their use in the life sciences is still limited mainly due to the high-level fabrication expertise required for construction.
Integrated microfluidics is one of the most sophisticated three-dimensional (multi layer) solution. It requires soft lithography (PDMS based chips), for production of high complexity microfluidic systems (multiple serial or parallel processes). Integrated microfluidics in particular is almost non-existent in the industry due to the low yield and uncontrolled production process.
My ERC project (MUDLOC-2012) is to develop a microfluidic platform for multidimensional protein array analysis. It uses complex multilayer microfluidic devices that consist of 2 PDMS layers and a glass microarray. The integrated microfluidics system contains thousands of micromechanical valves in micrometer dimensions, controlling thousands of parallel reactions. Our research demands production of hundreds of such devices.
We, as all others who produce integrated microfluidics, suffered from frustrating low yield (15%). In order to improve fabrication yield and to fabricate devices with increased density, we designed and manufactured, a first of its kind, full production process sequence, semi automatic Microfluidic Device Assembly System (µDAS). This resulted in a direct increase of device complexity and yield (85%) over the last half year.
The 2nd generation automated µDAS prototype will become a generic assembly tool for soft lithography. µDAS will enable a critical production standard and process control, which will pave the road for significant penetration of complex integrated microfluidics technology into both academia and industry.
Summary
Microfluidic systems, in general, have proven important platforms for biomedical assays. These systems benefit from reduced requirements for expensive reagents, short analysis times, and portability. Although microfluidic systems are convenient platforms, their use in the life sciences is still limited mainly due to the high-level fabrication expertise required for construction.
Integrated microfluidics is one of the most sophisticated three-dimensional (multi layer) solution. It requires soft lithography (PDMS based chips), for production of high complexity microfluidic systems (multiple serial or parallel processes). Integrated microfluidics in particular is almost non-existent in the industry due to the low yield and uncontrolled production process.
My ERC project (MUDLOC-2012) is to develop a microfluidic platform for multidimensional protein array analysis. It uses complex multilayer microfluidic devices that consist of 2 PDMS layers and a glass microarray. The integrated microfluidics system contains thousands of micromechanical valves in micrometer dimensions, controlling thousands of parallel reactions. Our research demands production of hundreds of such devices.
We, as all others who produce integrated microfluidics, suffered from frustrating low yield (15%). In order to improve fabrication yield and to fabricate devices with increased density, we designed and manufactured, a first of its kind, full production process sequence, semi automatic Microfluidic Device Assembly System (µDAS). This resulted in a direct increase of device complexity and yield (85%) over the last half year.
The 2nd generation automated µDAS prototype will become a generic assembly tool for soft lithography. µDAS will enable a critical production standard and process control, which will pave the road for significant penetration of complex integrated microfluidics technology into both academia and industry.
Max ERC Funding
150 000 €
Duration
Start date: 2015-04-01, End date: 2016-09-30
Project acronym ULTRAFASTEUVPROBE
Project Ultrafast EUV probe for Molecular Reaction Dynamics
Researcher (PI) Daniel Strasser
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), PE4, ERC-2012-StG_20111012
Summary "This research is aimed at developing and validating a novel approach for time resolved imaging of structural dynamics, using single photon Coulomb explosion imaging (CEI) with ultrafast extreme UV (EUV) pulses to probe laser initiated ultrafast structural rearrangement and fragmentation dynamics. The emerging field of ultrafast EUV pulses attracts increasing amount of scientific attention, predominantly concentrated on understanding aspects of the generation process, as well as on measuring record breaking attosecond pulses at increasingly high photon energies and photon flux. I propose to direct the unique properties of ultrafast EUV pulses towards time resolved studies of molecular reaction dynamics that are inaccessible with conventional ultrafast laser systems. Time resolved single photon CEI will make possible the visualization of complex dynamics in polyatomic systems; specifically, how laser driven electronic excitation couples into nuclear motion in a wide range of molecular systems. In contrast to earlier attempts, in which CEI was driven with intense near-IR pulses that can alter the observed dynamics, the proposed single photon CEI will remove the masking intense field effects and provide a simple and general probe. A comprehensive experimental effort is proposed - to conduct a direct comparison of intense field CEI to the proposed single EUV photon approach. Successful implementation of this research will endow us with a new way to visualize and understand the underlying quantum mechanisms involved in chemical reactions. With this new technology I hope to be able to provide unique insight into molecular fragmentation and rearrangement dynamics during chemical reactions and to resolve long standing basic scientific questions, such as the concerted or sequential nature of double proton transfer in DNA base-pair models. Finally, the ""table top"" techniques developed in my lab will mature and become applicable to the emerging ultrafast EUV user facilities."
Summary
"This research is aimed at developing and validating a novel approach for time resolved imaging of structural dynamics, using single photon Coulomb explosion imaging (CEI) with ultrafast extreme UV (EUV) pulses to probe laser initiated ultrafast structural rearrangement and fragmentation dynamics. The emerging field of ultrafast EUV pulses attracts increasing amount of scientific attention, predominantly concentrated on understanding aspects of the generation process, as well as on measuring record breaking attosecond pulses at increasingly high photon energies and photon flux. I propose to direct the unique properties of ultrafast EUV pulses towards time resolved studies of molecular reaction dynamics that are inaccessible with conventional ultrafast laser systems. Time resolved single photon CEI will make possible the visualization of complex dynamics in polyatomic systems; specifically, how laser driven electronic excitation couples into nuclear motion in a wide range of molecular systems. In contrast to earlier attempts, in which CEI was driven with intense near-IR pulses that can alter the observed dynamics, the proposed single photon CEI will remove the masking intense field effects and provide a simple and general probe. A comprehensive experimental effort is proposed - to conduct a direct comparison of intense field CEI to the proposed single EUV photon approach. Successful implementation of this research will endow us with a new way to visualize and understand the underlying quantum mechanisms involved in chemical reactions. With this new technology I hope to be able to provide unique insight into molecular fragmentation and rearrangement dynamics during chemical reactions and to resolve long standing basic scientific questions, such as the concerted or sequential nature of double proton transfer in DNA base-pair models. Finally, the ""table top"" techniques developed in my lab will mature and become applicable to the emerging ultrafast EUV user facilities."
Max ERC Funding
1 499 000 €
Duration
Start date: 2012-11-01, End date: 2018-10-31
Project acronym ULTRANMR
Project Ultrafast Hyperpolarized NMR and MRI in Multiple Dimensions
Researcher (PI) Lucio Frydman
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Advanced Grant (AdG), PE4, ERC-2009-AdG
Summary Multidimensional nuclear magnetic resonance (nD NMR) plays a unique role in Science as a primary tool for the characterization of biomolecules, as part of drug-discovery processes, and in clinical imaging (MRI). Further progress in NMR is hampered by this spectroscopy s low sensitivity, arising from the weak interactions that it involves. The prospects of solving this problem by continuing with incremental bigger machines approaches are poor, given the high maturity reached by existing technologies. The present Project deals with this issue by departing from traditional concepts, and relying on two incipient but highly promising developments in the field. One of these pertains ex situ dynamic nuclear hyperpolarization, an approach capable of eliciting liquid state NMR signals that surpass those afforded by the highest-field spectrometers by factors e10,000. While capable of providing super-signals hyperpolarization has the drawback of involving irreversible changes in the physical state of the sample. This makes it incompatible with nD NMR technologies, requiring the collection of multiple scans identical to one another except for systematic delay variations. As second component in this high-risk/high-gain Project we propose merging hyperpolarization with "ultrafast" methods that we have recently developed for completing arbitrary nD NMR/MRI acquisitions within a single scan. The resulting synergy could increase sensitivity by orders of magnitude, while demanding negligibly small amounts of spectrometer/scanner time to complete nD acquisitions. This should provide an ideal starting point for the analysis of a variety of organic and structural biology problems, and provide new tools to explore in vivo metabolism focusing on cancer biomarkers.
Summary
Multidimensional nuclear magnetic resonance (nD NMR) plays a unique role in Science as a primary tool for the characterization of biomolecules, as part of drug-discovery processes, and in clinical imaging (MRI). Further progress in NMR is hampered by this spectroscopy s low sensitivity, arising from the weak interactions that it involves. The prospects of solving this problem by continuing with incremental bigger machines approaches are poor, given the high maturity reached by existing technologies. The present Project deals with this issue by departing from traditional concepts, and relying on two incipient but highly promising developments in the field. One of these pertains ex situ dynamic nuclear hyperpolarization, an approach capable of eliciting liquid state NMR signals that surpass those afforded by the highest-field spectrometers by factors e10,000. While capable of providing super-signals hyperpolarization has the drawback of involving irreversible changes in the physical state of the sample. This makes it incompatible with nD NMR technologies, requiring the collection of multiple scans identical to one another except for systematic delay variations. As second component in this high-risk/high-gain Project we propose merging hyperpolarization with "ultrafast" methods that we have recently developed for completing arbitrary nD NMR/MRI acquisitions within a single scan. The resulting synergy could increase sensitivity by orders of magnitude, while demanding negligibly small amounts of spectrometer/scanner time to complete nD acquisitions. This should provide an ideal starting point for the analysis of a variety of organic and structural biology problems, and provide new tools to explore in vivo metabolism focusing on cancer biomarkers.
Max ERC Funding
2 499 780 €
Duration
Start date: 2010-03-01, End date: 2015-02-28
Project acronym UltraTherMicroscope
Project Ultra-sensitive Thermal Nanoscale Microscope
Researcher (PI) Elia Zeldov
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Proof of Concept (PoC), PC1, ERC-2014-PoC
Summary The research and commercialization of nanotechnology-based products require continuous development of advanced nanoscale inspection tools that drive the high-resolution microscopy markets. The state-of-the-art (SoA) microscopy provides a broad range of physical, spectroscopic, and materials characterization means, however, one of its key deficient ingredients is nanoscale thermal imaging – an essential tool for failure analysis and characterization of local heating and energy loss sources in high-density electronic nanodevices under operational conditions. The goal of this project is to provide a proof of concept (PoC) for a ground-breaking nanoscale thermal sensor and high-resolution scanning imaging system reaching thermal sensitivity with up to three orders of magnitude improvement over the existing SoA. The PoC aims at determining the technological feasibility and establishing commercialization potential for high sensitivity and high bandwidth nanoscale thermal imaging of operating devices in microelectronics, quantum computing, and novel materials industries. The project comprises benchmark demonstrations, patenting, development of relevant business models, and networking actions for successful commercialization. The novel sensor and microscopy system will serve as cost efficient application in materials research, failure analysis, and process control. It is thus expected to contribute to the competitiveness of Europe’s important Key Enabling Technology sector in Materials Science and Nanoindustries.
Summary
The research and commercialization of nanotechnology-based products require continuous development of advanced nanoscale inspection tools that drive the high-resolution microscopy markets. The state-of-the-art (SoA) microscopy provides a broad range of physical, spectroscopic, and materials characterization means, however, one of its key deficient ingredients is nanoscale thermal imaging – an essential tool for failure analysis and characterization of local heating and energy loss sources in high-density electronic nanodevices under operational conditions. The goal of this project is to provide a proof of concept (PoC) for a ground-breaking nanoscale thermal sensor and high-resolution scanning imaging system reaching thermal sensitivity with up to three orders of magnitude improvement over the existing SoA. The PoC aims at determining the technological feasibility and establishing commercialization potential for high sensitivity and high bandwidth nanoscale thermal imaging of operating devices in microelectronics, quantum computing, and novel materials industries. The project comprises benchmark demonstrations, patenting, development of relevant business models, and networking actions for successful commercialization. The novel sensor and microscopy system will serve as cost efficient application in materials research, failure analysis, and process control. It is thus expected to contribute to the competitiveness of Europe’s important Key Enabling Technology sector in Materials Science and Nanoindustries.
Max ERC Funding
150 000 €
Duration
Start date: 2015-04-01, End date: 2016-09-30
Project acronym UncertainENV
Project The Power of Randomization in Uncertain Environments
Researcher (PI) Shiri Chechik
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), PE6, ERC-2018-STG
Summary Much of the research on the foundations of graph algorithms is carried out under the assumption that the algorithm has full knowledge of the input data.
In spite of the theoretical appeal and simplicity of this setting, the assumption that the algorithm has full knowledge does not always hold.
Indeed uncertainty and partial knowledge arise in many settings.
One example is where the data is very large, in which case even reading the entire data once is infeasible, and sampling is required.
Another example is where data changes occur over time (e.g., social networks where information is fluid).
A third example is where processing of the data is distributed over computation nodes, and each node has only local information.
Randomization is a powerful tool in the classic setting of graph algorithms with full knowledge and is often used to simplify the algorithm and to speed-up its running time.
However, physical computers are deterministic machines, and obtaining true randomness can be a hard task to achieve.
Therefore, a central line of research is focused on the derandomization of algorithms that relies on randomness.
The challenge of derandomization also arise in settings where the algorithm has some degree of uncertainty.
In fact, in many cases of uncertainty the challenge and motivation of derandomization is even stronger.
Randomization by itself adds another layer of uncertainty, because different results may be attained in different runs of the algorithm.
In addition, in many cases of uncertainty randomization often comes with additional assumptions on the model itself, and therefore weaken the guarantees of the algorithm.
In this proposal I will investigate the power of randomization in uncertain environments.
I will focus on two fundamental areas of graph algorithms with uncertainty.
The first area relates to dynamic algorithms and the second area concerns distributed graph algorithms.
Summary
Much of the research on the foundations of graph algorithms is carried out under the assumption that the algorithm has full knowledge of the input data.
In spite of the theoretical appeal and simplicity of this setting, the assumption that the algorithm has full knowledge does not always hold.
Indeed uncertainty and partial knowledge arise in many settings.
One example is where the data is very large, in which case even reading the entire data once is infeasible, and sampling is required.
Another example is where data changes occur over time (e.g., social networks where information is fluid).
A third example is where processing of the data is distributed over computation nodes, and each node has only local information.
Randomization is a powerful tool in the classic setting of graph algorithms with full knowledge and is often used to simplify the algorithm and to speed-up its running time.
However, physical computers are deterministic machines, and obtaining true randomness can be a hard task to achieve.
Therefore, a central line of research is focused on the derandomization of algorithms that relies on randomness.
The challenge of derandomization also arise in settings where the algorithm has some degree of uncertainty.
In fact, in many cases of uncertainty the challenge and motivation of derandomization is even stronger.
Randomization by itself adds another layer of uncertainty, because different results may be attained in different runs of the algorithm.
In addition, in many cases of uncertainty randomization often comes with additional assumptions on the model itself, and therefore weaken the guarantees of the algorithm.
In this proposal I will investigate the power of randomization in uncertain environments.
I will focus on two fundamental areas of graph algorithms with uncertainty.
The first area relates to dynamic algorithms and the second area concerns distributed graph algorithms.
Max ERC Funding
1 500 000 €
Duration
Start date: 2019-10-01, End date: 2024-09-30
Project acronym Universal Banking
Project Universal Banking, Corporate Control and Crises
Researcher (PI) Miguel Luis Sousa De Almeida Ferreira
Host Institution (HI) FACULDADE DE ECONOMIA DA UNIVERSIDADE NOVA DE LISBOA
Call Details Starting Grant (StG), SH1, ERC-2012-StG_20111124
Summary Financial intermediaries play a vital role in providing capital to corporations. The 2007-2009 financial crisis had dramatic consequences on the organization of the financial system that led to the rise of universal banking and financial conglomerates. Financial conglomerates have been common in Europe, but the recent developments have eroded the separation of commercial and investment banking elsewhere. Financial conglomerates act as lenders but also underwrite and trade securities, have equity stakes and sit on the board of corporations, and manage mutual and pension funds that invest in corporations. These forms of corporate control by financial conglomerates are distinct in their incentives and costs and therefore can have distinct effects on non-financial corporations. We will study the effect of control by financial conglomerates on corporation’s performance, investment, financing, and corporate governance policies. A particular relevant channel through which financial conglomerates can affect firm’s policies is the credit channel. Firms establish relationships with financial conglomerates that give easier access to credit and potentially at a lower cost due to economies of scale in information collection and monitoring. There may be, however, costs to firms with a close relationship with a financial conglomerate as firms may be locked up due to an information monopoly. We will study the effects of bank-firm relationships on the loan market. In particular, we will examine the importance of these relationships for explaining differences in the cost of bank distress across firms. The hypothesis is that strong ties with banks reduce firms’ ability to substitute relationship bank loans with other sources of external finance, and therefore firms with stronger relationships could experience greater costs during financial crises. We will contribute to the understanding the consequences of shocks to the financial health of banks for nonfinancial firms.
Summary
Financial intermediaries play a vital role in providing capital to corporations. The 2007-2009 financial crisis had dramatic consequences on the organization of the financial system that led to the rise of universal banking and financial conglomerates. Financial conglomerates have been common in Europe, but the recent developments have eroded the separation of commercial and investment banking elsewhere. Financial conglomerates act as lenders but also underwrite and trade securities, have equity stakes and sit on the board of corporations, and manage mutual and pension funds that invest in corporations. These forms of corporate control by financial conglomerates are distinct in their incentives and costs and therefore can have distinct effects on non-financial corporations. We will study the effect of control by financial conglomerates on corporation’s performance, investment, financing, and corporate governance policies. A particular relevant channel through which financial conglomerates can affect firm’s policies is the credit channel. Firms establish relationships with financial conglomerates that give easier access to credit and potentially at a lower cost due to economies of scale in information collection and monitoring. There may be, however, costs to firms with a close relationship with a financial conglomerate as firms may be locked up due to an information monopoly. We will study the effects of bank-firm relationships on the loan market. In particular, we will examine the importance of these relationships for explaining differences in the cost of bank distress across firms. The hypothesis is that strong ties with banks reduce firms’ ability to substitute relationship bank loans with other sources of external finance, and therefore firms with stronger relationships could experience greater costs during financial crises. We will contribute to the understanding the consequences of shocks to the financial health of banks for nonfinancial firms.
Max ERC Funding
1 174 000 €
Duration
Start date: 2013-03-01, End date: 2018-02-28
Project acronym UNMOVED
Project Simultaneous three dimensional multiphoton microscopy without mechanical depth scanning
Researcher (PI) Shy Shoham
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Proof of Concept (PoC), PC1, ERC-2014-PoC
Summary Information in the nervous system is typically represented in activity patterns distributed across large populations of neurons in three dimensions, and there is a lot of current interest in the development of systems that can volumetrically image these patterns (or other distributed processes) in real-time. During our work towards imaging responses to artificial retinal stimulation we have developed a solution for remotely manipulating the illumination plane during temporal-focusing multiphoton imaging (a powerful bio-microscopy and photo-manipulation modality). This technique and its variations could provide unprecedented access into rapid 3D imaging at an exciting time for the neuroimaging field, where major initiatives for high-resolution brain activity mapping have been launched, and could potentially also be applied towards innovative approaches to micro-endoscopy and other applications.
Summary
Information in the nervous system is typically represented in activity patterns distributed across large populations of neurons in three dimensions, and there is a lot of current interest in the development of systems that can volumetrically image these patterns (or other distributed processes) in real-time. During our work towards imaging responses to artificial retinal stimulation we have developed a solution for remotely manipulating the illumination plane during temporal-focusing multiphoton imaging (a powerful bio-microscopy and photo-manipulation modality). This technique and its variations could provide unprecedented access into rapid 3D imaging at an exciting time for the neuroimaging field, where major initiatives for high-resolution brain activity mapping have been launched, and could potentially also be applied towards innovative approaches to micro-endoscopy and other applications.
Max ERC Funding
150 000 €
Duration
Start date: 2015-02-01, End date: 2016-07-31
Project acronym UP2DM
Project Up-scaling Production of 2-Dimensional Materials
Researcher (PI) Jonathan COLEMAN
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Proof of Concept (PoC), PC1, ERC-2011-PoC
Summary "Layered materials represent a diverse and largely untapped source of 2-dimensional (2D) systems with exotic electronic properties and high specific surface areas that are important for sensing, catalysis and energy storage applications. While graphene is the most well-known layered material, transition metal dichalcogenides (TMDs), transition metal oxides (TMOs) and other 2D compounds are also important. The latter materials are of particular interest as topological insulators and thermoelectric materials. Current production methods for these materials make them uneconomical for most commercial applications.
The project will develop and explore commercialisation of a unique method developed by the PI for producing single atomic layer materials. It will also evaluate the potential opportunity to commercialise the materials and or devices made using these materials. The project is linked to an ERC Starting Grant awarded to Prof Jonathan N Coleman in TCD called Semiconducting and Metallic nanosheets: Two dimensional electronic and mechanical materials (SEMANTICS). The proposal will seek to up-scale the process which has been developed within SEMANTICS (and which has already generated one patent application), and engage the commercialisation professionals in CRANN and the Technology Transfer resources in TCD to bring this technology out to the market place."
Summary
"Layered materials represent a diverse and largely untapped source of 2-dimensional (2D) systems with exotic electronic properties and high specific surface areas that are important for sensing, catalysis and energy storage applications. While graphene is the most well-known layered material, transition metal dichalcogenides (TMDs), transition metal oxides (TMOs) and other 2D compounds are also important. The latter materials are of particular interest as topological insulators and thermoelectric materials. Current production methods for these materials make them uneconomical for most commercial applications.
The project will develop and explore commercialisation of a unique method developed by the PI for producing single atomic layer materials. It will also evaluate the potential opportunity to commercialise the materials and or devices made using these materials. The project is linked to an ERC Starting Grant awarded to Prof Jonathan N Coleman in TCD called Semiconducting and Metallic nanosheets: Two dimensional electronic and mechanical materials (SEMANTICS). The proposal will seek to up-scale the process which has been developed within SEMANTICS (and which has already generated one patent application), and engage the commercialisation professionals in CRANN and the Technology Transfer resources in TCD to bring this technology out to the market place."
Max ERC Funding
149 760 €
Duration
Start date: 2012-04-01, End date: 2013-07-31
Project acronym UreaCa
Project Deciphering the metabolic roles of the urea-cycle pathway in carcinogenesis for improving diagnosis and therapy
Researcher (PI) Ayelet EREZ
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Consolidator Grant (CoG), LS7, ERC-2018-COG
Summary Almost 100 years ago, Warburg described a metabolic change in energy flux that occurs during carcinogenesis. Since then, multiple studies have demonstrated how anabolic synthesis of macromolecules can be altered to support cancer cell progression. Yet, the potential effect of altered catabolic degradation of macromolecules on tumour carcinogenesis has been much less studied.
The urea cycle (UC) is the main catabolic pathway by which mammals excrete waste nitrogen. Although the complete UC pathway is liver-specific, most tissues express different combinations of UC enzymes according to the cellular needs. Surprisingly, we find that changes in expression of UC components causing UC dysregulation, (UCD) is a global phenomenon in cancer, metabolically augmenting net nitrogen usage for the synthesis of macromolecules by reducing nitrogen waste. This metabolic alteration is associated with poor patient prognosis. Thus, we hypothesise that UCD provides a major metabolic advantage to multiple aspects of carcinogenesis and as such, leads to specific, identifiable genomic and biochemical signatures, with implications for cancer diagnosis and therapy.
To pursue our hypothesis, we will incorporate state-of-the-art comparative genomic, peptidomic, metabolomic, and molecular approaches to explore this scientific “blind spot” of nitrogen metabolism in carcinogenesis. We will investigate how UCD causally affects carcinogenesis, by characterising tumour-specific functions of UC enzymes (Aim I), correlating tumour phenotypes with systemic biomarkers (Aim II), and testing the treatment efficacy of drug combinations targeting UCD in cancers (Aim III).
Our proposal, strengthened by my training as a physician scientist, harbours considerable potential for translational diagnostic and therapeutic utility of our findings, enabling us to i) identify new diagnostic biomarkers for monitoring cancer initiation and progression and ii) predict and enhance the therapeutic response.
Summary
Almost 100 years ago, Warburg described a metabolic change in energy flux that occurs during carcinogenesis. Since then, multiple studies have demonstrated how anabolic synthesis of macromolecules can be altered to support cancer cell progression. Yet, the potential effect of altered catabolic degradation of macromolecules on tumour carcinogenesis has been much less studied.
The urea cycle (UC) is the main catabolic pathway by which mammals excrete waste nitrogen. Although the complete UC pathway is liver-specific, most tissues express different combinations of UC enzymes according to the cellular needs. Surprisingly, we find that changes in expression of UC components causing UC dysregulation, (UCD) is a global phenomenon in cancer, metabolically augmenting net nitrogen usage for the synthesis of macromolecules by reducing nitrogen waste. This metabolic alteration is associated with poor patient prognosis. Thus, we hypothesise that UCD provides a major metabolic advantage to multiple aspects of carcinogenesis and as such, leads to specific, identifiable genomic and biochemical signatures, with implications for cancer diagnosis and therapy.
To pursue our hypothesis, we will incorporate state-of-the-art comparative genomic, peptidomic, metabolomic, and molecular approaches to explore this scientific “blind spot” of nitrogen metabolism in carcinogenesis. We will investigate how UCD causally affects carcinogenesis, by characterising tumour-specific functions of UC enzymes (Aim I), correlating tumour phenotypes with systemic biomarkers (Aim II), and testing the treatment efficacy of drug combinations targeting UCD in cancers (Aim III).
Our proposal, strengthened by my training as a physician scientist, harbours considerable potential for translational diagnostic and therapeutic utility of our findings, enabling us to i) identify new diagnostic biomarkers for monitoring cancer initiation and progression and ii) predict and enhance the therapeutic response.
Max ERC Funding
2 000 000 €
Duration
Start date: 2019-05-01, End date: 2024-04-30
Project acronym URSAT
Project Understanding Random Systems via Algebraic Topology
Researcher (PI) Robert Joseph Adler
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Advanced Grant (AdG), PE1, ERC-2012-ADG_20120216
Summary Over the past decade there has been a significant expansion of activity in applying the techniques and theory of algebraic topology to real world problems. The expression `applied algebraic topology' is no longer an oxymoron! This expansion has generated new mathematical theory, new computational techniques, and even commercial startups. However, there is still an important component of this topological approach that has not been treated in any depth, and this is the inherently stochastic nature of the world. Consequently, there is an urgent need to complement recent developments, which have been primarily deterministic, with sophisticated stochastic modelling and analysis. The current proposal aims to attack this issue by applying algebraic topological thinking to random systems.
Over the past two years, the PI Adler and colleagues have organised workshops in Banff, Palo Alto and Chicago with tens of researchers from topology, probability, statistics, random networks, image analysis and other areas, with the aim of defining the important problems that `random algebraic topology' should address. These brain trusts have born fruit in terms of setting some clearly defined goals, many of which help motivate the core of the current proposal, which is by far the most ambitious of a number of earlier and current projects.
These endeavours are expected to have -- and are to a considerable part driven by -- applications to areas outside of mathematics, while at the same time having deep, intrinsic, mathematical interest. The multi-faceted aspect of the proposal, involving a number of areas within mathematics that do not usually appear together, is highly novel and requires the setting up of a large and coordinated team of researchers. This will include the PI, graduate students and postdoctoral fellows, and short and medium term visiting scholars from a variety of disciplines
Summary
Over the past decade there has been a significant expansion of activity in applying the techniques and theory of algebraic topology to real world problems. The expression `applied algebraic topology' is no longer an oxymoron! This expansion has generated new mathematical theory, new computational techniques, and even commercial startups. However, there is still an important component of this topological approach that has not been treated in any depth, and this is the inherently stochastic nature of the world. Consequently, there is an urgent need to complement recent developments, which have been primarily deterministic, with sophisticated stochastic modelling and analysis. The current proposal aims to attack this issue by applying algebraic topological thinking to random systems.
Over the past two years, the PI Adler and colleagues have organised workshops in Banff, Palo Alto and Chicago with tens of researchers from topology, probability, statistics, random networks, image analysis and other areas, with the aim of defining the important problems that `random algebraic topology' should address. These brain trusts have born fruit in terms of setting some clearly defined goals, many of which help motivate the core of the current proposal, which is by far the most ambitious of a number of earlier and current projects.
These endeavours are expected to have -- and are to a considerable part driven by -- applications to areas outside of mathematics, while at the same time having deep, intrinsic, mathematical interest. The multi-faceted aspect of the proposal, involving a number of areas within mathematics that do not usually appear together, is highly novel and requires the setting up of a large and coordinated team of researchers. This will include the PI, graduate students and postdoctoral fellows, and short and medium term visiting scholars from a variety of disciplines
Max ERC Funding
1 904 000 €
Duration
Start date: 2013-03-01, End date: 2018-09-30
Project acronym USE
Project User behavior Simulation in built Environments
Researcher (PI) Yehuda Elchanan Kalay
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Proof of Concept (PoC), PC1, ERC-2016-PoC
Summary The impact of a building design or renovation on the behavior of its intended occupants can be assessed only after the building has been built and occupied (or the renovation has been completed). This is a risky proposition, because design mistakes can cause severe consequences in terms of under-performing buildings, reduced productivity, absenteeism, and general user dissatisfaction. During our ERC-funded project, titled NextGenBim, we were able to develop and demonstrate a method to computationally simulate human behavior in built environments, which allows representing the intended occupants and their activities in a given building. In this PoC we aim to develop a software simulator that allows: (1) to visualize, analyze and evaluate trade-offs between several design options for a new building; and (2) to evaluate the systemic implication of proposed renovations on the operations of a building. The software will potentially impact millions of professionals (architects, civil engineers, internal designers, facility mangers, etc.), especially in the healthcare sector, where large budgets are spent to build and renovate facilities, and in the educational sector, which invests heavily in computer-aided design software to train the next generation of professionals.
Summary
The impact of a building design or renovation on the behavior of its intended occupants can be assessed only after the building has been built and occupied (or the renovation has been completed). This is a risky proposition, because design mistakes can cause severe consequences in terms of under-performing buildings, reduced productivity, absenteeism, and general user dissatisfaction. During our ERC-funded project, titled NextGenBim, we were able to develop and demonstrate a method to computationally simulate human behavior in built environments, which allows representing the intended occupants and their activities in a given building. In this PoC we aim to develop a software simulator that allows: (1) to visualize, analyze and evaluate trade-offs between several design options for a new building; and (2) to evaluate the systemic implication of proposed renovations on the operations of a building. The software will potentially impact millions of professionals (architects, civil engineers, internal designers, facility mangers, etc.), especially in the healthcare sector, where large budgets are spent to build and renovate facilities, and in the educational sector, which invests heavily in computer-aided design software to train the next generation of professionals.
Max ERC Funding
150 000 €
Duration
Start date: 2017-06-01, End date: 2018-11-30
Project acronym UVdynamicsProtection
Project Aligning pigmentation and repair: a holistic approach for UV protection dynamics
Researcher (PI) Karmit Levy
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Consolidator Grant (CoG), LS4, ERC-2016-COG
Summary The human body takes different measures in order to protect itself against the results of UV exposure and its accompanied hazards, such as skin cancer. Despite extensive studies regarding the molecular regulation of the two main UV protection mechanisms, namely, the DNA repair system and the pigmentation system, a comprehensive theory that simultaneously accounts for the two systems is still missing. Hence, the ground-breaking goal of this proposal is to elucidate, for the first time, the dynamic control used to schedule and synchronize the UV protection subsystems.
Since these two systems serve the same physiological purpose, but on different time scales (DNA repair takes minutes, while pigmentation lasts hours to days), I propose to take the novel approach of focusing on their timing as an opportunity to uncover their regulation. As a first step, we exposed human and mouse skin to UV and found that UV exposure at 48hr intervals resulted in higher skin pigmentation than did exposure at 24hr intervals, even after controlling for total UV dosage. Furthermore, we found that the expression level of the melanocyte central regulator, MITF, exhibits damped oscillatory behaviour during this 48hr interval. I therefore hypothesize that the dynamic behaviour of the central regulator dictates the UV–response timing of the two protection systems. In the proposed research, I will take a holistic approach and address this issue from three complementary perspectives: (1) transcriptional dynamics, (2) temporal effects on cellular output, and (3) DNA repair after UV. This will be achieved by utilizing and developing new experimental and analytical tools that will allow us to correlate the temporal behaviours of a wide set of molecular markers. Reaching our goals will provide a breakthrough in our understanding of skin protection from UV and the underlying mechanisms that control it. These findings may offer exciting new avenues for future skin cancer prevention.
Summary
The human body takes different measures in order to protect itself against the results of UV exposure and its accompanied hazards, such as skin cancer. Despite extensive studies regarding the molecular regulation of the two main UV protection mechanisms, namely, the DNA repair system and the pigmentation system, a comprehensive theory that simultaneously accounts for the two systems is still missing. Hence, the ground-breaking goal of this proposal is to elucidate, for the first time, the dynamic control used to schedule and synchronize the UV protection subsystems.
Since these two systems serve the same physiological purpose, but on different time scales (DNA repair takes minutes, while pigmentation lasts hours to days), I propose to take the novel approach of focusing on their timing as an opportunity to uncover their regulation. As a first step, we exposed human and mouse skin to UV and found that UV exposure at 48hr intervals resulted in higher skin pigmentation than did exposure at 24hr intervals, even after controlling for total UV dosage. Furthermore, we found that the expression level of the melanocyte central regulator, MITF, exhibits damped oscillatory behaviour during this 48hr interval. I therefore hypothesize that the dynamic behaviour of the central regulator dictates the UV–response timing of the two protection systems. In the proposed research, I will take a holistic approach and address this issue from three complementary perspectives: (1) transcriptional dynamics, (2) temporal effects on cellular output, and (3) DNA repair after UV. This will be achieved by utilizing and developing new experimental and analytical tools that will allow us to correlate the temporal behaviours of a wide set of molecular markers. Reaching our goals will provide a breakthrough in our understanding of skin protection from UV and the underlying mechanisms that control it. These findings may offer exciting new avenues for future skin cancer prevention.
Max ERC Funding
1 971 875 €
Duration
Start date: 2017-04-01, End date: 2022-03-31
Project acronym VARB
Project Variability and Robustness in Bio-molecular systems
Researcher (PI) Naama Barkai
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Advanced Grant (AdG), LS2, ERC-2008-AdG
Summary Cells process information using biochemical networks of interacting proteins and genes. We wish to understand the principles that guide the design of such networks. In particular, we are interested in the interplay between variability, inherent to biological systems, and the precision of cellular computing. To better understand this interplay, we will: (1) Characterize the extent of gene expression variability and define its genetic determinants, (2) Reveal how variability is buffered and (3) Describe instances where variability (or 'noise') is an integral part of cellular computation. The study will be conducted in the multidisciplinary atmosphere of our lab, by students trained in physics, computer science, chemistry and biology. Specific issues include: 1. Gene expression variability: we will focus on the influence of chromatin structure on gene expression variability, as suggested by our bioinformatics analysis. 2. Robustness and scaling in embryonic patterning: We will study the means by which fluctuations are buffered during the development of multicellular organisms. We will focus on the robustness of morphogen gradients to protein levels, and on the ability to maintain proportionate pattern in tissues of different size. 3. Noise-driven transitions in a fluctuating environment: Our preliminary results suggest that noise plays an integral part in phosphate homeostasis in S. cerevisiae. We will characterize the role of noise in this system and study its evolutionary implications. Together, our study will shed light on one we believe to be the fundamental challenge of biological information processing: ensuring a reliable and reproducible function in the highly variable biological environment. Our study will furthermore define novel multidisciplinary, system-level paradigms and approaches that will guide further studies of bio-molecular systems
Summary
Cells process information using biochemical networks of interacting proteins and genes. We wish to understand the principles that guide the design of such networks. In particular, we are interested in the interplay between variability, inherent to biological systems, and the precision of cellular computing. To better understand this interplay, we will: (1) Characterize the extent of gene expression variability and define its genetic determinants, (2) Reveal how variability is buffered and (3) Describe instances where variability (or 'noise') is an integral part of cellular computation. The study will be conducted in the multidisciplinary atmosphere of our lab, by students trained in physics, computer science, chemistry and biology. Specific issues include: 1. Gene expression variability: we will focus on the influence of chromatin structure on gene expression variability, as suggested by our bioinformatics analysis. 2. Robustness and scaling in embryonic patterning: We will study the means by which fluctuations are buffered during the development of multicellular organisms. We will focus on the robustness of morphogen gradients to protein levels, and on the ability to maintain proportionate pattern in tissues of different size. 3. Noise-driven transitions in a fluctuating environment: Our preliminary results suggest that noise plays an integral part in phosphate homeostasis in S. cerevisiae. We will characterize the role of noise in this system and study its evolutionary implications. Together, our study will shed light on one we believe to be the fundamental challenge of biological information processing: ensuring a reliable and reproducible function in the highly variable biological environment. Our study will furthermore define novel multidisciplinary, system-level paradigms and approaches that will guide further studies of bio-molecular systems
Max ERC Funding
2 200 000 €
Duration
Start date: 2009-01-01, End date: 2013-10-31
Project acronym VASCFLAP
Project A new reconstructing technique using tissue engineering methods to create an engineered autologous vascularized tissue flap
Researcher (PI) Shulamit Levenberg
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Proof of Concept (PoC), PC1, ERC-2014-PoC
Summary "Abdominal wall defects are often the consequence of severe trauma, cancer treatment and burns. These defects involve a significant loss of tissue, and often require surgical reconstruction where tissue is lifted from the patient's donor site and moved to his injured site with an intact blood supply (autologous muscle free flap). The current transfer surgery is complicated and involved with donor-site morbidity after tissue harvesting, and scant availability. We propose a robust engineered transplant performed by a novel reconstruction technique to overcome these disadvantages. The proposed transplant uses an alternative biomaterial implantation, offering the possibility to repair a full-thickness defect of the abdominal wall without the need to transfer tissue (skin+subcutis+fascia+muscle) from another site and minimal postoperative scarification (skin only). We name this technique ""an Engineered Autologous Vascularized Axial Flap"". The key idea of this approach is the use of a polymeric scaffold upon which human cells will be seeded. The engineered tissue cultured in vitro will contain also a network of blood vessels. Then, this engineered construct will be implanted around large blood vessels adjacent to the injured site. Once highly vascularized, it will be possible to transfer the implanted engineered vascularized construct as a flap for covering the defects. Once developed, this autologous cost-effective engineered tissue product may be used in reconstructive surgery of the abdominal wall and breast (thousands of cases in the EU alone) which improves the patients’ quality of life and reduces surgical costs and risks. Here we describe a plan to develop this product by identifying the most cost-effective niche where we can go to market in. We plan to complete a set of feasibility studies in large animal model using human cells (which could later be isolated from the patient = autologous cells) and proceed establishing our portfolio of intellectual property."
Summary
"Abdominal wall defects are often the consequence of severe trauma, cancer treatment and burns. These defects involve a significant loss of tissue, and often require surgical reconstruction where tissue is lifted from the patient's donor site and moved to his injured site with an intact blood supply (autologous muscle free flap). The current transfer surgery is complicated and involved with donor-site morbidity after tissue harvesting, and scant availability. We propose a robust engineered transplant performed by a novel reconstruction technique to overcome these disadvantages. The proposed transplant uses an alternative biomaterial implantation, offering the possibility to repair a full-thickness defect of the abdominal wall without the need to transfer tissue (skin+subcutis+fascia+muscle) from another site and minimal postoperative scarification (skin only). We name this technique ""an Engineered Autologous Vascularized Axial Flap"". The key idea of this approach is the use of a polymeric scaffold upon which human cells will be seeded. The engineered tissue cultured in vitro will contain also a network of blood vessels. Then, this engineered construct will be implanted around large blood vessels adjacent to the injured site. Once highly vascularized, it will be possible to transfer the implanted engineered vascularized construct as a flap for covering the defects. Once developed, this autologous cost-effective engineered tissue product may be used in reconstructive surgery of the abdominal wall and breast (thousands of cases in the EU alone) which improves the patients’ quality of life and reduces surgical costs and risks. Here we describe a plan to develop this product by identifying the most cost-effective niche where we can go to market in. We plan to complete a set of feasibility studies in large animal model using human cells (which could later be isolated from the patient = autologous cells) and proceed establishing our portfolio of intellectual property."
Max ERC Funding
147 500 €
Duration
Start date: 2015-04-01, End date: 2016-09-30
Project acronym VASNICHE
Project The vascular stem cell niche and the neurovascular unit
Researcher (PI) Eliahu Keshet
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Advanced Grant (AdG), LS4, ERC-2012-ADG_20120314
Summary Recent evidence suggests that VEGF and the vasculature play multiple roles in organ homeostasis, functions extending far beyond their traditional roles in tissue perfusion. The proposed study represents a vascular-centred approach to the neurovascular unit thriving to gain further insights on the many ways by which blood vessels may affect proper brain functioning. Major focus is on the vascular stem cell niche, i.e. the contention that blood vessels are a key component of adult stem cell niches, including a niche securing proper function of neuronal stem cells (NSCs). Further insights on the niche are also critical for contemplated implementation of stem-cell based therapy. In this multidisciplinary study combining the fields of vascular biology, neurobiology, stem cell biology, and aging research, we harness unique transgenic methodologies to conditionally manipulate (via VEGF) the vasculature within the stem cell niches. We provide a first compelling proof that blood vessels at the niche indeed control stem cells properties and behaviour, evidenced by showing that mere expansion of the niche vasculature and independently of VEGF) increases dramatically adult hippocampal neurogenesis, a process known to be associated with improved cognitive performance. We will determine what aspects of stem cell biology are controlled by juxtaposed, directly contacting blood vessels and will identify signalling systems mediating the vascular/stem cell cross-talk.
Adult neurogenesis is known to rapidly decline with age and ways to sustain the process are highly desired. We hypothesize and, in fact, provide initial evidence that expanding and 'rejuvenating' the niche vasculature can override the natural age-dependent decline of adult neurogenesis. Proposed experiments will extend this exciting finding and thrive to uncover the underlying mechanisms.
Summary
Recent evidence suggests that VEGF and the vasculature play multiple roles in organ homeostasis, functions extending far beyond their traditional roles in tissue perfusion. The proposed study represents a vascular-centred approach to the neurovascular unit thriving to gain further insights on the many ways by which blood vessels may affect proper brain functioning. Major focus is on the vascular stem cell niche, i.e. the contention that blood vessels are a key component of adult stem cell niches, including a niche securing proper function of neuronal stem cells (NSCs). Further insights on the niche are also critical for contemplated implementation of stem-cell based therapy. In this multidisciplinary study combining the fields of vascular biology, neurobiology, stem cell biology, and aging research, we harness unique transgenic methodologies to conditionally manipulate (via VEGF) the vasculature within the stem cell niches. We provide a first compelling proof that blood vessels at the niche indeed control stem cells properties and behaviour, evidenced by showing that mere expansion of the niche vasculature and independently of VEGF) increases dramatically adult hippocampal neurogenesis, a process known to be associated with improved cognitive performance. We will determine what aspects of stem cell biology are controlled by juxtaposed, directly contacting blood vessels and will identify signalling systems mediating the vascular/stem cell cross-talk.
Adult neurogenesis is known to rapidly decline with age and ways to sustain the process are highly desired. We hypothesize and, in fact, provide initial evidence that expanding and 'rejuvenating' the niche vasculature can override the natural age-dependent decline of adult neurogenesis. Proposed experiments will extend this exciting finding and thrive to uncover the underlying mechanisms.
Max ERC Funding
2 499 980 €
Duration
Start date: 2013-03-01, End date: 2018-02-28
Project acronym VDJtargeting
Project Engineering T cells and B cells for Immunotherapy using V(D)J recombination
Researcher (PI) Adi Barzel
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), LS7, ERC-2017-STG
Summary T cell engineering has shown clinical success mainly in haematological cancers, but scaling up is challenging due to reliance on ex vivo manipulations. In addition, B cell engineering has not shown therapeutic efficacy to date. Here, we propose a novel immunotherapy approach, allowing safe and efficient engineering of B cells and T cells, both ex vivo and in vivo. We will use adeno associated vectors (AAV) to integrate chimeric antigen receptor (CAR) or T cell receptor (TCR) genes into loci coding TCR chains and to integrate antibody (Ab) genes into loci coding Ab chains. Previously, we used AAV facilitated gene targeting in vivo to ameliorate genetic diseases in mice. For lymphocytes we develop “VDJ targeting”: A promoterless receptor/Ab gene flanked by recognition signal sequences (RSS) will be inserted into the endogenous locus by the recombination activating gene (RAG) complex during V(D)J recombination. Only developing lymphocytes, expressing RAG, will incorporate the receptor/Ab gene, which will thus be expressed in potent naïve cells from the strong endogenous promoter. Targeted developing cells are subjected to negative selection, thus reducing risk of adverse autoimmunity. Lack of promoter reduces spurious expression and oncogenic risk upon rare off-target integration. Targeting endogenous loci may allow allelic exclusion. In B cells it may allow utilizing the endogenous constant region to express a B cell receptor and, upon activation, a secreted Ab. Activation may be accompanied by proliferation and affinity maturation, including somatic hypermutation and class switching, to allow a potent immune response, memory retention and diminished antigenic escape. Where controlled autoimmunity is desired, we will engineer B cells to inducibly secret an auto-Ab. We will demonstrate efficacy in cancer and autoimmune disease models implanted with lymphocytes that we engineered while ex vivo differentiated and in mice injected with vectors for in vivo VDJ targeting.
Summary
T cell engineering has shown clinical success mainly in haematological cancers, but scaling up is challenging due to reliance on ex vivo manipulations. In addition, B cell engineering has not shown therapeutic efficacy to date. Here, we propose a novel immunotherapy approach, allowing safe and efficient engineering of B cells and T cells, both ex vivo and in vivo. We will use adeno associated vectors (AAV) to integrate chimeric antigen receptor (CAR) or T cell receptor (TCR) genes into loci coding TCR chains and to integrate antibody (Ab) genes into loci coding Ab chains. Previously, we used AAV facilitated gene targeting in vivo to ameliorate genetic diseases in mice. For lymphocytes we develop “VDJ targeting”: A promoterless receptor/Ab gene flanked by recognition signal sequences (RSS) will be inserted into the endogenous locus by the recombination activating gene (RAG) complex during V(D)J recombination. Only developing lymphocytes, expressing RAG, will incorporate the receptor/Ab gene, which will thus be expressed in potent naïve cells from the strong endogenous promoter. Targeted developing cells are subjected to negative selection, thus reducing risk of adverse autoimmunity. Lack of promoter reduces spurious expression and oncogenic risk upon rare off-target integration. Targeting endogenous loci may allow allelic exclusion. In B cells it may allow utilizing the endogenous constant region to express a B cell receptor and, upon activation, a secreted Ab. Activation may be accompanied by proliferation and affinity maturation, including somatic hypermutation and class switching, to allow a potent immune response, memory retention and diminished antigenic escape. Where controlled autoimmunity is desired, we will engineer B cells to inducibly secret an auto-Ab. We will demonstrate efficacy in cancer and autoimmune disease models implanted with lymphocytes that we engineered while ex vivo differentiated and in mice injected with vectors for in vivo VDJ targeting.
Max ERC Funding
1 496 875 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym VeCare
Project Selective retention of VEGF for cancer and retinopathies therapeutics
Researcher (PI) Cláudio Franco
Host Institution (HI) INSTITUTO DE MEDICINA MOLECULAR JOAO LOBO ANTUNES
Call Details Proof of Concept (PoC), ERC-2018-PoC
Summary Angiogenesis is the mechanism of blood vessel formation from pre-existing ones and is vital for nutrient and oxygen delivery to all cells in the organism. However, dysregulation of angiogenesis is detrimental for the organism. Excessive or abnormal angiogenesis is a hallmark of cancer and various retinopathies and favours tumour growth and metastasis, and vision loss, respectively. Excessive or abnormal angiogenesis is fuelled by hypoxia-driven expression of high levels of vascular endothelial growth factor (VEGF), the main pro-angiogenic molecule that stimulates the formation of new blood vessels.
Although VEGF-centric anti-angiogenic therapies do exist, their efficacy is limited by their specificity and numerous side effects, hampering greatly their potential clinical benefits. Therefore, there is an urgent need for improved anti-VEGF therapeutic strategies.
The aim of this PoC project is to identify a novel class of anti-VEGF drugs. We designed an innovative and unique screening method to identify an unexplored mechanism to inhibit VEGF function in vivo, which has the prospect of reduced toxicity and, thus, enhanced clinical efficacy.
This project will enable the creation of a start-up company to commercialize the newly identified class of drugs for subsequent clinical development to curb cancer and retinopathies. We aim at improving patient survival and well-being, and reducing the economic burden associated with these diseases.
Summary
Angiogenesis is the mechanism of blood vessel formation from pre-existing ones and is vital for nutrient and oxygen delivery to all cells in the organism. However, dysregulation of angiogenesis is detrimental for the organism. Excessive or abnormal angiogenesis is a hallmark of cancer and various retinopathies and favours tumour growth and metastasis, and vision loss, respectively. Excessive or abnormal angiogenesis is fuelled by hypoxia-driven expression of high levels of vascular endothelial growth factor (VEGF), the main pro-angiogenic molecule that stimulates the formation of new blood vessels.
Although VEGF-centric anti-angiogenic therapies do exist, their efficacy is limited by their specificity and numerous side effects, hampering greatly their potential clinical benefits. Therefore, there is an urgent need for improved anti-VEGF therapeutic strategies.
The aim of this PoC project is to identify a novel class of anti-VEGF drugs. We designed an innovative and unique screening method to identify an unexplored mechanism to inhibit VEGF function in vivo, which has the prospect of reduced toxicity and, thus, enhanced clinical efficacy.
This project will enable the creation of a start-up company to commercialize the newly identified class of drugs for subsequent clinical development to curb cancer and retinopathies. We aim at improving patient survival and well-being, and reducing the economic burden associated with these diseases.
Max ERC Funding
150 000 €
Duration
Start date: 2019-03-01, End date: 2020-08-31
Project acronym VERICOMP
Project Foundations of Verifiable Computing
Researcher (PI) Guy ROTHBLUM
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Consolidator Grant (CoG), PE6, ERC-2018-COG
Summary Proof systems allow a weak verifier to ascertain the correctness of complex computational statements. Efficiently-verifiable proof systems are fundamental objects in the study of computation, and have led to some of the deepest and most celebrated insights in cryptography and in complexity theory.
The vast and rich literature on proof systems focuses primarily on proving the correctness of intractable statements, e.g. ones that are NP-complete. While the verification can be efficient, the proofs themselves cannot be generated in polynomial time. This limits the applicability of such proof systems, both from a theoretical perspective and in their real-world impact. This proposal aims to obtain a comprehensive understanding of proof systems with polynomial-time proof generation, to explore their practical applicability, and to investigate their connections with foundational questions in cryptography and in complexity theory.
Our study will focus primarily on interactive proof systems for tractable computations. The proposed research aims to revolutionize our understanding of these foundational objects by providing a complete and tight characterization of the complexity or proving and verifying general statements, by achieving breakthroughs in the study of related proof system notions, such as cryptographic arguments, and by building a fine-grained “algorithmic” theory of proof systems for central polynomial-time computational problems.
Our research will leverage these advances towards diverse applications: from real-world security challenges, such as verifying the correctness of computations performed by the cloud and cryptographic “proofs of work”, to a complexity-theoretic understanding of the complexity of approximating problems in P and of solving them on random instances.
Summary
Proof systems allow a weak verifier to ascertain the correctness of complex computational statements. Efficiently-verifiable proof systems are fundamental objects in the study of computation, and have led to some of the deepest and most celebrated insights in cryptography and in complexity theory.
The vast and rich literature on proof systems focuses primarily on proving the correctness of intractable statements, e.g. ones that are NP-complete. While the verification can be efficient, the proofs themselves cannot be generated in polynomial time. This limits the applicability of such proof systems, both from a theoretical perspective and in their real-world impact. This proposal aims to obtain a comprehensive understanding of proof systems with polynomial-time proof generation, to explore their practical applicability, and to investigate their connections with foundational questions in cryptography and in complexity theory.
Our study will focus primarily on interactive proof systems for tractable computations. The proposed research aims to revolutionize our understanding of these foundational objects by providing a complete and tight characterization of the complexity or proving and verifying general statements, by achieving breakthroughs in the study of related proof system notions, such as cryptographic arguments, and by building a fine-grained “algorithmic” theory of proof systems for central polynomial-time computational problems.
Our research will leverage these advances towards diverse applications: from real-world security challenges, such as verifying the correctness of computations performed by the cloud and cryptographic “proofs of work”, to a complexity-theoretic understanding of the complexity of approximating problems in P and of solving them on random instances.
Max ERC Funding
1 882 460 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym VesselNet
Project Engineering Composite Tissues for Facial Reconstruction
Researcher (PI) Shulamit Levenberg
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Consolidator Grant (CoG), LS7, ERC-2018-COG
Summary Facial reconstruction usually involves the use of autologous grafts or composite tissue allografts, which are highly complex tissues that pose significant challenges to tissue engineering experts. Tissue engineering of independent facial elements, e.g., bone, adipose, skin and muscle tissues, has been demonstrated. However, to date, no composite soft tissues composed of multiple facial layers have been created. Composite facial tissue engineering will require proper innervation and vascularization, essential to support generation of large thick implants. However, techniques for effective innervation of engineered tissues are currently insufficient and generation of well-vascularized large and thick engineered tissues is still one of the major obstacles limiting their translation to the clinic. Our goal is to engineer thick, composite, human-scale, facial tissues (muscle-adipose-dermis composite, and bone) of a personally adaptable shape, that will be vascularized in-vitro, and innervated upon transplantation. Our concept is to create in-vitro a functional vascular network (VesselNet), composed of both large and small vessels, within engineered constructs, which will allow for the generation of thick engineered tissues under continuous flow conditions. 3D bio-printing techniques will be applied to create the engineered tissues. These tissues will serve as a model to study mechanisms involved in vessel anastomosis, and tissue organization and stabilization. The applicability of the engineered composite soft and bone tissues will be evaluated in facial, breast and abdominal wall defect reconstruction models, and in an open fracture model. Such engineered large-scale composite tissues are expected to have a major impact on reconstructive surgery and will shed light on yet unknown tissue organization mechanisms.
Summary
Facial reconstruction usually involves the use of autologous grafts or composite tissue allografts, which are highly complex tissues that pose significant challenges to tissue engineering experts. Tissue engineering of independent facial elements, e.g., bone, adipose, skin and muscle tissues, has been demonstrated. However, to date, no composite soft tissues composed of multiple facial layers have been created. Composite facial tissue engineering will require proper innervation and vascularization, essential to support generation of large thick implants. However, techniques for effective innervation of engineered tissues are currently insufficient and generation of well-vascularized large and thick engineered tissues is still one of the major obstacles limiting their translation to the clinic. Our goal is to engineer thick, composite, human-scale, facial tissues (muscle-adipose-dermis composite, and bone) of a personally adaptable shape, that will be vascularized in-vitro, and innervated upon transplantation. Our concept is to create in-vitro a functional vascular network (VesselNet), composed of both large and small vessels, within engineered constructs, which will allow for the generation of thick engineered tissues under continuous flow conditions. 3D bio-printing techniques will be applied to create the engineered tissues. These tissues will serve as a model to study mechanisms involved in vessel anastomosis, and tissue organization and stabilization. The applicability of the engineered composite soft and bone tissues will be evaluated in facial, breast and abdominal wall defect reconstruction models, and in an open fracture model. Such engineered large-scale composite tissues are expected to have a major impact on reconstructive surgery and will shed light on yet unknown tissue organization mechanisms.
Max ERC Funding
2 375 000 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym VINCULUM
Project Entailing Perpetuity: Family, Power, Identity. The Social Agency of a Corporate Body(Southern Europe, 14th-17th Centuries)
Researcher (PI) Maria de Lurdes Pereira ROSA
Host Institution (HI) UNIVERSIDADE NOVA DE LISBOA
Call Details Consolidator Grant (CoG), SH6, ERC-2018-COG
Summary Few legal phenomena have been so relevant to premodern southern Europe societies as entails, a specific strategy that evolved to protect family inheritances, thus enabling the reproduction of elite social status. The VINCULUM project aims to explain how entailment became possible, how it functioned, and why it lasted for so many centuries. The project rests on the innovative theoretical claim that entails, as corporate bodies, functioned as a key social agent, created and acting within societies for which non-personal legal subjects were normal. Building on the Portuguese-Iberian case, and on the extensive research already carried out by me and my team, I propose to study 'entailment' as a diverse but pivotal practice, one embedded in law, aristocratic discourse, and kinship-based organization, and to carry out comprehensive analysis that explores this global nature. The research approach systematically breaks with traditional research frontiers: cases will extend from the 14th to 17th century in both continental and Atlantic spaces, and include both comparative perspectives and the study of later social reconfigurations.
VINCULUM will be anchored in extended research in public archives and on unprecedented access to extensive private family archives, which have been opened to research by the ARQFAM program I have led since 2008. Data collection will allow for the construction of a large database, gathering all documents relating to each entail, under a theoretical model that seeks to reconstruct past information systems, thus testing a novel methodology developed in my previous research. The database that will gather c.7000 thousand entails, enabling systematic inquiries organized around the new conceptual definitions proposed by the project. The research will be strongly interdisciplinary, engaging with historical anthropology and archival science in order to construct a proper theoretical model for understanding this crucial legal and social phenomenon.
Summary
Few legal phenomena have been so relevant to premodern southern Europe societies as entails, a specific strategy that evolved to protect family inheritances, thus enabling the reproduction of elite social status. The VINCULUM project aims to explain how entailment became possible, how it functioned, and why it lasted for so many centuries. The project rests on the innovative theoretical claim that entails, as corporate bodies, functioned as a key social agent, created and acting within societies for which non-personal legal subjects were normal. Building on the Portuguese-Iberian case, and on the extensive research already carried out by me and my team, I propose to study 'entailment' as a diverse but pivotal practice, one embedded in law, aristocratic discourse, and kinship-based organization, and to carry out comprehensive analysis that explores this global nature. The research approach systematically breaks with traditional research frontiers: cases will extend from the 14th to 17th century in both continental and Atlantic spaces, and include both comparative perspectives and the study of later social reconfigurations.
VINCULUM will be anchored in extended research in public archives and on unprecedented access to extensive private family archives, which have been opened to research by the ARQFAM program I have led since 2008. Data collection will allow for the construction of a large database, gathering all documents relating to each entail, under a theoretical model that seeks to reconstruct past information systems, thus testing a novel methodology developed in my previous research. The database that will gather c.7000 thousand entails, enabling systematic inquiries organized around the new conceptual definitions proposed by the project. The research will be strongly interdisciplinary, engaging with historical anthropology and archival science in order to construct a proper theoretical model for understanding this crucial legal and social phenomenon.
Max ERC Funding
1 591 450 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym Virocellsphere
Project Host-virus chemical arms race during algal bloom in the ocean at a single cell resolution
Researcher (PI) Asaf Vardi
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Consolidator Grant (CoG), LS8, ERC-2015-CoG
Summary Phytoplankton blooms are ephemeral events of exceptionally high primary productivity that regulate the flux of carbon across marine food webs. The cosmopolitan coccolithophore Emiliania huxleyi (Haptophyta) is a unicellular eukaryotic alga responsible for the largest oceanic algal blooms covering thousands of square kilometers. These annual blooms are frequently terminated by a specific large dsDNA E. huxleyi virus (EhV).
Despite the huge ecological importance of host-virus interactions, the ability to assess their ecological impact is limited to current approaches, which focus mainly on quantification of viral abundance and diversity. On the molecular basis, a major challenge in the current understanding of host-virus interactions in the marine environment is the ability to decode the wealth of “omics” data and translate it into cellular mechanisms that mediate host susceptibility and resistance to viral infection.
In the current proposal we intend to provide novel functional insights into molecular mechanisms that regulate host-virus interactions at the single-cell level by unravelling phenotypic heterogeneity within infected populations. By using physiological markers and single-cell transcriptomics, we propose to discern between host subpopulations and define their different “metabolic states”, in order to map them into different modes of susceptibility and resistance. By using advanced metabolomic approaches, we also aim to define the infochemical microenvironment generated during viral infection and examine how it can shape host phenotypic plasticity. Mapping the transcriptomic and metabolic footprints of viral infection will provide a meaningful tool to assess the dynamics of active viral infection during natural E. huxleyi blooms. Our novel approaches will pave the way for unprecedented quantification of the “viral shunt” that drives nutrient fluxes in marine food webs, from a single-cell level to a population and eventually ecosystem levels.
Summary
Phytoplankton blooms are ephemeral events of exceptionally high primary productivity that regulate the flux of carbon across marine food webs. The cosmopolitan coccolithophore Emiliania huxleyi (Haptophyta) is a unicellular eukaryotic alga responsible for the largest oceanic algal blooms covering thousands of square kilometers. These annual blooms are frequently terminated by a specific large dsDNA E. huxleyi virus (EhV).
Despite the huge ecological importance of host-virus interactions, the ability to assess their ecological impact is limited to current approaches, which focus mainly on quantification of viral abundance and diversity. On the molecular basis, a major challenge in the current understanding of host-virus interactions in the marine environment is the ability to decode the wealth of “omics” data and translate it into cellular mechanisms that mediate host susceptibility and resistance to viral infection.
In the current proposal we intend to provide novel functional insights into molecular mechanisms that regulate host-virus interactions at the single-cell level by unravelling phenotypic heterogeneity within infected populations. By using physiological markers and single-cell transcriptomics, we propose to discern between host subpopulations and define their different “metabolic states”, in order to map them into different modes of susceptibility and resistance. By using advanced metabolomic approaches, we also aim to define the infochemical microenvironment generated during viral infection and examine how it can shape host phenotypic plasticity. Mapping the transcriptomic and metabolic footprints of viral infection will provide a meaningful tool to assess the dynamics of active viral infection during natural E. huxleyi blooms. Our novel approaches will pave the way for unprecedented quantification of the “viral shunt” that drives nutrient fluxes in marine food webs, from a single-cell level to a population and eventually ecosystem levels.
Max ERC Funding
2 749 901 €
Duration
Start date: 2016-11-01, End date: 2021-10-31
Project acronym VisuLive
Project Quantitative Nanoscale Visualization of Macromolecular Complexes in Live Cells using Genetic Code Expansion and High-Resolution Imaging
Researcher (PI) Natalie Elia herooty
Host Institution (HI) BEN-GURION UNIVERSITY OF THE NEGEV
Call Details Starting Grant (StG), LS9, ERC-2014-STG
Summary High-resolution fluorescence imaging, including super-resolution microscopy and high-speed live cell imaging, are used to obtain quantitative information on the structural organization and kinetics of cellular processes. The contribution of these high-resolution techniques to cell biology was recently demonstrated for dynamin- and ESCRT-driven membrane fission in cells. While they advance our knowledge on membrane fission these techniques do not provide the quantitative information needed to formulate a mechanical understanding of membrane fission in a physiological context, a shortcoming that stresses the need to increase the spatiotemporal resolution and improve the live cell capabilities of these techniques. Substituting the bulky fluorescent protein tags (such as GFP) currently used in live-cell applications with much smaller fluorescent dyes that possess superior photophysical characteristics will markedly improve these advanced imaging techniques. Genetic code expansion and bioorthogonal labeling offer, for the first time, a non-invasive way to specifically attach such fluorescent dyes to proteins in live cells. I, therefore, propose to develop an innovative approach to label cellular proteins with fluorescent dyes via genetic code expansion for quantitative high-resolution live cell imaging of cellular protein complexes. By applying this approach to three distinguished high-resolution methodologies and by visualizing membrane fission in distinct cellular processes in live cells at milliseconds rate and at nanoscale resolution, we aim to decipher the mechanistic principles of membrane fission in cells. As numerous cellular processes rely on membrane fission for their function, such an understanding will have a broad impact on cell biology. The implications of this study reach beyond the scope of membrane fission by offering a new approach to study cellular processes at close-to-real conditions in live cells and at nanoscale resolution.
Summary
High-resolution fluorescence imaging, including super-resolution microscopy and high-speed live cell imaging, are used to obtain quantitative information on the structural organization and kinetics of cellular processes. The contribution of these high-resolution techniques to cell biology was recently demonstrated for dynamin- and ESCRT-driven membrane fission in cells. While they advance our knowledge on membrane fission these techniques do not provide the quantitative information needed to formulate a mechanical understanding of membrane fission in a physiological context, a shortcoming that stresses the need to increase the spatiotemporal resolution and improve the live cell capabilities of these techniques. Substituting the bulky fluorescent protein tags (such as GFP) currently used in live-cell applications with much smaller fluorescent dyes that possess superior photophysical characteristics will markedly improve these advanced imaging techniques. Genetic code expansion and bioorthogonal labeling offer, for the first time, a non-invasive way to specifically attach such fluorescent dyes to proteins in live cells. I, therefore, propose to develop an innovative approach to label cellular proteins with fluorescent dyes via genetic code expansion for quantitative high-resolution live cell imaging of cellular protein complexes. By applying this approach to three distinguished high-resolution methodologies and by visualizing membrane fission in distinct cellular processes in live cells at milliseconds rate and at nanoscale resolution, we aim to decipher the mechanistic principles of membrane fission in cells. As numerous cellular processes rely on membrane fission for their function, such an understanding will have a broad impact on cell biology. The implications of this study reach beyond the scope of membrane fission by offering a new approach to study cellular processes at close-to-real conditions in live cells and at nanoscale resolution.
Max ERC Funding
1 625 000 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym VOICES
Project Voices Of Individuals: Collectively Exploring Self-determination
Researcher (PI) Eilionóir Teresa Flynn
Host Institution (HI) NATIONAL UNIVERSITY OF IRELAND GALWAY
Call Details Starting Grant (StG), SH2, ERC-2014-STG
Summary The right to make one’s own decisions and to have these decisions respected by law is a basic human freedom which most adults take for granted. However, for many people with disabilities (especially people with intellectual, psycho-social and other cognitive disabilities) this fundamental right has been denied – informally, in the private sphere, and formally, in the public sphere through States’ laws and policies.
Since the entry into force of the UN Convention on the Rights of Persons with Disabilities, there is an emerging consensus in human rights discourse that all people, regardless of their decision-making skills, should enjoy ‘legal capacity’ on an equal basis—that is, the right to be recognised as a person before the law and the subsequent right to have one’s decisions legally recognised. To date most of the literature on how this right should be realised has been developed by non-disabled scholars without the direct input of people with disabilities themselves.
The VOICES project will take a radical approach to develop new law reform ideas based on this concept of ‘universal legal capacity.’ Its primary objective is to develop reform proposals based on the lived experience of disability. The project will support individuals who self-identify as disabled to develop personal narratives about their experiences in exercising, or being denied, legal capacity. Through a collaborative process, legal and social science scholars will then work with people with disabilities to develop their personal narratives to frame and ground concrete proposals for law reform in previously unexplored areas – including consent to sex, contractual capacity, criminal responsibility and consent to medical treatment. In this way, the legitimacy of people with disabilities’ perspectives on the options for law reform will be validated, and this will create a powerful argument for legal change.
Summary
The right to make one’s own decisions and to have these decisions respected by law is a basic human freedom which most adults take for granted. However, for many people with disabilities (especially people with intellectual, psycho-social and other cognitive disabilities) this fundamental right has been denied – informally, in the private sphere, and formally, in the public sphere through States’ laws and policies.
Since the entry into force of the UN Convention on the Rights of Persons with Disabilities, there is an emerging consensus in human rights discourse that all people, regardless of their decision-making skills, should enjoy ‘legal capacity’ on an equal basis—that is, the right to be recognised as a person before the law and the subsequent right to have one’s decisions legally recognised. To date most of the literature on how this right should be realised has been developed by non-disabled scholars without the direct input of people with disabilities themselves.
The VOICES project will take a radical approach to develop new law reform ideas based on this concept of ‘universal legal capacity.’ Its primary objective is to develop reform proposals based on the lived experience of disability. The project will support individuals who self-identify as disabled to develop personal narratives about their experiences in exercising, or being denied, legal capacity. Through a collaborative process, legal and social science scholars will then work with people with disabilities to develop their personal narratives to frame and ground concrete proposals for law reform in previously unexplored areas – including consent to sex, contractual capacity, criminal responsibility and consent to medical treatment. In this way, the legitimacy of people with disabilities’ perspectives on the options for law reform will be validated, and this will create a powerful argument for legal change.
Max ERC Funding
891 386 €
Duration
Start date: 2015-06-01, End date: 2018-11-30
Project acronym VSSC
Project Verifying and Synthesizing Software Compositions
Researcher (PI) Shmuel (Mooly) Sagiv
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Advanced Grant (AdG), PE6, ERC-2012-ADG_20120216
Summary One of the first things a programmer must commit to in developing any significant piece of software is the representation of the data. In applications where performance or memory consumption is important, this representation is often quite complex: the data may be indexed in multiple ways and use a variety of concrete, interlinked data structures. The current situation, in which programmers either directly write these data structures themselves or use a standard data structure library, leads to two problems:
1:The particular choice of data representation is based on an expectation of what the most common workloads will be; that is, the programmer has already made cost-benefit trade-offs based on the expected distribution of operations the program will perform on these data structures.
2: It is difficult for the programmer to check or even express the high-level consistency properties of complex structures, especially when these structures are shared. This also makes software verification in existing programming languages very hard.
We will investigate specification languages for describing and reasoning program data at a much higher level. The hope is that this can reduce the inherited complexity of reasoning about programs. In tandem, we will check if the high level specifications can be semi-automatically mapped specifications to efficient data representations.
A novel aspect of our approach allows the user to define global invariants and a restricted set of high level operations, and only then to synthesize a representation that both adheres to the invariants and is highly specialized to exactly the set of operations the user requires. In contrast, the classical approach in databases is to assume nothing about the queries that must be answered; the representation must support all possible operations.
Summary
One of the first things a programmer must commit to in developing any significant piece of software is the representation of the data. In applications where performance or memory consumption is important, this representation is often quite complex: the data may be indexed in multiple ways and use a variety of concrete, interlinked data structures. The current situation, in which programmers either directly write these data structures themselves or use a standard data structure library, leads to two problems:
1:The particular choice of data representation is based on an expectation of what the most common workloads will be; that is, the programmer has already made cost-benefit trade-offs based on the expected distribution of operations the program will perform on these data structures.
2: It is difficult for the programmer to check or even express the high-level consistency properties of complex structures, especially when these structures are shared. This also makes software verification in existing programming languages very hard.
We will investigate specification languages for describing and reasoning program data at a much higher level. The hope is that this can reduce the inherited complexity of reasoning about programs. In tandem, we will check if the high level specifications can be semi-automatically mapped specifications to efficient data representations.
A novel aspect of our approach allows the user to define global invariants and a restricted set of high level operations, and only then to synthesize a representation that both adheres to the invariants and is highly specialized to exactly the set of operations the user requires. In contrast, the classical approach in databases is to assume nothing about the queries that must be answered; the representation must support all possible operations.
Max ERC Funding
1 577 200 €
Duration
Start date: 2013-04-01, End date: 2018-03-31
Project acronym Watecco
Project Water column profiler for quantification of photosynthesis and biomass of phytoplankton in natural and man made water bodies
Researcher (PI) Zvy Dubinsky
Host Institution (HI) BAR ILAN UNIVERSITY
Call Details Proof of Concept (PoC), PC1, ERC-2014-PoC
Summary Watecco is a unique, photoacoustics based profiler, designed to determine the depth distribution of natural phytoplankton assemblages and of algal cells in mass cultures and of their photosynthetic efficiency. It can report the stratification of the light absorbing photosynthetic pigments and is unique in its sensitivity to document changes in the quantum yield of phototrophic plankters.
The profiler is of great importance in following the health and photosynthetic light utilization efficiency of natural and manmade water bodies, freshwater and marine.
Since increase in photosynthetic efficiency and the related proliferation of algal biomass of phytoplankton in lakes, rivers and in the sea occurs mainly as a result of nutrient enrichment, or anthropogenic eutrophication, as well as due to Global Climate Change related water acidification, Watecco can provide an early warning alert call to environmental pollution by urban or farm sewage, compromising the safety of water supplies and coastal fisheries.
Furthermore, the capability of Watecco to provide fast, real time data on the efficiency of photobioreactors and algal mass cultures makes it a unique tool in the optimization of these devices for attaining high yields of their target products.
Commercialization activities of Watecco include market analysis, financial and business planning, legal workup, IPR protection and contracting with manufacturers and investors. The project also covers the finalization of the development of a submersible prototype, a user friendly interface, and its field testing and data validation.
This innovative technology will yield reliable and robust data for coastal authorities responsible for evaluation of the proliferation and collapse of algal populations, marine labs and nature reserves, water supply authorities, industries involved in the mass culture of algae for the production of biodiesel and fine chemicals, and installations based on high-rate-algal sewage treatment systems.
Summary
Watecco is a unique, photoacoustics based profiler, designed to determine the depth distribution of natural phytoplankton assemblages and of algal cells in mass cultures and of their photosynthetic efficiency. It can report the stratification of the light absorbing photosynthetic pigments and is unique in its sensitivity to document changes in the quantum yield of phototrophic plankters.
The profiler is of great importance in following the health and photosynthetic light utilization efficiency of natural and manmade water bodies, freshwater and marine.
Since increase in photosynthetic efficiency and the related proliferation of algal biomass of phytoplankton in lakes, rivers and in the sea occurs mainly as a result of nutrient enrichment, or anthropogenic eutrophication, as well as due to Global Climate Change related water acidification, Watecco can provide an early warning alert call to environmental pollution by urban or farm sewage, compromising the safety of water supplies and coastal fisheries.
Furthermore, the capability of Watecco to provide fast, real time data on the efficiency of photobioreactors and algal mass cultures makes it a unique tool in the optimization of these devices for attaining high yields of their target products.
Commercialization activities of Watecco include market analysis, financial and business planning, legal workup, IPR protection and contracting with manufacturers and investors. The project also covers the finalization of the development of a submersible prototype, a user friendly interface, and its field testing and data validation.
This innovative technology will yield reliable and robust data for coastal authorities responsible for evaluation of the proliferation and collapse of algal populations, marine labs and nature reserves, water supply authorities, industries involved in the mass culture of algae for the production of biodiesel and fine chemicals, and installations based on high-rate-algal sewage treatment systems.
Max ERC Funding
149 775 €
Duration
Start date: 2014-11-01, End date: 2016-04-30
Project acronym watersplit
Project Producing hydrogen by water splitting
Researcher (PI) Ron NAAMAN
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Proof of Concept (PoC), ERC-2017-PoC
Summary This proposal describes a plan to prove the role of electrons’ spin in the photocatalyzed oxidation of water; reaction of two water molecules to form two hydrogen molecules and a single oxygen molecule in its triplet ground state. We found that the catalyst performance is affected by its chiral symmetry, or lack thereof. It is well known that the electrochemical and photoelectrochemical splitting of water requires a high overpotential and that it can be changed by the choice of catalyst; however, the origin of the overpotential was not fully understood. In addition, the artificial water splitting reaction is commonly hampered by the production of hydrogen peroxide, which reduces the hydrogen yield and acts as a strong oxidation agent that can damage the electrochemical cell. The proposed studies will build on our new results which suggest that the chirality of the photoanode material can reduce the overpotential for water splitting and enhance the reaction rate. We found that spin-polarized current causes a spin correlation between the two OH intermediates in the reaction. This spin correlation could both suppress production of hydrogen peroxide and enhance the formation rate of oxygen molecules in the triplet state, improving water splitting efficiency.
This proposal describes a program in which we intend to build upon our finding from our research performed in the framework of our ERC project, and to prove the concept of spin controlled water splitting and thereby produce an efficient hydrogen producing cell.
Summary
This proposal describes a plan to prove the role of electrons’ spin in the photocatalyzed oxidation of water; reaction of two water molecules to form two hydrogen molecules and a single oxygen molecule in its triplet ground state. We found that the catalyst performance is affected by its chiral symmetry, or lack thereof. It is well known that the electrochemical and photoelectrochemical splitting of water requires a high overpotential and that it can be changed by the choice of catalyst; however, the origin of the overpotential was not fully understood. In addition, the artificial water splitting reaction is commonly hampered by the production of hydrogen peroxide, which reduces the hydrogen yield and acts as a strong oxidation agent that can damage the electrochemical cell. The proposed studies will build on our new results which suggest that the chirality of the photoanode material can reduce the overpotential for water splitting and enhance the reaction rate. We found that spin-polarized current causes a spin correlation between the two OH intermediates in the reaction. This spin correlation could both suppress production of hydrogen peroxide and enhance the formation rate of oxygen molecules in the triplet state, improving water splitting efficiency.
This proposal describes a program in which we intend to build upon our finding from our research performed in the framework of our ERC project, and to prove the concept of spin controlled water splitting and thereby produce an efficient hydrogen producing cell.
Max ERC Funding
150 000 €
Duration
Start date: 2017-07-01, End date: 2018-12-31
Project acronym WAVEMEASUREMENT
Project Calibration of extreme wave measurement on the ocean surface
Researcher (PI) Frederic DIAS
Host Institution (HI) UNIVERSITY COLLEGE DUBLIN, NATIONAL UNIVERSITY OF IRELAND, DUBLIN
Call Details Proof of Concept (PoC), PC1, ERC-2013-PoC
Summary Our idea is to revisit the recording of time-series of sea surface elevations in extreme conditions. A better understanding of extreme waves is vital in harbour and coastal monitoring, coastal engineering, offshore design and operations, maritime traffic control, meteorological and climatological studies, wave and wind energy studies. Within the framework of the ERC Advanced Grant MULTIWAVE, which started in April 2012, we brought new physical and mathematical insights into the dynamic shaping mechanisms and statistics of rogue waves, both in optics and in hydrodynamics. While in optics the measurement of such extreme events is accurate, the measurement of extreme waves in the ocean is quite delicate. Moreover experiments can be repeated in optics while in the ocean one has to wait for extreme waves. There are several techniques to measure waves in the ocean: the traditional in-situ techniques as well as remote sensing or optical techniques. They all have advantages and drawbacks and none of them has ever been calibrated against extreme ocean waves. We believe that the in-situ technique once improved remains the best technique for now. Together with the industrial subcontractor, we will optimize wave buoy sensor technology for extreme waves. Based on the fundamental know-how from MULTIWAVE, we are in a position to allow settings to be optimized in the initial design phase prior to ocean deployment and then to test the performance of the optimized device. We will select areas off the west coast of Ireland and times of the year when the probability of recording extreme waves and rogue waves in particular is the highest. If the recorded time-series are robust and accurate, the new wave buoy will be the first wave buoy that can be trusted for recording extreme waves. A market search of existing wave measuring devices currently available will be performed. We believe that the potential market is huge.
Summary
Our idea is to revisit the recording of time-series of sea surface elevations in extreme conditions. A better understanding of extreme waves is vital in harbour and coastal monitoring, coastal engineering, offshore design and operations, maritime traffic control, meteorological and climatological studies, wave and wind energy studies. Within the framework of the ERC Advanced Grant MULTIWAVE, which started in April 2012, we brought new physical and mathematical insights into the dynamic shaping mechanisms and statistics of rogue waves, both in optics and in hydrodynamics. While in optics the measurement of such extreme events is accurate, the measurement of extreme waves in the ocean is quite delicate. Moreover experiments can be repeated in optics while in the ocean one has to wait for extreme waves. There are several techniques to measure waves in the ocean: the traditional in-situ techniques as well as remote sensing or optical techniques. They all have advantages and drawbacks and none of them has ever been calibrated against extreme ocean waves. We believe that the in-situ technique once improved remains the best technique for now. Together with the industrial subcontractor, we will optimize wave buoy sensor technology for extreme waves. Based on the fundamental know-how from MULTIWAVE, we are in a position to allow settings to be optimized in the initial design phase prior to ocean deployment and then to test the performance of the optimized device. We will select areas off the west coast of Ireland and times of the year when the probability of recording extreme waves and rogue waves in particular is the highest. If the recorded time-series are robust and accurate, the new wave buoy will be the first wave buoy that can be trusted for recording extreme waves. A market search of existing wave measuring devices currently available will be performed. We believe that the potential market is huge.
Max ERC Funding
139 140 €
Duration
Start date: 2014-09-01, End date: 2016-02-29
Project acronym WEAR
Project Behaviour Phenotyping using Inertial Sensors
Researcher (PI) Rui Manuel MARQUES FERNANDES DA COSTA
Host Institution (HI) FUNDACAO D. ANNA SOMMER CHAMPALIMAUD E DR. CARLOS MONTEZ CHAMPALIMAUD
Call Details Proof of Concept (PoC), ERC-2018-PoC
Summary Behavior is ultimately the observable output of the nervous system. Thus, to properly diagnose and monitor nervous system disorders it is crucial that we assess behavior comprehensively. However, every day clinical practice still relies mostly on subjective methods to evaluate behavior. The ability to adequately analyse behaviour is also critical during drug development, from the preclinical (animal models) to the clinical stages. Results from our ERC-funded work revealed that the combination of inertial sensor data with an unsupervised algorithm provides an optimal method for an easy to implement unbiased behavior classification that adequately captured the outcome of neural circuit’s computations. In this PoC we proposed to develop this technology into a product that provides a continuous, quantitative and comprehensive behavior assessment that is also versatile, covering the diverse spectrum from the pre-clinical to the clinical context. In this POC we will 1) refinement and validate of inertial sensors in animal models, 2) enable their integration into a Body Area Network, 3) Evolve our unsupervised algorithm into a stand-alone software that is versatile and easy to use. In addition to these technical aims, we propose to explore commercial opportunities and societal benefits, in particular in the medical and drug development sector. We will conduct market analysis and develop a business case for this product, while expanding industry contacts for production and commercialization. Our proposal will lead to a ground-breaking technical solution for quantitative, automated behavioral assessment in animal models of disease and humans that will have an important societal impact through innovation in diagnosis, disease monitoring, and drug development.
Summary
Behavior is ultimately the observable output of the nervous system. Thus, to properly diagnose and monitor nervous system disorders it is crucial that we assess behavior comprehensively. However, every day clinical practice still relies mostly on subjective methods to evaluate behavior. The ability to adequately analyse behaviour is also critical during drug development, from the preclinical (animal models) to the clinical stages. Results from our ERC-funded work revealed that the combination of inertial sensor data with an unsupervised algorithm provides an optimal method for an easy to implement unbiased behavior classification that adequately captured the outcome of neural circuit’s computations. In this PoC we proposed to develop this technology into a product that provides a continuous, quantitative and comprehensive behavior assessment that is also versatile, covering the diverse spectrum from the pre-clinical to the clinical context. In this POC we will 1) refinement and validate of inertial sensors in animal models, 2) enable their integration into a Body Area Network, 3) Evolve our unsupervised algorithm into a stand-alone software that is versatile and easy to use. In addition to these technical aims, we propose to explore commercial opportunities and societal benefits, in particular in the medical and drug development sector. We will conduct market analysis and develop a business case for this product, while expanding industry contacts for production and commercialization. Our proposal will lead to a ground-breaking technical solution for quantitative, automated behavioral assessment in animal models of disease and humans that will have an important societal impact through innovation in diagnosis, disease monitoring, and drug development.
Max ERC Funding
149 820 €
Duration
Start date: 2019-01-01, End date: 2020-06-30
Project acronym WOLBAKIAN
Project Functional genetics of Wolbachia proliferation and protection to viruses
Researcher (PI) Luis Manuel VALLA TEIXEIRA
Host Institution (HI) FUNDACAO CALOUSTE GULBENKIAN
Call Details Consolidator Grant (CoG), LS6, ERC-2017-COG
Summary Wolbachia are arguably the most prevalent intracellular bacteria in animals, infecting filarial nematodes and up to 66% of arthropod species. Wolbachia are maternally transmitted and can induce a large range of strong phenotypes on their hosts. However, very little is known on how they induce these phenotypes and how they interact with the host at the molecular level. One main difficulty with this system is that Wolbachia have been genetically intractable. We will study how Wolbachia confers protection to viruses, a phenomenon that is currently being applied to fight dengue and Zika viruses. We also aim at understanding how these endosymbiont titres are regulated, a crucial aspect of their biology. We will identify host and Wolbachia genes that regulate these processes by performing classical genetic screens in Drosophila and develop a new method to perform a forward genetic screen in Wolbachia. Our previous analysis of natural variants of Wolbachia will also be extended in order to identify alleles associated with differential growth and antiviral protection. We will characterize candidate Wolbachia genes, from the previous analysis and current results in the lab, by performing a new method to obtain loss-of-function mutants in target Wolbachia genes. We will also focus on putative effector proteins of Wolbachia with the purpose of identifying cellular location, induced phenotypes, and host interacting proteins. Drosophila genes will be characterized by classical genetic methods in this model organism. The identification and characterization of Wolbachia and host genes involved in antiviral protection and Wolbachia proliferation will provide key insights to these basic biological problems. Moreover, the knowledge generated and new Wolbachia variants may have an application in the fight against arboviruses transmitted by mosquitoes and human diseases caused by filarial nematodes.
Summary
Wolbachia are arguably the most prevalent intracellular bacteria in animals, infecting filarial nematodes and up to 66% of arthropod species. Wolbachia are maternally transmitted and can induce a large range of strong phenotypes on their hosts. However, very little is known on how they induce these phenotypes and how they interact with the host at the molecular level. One main difficulty with this system is that Wolbachia have been genetically intractable. We will study how Wolbachia confers protection to viruses, a phenomenon that is currently being applied to fight dengue and Zika viruses. We also aim at understanding how these endosymbiont titres are regulated, a crucial aspect of their biology. We will identify host and Wolbachia genes that regulate these processes by performing classical genetic screens in Drosophila and develop a new method to perform a forward genetic screen in Wolbachia. Our previous analysis of natural variants of Wolbachia will also be extended in order to identify alleles associated with differential growth and antiviral protection. We will characterize candidate Wolbachia genes, from the previous analysis and current results in the lab, by performing a new method to obtain loss-of-function mutants in target Wolbachia genes. We will also focus on putative effector proteins of Wolbachia with the purpose of identifying cellular location, induced phenotypes, and host interacting proteins. Drosophila genes will be characterized by classical genetic methods in this model organism. The identification and characterization of Wolbachia and host genes involved in antiviral protection and Wolbachia proliferation will provide key insights to these basic biological problems. Moreover, the knowledge generated and new Wolbachia variants may have an application in the fight against arboviruses transmitted by mosquitoes and human diseases caused by filarial nematodes.
Max ERC Funding
1 999 500 €
Duration
Start date: 2018-07-01, End date: 2023-06-30
Project acronym WORDS
Project Words and Waring type problems
Researcher (PI) Aner Shalev
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Advanced Grant (AdG), PE1, ERC-2009-AdG
Summary Hilbert's solution to Waring problem in Number Theory shows that every positive integer is a sum of g(n) nth powers. Surprising non-commutative analogues of this phenomenon were discovered recently in Group Theory, where powers are replaced by general words. Moreover, the study of group words occurs naturally in important contexts, such as the Burnside problems, Serre's problem on profinite groups, and finite simple group theory. We propose a systematic study of word maps on groups, their images and kernels, as well as related Waring type problems. These include a celebrated conjecture of Thompson, problems regarding covering numbers and mixing times of random walks, as well as probabilistic identities in finite and profinite groups. This is a highly challenging project in which we intend to utilize a wide spectrum of tools, including Representation Theory, Algebraic Geometry, Number Theory, computational group theory, as well as probabilistic methods and Lie methods. Moreover, we aim to establish new results on representations and character bounds, which would be very useful in various additional contexts. Apart from their intrinsic interest, the problems and conjectures we propose have exciting applications to other fields, and the project is likely to shed new light not just in group theory but also in combinatorics, probability and geometry.
Summary
Hilbert's solution to Waring problem in Number Theory shows that every positive integer is a sum of g(n) nth powers. Surprising non-commutative analogues of this phenomenon were discovered recently in Group Theory, where powers are replaced by general words. Moreover, the study of group words occurs naturally in important contexts, such as the Burnside problems, Serre's problem on profinite groups, and finite simple group theory. We propose a systematic study of word maps on groups, their images and kernels, as well as related Waring type problems. These include a celebrated conjecture of Thompson, problems regarding covering numbers and mixing times of random walks, as well as probabilistic identities in finite and profinite groups. This is a highly challenging project in which we intend to utilize a wide spectrum of tools, including Representation Theory, Algebraic Geometry, Number Theory, computational group theory, as well as probabilistic methods and Lie methods. Moreover, we aim to establish new results on representations and character bounds, which would be very useful in various additional contexts. Apart from their intrinsic interest, the problems and conjectures we propose have exciting applications to other fields, and the project is likely to shed new light not just in group theory but also in combinatorics, probability and geometry.
Max ERC Funding
1 197 800 €
Duration
Start date: 2010-01-01, End date: 2014-12-31
Project acronym WTBLDOHRNCE
Project Walking the tightrope between life and death: Oxygen homeostasis regulation in the nematode Caenorhabditis elegans
Researcher (PI) Einav Gross
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), LS4, ERC-2011-StG_20101109
Summary Oxygen (O2) is vital for the life of all aerobic animals. However, fine-tuned regulation of O2 levels is crucial since both shortage (hypoxia) and excess (via the production of reactive oxygen species, ROS) may be harmful. Indeed, both hypoxia and ROS may underlie the pathophysiology of many diseases such as atherosclerosis and Alzheimer’s. To understand how this fine-tuned O2 regulation is achieved at both the molecular and organismal levels my research proposal aims to explore the following integrated questions, using the nematode C. elegans as a model organism.
1) How do animal sense O2? What are the molecular sensors and how do they act together to fine-tune O2 responses?
2) How does O2 regulate food intake, and repress appetite in hypoxia?
3) How do animals survive and behaviorally adapt to hypoxia without HIF-1?
4) How hydrogen sulfide (H2S) regulates O2 responses and aging?
5) How do animals protect against mRNA oxidation damage?
I have focused my research on the globins. GLB-5 is a C. elegans hexacoordinated globin that regulates foraging behavior in response to subtle changes in O2 concentration. Like neuroglobin and cytoglobin in our brain, GLB-5 is expressed in neurons. Recently I discovered that GLB-5 regulates the re-adaptation of animals to 21% O2 after hypoxia. To understand how GLB-5 regulates hypoxia-reoxygenation responses I made a mutagenesis screen and isolated four classes of GLB-5 suppressors, and mapped them using single-nucleotide polymorphisms (SNP’s) to about a 1 Mbp genomic interval. Using a novel non-PCR based libraries preparation and Next Generation whole-genome sequencing, I have already sequenced four independent mutations and cloned one of the GLB-5 suppressors. In the future, I intend to clone more suppressor genes, and use this methodology in other parts of my project. By doing so, I aim to understand O2 homeostasis regulation at all levels; from the molecular signaling network to the physiology and behavior of the whole animal.
Summary
Oxygen (O2) is vital for the life of all aerobic animals. However, fine-tuned regulation of O2 levels is crucial since both shortage (hypoxia) and excess (via the production of reactive oxygen species, ROS) may be harmful. Indeed, both hypoxia and ROS may underlie the pathophysiology of many diseases such as atherosclerosis and Alzheimer’s. To understand how this fine-tuned O2 regulation is achieved at both the molecular and organismal levels my research proposal aims to explore the following integrated questions, using the nematode C. elegans as a model organism.
1) How do animal sense O2? What are the molecular sensors and how do they act together to fine-tune O2 responses?
2) How does O2 regulate food intake, and repress appetite in hypoxia?
3) How do animals survive and behaviorally adapt to hypoxia without HIF-1?
4) How hydrogen sulfide (H2S) regulates O2 responses and aging?
5) How do animals protect against mRNA oxidation damage?
I have focused my research on the globins. GLB-5 is a C. elegans hexacoordinated globin that regulates foraging behavior in response to subtle changes in O2 concentration. Like neuroglobin and cytoglobin in our brain, GLB-5 is expressed in neurons. Recently I discovered that GLB-5 regulates the re-adaptation of animals to 21% O2 after hypoxia. To understand how GLB-5 regulates hypoxia-reoxygenation responses I made a mutagenesis screen and isolated four classes of GLB-5 suppressors, and mapped them using single-nucleotide polymorphisms (SNP’s) to about a 1 Mbp genomic interval. Using a novel non-PCR based libraries preparation and Next Generation whole-genome sequencing, I have already sequenced four independent mutations and cloned one of the GLB-5 suppressors. In the future, I intend to clone more suppressor genes, and use this methodology in other parts of my project. By doing so, I aim to understand O2 homeostasis regulation at all levels; from the molecular signaling network to the physiology and behavior of the whole animal.
Max ERC Funding
1 495 922 €
Duration
Start date: 2011-11-01, End date: 2017-10-31