Project acronym AcetyLys
Project Unravelling the role of lysine acetylation in the regulation of glycolysis in cancer cells through the development of synthetic biology-based tools
Researcher (PI) Eyal Arbely
Host Institution (HI) BEN-GURION UNIVERSITY OF THE NEGEV
Call Details Starting Grant (StG), LS9, ERC-2015-STG
Summary Synthetic biology is an emerging discipline that offers powerful tools to control and manipulate fundamental processes in living matter. We propose to develop and apply such tools to modify the genetic code of cultured mammalian cells and bacteria with the aim to study the role of lysine acetylation in the regulation of metabolism and in cancer development. Thousands of lysine acetylation sites were recently discovered on non-histone proteins, suggesting that acetylation is a widespread and evolutionarily conserved post translational modification, similar in scope to phosphorylation and ubiquitination. Specifically, it has been found that most of the enzymes of metabolic processes—including glycolysis—are acetylated, implying that acetylation is key regulator of cellular metabolism in general and in glycolysis in particular. The regulation of metabolic pathways is of particular importance to cancer research, as misregulation of metabolic pathways, especially upregulation of glycolysis, is common to most transformed cells and is now considered a new hallmark of cancer. These data raise an immediate question: what is the role of acetylation in the regulation of glycolysis and in the metabolic reprogramming of cancer cells? While current methods rely on mutational analyses, we will genetically encode the incorporation of acetylated lysine and directly measure the functional role of each acetylation site in cancerous and non-cancerous cell lines. Using this methodology, we will study the structural and functional implications of all the acetylation sites in glycolytic enzymes. We will also decipher the mechanism by which acetylation is regulated by deacetylases and answer a long standing question – how 18 deacetylases recognise their substrates among thousands of acetylated proteins? The developed methodologies can be applied to a wide range of protein families known to be acetylated, thereby making this study relevant to diverse research fields.
Summary
Synthetic biology is an emerging discipline that offers powerful tools to control and manipulate fundamental processes in living matter. We propose to develop and apply such tools to modify the genetic code of cultured mammalian cells and bacteria with the aim to study the role of lysine acetylation in the regulation of metabolism and in cancer development. Thousands of lysine acetylation sites were recently discovered on non-histone proteins, suggesting that acetylation is a widespread and evolutionarily conserved post translational modification, similar in scope to phosphorylation and ubiquitination. Specifically, it has been found that most of the enzymes of metabolic processes—including glycolysis—are acetylated, implying that acetylation is key regulator of cellular metabolism in general and in glycolysis in particular. The regulation of metabolic pathways is of particular importance to cancer research, as misregulation of metabolic pathways, especially upregulation of glycolysis, is common to most transformed cells and is now considered a new hallmark of cancer. These data raise an immediate question: what is the role of acetylation in the regulation of glycolysis and in the metabolic reprogramming of cancer cells? While current methods rely on mutational analyses, we will genetically encode the incorporation of acetylated lysine and directly measure the functional role of each acetylation site in cancerous and non-cancerous cell lines. Using this methodology, we will study the structural and functional implications of all the acetylation sites in glycolytic enzymes. We will also decipher the mechanism by which acetylation is regulated by deacetylases and answer a long standing question – how 18 deacetylases recognise their substrates among thousands of acetylated proteins? The developed methodologies can be applied to a wide range of protein families known to be acetylated, thereby making this study relevant to diverse research fields.
Max ERC Funding
1 499 375 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym BARCODE DIAGNOSTICS
Project Next-Generation Personalized Diagnostic Nanotechnologies for Predicting Response to Cancer Medicine
Researcher (PI) Avraham Dror Schroeder
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Starting Grant (StG), LS7, ERC-2015-STG
Summary Cancer is the leading cause of death in the Western world and the second cause of death worldwide. Despite advances in medical research, 30% of cancer patients are prescribed a medication the tumor does not respond to, or, alternatively, drugs that induce adverse side effects patients' cannot tolerate.
Nanotechnologies are becoming impactful therapeutic tools, granting tissue-targeting and cellular precision that cannot be attained using systems of larger scale.
In this proposal, I plan to expand far beyond the state-of-the-art and develop a conceptually new approach in which diagnostic nanoparticles are designed to retrieve drug-sensitivity information from malignant tissue inside the body. The ultimate goal of this program is to be able to predict, ahead of time, which treatment will be best for each cancer patient – an emerging field called personalized medicine. This interdisciplinary research program will expand our understandings and capabilities in nanotechnology, cancer biology and medicine.
To achieve this goal, I will engineer novel nanotechnologies that autonomously maneuver, target and diagnose the various cells that compose the tumor microenvironment and its disseminated metastasis. Each nanometric system will contain a miniscule amount of a biologically-active agent, and will serve as a nano lab for testing the activity of the agents inside the tumor cells.
To distinguish between system to system, and to grant single-cell sensitivity in vivo, nanoparticles will be barcoded with unique DNA fragments.
We will enable nanoparticle' deep tissue penetration into primary tumors and metastatic microenvironments using enzyme-loaded particles, and study how different agents, including small-molecule drugs, proteins and RNA, interact with the malignant and stromal cells that compose the cancerous microenvironments. Finally, we will demonstrate the ability of barcoded nanoparticles to predict adverse, life-threatening, side effects, in a personalized manner.
Summary
Cancer is the leading cause of death in the Western world and the second cause of death worldwide. Despite advances in medical research, 30% of cancer patients are prescribed a medication the tumor does not respond to, or, alternatively, drugs that induce adverse side effects patients' cannot tolerate.
Nanotechnologies are becoming impactful therapeutic tools, granting tissue-targeting and cellular precision that cannot be attained using systems of larger scale.
In this proposal, I plan to expand far beyond the state-of-the-art and develop a conceptually new approach in which diagnostic nanoparticles are designed to retrieve drug-sensitivity information from malignant tissue inside the body. The ultimate goal of this program is to be able to predict, ahead of time, which treatment will be best for each cancer patient – an emerging field called personalized medicine. This interdisciplinary research program will expand our understandings and capabilities in nanotechnology, cancer biology and medicine.
To achieve this goal, I will engineer novel nanotechnologies that autonomously maneuver, target and diagnose the various cells that compose the tumor microenvironment and its disseminated metastasis. Each nanometric system will contain a miniscule amount of a biologically-active agent, and will serve as a nano lab for testing the activity of the agents inside the tumor cells.
To distinguish between system to system, and to grant single-cell sensitivity in vivo, nanoparticles will be barcoded with unique DNA fragments.
We will enable nanoparticle' deep tissue penetration into primary tumors and metastatic microenvironments using enzyme-loaded particles, and study how different agents, including small-molecule drugs, proteins and RNA, interact with the malignant and stromal cells that compose the cancerous microenvironments. Finally, we will demonstrate the ability of barcoded nanoparticles to predict adverse, life-threatening, side effects, in a personalized manner.
Max ERC Funding
1 499 250 €
Duration
Start date: 2016-04-01, End date: 2021-03-31
Project acronym BUNDLEFORCE
Project Unravelling the Mechanosensitivity of Actin Bundles in Filopodia
Researcher (PI) Antoine Guillaume Jegou
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), LS1, ERC-2015-STG
Summary Eukaryotic cells constantly convert signals between biochemical energy and mechanical work to timely accomplish many key functions such as migration, division or development. Filopodia are essential finger-like structures that emerge at the cell front to orient the cell in response to its chemical and mechanical environment. Yet, the molecular interactions that make the filopodia mechanosensitive are not known. To tackle this challenge we propose unique biophysical in vitro and in vivo experiments of increasing complexity. Here we will focus on how the underlying actin filament bundle regulates filopodium growth and retraction cycles at the micrometer and seconds scales. These parallel actin filaments are mainly elongated at their barbed-end by formins and cross-linked by bundling proteins such as fascins.
We aim to:
1) Elucidate how formin and fascin functions are regulated by mechanics at the single filament level. We will investigate how formin partners and competitors present in filopodia affect formin processivity; how fascin affinity for the side of filaments is modified by filament tension and formin presence at the barbed-end.
2) Reconstitute filopodium-like actin bundles in vitro to understand how actin bundle size and fate are regulated down to the molecular scale. Using a unique experimental setup that combines microfluidics and optical tweezers, we will uncover for the first time actin bundles mechanosensitive capabilities, both in tension and compression.
3) Decipher in vivo the mechanics of actin bundles in filopodia, using fascins and formins with integrated fluorescent tension sensors.
This framework spanning from in vitro single filament to in vivo meso-scale actin networks will bring unprecedented insights into the role of actin bundles in filopodia mechanosensitivity.
Summary
Eukaryotic cells constantly convert signals between biochemical energy and mechanical work to timely accomplish many key functions such as migration, division or development. Filopodia are essential finger-like structures that emerge at the cell front to orient the cell in response to its chemical and mechanical environment. Yet, the molecular interactions that make the filopodia mechanosensitive are not known. To tackle this challenge we propose unique biophysical in vitro and in vivo experiments of increasing complexity. Here we will focus on how the underlying actin filament bundle regulates filopodium growth and retraction cycles at the micrometer and seconds scales. These parallel actin filaments are mainly elongated at their barbed-end by formins and cross-linked by bundling proteins such as fascins.
We aim to:
1) Elucidate how formin and fascin functions are regulated by mechanics at the single filament level. We will investigate how formin partners and competitors present in filopodia affect formin processivity; how fascin affinity for the side of filaments is modified by filament tension and formin presence at the barbed-end.
2) Reconstitute filopodium-like actin bundles in vitro to understand how actin bundle size and fate are regulated down to the molecular scale. Using a unique experimental setup that combines microfluidics and optical tweezers, we will uncover for the first time actin bundles mechanosensitive capabilities, both in tension and compression.
3) Decipher in vivo the mechanics of actin bundles in filopodia, using fascins and formins with integrated fluorescent tension sensors.
This framework spanning from in vitro single filament to in vivo meso-scale actin networks will bring unprecedented insights into the role of actin bundles in filopodia mechanosensitivity.
Max ERC Funding
1 499 190 €
Duration
Start date: 2016-03-01, End date: 2021-02-28
Project acronym CaNANObinoids
Project From Peripheralized to Cell- and Organelle-Targeted Medicine: The 3rd Generation of Cannabinoid-1 Receptor Antagonists for the Treatment of Chronic Kidney Disease
Researcher (PI) Yossef Tam
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), LS4, ERC-2015-STG
Summary Clinical experience with globally-acting cannabinoid-1 receptor (CB1R) antagonists revealed the benefits of blocking CB1Rs for the treatment of obesity and diabetes. However, their use is hampered by increased CNS-mediated side effects. Recently, I have demonstrated that peripherally-restricted CB1R antagonists have the potential to treat the metabolic syndrome without eliciting these adverse effects. While these results are promising and are currently being developed into the clinic, our ability to rationally design CB1R blockers that would target a diseased organ is limited.
The current proposal aims to develop and test cell- and organelle-specific CB1R antagonists. To establish this paradigm, I will focus our interest on the kidney, since chronic kidney disease (CKD) is the leading cause of increased morbidity and mortality of patients with diabetes. Our first goal will be to characterize the obligatory role of the renal proximal tubular CB1R in the pathogenesis of diabetic renal complications. Next, we will attempt to link renal proximal CB1R with diabetic mitochondrial dysfunction. Finally, we will develop proximal tubular (cell-specific) and mitochondrial (organelle-specific) CB1R blockers and test their effectiveness in treating CKD. To that end, we will encapsulate CB1R blockers into biocompatible polymeric nanoparticles that will serve as targeted drug delivery systems, via their conjugation to targeting ligands.
The implications of this work are far reaching as they will (i) point to renal proximal tubule CB1R as a novel target for CKD; (ii) identify mitochondrial CB1R as a new player in the regulation of proximal tubular cell function, and (iii) eventually become the drug-of-choice in treating diabetic CKD and its comorbidities. Moreover, this work will lead to the development of a novel organ-specific drug delivery system for CB1R blockers, which could be then exploited in other tissues affected by obesity, diabetes and the metabolic syndrome.
Summary
Clinical experience with globally-acting cannabinoid-1 receptor (CB1R) antagonists revealed the benefits of blocking CB1Rs for the treatment of obesity and diabetes. However, their use is hampered by increased CNS-mediated side effects. Recently, I have demonstrated that peripherally-restricted CB1R antagonists have the potential to treat the metabolic syndrome without eliciting these adverse effects. While these results are promising and are currently being developed into the clinic, our ability to rationally design CB1R blockers that would target a diseased organ is limited.
The current proposal aims to develop and test cell- and organelle-specific CB1R antagonists. To establish this paradigm, I will focus our interest on the kidney, since chronic kidney disease (CKD) is the leading cause of increased morbidity and mortality of patients with diabetes. Our first goal will be to characterize the obligatory role of the renal proximal tubular CB1R in the pathogenesis of diabetic renal complications. Next, we will attempt to link renal proximal CB1R with diabetic mitochondrial dysfunction. Finally, we will develop proximal tubular (cell-specific) and mitochondrial (organelle-specific) CB1R blockers and test their effectiveness in treating CKD. To that end, we will encapsulate CB1R blockers into biocompatible polymeric nanoparticles that will serve as targeted drug delivery systems, via their conjugation to targeting ligands.
The implications of this work are far reaching as they will (i) point to renal proximal tubule CB1R as a novel target for CKD; (ii) identify mitochondrial CB1R as a new player in the regulation of proximal tubular cell function, and (iii) eventually become the drug-of-choice in treating diabetic CKD and its comorbidities. Moreover, this work will lead to the development of a novel organ-specific drug delivery system for CB1R blockers, which could be then exploited in other tissues affected by obesity, diabetes and the metabolic syndrome.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-04-01, End date: 2021-03-31
Project acronym CBTC
Project The Resurgence in Wage Inequality and Technological Change: A New Approach
Researcher (PI) Tali Kristal
Host Institution (HI) UNIVERSITY OF HAIFA
Call Details Starting Grant (StG), SH2, ERC-2015-STG
Summary Social-science explanations for rising wage inequality have reached a dead end. Most economists argue that computerization has been primarily responsible, while on the other side of the argument are sociologists and political scientists who stress the role of political forces in the evolution process of wages. I would like to use my knowledge and experience to come up with an original theory on the complex dynamics between technology and politics in order to solve two unsettled questions regarding the role of computerization in rising wage inequality: First, how can computerization, which diffused simultaneously in rich countries, explain the divergent inequality trends in Europe and the United States? Second, what are the mechanisms behind the well-known observed positive correlation between computers and earnings?
To answer the first question, I develop a new institutional agenda stating that politics, broadly defined, mitigates the effects of technological change on wages by stimulating norms of fair pay and equity. To answer the second question, I propose a truly novel perspective that conceptualizes the earnings advantage that derives from computerization around access to and control of information on the production process. Capitalizing on this new perspective, I develop a new approach to measuring computerization to capture the form of workers’ interaction with computers at work, and build a research strategy for analysing the effect of computerization on wages across countries and workplaces, and over time.
This research project challenges the common understanding of technology’s role in producing economic inequality, and would thereby significantly impact all of the abovementioned disciplines, which are debating over the upswing in wage inequality, as well as public policy, which discusses what should be done to confront the resurgence of income inequality.
Summary
Social-science explanations for rising wage inequality have reached a dead end. Most economists argue that computerization has been primarily responsible, while on the other side of the argument are sociologists and political scientists who stress the role of political forces in the evolution process of wages. I would like to use my knowledge and experience to come up with an original theory on the complex dynamics between technology and politics in order to solve two unsettled questions regarding the role of computerization in rising wage inequality: First, how can computerization, which diffused simultaneously in rich countries, explain the divergent inequality trends in Europe and the United States? Second, what are the mechanisms behind the well-known observed positive correlation between computers and earnings?
To answer the first question, I develop a new institutional agenda stating that politics, broadly defined, mitigates the effects of technological change on wages by stimulating norms of fair pay and equity. To answer the second question, I propose a truly novel perspective that conceptualizes the earnings advantage that derives from computerization around access to and control of information on the production process. Capitalizing on this new perspective, I develop a new approach to measuring computerization to capture the form of workers’ interaction with computers at work, and build a research strategy for analysing the effect of computerization on wages across countries and workplaces, and over time.
This research project challenges the common understanding of technology’s role in producing economic inequality, and would thereby significantly impact all of the abovementioned disciplines, which are debating over the upswing in wage inequality, as well as public policy, which discusses what should be done to confront the resurgence of income inequality.
Max ERC Funding
1 495 091 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym CD4DNASP
Project Cell intrinsic control of CD4 T cell differentiation by cytosolic DNA sensing pathways
Researcher (PI) Lionel Jerome Apetoh
Host Institution (HI) INSTITUT NATIONAL DE LA SANTE ET DE LA RECHERCHE MEDICALE
Call Details Starting Grant (StG), LS6, ERC-2015-STG
Summary This proposal aims to investigate the role of cytosolic DNA sensing pathways in CD4 T cell differentiation.
Cellular host defense to pathogens relies on the detection of pathogen-associated molecular patterns including deoxyribonucleic acid (DNA), which can be recognized by host myeloid cells through Toll-like receptor (TLR) 9 binding. Recent evidence however suggests that innate immune cells can also perceive cytoplasmic DNA from infectious or autologous origin through cytosolic DNA sensors triggering TLR9-independent signaling. Activation of cytosolic DNA sensor-dependent signaling pathways has been clearly shown to trigger innate immune responses to microbial and host DNA, but the contribution of cytosolic DNA sensors to the differentiation of CD4 T cells, an essential event for shaping adaptive immune responses, has not been documented. This proposal aims to fill this current knowledge gap.
We aim to decipher the molecular series of transcriptional events triggered by DNA in CD4 T cells that ultimately result in altered T cell differentiation. This aim will be addressed by combining in vitro and in vivo approaches such as advanced gene expression analysis of CD4 T cells and use of transgenic and gene-deficient mice. Structure activity relationship and biophysical studies will also be performed to unravel novel immunomodulators able to affect CD4 T cell differentiation.
Summary
This proposal aims to investigate the role of cytosolic DNA sensing pathways in CD4 T cell differentiation.
Cellular host defense to pathogens relies on the detection of pathogen-associated molecular patterns including deoxyribonucleic acid (DNA), which can be recognized by host myeloid cells through Toll-like receptor (TLR) 9 binding. Recent evidence however suggests that innate immune cells can also perceive cytoplasmic DNA from infectious or autologous origin through cytosolic DNA sensors triggering TLR9-independent signaling. Activation of cytosolic DNA sensor-dependent signaling pathways has been clearly shown to trigger innate immune responses to microbial and host DNA, but the contribution of cytosolic DNA sensors to the differentiation of CD4 T cells, an essential event for shaping adaptive immune responses, has not been documented. This proposal aims to fill this current knowledge gap.
We aim to decipher the molecular series of transcriptional events triggered by DNA in CD4 T cells that ultimately result in altered T cell differentiation. This aim will be addressed by combining in vitro and in vivo approaches such as advanced gene expression analysis of CD4 T cells and use of transgenic and gene-deficient mice. Structure activity relationship and biophysical studies will also be performed to unravel novel immunomodulators able to affect CD4 T cell differentiation.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-08-01, End date: 2021-07-31
Project acronym CHROMTOPOLOGY
Project Understanding and manipulating the dynamics of chromosome topologies in transcriptional control
Researcher (PI) Thomas, Ivor Sexton
Host Institution (HI) CENTRE EUROPEEN DE RECHERCHE EN BIOLOGIE ET MEDECINE
Call Details Starting Grant (StG), LS2, ERC-2015-STG
Summary Transcriptional regulation of genes in eukaryotic cells requires a complex and highly regulated interplay of chromatin environment, epigenetic status of target sequences and several different transcription factors. Eukaryotic genomes are tightly packaged within nuclei, yet must be accessible for transcription, replication and repair. A striking correlation exists between chromatin topology and underlying gene activity. According to the textbook view, chromatin loops bring genes into direct contact with distal regulatory elements, such as enhancers. Moreover, we and others have shown that genomes are organized into discretely folded megabase-sized regions, denoted as topologically associated domains (TADs), which seem to correlate well with transcription activity and histone modifications. However, it is unknown whether chromosome folding is a cause or consequence of underlying gene function.
To better understand the role of genome organization in transcription regulation, I will address the following questions:
(i) How are chromatin configurations altered during transcriptional changes accompanying development?
(ii) What are the real-time kinetics and cell-to-cell variabilities of chromatin interactions and TAD architectures?
(iii) Can chromatin loops be engineered de novo, and do they influence gene expression?
(iv) What genetic elements and trans-acting factors are required to organize TADs?
To address these fundamental questions, I will use a combination of novel technologies and approaches, such as Hi-C, CRISPR knock-ins, ANCHOR tagging of DNA loci, high- and super-resolution single-cell imaging, genome-wide screens and optogenetics, in order to both study and engineer chromatin architectures.
These studies will give groundbreaking insight into if and how chromatin topology regulates transcription. Thus, I anticipate that the results of this project will have a major impact on the field and will lead to a new paradigm for metazoan transcription control.
Summary
Transcriptional regulation of genes in eukaryotic cells requires a complex and highly regulated interplay of chromatin environment, epigenetic status of target sequences and several different transcription factors. Eukaryotic genomes are tightly packaged within nuclei, yet must be accessible for transcription, replication and repair. A striking correlation exists between chromatin topology and underlying gene activity. According to the textbook view, chromatin loops bring genes into direct contact with distal regulatory elements, such as enhancers. Moreover, we and others have shown that genomes are organized into discretely folded megabase-sized regions, denoted as topologically associated domains (TADs), which seem to correlate well with transcription activity and histone modifications. However, it is unknown whether chromosome folding is a cause or consequence of underlying gene function.
To better understand the role of genome organization in transcription regulation, I will address the following questions:
(i) How are chromatin configurations altered during transcriptional changes accompanying development?
(ii) What are the real-time kinetics and cell-to-cell variabilities of chromatin interactions and TAD architectures?
(iii) Can chromatin loops be engineered de novo, and do they influence gene expression?
(iv) What genetic elements and trans-acting factors are required to organize TADs?
To address these fundamental questions, I will use a combination of novel technologies and approaches, such as Hi-C, CRISPR knock-ins, ANCHOR tagging of DNA loci, high- and super-resolution single-cell imaging, genome-wide screens and optogenetics, in order to both study and engineer chromatin architectures.
These studies will give groundbreaking insight into if and how chromatin topology regulates transcription. Thus, I anticipate that the results of this project will have a major impact on the field and will lead to a new paradigm for metazoan transcription control.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym CoBABATI
Project Cofactor Binding Antibodies – Basic Aspects and Therapeutic Innovations
Researcher (PI) Jordan Dimitrov
Host Institution (HI) INSTITUT NATIONAL DE LA SANTE ET DE LA RECHERCHE MEDICALE
Call Details Starting Grant (StG), LS7, ERC-2015-STG
Summary The immune repertoire of healthy individuals contains a fraction of antibodies (Abs) that are able to bind with high affinity various endogenous or exogenous low molecular weight compounds, including cofactors essential for the aerobic life, such as riboflavin, heme and ATP. Despite identification of cofactor-binding Abs as a constituent of normal immune repertoires, their fundamental characteristics and have not been systematically investigated. Thus, we do not know the origin, prevalence and physiopathological significance of cofactor-binding Abs. Moreover, the molecular mechanisms of interaction of cofactors with Abs are ill defined. Different proteins use cofactors to extend the chemistry intrinsic to the amino acid sequence of their polypeptide chain(s). Thus, one can speculate that the alliance of Abs with low molecular weight compounds results in the emergence of untypical properties of Abs and offers a strategy for designing a new generation of therapeutic Abs. Moreover, cofactor-binding Abs may be used for delivery of cytotoxic compounds to particular sites in the body, or for scavenging pro-inflammatory compounds. The principal goal of the present proposal is to gain a basic understanding on the fraction of cofactor-binding Abs in immune repertoires and to use this knowledge for the rational design of novel classes of therapeutic Abs. In this project, we will address the following questions: 1) understand the origin and prevalence of cofactor-binding Abs in immune repertoires; 2) characterize the molecular mechanisms of interaction of cofactors with Abs; 3) Understand the physiopathological roles of cofactor-binding Abs, and 4) use cofactor binding for the development of novel types of therapeutic Abs. A comprehensive understanding of various aspects of cofactor-binding Abs should lead to advances in fundamental understanding and in the development of innovative therapeutic and diagnostic tools.
Summary
The immune repertoire of healthy individuals contains a fraction of antibodies (Abs) that are able to bind with high affinity various endogenous or exogenous low molecular weight compounds, including cofactors essential for the aerobic life, such as riboflavin, heme and ATP. Despite identification of cofactor-binding Abs as a constituent of normal immune repertoires, their fundamental characteristics and have not been systematically investigated. Thus, we do not know the origin, prevalence and physiopathological significance of cofactor-binding Abs. Moreover, the molecular mechanisms of interaction of cofactors with Abs are ill defined. Different proteins use cofactors to extend the chemistry intrinsic to the amino acid sequence of their polypeptide chain(s). Thus, one can speculate that the alliance of Abs with low molecular weight compounds results in the emergence of untypical properties of Abs and offers a strategy for designing a new generation of therapeutic Abs. Moreover, cofactor-binding Abs may be used for delivery of cytotoxic compounds to particular sites in the body, or for scavenging pro-inflammatory compounds. The principal goal of the present proposal is to gain a basic understanding on the fraction of cofactor-binding Abs in immune repertoires and to use this knowledge for the rational design of novel classes of therapeutic Abs. In this project, we will address the following questions: 1) understand the origin and prevalence of cofactor-binding Abs in immune repertoires; 2) characterize the molecular mechanisms of interaction of cofactors with Abs; 3) Understand the physiopathological roles of cofactor-binding Abs, and 4) use cofactor binding for the development of novel types of therapeutic Abs. A comprehensive understanding of various aspects of cofactor-binding Abs should lead to advances in fundamental understanding and in the development of innovative therapeutic and diagnostic tools.
Max ERC Funding
1 255 000 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym COHEGRAPH
Project Electron quantum optics in Graphene
Researcher (PI) Séverin Preden Roulleau
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE3, ERC-2015-STG
Summary Quantum computing is based on the manipulation of quantum bits (qubits) to enhance the efficiency of information processing. In solid-state systems, two approaches have been explored:
• static qubits, coupled to quantum buses used for manipulation and information transmission,
• flying qubits which are mobile qubits propagating in quantum circuits for further manipulation.
Flying qubits research led to the recent emergence of the field of electron quantum optics, where electrons play the role of photons in quantum optic like experiments. This has recently led to the development of electronic quantum interferometry as well as single electron sources. As of yet, such experiments have only been successfully implemented in semi-conductor heterostructures cooled at extremely low temperatures. Realizing electron quantum optics experiments in graphene, an inexpensive material showing a high degree of quantum coherence even at moderately low temperatures, would be a strong evidence that quantum computing in graphene is within reach.
One of the most elementary building blocks necessary to perform electron quantum optics experiments is the electron beam splitter, which is the electronic analog of a beam splitter for light. However, the usual scheme for electron beam splitters in semi-conductor heterostructures is not available in graphene because of its gapless band structure. I propose a breakthrough in this direction where pn junction plays the role of electron beam splitter. This will lead to the following achievements considered as important steps towards quantum computing:
• electronic Mach Zehnder interferometry used to study the quantum coherence properties of graphene,
• two electrons Aharonov Bohm interferometry used to generate entangled states as an elementary quantum gate,
• the implementation of on-demand electronic sources in the GHz range for graphene flying qubits.
Summary
Quantum computing is based on the manipulation of quantum bits (qubits) to enhance the efficiency of information processing. In solid-state systems, two approaches have been explored:
• static qubits, coupled to quantum buses used for manipulation and information transmission,
• flying qubits which are mobile qubits propagating in quantum circuits for further manipulation.
Flying qubits research led to the recent emergence of the field of electron quantum optics, where electrons play the role of photons in quantum optic like experiments. This has recently led to the development of electronic quantum interferometry as well as single electron sources. As of yet, such experiments have only been successfully implemented in semi-conductor heterostructures cooled at extremely low temperatures. Realizing electron quantum optics experiments in graphene, an inexpensive material showing a high degree of quantum coherence even at moderately low temperatures, would be a strong evidence that quantum computing in graphene is within reach.
One of the most elementary building blocks necessary to perform electron quantum optics experiments is the electron beam splitter, which is the electronic analog of a beam splitter for light. However, the usual scheme for electron beam splitters in semi-conductor heterostructures is not available in graphene because of its gapless band structure. I propose a breakthrough in this direction where pn junction plays the role of electron beam splitter. This will lead to the following achievements considered as important steps towards quantum computing:
• electronic Mach Zehnder interferometry used to study the quantum coherence properties of graphene,
• two electrons Aharonov Bohm interferometry used to generate entangled states as an elementary quantum gate,
• the implementation of on-demand electronic sources in the GHz range for graphene flying qubits.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym CombiCompGeom
Project Combinatorial Aspects of Computational Geometry
Researcher (PI) Natan Rubin
Host Institution (HI) BEN-GURION UNIVERSITY OF THE NEGEV
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary The project focuses on the interface between computational and combinatorial geometry.
Geometric problems emerge in a variety of computational fields that interact with the physical world.
The performance of geometric algorithms is determined by the description complexity of their underlying combinatorial structures. Hence, most theoretical challenges faced by computational geometry are of a distinctly combinatorial nature.
In the past two decades, computational geometry has been revolutionized by the powerful combination of random sampling techniques with the abstract machinery of geometric arrangements. These insights were used, in turn, to establish state-of-the-art results in combinatorial geometry. Nevertheless, a number of fundamental problems remained open and resisted numerous attempts to solve them.
Motivated by the recent breakthrough results, in which the PI played a central role, we propose two exciting lines of study with the potential to change the landscape of this field.
The first research direction concerns the complexity of Voronoi diagrams -- arguably the most common structures in computational geometry.
The second direction concerns combinatorial and algorithmic aspects of geometric intersection structures, including some fundamental open problems in geometric transversal theory. Many of these questions are motivated by geometric variants of general covering and packing problems, and all efficient approximation schemes for them must rely on the intrinsic properties of geometric graphs and hypergraphs.
Any progress in responding to these challenges will constitute a major breakthrough in both computational and combinatorial geometry.
Summary
The project focuses on the interface between computational and combinatorial geometry.
Geometric problems emerge in a variety of computational fields that interact with the physical world.
The performance of geometric algorithms is determined by the description complexity of their underlying combinatorial structures. Hence, most theoretical challenges faced by computational geometry are of a distinctly combinatorial nature.
In the past two decades, computational geometry has been revolutionized by the powerful combination of random sampling techniques with the abstract machinery of geometric arrangements. These insights were used, in turn, to establish state-of-the-art results in combinatorial geometry. Nevertheless, a number of fundamental problems remained open and resisted numerous attempts to solve them.
Motivated by the recent breakthrough results, in which the PI played a central role, we propose two exciting lines of study with the potential to change the landscape of this field.
The first research direction concerns the complexity of Voronoi diagrams -- arguably the most common structures in computational geometry.
The second direction concerns combinatorial and algorithmic aspects of geometric intersection structures, including some fundamental open problems in geometric transversal theory. Many of these questions are motivated by geometric variants of general covering and packing problems, and all efficient approximation schemes for them must rely on the intrinsic properties of geometric graphs and hypergraphs.
Any progress in responding to these challenges will constitute a major breakthrough in both computational and combinatorial geometry.
Max ERC Funding
1 303 750 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym COSMO_SIMS
Project Astrophysics for the Dark Universe: Cosmological simulations in the context of dark matter and dark energy research
Researcher (PI) Oliver Jens Hahn
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE9, ERC-2015-STG
Summary The objective of this ambitious research proposal is to push forward the frontier of computational cosmology by significantly improving the precision of numerical models on par with the increasing richness and depth of surveys that aim to shed light on the nature of dark matter and dark energy.
Using new phase-space techniques for the simulation and analysis of dark matter, completely new insights into its dynamics are possible. They allow, for the first time, the accurate simulation of dark matter cosmologies with suppressed small-scale power without artificial fragmentation. Using such techniques, I will establish highly accurate predictions for the properties of dark matter and baryons on small scales and investigate the formation of the first galaxies in non-CDM cosmologies.
Baryonic effects on cosmological observables are a severe limiting factor in interpreting cosmological measurements. I will investigate their impact by identifying the relevant astrophysical processes in relation to the multi-wavelength properties of galaxy clusters and the galaxies they host. This will be enabled by a statistical set of zoom simulations where it is possible to study how these properties correlate with one another, with the assembly history, and how we can derive better models for unresolved baryonic processes in cosmological simulations and thus, ultimately, how we can improve the power of cosmological surveys.
Finally, I will develop a completely unified framework for precision cosmological initial conditions (ICs) that is scalable to both the largest simulations and the highest resolution zoom simulations. Bringing ICs into the ‘cloud’ will enable new statistical studies using zoom simulations and increase the reproducibility of simulations within the community.
My previous work in developing most of the underlying techniques puts me in an excellent position to lead a research group that is able to successfully approach such a wide-ranging and ambitious project.
Summary
The objective of this ambitious research proposal is to push forward the frontier of computational cosmology by significantly improving the precision of numerical models on par with the increasing richness and depth of surveys that aim to shed light on the nature of dark matter and dark energy.
Using new phase-space techniques for the simulation and analysis of dark matter, completely new insights into its dynamics are possible. They allow, for the first time, the accurate simulation of dark matter cosmologies with suppressed small-scale power without artificial fragmentation. Using such techniques, I will establish highly accurate predictions for the properties of dark matter and baryons on small scales and investigate the formation of the first galaxies in non-CDM cosmologies.
Baryonic effects on cosmological observables are a severe limiting factor in interpreting cosmological measurements. I will investigate their impact by identifying the relevant astrophysical processes in relation to the multi-wavelength properties of galaxy clusters and the galaxies they host. This will be enabled by a statistical set of zoom simulations where it is possible to study how these properties correlate with one another, with the assembly history, and how we can derive better models for unresolved baryonic processes in cosmological simulations and thus, ultimately, how we can improve the power of cosmological surveys.
Finally, I will develop a completely unified framework for precision cosmological initial conditions (ICs) that is scalable to both the largest simulations and the highest resolution zoom simulations. Bringing ICs into the ‘cloud’ will enable new statistical studies using zoom simulations and increase the reproducibility of simulations within the community.
My previous work in developing most of the underlying techniques puts me in an excellent position to lead a research group that is able to successfully approach such a wide-ranging and ambitious project.
Max ERC Funding
1 471 382 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym CoVeCe
Project Coinduction for Verification and Certification
Researcher (PI) Damien Gabriel Jacques Pous
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary Software and hardware bugs cost hundreds of millions of euros every year to companies and administrations. Formal methods like verification provide automatic means of finding some of these bugs. Certification, using proof assistants like Coq or Isabelle/HOL, make it possible to guarantee the absence of bugs (up to a certain point).
These two kinds of tools are crucial in order to design safer programs and machines. Unfortunately, state-of-the art tools are not yet satisfactory. Verification tools often face state-explosion problems and require more efficient algorithms; certification tools need more automation: they currently require too much time and expertise, even for basic tasks that could be handled easily through verification.
In recent work with Bonchi, we have shown that an extremely simple idea from concurrency theory could give rise to algorithms that are often exponentially faster than the algorithms currently used in verification tools.
My claim is that this idea could scale to richer models, revolutionising existing verification tools and providing algorithms for problems whose decidability is still open.
Moreover, the expected simplicity of those algorithms will make it possible to implement them inside certification tools such as Coq, to provide powerful automation techniques based on verification techniques. In the end, we will thus provide efficient and certified verification tools going beyond the state-of-the-art, but also the ability to use such tools inside the Coq proof assistant, to alleviate the cost of certification tasks.
Summary
Software and hardware bugs cost hundreds of millions of euros every year to companies and administrations. Formal methods like verification provide automatic means of finding some of these bugs. Certification, using proof assistants like Coq or Isabelle/HOL, make it possible to guarantee the absence of bugs (up to a certain point).
These two kinds of tools are crucial in order to design safer programs and machines. Unfortunately, state-of-the art tools are not yet satisfactory. Verification tools often face state-explosion problems and require more efficient algorithms; certification tools need more automation: they currently require too much time and expertise, even for basic tasks that could be handled easily through verification.
In recent work with Bonchi, we have shown that an extremely simple idea from concurrency theory could give rise to algorithms that are often exponentially faster than the algorithms currently used in verification tools.
My claim is that this idea could scale to richer models, revolutionising existing verification tools and providing algorithms for problems whose decidability is still open.
Moreover, the expected simplicity of those algorithms will make it possible to implement them inside certification tools such as Coq, to provide powerful automation techniques based on verification techniques. In the end, we will thus provide efficient and certified verification tools going beyond the state-of-the-art, but also the ability to use such tools inside the Coq proof assistant, to alleviate the cost of certification tasks.
Max ERC Funding
1 407 413 €
Duration
Start date: 2016-04-01, End date: 2021-03-31
Project acronym CRISPAIR
Project Study of the interplay between CRISPR interference and DNA repair pathways towards the development of novel CRISPR tools
Researcher (PI) David Bikard
Host Institution (HI) INSTITUT PASTEUR
Call Details Starting Grant (StG), LS1, ERC-2015-STG
Summary CRISPR-Cas loci are the adaptive immune system of archaea and bacteria. They can capture pieces of invading DNA and use this information to degrade target DNA through the action of RNA-guided nucleases. The consequences of DNA cleavage by Cas nucleases, i.e. how breaks are processed and whether they can be repaired, remains to be investigated. A better understanding of the interplay between DNA repair and CRISPR-Cas is critical both to shed light on the evolution and biology of these fascinating systems and for the development of biotechnological tools based on Cas nucleases. CRISPR systems have indeed become a popular tool to edit Eukaryotic genomes. The strategies employed take advantage of different DNA repair pathways to introduce mutations upon DNA cleavage. In bacteria however, the introduction of breaks by Cas nucleases in the chromosome has been described to kill the cell. Preliminary data indicates that this might not always be the case and that some DNA repair pathways could compete with CRISPR immunity allowing cells to survive. Using a combination of bioinformatics and genetics approaches we will investigate the interplay between CRISPR and DNA repair in bacteria with a particular focus on the widely used CRISPR-Cas9 system. The knowledge gained from this study will then help us develop novel tools for bacterial genome engineering. In particular we will introduce a NHEJ pathway in E.coli making it possible to perform CRISPR knockout screens. Finally using CRISPR libraries and multiplexed targeting, we will generate for the first time all combinations of pair-wise gene knockouts in an organism, a task that for now remains elusive, even for large consortiums and with the use of automation. This will enable to decipher genome-scale genetic interaction networks, an important step for our understanding of bacteria as a system.
Summary
CRISPR-Cas loci are the adaptive immune system of archaea and bacteria. They can capture pieces of invading DNA and use this information to degrade target DNA through the action of RNA-guided nucleases. The consequences of DNA cleavage by Cas nucleases, i.e. how breaks are processed and whether they can be repaired, remains to be investigated. A better understanding of the interplay between DNA repair and CRISPR-Cas is critical both to shed light on the evolution and biology of these fascinating systems and for the development of biotechnological tools based on Cas nucleases. CRISPR systems have indeed become a popular tool to edit Eukaryotic genomes. The strategies employed take advantage of different DNA repair pathways to introduce mutations upon DNA cleavage. In bacteria however, the introduction of breaks by Cas nucleases in the chromosome has been described to kill the cell. Preliminary data indicates that this might not always be the case and that some DNA repair pathways could compete with CRISPR immunity allowing cells to survive. Using a combination of bioinformatics and genetics approaches we will investigate the interplay between CRISPR and DNA repair in bacteria with a particular focus on the widely used CRISPR-Cas9 system. The knowledge gained from this study will then help us develop novel tools for bacterial genome engineering. In particular we will introduce a NHEJ pathway in E.coli making it possible to perform CRISPR knockout screens. Finally using CRISPR libraries and multiplexed targeting, we will generate for the first time all combinations of pair-wise gene knockouts in an organism, a task that for now remains elusive, even for large consortiums and with the use of automation. This will enable to decipher genome-scale genetic interaction networks, an important step for our understanding of bacteria as a system.
Max ERC Funding
1 499 763 €
Duration
Start date: 2016-03-01, End date: 2021-02-28
Project acronym DYNA-MIC
Project Deep non-invasive imaging via scattered-light acoustically-mediated computational microscopy
Researcher (PI) Ori Katz
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), PE7, ERC-2015-STG
Summary Optical microscopy, perhaps the most important tool in biomedical investigation and clinical diagnostics, is currently held back by the assumption that it is not possible to noninvasively image microscopic structures more than a fraction of a millimeter deep inside tissue. The governing paradigm is that high-resolution information carried by light is lost due to random scattering in complex samples such as tissue. While non-optical imaging techniques, employing non-ionizing radiation such as ultrasound, allow deeper investigations, they possess drastically inferior resolution and do not permit microscopic studies of cellular structures, crucial for accurate diagnosis of cancer and other diseases.
I propose a new kind of microscope, one that can peer deep inside visually opaque samples, combining the sub-micron resolution of light with the penetration depth of ultrasound. My novel approach is based on our discovery that information on microscopic structures is contained in random scattered-light patterns. It breaks current limits by exploiting the randomness of scattered light rather than struggling to fight it.
We will transform this concept into a breakthrough imaging platform by combining ultrasonic probing and modulation of light with advanced digital signal processing algorithms, extracting the hidden microscopic structure by two complementary approaches: 1) By exploiting the stochastic dynamics of scattered light using methods developed to surpass the diffraction limit in optical nanoscopy and for compressive sampling, harnessing nonlinear effects. 2) Through the analysis of intrinsic correlations in scattered light that persist deep inside scattering tissue.
This proposal is formed by bringing together novel insights on the physics of light in complex media, advanced microscopy techniques, and ultrasound-mediated imaging. It is made possible by the new ability to digitally process vast amounts of scattering data, and has the potential to impact many fields.
Summary
Optical microscopy, perhaps the most important tool in biomedical investigation and clinical diagnostics, is currently held back by the assumption that it is not possible to noninvasively image microscopic structures more than a fraction of a millimeter deep inside tissue. The governing paradigm is that high-resolution information carried by light is lost due to random scattering in complex samples such as tissue. While non-optical imaging techniques, employing non-ionizing radiation such as ultrasound, allow deeper investigations, they possess drastically inferior resolution and do not permit microscopic studies of cellular structures, crucial for accurate diagnosis of cancer and other diseases.
I propose a new kind of microscope, one that can peer deep inside visually opaque samples, combining the sub-micron resolution of light with the penetration depth of ultrasound. My novel approach is based on our discovery that information on microscopic structures is contained in random scattered-light patterns. It breaks current limits by exploiting the randomness of scattered light rather than struggling to fight it.
We will transform this concept into a breakthrough imaging platform by combining ultrasonic probing and modulation of light with advanced digital signal processing algorithms, extracting the hidden microscopic structure by two complementary approaches: 1) By exploiting the stochastic dynamics of scattered light using methods developed to surpass the diffraction limit in optical nanoscopy and for compressive sampling, harnessing nonlinear effects. 2) Through the analysis of intrinsic correlations in scattered light that persist deep inside scattering tissue.
This proposal is formed by bringing together novel insights on the physics of light in complex media, advanced microscopy techniques, and ultrasound-mediated imaging. It is made possible by the new ability to digitally process vast amounts of scattering data, and has the potential to impact many fields.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-04-01, End date: 2021-03-31
Project acronym DYNAMIQS
Project Relaxation dynamics in closed quantum systems
Researcher (PI) Marc Cheneau
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE2, ERC-2015-STG
Summary Statistical mechanics, a century-old theory, is probably one of the most powerful constructions of physics. It predicts that the equilibrium properties of any system composed of a large number of particles depend only on a handful of macroscopic parameters, no matter how the particles interact with each other. But the question of how many-body systems relax towards such equilibrium states remains largely unsolved. This problem is especially acute for quantum systems, which evolve in a much larger mathematical space than the classical space-time and obey non-local equations of motion. Despite the formidable complexity of quantum dynamics, recent theoretical advances have put forward a very simple picture: the dynamics of closed quantum many-body systems would be essentially local, meaning that it would take a finite time for correlations between two distant regions of space to reach their equilibrium value. This locality would be an emergent collective property, similar to spontaneous symmetry breaking, and have its origin in the propagation of quasiparticle excitations. The fact is, however, that only few observations directly confirm this scenario. In particular, the role played by the dimensionality and the interaction range is largely unknown. The concept of this project is to take advantage of the great versatility offered by ultracold atom systems to investigate experimentally the relaxation dynamics in regimes well beyond the boundaries of our current knowledge. We will focus our attention on two-dimensional systems with both short- and long-range interactions, when all previous experiments were bound to one-dimensional systems. The realisation of the project will hinge on the construction on a new-generation quantum gas microscope experiment for strontium gases. Amongst the innovative techniques that we will implement is the electronic state hybridisation with Rydberg states, called Rydberg dressing.
Summary
Statistical mechanics, a century-old theory, is probably one of the most powerful constructions of physics. It predicts that the equilibrium properties of any system composed of a large number of particles depend only on a handful of macroscopic parameters, no matter how the particles interact with each other. But the question of how many-body systems relax towards such equilibrium states remains largely unsolved. This problem is especially acute for quantum systems, which evolve in a much larger mathematical space than the classical space-time and obey non-local equations of motion. Despite the formidable complexity of quantum dynamics, recent theoretical advances have put forward a very simple picture: the dynamics of closed quantum many-body systems would be essentially local, meaning that it would take a finite time for correlations between two distant regions of space to reach their equilibrium value. This locality would be an emergent collective property, similar to spontaneous symmetry breaking, and have its origin in the propagation of quasiparticle excitations. The fact is, however, that only few observations directly confirm this scenario. In particular, the role played by the dimensionality and the interaction range is largely unknown. The concept of this project is to take advantage of the great versatility offered by ultracold atom systems to investigate experimentally the relaxation dynamics in regimes well beyond the boundaries of our current knowledge. We will focus our attention on two-dimensional systems with both short- and long-range interactions, when all previous experiments were bound to one-dimensional systems. The realisation of the project will hinge on the construction on a new-generation quantum gas microscope experiment for strontium gases. Amongst the innovative techniques that we will implement is the electronic state hybridisation with Rydberg states, called Rydberg dressing.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym Emergent-BH
Project Emergent spacetime and maximally spinning black holes
Researcher (PI) Monica Guica
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE2, ERC-2015-STG
Summary One of the greatest challenges of theoretical physics is to understand the fundamental nature of gravity and how it is reconciled with quantum mechanics. Black holes indicate that gravity is holographic, i.e. it is emergent, together with some of the spacetime dimensions, from a lower-dimensional field theory. The emergence mechanism has just started to be understood in certain special contexts, such as AdS/CFT. However, very little is known about it for the spacetime backgrounds relevant to the real world, due mainly to our lack of knowledge of the underlying field theories.
My goal is to uncover the fundamental nature of spacetime and gravity in our universe by: i) formulating and working out the properties of the relevant lower-dimensional field theories and ii) studying the mechanism by which spacetime and gravity emerge from them. I will adress the first problem by concentrating on the near-horizon regions of maximally spinning black holes, for which the dual field theories greatly simplify and can be studied using a combination of conformal field theory and string theory methods. To study the emergence mechanism, I plan to adapt the tools that were succesfully used to understand emergent gravity in anti de-Sitter (AdS) spacetimes - such as holographic quantum entanglement and conformal bootstrap - to non-AdS, more realistic spacetimes.
Summary
One of the greatest challenges of theoretical physics is to understand the fundamental nature of gravity and how it is reconciled with quantum mechanics. Black holes indicate that gravity is holographic, i.e. it is emergent, together with some of the spacetime dimensions, from a lower-dimensional field theory. The emergence mechanism has just started to be understood in certain special contexts, such as AdS/CFT. However, very little is known about it for the spacetime backgrounds relevant to the real world, due mainly to our lack of knowledge of the underlying field theories.
My goal is to uncover the fundamental nature of spacetime and gravity in our universe by: i) formulating and working out the properties of the relevant lower-dimensional field theories and ii) studying the mechanism by which spacetime and gravity emerge from them. I will adress the first problem by concentrating on the near-horizon regions of maximally spinning black holes, for which the dual field theories greatly simplify and can be studied using a combination of conformal field theory and string theory methods. To study the emergence mechanism, I plan to adapt the tools that were succesfully used to understand emergent gravity in anti de-Sitter (AdS) spacetimes - such as holographic quantum entanglement and conformal bootstrap - to non-AdS, more realistic spacetimes.
Max ERC Funding
1 495 476 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym FAT NKT
Project Targeting iNKT cell and adipocyte crosstalk for control of metabolism and body weight
Researcher (PI) Lydia Lynch
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Starting Grant (StG), LS6, ERC-2015-STG
Summary Obesity has reached epidemic proportions globally. At least 2.8 million people die each year as a result of being overweight or obese, the biggest burden being obesity-related diseases. It is now clear that inflammation is an underlying cause or contributor to many of these diseases, including type 2 diabetes, atherosclerosis, and cancer. Recognition that the immune system can regulate metabolic pathways has prompted a new way of thinking about diabetes and weight management. Despite much recent progress, most immunometabolic pathways, and how to target them, are currently unknown. One such pathway is the cross-talk between invariant natural killer (iNKT) cells and neighboring adipocytes. iNKT cells are the innate lipid-sensing arm of the immune system. Since our discovery that mammalian adipose tissue is enriched for iNKT cells, we have identified a critical role for iNKT cells in regulating adipose inflammation and body weight. The goal of this project is to use a multi-disciplinary approach to identify key signals and molecules used by iNKT cells to induce metabolic control and weight loss in obesity. Using immunological assays and multi-photon intravital microscopy, cells and pathways that control the unique regulatory functions of adipose iNKT cells will be identified and characterised. Novel lipid antigens in adipose tissue will be identified using a biochemical approach, perhaps explaining iNKT cell conservation in adipose depots, and providing safe tools for iNKT cell manipulation in vivo. Finally, using proteomics and whole body metabolic analysis in vivo, novel ‘weight-loss inducing’ factors produced by adipose iNKT cells will be identified. This ambitious and high impact project has the potential to yield major insights into immunometabolic interactions at steady state and in obesity. The ability to activate or induce adipose iNKT cells holds remarkable potential as an entirely new therapeutic direction for treating obesity and type 2 diabetes.
Summary
Obesity has reached epidemic proportions globally. At least 2.8 million people die each year as a result of being overweight or obese, the biggest burden being obesity-related diseases. It is now clear that inflammation is an underlying cause or contributor to many of these diseases, including type 2 diabetes, atherosclerosis, and cancer. Recognition that the immune system can regulate metabolic pathways has prompted a new way of thinking about diabetes and weight management. Despite much recent progress, most immunometabolic pathways, and how to target them, are currently unknown. One such pathway is the cross-talk between invariant natural killer (iNKT) cells and neighboring adipocytes. iNKT cells are the innate lipid-sensing arm of the immune system. Since our discovery that mammalian adipose tissue is enriched for iNKT cells, we have identified a critical role for iNKT cells in regulating adipose inflammation and body weight. The goal of this project is to use a multi-disciplinary approach to identify key signals and molecules used by iNKT cells to induce metabolic control and weight loss in obesity. Using immunological assays and multi-photon intravital microscopy, cells and pathways that control the unique regulatory functions of adipose iNKT cells will be identified and characterised. Novel lipid antigens in adipose tissue will be identified using a biochemical approach, perhaps explaining iNKT cell conservation in adipose depots, and providing safe tools for iNKT cell manipulation in vivo. Finally, using proteomics and whole body metabolic analysis in vivo, novel ‘weight-loss inducing’ factors produced by adipose iNKT cells will be identified. This ambitious and high impact project has the potential to yield major insights into immunometabolic interactions at steady state and in obesity. The ability to activate or induce adipose iNKT cells holds remarkable potential as an entirely new therapeutic direction for treating obesity and type 2 diabetes.
Max ERC Funding
1 804 052 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym FluctEvol
Project Fluctuating selection, evolution, and plasticity in random environments
Researcher (PI) Luis-Miguel Chevin
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), LS8, ERC-2015-STG
Summary Temporal environmental variation in natural systems includes a large component of random fluctuations, the magnitude and predictability of which is modified under current climate change. The need for predicting eco-evolutionary impacts of plastic and evolutionary responses to changing environments is still hampered by lack of strong experimental evidence. FluctEvol aims at shedding a new light on population responses to stochastic environments, and facilitating their prediction, using a unique combination of approaches. First, theoretical models of evolution and demography under a randomly changing optimum phenotype will be designed and analysed, producing new quantitative predictions. Second, statistical methodologies will be developed, and employed in meta-analyses of long-term datasets from natural populations. And third, large-scale and automated experimental evolution in stochastic environments will be carried out with the micro-alga Dunaliella salina, an extremophile that thrives at high and variable salinities. We will manipulate the magnitude and predictability of fluctuations in salinity, and use high-throughput phenotyping and candidate-gene sequencing to analyse the evolution of plasticity for traits involved in salinity adaptation in this species: glycerol and carotene content. We will thus combine the benefits of experimental evolution in microbes (short generations, ample replication) with a priori knowledge of ecologically relevant adaptive traits, allowing for hypothesis-driven experiments. The success of this project in increasing our predictive power about eco-evolutionary dynamics is warranted by the experience of the PI, at the interface between theoretical and empirical approaches. Our experiments will have relevance beyond academia, as we will modify through evolution the plasticity of traits (accumulation of energetic cell metabolites) that are direct targets for bioindustry, thus potentially overcoming current limitations in productivity.
Summary
Temporal environmental variation in natural systems includes a large component of random fluctuations, the magnitude and predictability of which is modified under current climate change. The need for predicting eco-evolutionary impacts of plastic and evolutionary responses to changing environments is still hampered by lack of strong experimental evidence. FluctEvol aims at shedding a new light on population responses to stochastic environments, and facilitating their prediction, using a unique combination of approaches. First, theoretical models of evolution and demography under a randomly changing optimum phenotype will be designed and analysed, producing new quantitative predictions. Second, statistical methodologies will be developed, and employed in meta-analyses of long-term datasets from natural populations. And third, large-scale and automated experimental evolution in stochastic environments will be carried out with the micro-alga Dunaliella salina, an extremophile that thrives at high and variable salinities. We will manipulate the magnitude and predictability of fluctuations in salinity, and use high-throughput phenotyping and candidate-gene sequencing to analyse the evolution of plasticity for traits involved in salinity adaptation in this species: glycerol and carotene content. We will thus combine the benefits of experimental evolution in microbes (short generations, ample replication) with a priori knowledge of ecologically relevant adaptive traits, allowing for hypothesis-driven experiments. The success of this project in increasing our predictive power about eco-evolutionary dynamics is warranted by the experience of the PI, at the interface between theoretical and empirical approaches. Our experiments will have relevance beyond academia, as we will modify through evolution the plasticity of traits (accumulation of energetic cell metabolites) that are direct targets for bioindustry, thus potentially overcoming current limitations in productivity.
Max ERC Funding
1 499 665 €
Duration
Start date: 2016-03-01, End date: 2021-02-28
Project acronym FORECASToneMONTH
Project Forecasting Surface Weather and Climate at One-Month Leads through Stratosphere-Troposphere Coupling
Researcher (PI) Chaim Israel Garfinkel
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), PE10, ERC-2015-STG
Summary Anomalies in surface temperatures, winds, and precipitation can significantly alter energy supply and demand, cause flooding, and cripple transportation networks. Better management of these impacts can be achieved by extending the duration of reliable predictions of the atmospheric circulation.
Polar stratospheric variability can impact surface weather for well over a month, and this proposed research presents a novel approach towards understanding the fundamentals of how this coupling occurs. Specifically, we are interested in: 1) how predictable are anomalies in the stratospheric circulation? 2) why do only some stratospheric events modify surface weather? and 3) what is the mechanism whereby stratospheric anomalies reach the surface? While this last question may appear academic, several studies indicate that stratosphere-troposphere coupling drives the midlatitude tropospheric response to climate change; therefore, a clearer understanding of the mechanisms will aid in the interpretation of the upcoming changes in the surface climate.
I propose a multi-pronged effort aimed at addressing these questions and improving monthly forecasting. First, carefully designed modelling experiments using a novel modelling framework will be used to clarify how, and under what conditions, stratospheric variability couples to tropospheric variability. Second, novel linkages between variability external to the stratospheric polar vortex and the stratospheric polar vortex will be pursued, thus improving our ability to forecast polar vortex variability itself. To these ends, my group will develop 1) an analytic model for Rossby wave propagation on the sphere, and 2) a simplified general circulation model, which captures the essential processes underlying stratosphere-troposphere coupling. By combining output from the new models, observational data, and output from comprehensive climate models, the connections between the stratosphere and surface climate will be elucidated.
Summary
Anomalies in surface temperatures, winds, and precipitation can significantly alter energy supply and demand, cause flooding, and cripple transportation networks. Better management of these impacts can be achieved by extending the duration of reliable predictions of the atmospheric circulation.
Polar stratospheric variability can impact surface weather for well over a month, and this proposed research presents a novel approach towards understanding the fundamentals of how this coupling occurs. Specifically, we are interested in: 1) how predictable are anomalies in the stratospheric circulation? 2) why do only some stratospheric events modify surface weather? and 3) what is the mechanism whereby stratospheric anomalies reach the surface? While this last question may appear academic, several studies indicate that stratosphere-troposphere coupling drives the midlatitude tropospheric response to climate change; therefore, a clearer understanding of the mechanisms will aid in the interpretation of the upcoming changes in the surface climate.
I propose a multi-pronged effort aimed at addressing these questions and improving monthly forecasting. First, carefully designed modelling experiments using a novel modelling framework will be used to clarify how, and under what conditions, stratospheric variability couples to tropospheric variability. Second, novel linkages between variability external to the stratospheric polar vortex and the stratospheric polar vortex will be pursued, thus improving our ability to forecast polar vortex variability itself. To these ends, my group will develop 1) an analytic model for Rossby wave propagation on the sphere, and 2) a simplified general circulation model, which captures the essential processes underlying stratosphere-troposphere coupling. By combining output from the new models, observational data, and output from comprehensive climate models, the connections between the stratosphere and surface climate will be elucidated.
Max ERC Funding
1 808 000 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym FOVEDIS
Project Formal specification and verification of distributed data structures
Researcher (PI) Constantin Enea
Host Institution (HI) UNIVERSITE PARIS DIDEROT - PARIS 7
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary The future of the computing technology relies on fast access, transformation, and exchange of data across large-scale networks such as the Internet. The design of software systems that support high-frequency parallel accesses to high-quantity data is a fundamental challenge. As more scalable alternatives to traditional relational databases, distributed data structures (DDSs) are at the basis of a wide range of automated services, for now, and for the foreseeable future.
This proposal aims to improve our understanding of the theoretical foundations of DDSs. The design and the usage of DDSs are based on new principles, for which we currently lack rigorous engineering methodologies. Specifically, we lack design procedures based on precise specifications, and automated reasoning techniques for enhancing the reliability of the engineering process.
The targeted breakthrough of this proposal is developing automated formal methods for rigorous engineering of DDSs. A first objective is to define coherent formal specifications that provide precise requirements at design time and explicit guarantees during their usage. Then, we will investigate practical programming principles, compatible with these specifications, for building applications that use DDSs. Finally, we will develop efficient automated reasoning techniques for debugging or validating DDS implementations against their specifications. The principles underlying automated reasoning are also important for identifying best practices in the design of these complex systems to increase confidence in their correctness. The developed methodologies based on formal specifications will thus benefit both the conception and automated validation of DDS implementations and the applications that use them.
Summary
The future of the computing technology relies on fast access, transformation, and exchange of data across large-scale networks such as the Internet. The design of software systems that support high-frequency parallel accesses to high-quantity data is a fundamental challenge. As more scalable alternatives to traditional relational databases, distributed data structures (DDSs) are at the basis of a wide range of automated services, for now, and for the foreseeable future.
This proposal aims to improve our understanding of the theoretical foundations of DDSs. The design and the usage of DDSs are based on new principles, for which we currently lack rigorous engineering methodologies. Specifically, we lack design procedures based on precise specifications, and automated reasoning techniques for enhancing the reliability of the engineering process.
The targeted breakthrough of this proposal is developing automated formal methods for rigorous engineering of DDSs. A first objective is to define coherent formal specifications that provide precise requirements at design time and explicit guarantees during their usage. Then, we will investigate practical programming principles, compatible with these specifications, for building applications that use DDSs. Finally, we will develop efficient automated reasoning techniques for debugging or validating DDS implementations against their specifications. The principles underlying automated reasoning are also important for identifying best practices in the design of these complex systems to increase confidence in their correctness. The developed methodologies based on formal specifications will thus benefit both the conception and automated validation of DDS implementations and the applications that use them.
Max ERC Funding
1 300 000 €
Duration
Start date: 2016-05-01, End date: 2021-04-30