Project acronym 3D-REPAIR
Project Spatial organization of DNA repair within the nucleus
Researcher (PI) Evanthia Soutoglou
Host Institution (HI) THE UNIVERSITY OF SUSSEX
Country United Kingdom
Call Details Consolidator Grant (CoG), LS2, ERC-2015-CoG
Summary Faithful repair of double stranded DNA breaks (DSBs) is essential, as they are at the origin of genome instability, chromosomal translocations and cancer. Cells repair DSBs through different pathways, which can be faithful or mutagenic, and the balance between them at a given locus must be tightly regulated to preserve genome integrity. Although, much is known about DSB repair factors, how the choice between pathways is controlled within the nuclear environment is not understood. We have shown that nuclear architecture and non-random genome organization determine the frequency of chromosomal translocations and that pathway choice is dictated by the spatial organization of DNA in the nucleus. Nevertheless, what determines which pathway is activated in response to DSBs at specific genomic locations is not understood. Furthermore, the impact of 3D-genome folding on the kinetics and efficiency of DSB repair is completely unknown.
Here we aim to understand how nuclear compartmentalization, chromatin structure and genome organization impact on the efficiency of detection, signaling and repair of DSBs. We will unravel what determines the DNA repair specificity within distinct nuclear compartments using protein tethering, promiscuous biotinylation and quantitative proteomics. We will determine how DNA repair is orchestrated at different heterochromatin structures using a CRISPR/Cas9-based system that allows, for the first time robust induction of DSBs at specific heterochromatin compartments. Finally, we will investigate the role of 3D-genome folding in the kinetics of DNA repair and pathway choice using single nucleotide resolution DSB-mapping coupled to 3D-topological maps.
This proposal has significant implications for understanding the mechanisms controlling DNA repair within the nuclear environment and will reveal the regions of the genome that are susceptible to genomic instability and help us understand why certain mutations and translocations are recurrent in cancer
Summary
Faithful repair of double stranded DNA breaks (DSBs) is essential, as they are at the origin of genome instability, chromosomal translocations and cancer. Cells repair DSBs through different pathways, which can be faithful or mutagenic, and the balance between them at a given locus must be tightly regulated to preserve genome integrity. Although, much is known about DSB repair factors, how the choice between pathways is controlled within the nuclear environment is not understood. We have shown that nuclear architecture and non-random genome organization determine the frequency of chromosomal translocations and that pathway choice is dictated by the spatial organization of DNA in the nucleus. Nevertheless, what determines which pathway is activated in response to DSBs at specific genomic locations is not understood. Furthermore, the impact of 3D-genome folding on the kinetics and efficiency of DSB repair is completely unknown.
Here we aim to understand how nuclear compartmentalization, chromatin structure and genome organization impact on the efficiency of detection, signaling and repair of DSBs. We will unravel what determines the DNA repair specificity within distinct nuclear compartments using protein tethering, promiscuous biotinylation and quantitative proteomics. We will determine how DNA repair is orchestrated at different heterochromatin structures using a CRISPR/Cas9-based system that allows, for the first time robust induction of DSBs at specific heterochromatin compartments. Finally, we will investigate the role of 3D-genome folding in the kinetics of DNA repair and pathway choice using single nucleotide resolution DSB-mapping coupled to 3D-topological maps.
This proposal has significant implications for understanding the mechanisms controlling DNA repair within the nuclear environment and will reveal the regions of the genome that are susceptible to genomic instability and help us understand why certain mutations and translocations are recurrent in cancer
Max ERC Funding
1 999 750 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym ALUNIF
Project Algorithms and Lower Bounds: A Unified Approach
Researcher (PI) Rahul Santhanam
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Country United Kingdom
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary One of the fundamental goals of theoretical computer science is to
understand the possibilities and limits of efficient computation. This
quest has two dimensions. The
theory of algorithms focuses on finding efficient solutions to
problems, while computational complexity theory aims to understand when
and why problems are hard to solve. These two areas have different
philosophies and use different sets of techniques. However, in recent
years there have been indications of deep and mysterious connections
between them.
In this project, we propose to explore and develop the connections between
algorithmic analysis and complexity lower bounds in a systematic way.
On the one hand, we plan to use complexity lower bound techniques as inspiration
to design new and improved algorithms for Satisfiability and other
NP-complete problems, as well as to analyze existing algorithms better.
On the other hand, we plan to strengthen implications yielding circuit
lower bounds from non-trivial algorithms for Satisfiability, and to derive
new circuit lower bounds using these stronger implications.
This project has potential for massive impact in both the areas of algorithms
and computational complexity. Improved algorithms for Satisfiability could lead
to improved SAT solvers, and the new analytical tools would lead to a better
understanding of existing heuristics. Complexity lower bound questions are
fundamental
but notoriously difficult, and new lower bounds would open the way to
unconditionally secure cryptographic protocols and derandomization of
probabilistic algorithms. More broadly, this project aims to initiate greater
dialogue between the two areas, with an exchange of ideas and techniques
which leads to accelerated progress in both, as well as a deeper understanding
of the nature of efficient computation.
Summary
One of the fundamental goals of theoretical computer science is to
understand the possibilities and limits of efficient computation. This
quest has two dimensions. The
theory of algorithms focuses on finding efficient solutions to
problems, while computational complexity theory aims to understand when
and why problems are hard to solve. These two areas have different
philosophies and use different sets of techniques. However, in recent
years there have been indications of deep and mysterious connections
between them.
In this project, we propose to explore and develop the connections between
algorithmic analysis and complexity lower bounds in a systematic way.
On the one hand, we plan to use complexity lower bound techniques as inspiration
to design new and improved algorithms for Satisfiability and other
NP-complete problems, as well as to analyze existing algorithms better.
On the other hand, we plan to strengthen implications yielding circuit
lower bounds from non-trivial algorithms for Satisfiability, and to derive
new circuit lower bounds using these stronger implications.
This project has potential for massive impact in both the areas of algorithms
and computational complexity. Improved algorithms for Satisfiability could lead
to improved SAT solvers, and the new analytical tools would lead to a better
understanding of existing heuristics. Complexity lower bound questions are
fundamental
but notoriously difficult, and new lower bounds would open the way to
unconditionally secure cryptographic protocols and derandomization of
probabilistic algorithms. More broadly, this project aims to initiate greater
dialogue between the two areas, with an exchange of ideas and techniques
which leads to accelerated progress in both, as well as a deeper understanding
of the nature of efficient computation.
Max ERC Funding
1 274 496 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym ALZSYN
Project Imaging synaptic contributors to dementia
Researcher (PI) Tara Spires-Jones
Host Institution (HI) THE UNIVERSITY OF EDINBURGH
Country United Kingdom
Call Details Consolidator Grant (CoG), LS5, ERC-2015-CoG
Summary Alzheimer's disease, the most common cause of dementia in older people, is a devastating condition that is becoming a public health crisis as our population ages. Despite great progress recently in Alzheimer’s disease research, we have no disease modifying drugs and a decade with a 99.6% failure rate of clinical trials attempting to treat the disease. This project aims to develop relevant therapeutic targets to restore brain function in Alzheimer’s disease by integrating human and model studies of synapses. It is widely accepted in the field that alterations in amyloid beta initiate the disease process. However the cascade leading from changes in amyloid to widespread tau pathology and neurodegeneration remain unclear. Synapse loss is the strongest pathological correlate of dementia in Alzheimer’s, and mounting evidence suggests that synapse degeneration plays a key role in causing cognitive decline. Here I propose to test the hypothesis that the amyloid cascade begins at the synapse leading to tau pathology, synapse dysfunction and loss, and ultimately neural circuit collapse causing cognitive impairment. The team will use cutting-edge multiphoton and array tomography imaging techniques to test mechanisms downstream of amyloid beta at synapses, and determine whether intervening in the cascade allows recovery of synapse structure and function. Importantly, I will combine studies in robust models of familial Alzheimer’s disease with studies in postmortem human brain to confirm relevance of our mechanistic studies to human disease. Finally, human stem cell derived neurons will be used to test mechanisms and potential therapeutics in neurons expressing the human proteome. Together, these experiments are ground-breaking since they have the potential to further our understanding of how synapses are lost in Alzheimer’s disease and to identify targets for effective therapeutic intervention, which is a critical unmet need in today’s health care system.
Summary
Alzheimer's disease, the most common cause of dementia in older people, is a devastating condition that is becoming a public health crisis as our population ages. Despite great progress recently in Alzheimer’s disease research, we have no disease modifying drugs and a decade with a 99.6% failure rate of clinical trials attempting to treat the disease. This project aims to develop relevant therapeutic targets to restore brain function in Alzheimer’s disease by integrating human and model studies of synapses. It is widely accepted in the field that alterations in amyloid beta initiate the disease process. However the cascade leading from changes in amyloid to widespread tau pathology and neurodegeneration remain unclear. Synapse loss is the strongest pathological correlate of dementia in Alzheimer’s, and mounting evidence suggests that synapse degeneration plays a key role in causing cognitive decline. Here I propose to test the hypothesis that the amyloid cascade begins at the synapse leading to tau pathology, synapse dysfunction and loss, and ultimately neural circuit collapse causing cognitive impairment. The team will use cutting-edge multiphoton and array tomography imaging techniques to test mechanisms downstream of amyloid beta at synapses, and determine whether intervening in the cascade allows recovery of synapse structure and function. Importantly, I will combine studies in robust models of familial Alzheimer’s disease with studies in postmortem human brain to confirm relevance of our mechanistic studies to human disease. Finally, human stem cell derived neurons will be used to test mechanisms and potential therapeutics in neurons expressing the human proteome. Together, these experiments are ground-breaking since they have the potential to further our understanding of how synapses are lost in Alzheimer’s disease and to identify targets for effective therapeutic intervention, which is a critical unmet need in today’s health care system.
Max ERC Funding
2 000 000 €
Duration
Start date: 2016-11-01, End date: 2021-10-31
Project acronym BIGBAYES
Project Rich, Structured and Efficient Learning of Big Bayesian Models
Researcher (PI) Yee Whye Teh
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Country United Kingdom
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary As datasets grow ever larger in scale, complexity and variety, there is an increasing need for powerful machine learning and statistical techniques that are capable of learning from such data. Bayesian nonparametrics is a promising approach to data analysis that is increasingly popular in machine learning and statistics. Bayesian nonparametric models are highly flexible models with infinite-dimensional parameter spaces that can be used to directly parameterise and learn about functions, densities, conditional distributions etc, and have been successfully applied to regression, survival analysis, language modelling, time series analysis, and visual scene analysis among others. However, to successfully use Bayesian nonparametric models to analyse the high-dimensional and structured datasets now commonly encountered in the age of Big Data, we will have to overcome a number of challenges. Namely, we need to develop Bayesian nonparametric models that can learn rich representations from structured data, and we need computational methodologies that can scale effectively to the large and complex models of the future. We will ground our developments in relevant applications, particularly to natural language processing (learning distributed representations for language modelling and compositional semantics) and genetics (modelling genetic variations arising from population, genealogical and spatial structures).
Summary
As datasets grow ever larger in scale, complexity and variety, there is an increasing need for powerful machine learning and statistical techniques that are capable of learning from such data. Bayesian nonparametrics is a promising approach to data analysis that is increasingly popular in machine learning and statistics. Bayesian nonparametric models are highly flexible models with infinite-dimensional parameter spaces that can be used to directly parameterise and learn about functions, densities, conditional distributions etc, and have been successfully applied to regression, survival analysis, language modelling, time series analysis, and visual scene analysis among others. However, to successfully use Bayesian nonparametric models to analyse the high-dimensional and structured datasets now commonly encountered in the age of Big Data, we will have to overcome a number of challenges. Namely, we need to develop Bayesian nonparametric models that can learn rich representations from structured data, and we need computational methodologies that can scale effectively to the large and complex models of the future. We will ground our developments in relevant applications, particularly to natural language processing (learning distributed representations for language modelling and compositional semantics) and genetics (modelling genetic variations arising from population, genealogical and spatial structures).
Max ERC Funding
1 918 092 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym BPI
Project Bayesian Peer Influence: Group Beliefs, Polarisation and Segregation
Researcher (PI) Gilat Levy
Host Institution (HI) LONDON SCHOOL OF ECONOMICS AND POLITICAL SCIENCE
Country United Kingdom
Call Details Consolidator Grant (CoG), SH1, ERC-2015-CoG
Summary "The objective of this research agenda is to provide a new framework to model and analyze dynamics of group beliefs, in order to study phenomena such as group polarization, segregation and inter-group discrimination. We introduce a simple new heuristic, the Bayesian Peer Influence heuristic (BPI), which is based on rational foundations and captures how individuals are influenced by others' beliefs. We will explore the theoretical properties of this heuristic, and apply the model to analyze the implications of belief dynamics on social interactions.
Understanding the formation and evolution of beliefs in groups is an important aspect of many economic applications, such as labour market discrimination. The beliefs that different groups of people have about members of other groups should be central to any theory or empirical investigation of this topic. At the same time, economic models of segregation and discrimination typically do not focus on the evolution and dynamics of group beliefs that allow for such phenomena. There is therefore a need for new tools of analysis for incorporating the dynamics of group beliefs; this is particularly important in order to understand the full implications of policy interventions which often intend to ""educate the public''. The BPI fills this gap in the literature by offering a tractable and natural heuristic for group communication.
Our aim is to study the theoretical properties of the BPI, as well as its applications to the dynamics of group behavior. Our plan is to: (i) Analyze rational learning from others’ beliefs and characterise the BPI. (ii) Use the BPI to account for cognitive biases in information processing. (iii) Use the BPI to analyze the diffusion of beliefs in social networks. (iv) Apply the BPI to understand the relation between belief polarization, segregation in education and labour market discrimination. (v) Apply the BPI to understand the relation between belief polarization and political outcomes."
Summary
"The objective of this research agenda is to provide a new framework to model and analyze dynamics of group beliefs, in order to study phenomena such as group polarization, segregation and inter-group discrimination. We introduce a simple new heuristic, the Bayesian Peer Influence heuristic (BPI), which is based on rational foundations and captures how individuals are influenced by others' beliefs. We will explore the theoretical properties of this heuristic, and apply the model to analyze the implications of belief dynamics on social interactions.
Understanding the formation and evolution of beliefs in groups is an important aspect of many economic applications, such as labour market discrimination. The beliefs that different groups of people have about members of other groups should be central to any theory or empirical investigation of this topic. At the same time, economic models of segregation and discrimination typically do not focus on the evolution and dynamics of group beliefs that allow for such phenomena. There is therefore a need for new tools of analysis for incorporating the dynamics of group beliefs; this is particularly important in order to understand the full implications of policy interventions which often intend to ""educate the public''. The BPI fills this gap in the literature by offering a tractable and natural heuristic for group communication.
Our aim is to study the theoretical properties of the BPI, as well as its applications to the dynamics of group behavior. Our plan is to: (i) Analyze rational learning from others’ beliefs and characterise the BPI. (ii) Use the BPI to account for cognitive biases in information processing. (iii) Use the BPI to analyze the diffusion of beliefs in social networks. (iv) Apply the BPI to understand the relation between belief polarization, segregation in education and labour market discrimination. (v) Apply the BPI to understand the relation between belief polarization and political outcomes."
Max ERC Funding
1 662 942 €
Duration
Start date: 2016-08-01, End date: 2022-01-31
Project acronym Critical
Project Behaviour near criticality
Researcher (PI) Martin Hairer
Host Institution (HI) THE UNIVERSITY OF WARWICK
Country United Kingdom
Call Details Consolidator Grant (CoG), PE1, ERC-2013-CoG
Summary "One of the main challenges of modern mathematical physics is to understand the behaviour of systems at or near criticality. In a number of cases, one can argue heuristically that this behaviour should be described by a nonlinear stochastic partial differential equation. Some examples of systems of interest are models of phase coexistence near the critical temperature, one-dimensional interface growth models, and models of absorption of a diffusing particle by random impurities. Unfortunately, the equations arising in all of these contexts are mathematically ill-posed. This is to the extent that they defeat not only ""standard"" stochastic PDE techniques (as developed by Da Prato / Zabczyk / Röckner / Walsh / Krylov / etc), but also more recent approaches based on Wick renormalisation of nonlinearities (Da Prato / Debussche / etc).
Over the past year or so, I have been developing a theory of regularity structures that allows to give a rigorous mathematical interpretation to such equations, which therefore allows to build the mathematical objects conjectured to describe the abovementioned systems near criticality. The aim of the proposal is to study the convergence of a variety of concrete microscopic models to these limiting objects. The main fundamental mathematical tools to be developed in this endeavour are a discrete analogue to the theory of regularity structures, as well as a number of nonlinear invariance principles.
If successful, the project will yield unique insight in the large-scale behaviour of a number of physically relevant systems in regimes where both nonlinear effects and random fluctuations compete with equal strength."
Summary
"One of the main challenges of modern mathematical physics is to understand the behaviour of systems at or near criticality. In a number of cases, one can argue heuristically that this behaviour should be described by a nonlinear stochastic partial differential equation. Some examples of systems of interest are models of phase coexistence near the critical temperature, one-dimensional interface growth models, and models of absorption of a diffusing particle by random impurities. Unfortunately, the equations arising in all of these contexts are mathematically ill-posed. This is to the extent that they defeat not only ""standard"" stochastic PDE techniques (as developed by Da Prato / Zabczyk / Röckner / Walsh / Krylov / etc), but also more recent approaches based on Wick renormalisation of nonlinearities (Da Prato / Debussche / etc).
Over the past year or so, I have been developing a theory of regularity structures that allows to give a rigorous mathematical interpretation to such equations, which therefore allows to build the mathematical objects conjectured to describe the abovementioned systems near criticality. The aim of the proposal is to study the convergence of a variety of concrete microscopic models to these limiting objects. The main fundamental mathematical tools to be developed in this endeavour are a discrete analogue to the theory of regularity structures, as well as a number of nonlinear invariance principles.
If successful, the project will yield unique insight in the large-scale behaviour of a number of physically relevant systems in regimes where both nonlinear effects and random fluctuations compete with equal strength."
Max ERC Funding
1 526 234 €
Duration
Start date: 2014-09-01, End date: 2019-08-31
Project acronym DEPP
Project Designing Effective Public Policies
Researcher (PI) Henrik Jacobsen Kleven
Host Institution (HI) LONDON SCHOOL OF ECONOMICS AND POLITICAL SCIENCE
Country United Kingdom
Call Details Consolidator Grant (CoG), SH1, ERC-2015-CoG
Summary This proposal outlines a number of projects in public economics, with links to other fields such as macro, real estate, labor, and gender economics. The projects evolve around several large administrative datasets from the UK and Denmark, and they advance approaches and methodologies that have recently been developed in public economics into new areas. There is a strong public policy focus running through the proposal, including tax policy, transfer policy, family policy, and indirectly monetary policy. The objective is to achieve a comprehensive understanding of how government interventions affect two key markets: the housing market and the labor market.
The project is divided into two themes. The first theme focuses on the housing market and is divided into three subprojects. The first project investigates the effects of mortgage interest rates on leverage and house prices, and it develops a new quasi-experimental method for estimating the elasticity of intertemporal substitution in consumption, a crucial parameter for many public policies. The second and third projects investigate housing market responses to different tax policies, focusing on how such responses are magnified by liquidity constraints and leverage.
The second theme focuses on the labor market and is divided into two subprojects. The first project studies secular changes in gender inequality and the underlying sources of those changes, focusing mainly on the effects of child rearing on gender inequality. The project explores the underlying mechanisms driving child-related inequality, including gender identity norms and family policies. The second project proposes a new way of estimating macro labor supply elasticities that integrates taxes and public expenditures, and it develops a theoretical framework to draw policy implications from those estimations.
Summary
This proposal outlines a number of projects in public economics, with links to other fields such as macro, real estate, labor, and gender economics. The projects evolve around several large administrative datasets from the UK and Denmark, and they advance approaches and methodologies that have recently been developed in public economics into new areas. There is a strong public policy focus running through the proposal, including tax policy, transfer policy, family policy, and indirectly monetary policy. The objective is to achieve a comprehensive understanding of how government interventions affect two key markets: the housing market and the labor market.
The project is divided into two themes. The first theme focuses on the housing market and is divided into three subprojects. The first project investigates the effects of mortgage interest rates on leverage and house prices, and it develops a new quasi-experimental method for estimating the elasticity of intertemporal substitution in consumption, a crucial parameter for many public policies. The second and third projects investigate housing market responses to different tax policies, focusing on how such responses are magnified by liquidity constraints and leverage.
The second theme focuses on the labor market and is divided into two subprojects. The first project studies secular changes in gender inequality and the underlying sources of those changes, focusing mainly on the effects of child rearing on gender inequality. The project explores the underlying mechanisms driving child-related inequality, including gender identity norms and family policies. The second project proposes a new way of estimating macro labor supply elasticities that integrates taxes and public expenditures, and it develops a theoretical framework to draw policy implications from those estimations.
Max ERC Funding
1 294 699 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym EDWEL
Project Empirical Demand and Welfare Analysis
Researcher (PI) Debopam Bhattacharya
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Country United Kingdom
Call Details Consolidator Grant (CoG), SH1, ERC-2015-CoG
Summary Measurement of consumer welfare is central to economic evaluations. It underlies calculation of price-indices, formulation of tax policies, and environmental and industrial regulation. But existing measurement methods rely on restrictive assumptions about consumer preferences, leading to potentially incorrect conclusions regarding policy-impacts. The proposed project aims to make fundamental contributions to empirical welfare analysis by developing nonparametric approaches, which would avoid such assumptions and thus produce reliable welfare estimates from micro-data. The emphasis will be on welfare-evaluation of price/quality changes in the under-researched but common real-life setting of discrete-choice, e.g., the impact of tuition subsidies for college entrants, fare-hikes for passengers and access to new channels for TV viewers. The project will cover (i) discrete choice with multinomial/ordered/non-exclusive alternatives, (ii) random coefficient choice-models, (iii) settings where one’s choice affects one’s peers’ utilities, and (iv) dynamic choice under uncertainty such as durable-purchase. Welfare analyses in situations (ii)-(iv) are previously unexplored problems and represent ambitious undertakings. Situation (i) has been analyzed only under strong, unsubstantiated assumptions, like quasilinear preferences and extreme valued errors. The key insight driving the project is that welfare calculations require less information than what is needed to identify underlying preference parameters. The project will also develop methods to overcome common data problems like interval-reporting and endogeneity of income. The theoretical results will be complemented by software codes in Stata/R which can be readily used by practitioners. Given the ubiquity of welfare analysis in economic applications and its use in non-academic settings such as merger-analysis, damage calculations, etc., the project is likely to have a substantial impact both in and beyond the academia.
Summary
Measurement of consumer welfare is central to economic evaluations. It underlies calculation of price-indices, formulation of tax policies, and environmental and industrial regulation. But existing measurement methods rely on restrictive assumptions about consumer preferences, leading to potentially incorrect conclusions regarding policy-impacts. The proposed project aims to make fundamental contributions to empirical welfare analysis by developing nonparametric approaches, which would avoid such assumptions and thus produce reliable welfare estimates from micro-data. The emphasis will be on welfare-evaluation of price/quality changes in the under-researched but common real-life setting of discrete-choice, e.g., the impact of tuition subsidies for college entrants, fare-hikes for passengers and access to new channels for TV viewers. The project will cover (i) discrete choice with multinomial/ordered/non-exclusive alternatives, (ii) random coefficient choice-models, (iii) settings where one’s choice affects one’s peers’ utilities, and (iv) dynamic choice under uncertainty such as durable-purchase. Welfare analyses in situations (ii)-(iv) are previously unexplored problems and represent ambitious undertakings. Situation (i) has been analyzed only under strong, unsubstantiated assumptions, like quasilinear preferences and extreme valued errors. The key insight driving the project is that welfare calculations require less information than what is needed to identify underlying preference parameters. The project will also develop methods to overcome common data problems like interval-reporting and endogeneity of income. The theoretical results will be complemented by software codes in Stata/R which can be readily used by practitioners. Given the ubiquity of welfare analysis in economic applications and its use in non-academic settings such as merger-analysis, damage calculations, etc., the project is likely to have a substantial impact both in and beyond the academia.
Max ERC Funding
1 426 418 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym Endogenous_Info
Project Financial Decision Making with Endogenous Information Acquisition
Researcher (PI) Marcin Kacperczyk
Host Institution (HI) IMPERIAL COLLEGE OF SCIENCE TECHNOLOGY AND MEDICINE
Country United Kingdom
Call Details Consolidator Grant (CoG), SH1, ERC-2015-CoG
Summary Are financial markets informationally efficient? Are some economic agents more informed than others? What is the impact of such heterogeneity on asset prices? These questions, of great economic significance, have permeated academic and business circles over the last few decades. Despite significant progress on the theoretical and empirical fronts relatively little is known about how information is endogenously acquired and processed in markets by agents with differential access to information and facing heterogeneous opportunities.
Using the novel setting of endogenous information acquisition with non-trivial heterogeneity the project has two goals: (A) to lay out micro foundations for informed trading in investment and corporate settings using the contexts of illegal insider trading and household finance; (B) to investigate macro implications of heterogeneous information for global economic phenomena such as income inequality, organizational design, and market power.
Summary
Are financial markets informationally efficient? Are some economic agents more informed than others? What is the impact of such heterogeneity on asset prices? These questions, of great economic significance, have permeated academic and business circles over the last few decades. Despite significant progress on the theoretical and empirical fronts relatively little is known about how information is endogenously acquired and processed in markets by agents with differential access to information and facing heterogeneous opportunities.
Using the novel setting of endogenous information acquisition with non-trivial heterogeneity the project has two goals: (A) to lay out micro foundations for informed trading in investment and corporate settings using the contexts of illegal insider trading and household finance; (B) to investigate macro implications of heterogeneous information for global economic phenomena such as income inequality, organizational design, and market power.
Max ERC Funding
1 588 959 €
Duration
Start date: 2016-06-01, End date: 2020-05-31
Project acronym EVALVE
Project Biomechanics and signaling in models of congenital heart valve defects
Researcher (PI) Julien Jean-Louis Marie Vermot
Host Institution (HI) IMPERIAL COLLEGE OF SCIENCE TECHNOLOGY AND MEDICINE
Country United Kingdom
Call Details Consolidator Grant (CoG), LS4, ERC-2015-CoG
Summary Mechanical forces are fundamental to cardiovascular development and physiology. The interactions between mechanical forces and endothelial cells are mediated by mechanotransduction feedback loops. My lab is interested in understanding how hemodynamic forces modulate cardiovascular function and morphogenesis. Overall, our recent work is unraveling the biological links between mechanical forces, mechanotransduction and endothelial cell responses. The heart beats 2.6 billion times in a human lifetime and heart valves are amongst the most mechanically challenged structures of the body. The cardiac valves are made of endocardial cells (EdCs) and extracellular matrix components. Most valve diseases have their origins in embryogenesis, either as signs of abnormal developmental processes or the aberrant re-expression of fetal gene programs normally quiescent in adulthood.
This project is directed towards the elucidation of the biomechanical mechanism of mechanotransduction at the subcellular and molecular level and in addressing how EdCs integrate this information to form and maintain a functional cardiac valve. We will identify the mechanosensors at work in EdCs and their roles during cardiac valve development and repair. To do so, we will implement unique optical methodologies the lab has pioneered to characterize endocardial mechanotransduction: 1) Optical tweezing combined with mechanical stress reporters to test the mechanosensitivity of EdCs; 2) High resolution live microscopy and mathematical modeling to quantify mechanical forces; 3) 3D cell lineage studies to understand how cells respond and organize during pathological valve development. We will also use high-throughput mRNA- and ChIP-sequencing to characterize the transcriptional network activated by forces.
When completed this proposal will shed light on a critical, but little explored, aspect of congenital valve defects and will be useful for identifying new targets for therapeutic interventions.
Summary
Mechanical forces are fundamental to cardiovascular development and physiology. The interactions between mechanical forces and endothelial cells are mediated by mechanotransduction feedback loops. My lab is interested in understanding how hemodynamic forces modulate cardiovascular function and morphogenesis. Overall, our recent work is unraveling the biological links between mechanical forces, mechanotransduction and endothelial cell responses. The heart beats 2.6 billion times in a human lifetime and heart valves are amongst the most mechanically challenged structures of the body. The cardiac valves are made of endocardial cells (EdCs) and extracellular matrix components. Most valve diseases have their origins in embryogenesis, either as signs of abnormal developmental processes or the aberrant re-expression of fetal gene programs normally quiescent in adulthood.
This project is directed towards the elucidation of the biomechanical mechanism of mechanotransduction at the subcellular and molecular level and in addressing how EdCs integrate this information to form and maintain a functional cardiac valve. We will identify the mechanosensors at work in EdCs and their roles during cardiac valve development and repair. To do so, we will implement unique optical methodologies the lab has pioneered to characterize endocardial mechanotransduction: 1) Optical tweezing combined with mechanical stress reporters to test the mechanosensitivity of EdCs; 2) High resolution live microscopy and mathematical modeling to quantify mechanical forces; 3) 3D cell lineage studies to understand how cells respond and organize during pathological valve development. We will also use high-throughput mRNA- and ChIP-sequencing to characterize the transcriptional network activated by forces.
When completed this proposal will shed light on a critical, but little explored, aspect of congenital valve defects and will be useful for identifying new targets for therapeutic interventions.
Max ERC Funding
2 000 000 €
Duration
Start date: 2016-12-01, End date: 2023-05-31