Project acronym ACOPS
Project Advanced Coherent Ultrafast Laser Pulse Stacking
Researcher (PI) Jens Limpert
Host Institution (HI) FRIEDRICH-SCHILLER-UNIVERSITAT JENA
Country Germany
Call Details Consolidator Grant (CoG), PE2, ERC-2013-CoG
Summary "An important driver of scientific progress has always been the envisioning of applications far beyond existing technological capabilities. Such thinking creates new challenges for physicists, driven by the groundbreaking nature of the anticipated application. In the case of laser physics, one of these applications is laser wake-field particle acceleration and possible future uses thereof, such as in collider experiments, or for medical applications such as cancer treatment. To accelerate electrons and positrons to TeV-energies, a laser architecture is required that allows for the combination of high efficiency, Petawatt peak powers, and Megawatt average powers. Developing such a laser system would be a challenging task that might take decades of aggressive research, development, and, most important, revolutionary approaches and innovative ideas.
The goal of the ACOPS project is to develop a compact, efficient, scalable, and cost-effective high-average and high-peak power ultra-short pulse laser concept.
The proposed approach to this goal relies on the spatially and temporally separated amplification of ultrashort laser pulses in waveguide structures, followed by coherent combination into a single train of pulses with increased average power and pulse energy. This combination can be realized through the coherent addition of the output beams of spatially separated amplifiers, combined with the pulse stacking of temporally separated pulses in passive enhancement cavities, employing a fast-switching element as cavity dumper.
Therefore, the three main tasks are the development of kW-class high-repetition-rate driving lasers, the investigation of non-steady state pulse enhancement in passive cavities, and the development of a suitable dumping element.
If successful, the proposed concept would undoubtedly provide a tool that would allow researchers to surpass the current limits in high-field physics and accelerator science."
Summary
"An important driver of scientific progress has always been the envisioning of applications far beyond existing technological capabilities. Such thinking creates new challenges for physicists, driven by the groundbreaking nature of the anticipated application. In the case of laser physics, one of these applications is laser wake-field particle acceleration and possible future uses thereof, such as in collider experiments, or for medical applications such as cancer treatment. To accelerate electrons and positrons to TeV-energies, a laser architecture is required that allows for the combination of high efficiency, Petawatt peak powers, and Megawatt average powers. Developing such a laser system would be a challenging task that might take decades of aggressive research, development, and, most important, revolutionary approaches and innovative ideas.
The goal of the ACOPS project is to develop a compact, efficient, scalable, and cost-effective high-average and high-peak power ultra-short pulse laser concept.
The proposed approach to this goal relies on the spatially and temporally separated amplification of ultrashort laser pulses in waveguide structures, followed by coherent combination into a single train of pulses with increased average power and pulse energy. This combination can be realized through the coherent addition of the output beams of spatially separated amplifiers, combined with the pulse stacking of temporally separated pulses in passive enhancement cavities, employing a fast-switching element as cavity dumper.
Therefore, the three main tasks are the development of kW-class high-repetition-rate driving lasers, the investigation of non-steady state pulse enhancement in passive cavities, and the development of a suitable dumping element.
If successful, the proposed concept would undoubtedly provide a tool that would allow researchers to surpass the current limits in high-field physics and accelerator science."
Max ERC Funding
1 881 040 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ALUNIF
Project Algorithms and Lower Bounds: A Unified Approach
Researcher (PI) Rahul Santhanam
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Country United Kingdom
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary One of the fundamental goals of theoretical computer science is to
understand the possibilities and limits of efficient computation. This
quest has two dimensions. The
theory of algorithms focuses on finding efficient solutions to
problems, while computational complexity theory aims to understand when
and why problems are hard to solve. These two areas have different
philosophies and use different sets of techniques. However, in recent
years there have been indications of deep and mysterious connections
between them.
In this project, we propose to explore and develop the connections between
algorithmic analysis and complexity lower bounds in a systematic way.
On the one hand, we plan to use complexity lower bound techniques as inspiration
to design new and improved algorithms for Satisfiability and other
NP-complete problems, as well as to analyze existing algorithms better.
On the other hand, we plan to strengthen implications yielding circuit
lower bounds from non-trivial algorithms for Satisfiability, and to derive
new circuit lower bounds using these stronger implications.
This project has potential for massive impact in both the areas of algorithms
and computational complexity. Improved algorithms for Satisfiability could lead
to improved SAT solvers, and the new analytical tools would lead to a better
understanding of existing heuristics. Complexity lower bound questions are
fundamental
but notoriously difficult, and new lower bounds would open the way to
unconditionally secure cryptographic protocols and derandomization of
probabilistic algorithms. More broadly, this project aims to initiate greater
dialogue between the two areas, with an exchange of ideas and techniques
which leads to accelerated progress in both, as well as a deeper understanding
of the nature of efficient computation.
Summary
One of the fundamental goals of theoretical computer science is to
understand the possibilities and limits of efficient computation. This
quest has two dimensions. The
theory of algorithms focuses on finding efficient solutions to
problems, while computational complexity theory aims to understand when
and why problems are hard to solve. These two areas have different
philosophies and use different sets of techniques. However, in recent
years there have been indications of deep and mysterious connections
between them.
In this project, we propose to explore and develop the connections between
algorithmic analysis and complexity lower bounds in a systematic way.
On the one hand, we plan to use complexity lower bound techniques as inspiration
to design new and improved algorithms for Satisfiability and other
NP-complete problems, as well as to analyze existing algorithms better.
On the other hand, we plan to strengthen implications yielding circuit
lower bounds from non-trivial algorithms for Satisfiability, and to derive
new circuit lower bounds using these stronger implications.
This project has potential for massive impact in both the areas of algorithms
and computational complexity. Improved algorithms for Satisfiability could lead
to improved SAT solvers, and the new analytical tools would lead to a better
understanding of existing heuristics. Complexity lower bound questions are
fundamental
but notoriously difficult, and new lower bounds would open the way to
unconditionally secure cryptographic protocols and derandomization of
probabilistic algorithms. More broadly, this project aims to initiate greater
dialogue between the two areas, with an exchange of ideas and techniques
which leads to accelerated progress in both, as well as a deeper understanding
of the nature of efficient computation.
Max ERC Funding
1 274 496 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym ARITHMUS
Project Peopling Europe: How data make a people
Researcher (PI) Evelyn Sharon Ruppert
Host Institution (HI) GOLDSMITHS' COLLEGE
Country United Kingdom
Call Details Consolidator Grant (CoG), SH3, ERC-2013-CoG
Summary Who are the people of Europe? This question is facing statisticians as they grapple with standardising national census methods so that their numbers can be assembled into a European population. Yet, by so doing—intentionally or otherwise—they also contribute to the making of a European people. This, at least, is the central thesis of ARITHMUS. While typically framed as a methodological or statistical problem, the project approaches this as a practical and political problem of assembling multiple national populations into a European population and people.
Why is this both an urgent political and practical problem? Politically, Europe is said to be unable to address itself to a constituted polity and people, which is crucial to European integration. Practically, its efforts to constitute a European population are also being challenged by digital technologies, which are being used to diversify census methods and bringing into question the comparability of national population data. Consequently, over the next several years Eurostat and national statistical institutes are negotiating regulations for the 2020 census round towards ensuring 'Europe-wide comparability.'
ARITHMUS will follow this process and investigate the practices of statisticians as they juggle scientific independence, national autonomy and EU comparability to innovate census methods. It will then connect this practical work to political questions of the making and governing of a European people and polity. It will do so by going beyond state-of-the art scholarship on methods, politics and science and technology studies. Five case studies involving discourse analysis and ethnographic methods will investigate the situated practices of EU and national statisticians as they remake census methods, arguably the most fundamental changes since modern censuses were launched over two centuries ago. At the same time it will attend to how these practices affect the constitution of who are the people of Europe.
Summary
Who are the people of Europe? This question is facing statisticians as they grapple with standardising national census methods so that their numbers can be assembled into a European population. Yet, by so doing—intentionally or otherwise—they also contribute to the making of a European people. This, at least, is the central thesis of ARITHMUS. While typically framed as a methodological or statistical problem, the project approaches this as a practical and political problem of assembling multiple national populations into a European population and people.
Why is this both an urgent political and practical problem? Politically, Europe is said to be unable to address itself to a constituted polity and people, which is crucial to European integration. Practically, its efforts to constitute a European population are also being challenged by digital technologies, which are being used to diversify census methods and bringing into question the comparability of national population data. Consequently, over the next several years Eurostat and national statistical institutes are negotiating regulations for the 2020 census round towards ensuring 'Europe-wide comparability.'
ARITHMUS will follow this process and investigate the practices of statisticians as they juggle scientific independence, national autonomy and EU comparability to innovate census methods. It will then connect this practical work to political questions of the making and governing of a European people and polity. It will do so by going beyond state-of-the art scholarship on methods, politics and science and technology studies. Five case studies involving discourse analysis and ethnographic methods will investigate the situated practices of EU and national statisticians as they remake census methods, arguably the most fundamental changes since modern censuses were launched over two centuries ago. At the same time it will attend to how these practices affect the constitution of who are the people of Europe.
Max ERC Funding
1 833 649 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym BIGBAYES
Project Rich, Structured and Efficient Learning of Big Bayesian Models
Researcher (PI) Yee Whye Teh
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Country United Kingdom
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary As datasets grow ever larger in scale, complexity and variety, there is an increasing need for powerful machine learning and statistical techniques that are capable of learning from such data. Bayesian nonparametrics is a promising approach to data analysis that is increasingly popular in machine learning and statistics. Bayesian nonparametric models are highly flexible models with infinite-dimensional parameter spaces that can be used to directly parameterise and learn about functions, densities, conditional distributions etc, and have been successfully applied to regression, survival analysis, language modelling, time series analysis, and visual scene analysis among others. However, to successfully use Bayesian nonparametric models to analyse the high-dimensional and structured datasets now commonly encountered in the age of Big Data, we will have to overcome a number of challenges. Namely, we need to develop Bayesian nonparametric models that can learn rich representations from structured data, and we need computational methodologies that can scale effectively to the large and complex models of the future. We will ground our developments in relevant applications, particularly to natural language processing (learning distributed representations for language modelling and compositional semantics) and genetics (modelling genetic variations arising from population, genealogical and spatial structures).
Summary
As datasets grow ever larger in scale, complexity and variety, there is an increasing need for powerful machine learning and statistical techniques that are capable of learning from such data. Bayesian nonparametrics is a promising approach to data analysis that is increasingly popular in machine learning and statistics. Bayesian nonparametric models are highly flexible models with infinite-dimensional parameter spaces that can be used to directly parameterise and learn about functions, densities, conditional distributions etc, and have been successfully applied to regression, survival analysis, language modelling, time series analysis, and visual scene analysis among others. However, to successfully use Bayesian nonparametric models to analyse the high-dimensional and structured datasets now commonly encountered in the age of Big Data, we will have to overcome a number of challenges. Namely, we need to develop Bayesian nonparametric models that can learn rich representations from structured data, and we need computational methodologies that can scale effectively to the large and complex models of the future. We will ground our developments in relevant applications, particularly to natural language processing (learning distributed representations for language modelling and compositional semantics) and genetics (modelling genetic variations arising from population, genealogical and spatial structures).
Max ERC Funding
1 918 092 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym CANCEREVO
Project Deciphering and predicting the evolution of cancer cell populations
Researcher (PI) Marco Helmut GERLINGER
Host Institution (HI) THE INSTITUTE OF CANCER RESEARCH: ROYAL CANCER HOSPITAL
Country United Kingdom
Call Details Consolidator Grant (CoG), LS7, ERC-2018-COG
Summary The fundamental evolutionary nature of cancer is well recognized but an understanding of the dynamic evolutionary changes occurring throughout a tumour’s lifetime and their clinical implications is in its infancy. Current approaches to reveal cancer evolution by sequencing of multiple biopsies remain of limited use in the clinic due to sample access problems in multi-metastatic disease. Circulating tumour DNA (ctDNA) is thought to comprehensively sample subclones across metastatic sites. However, available technologies either have high sensitivity but are restricted to the analysis of small gene panels or they allow sequencing of large target regions such as exomes but with too limited sensitivity to detect rare subclones. We developed a novel error corrected sequencing technology that will be applied to perform deep exome sequencing on longitudinal ctDNA samples from highly heterogeneous metastatic gastro-oesophageal carcinomas. This will track the evolution of the entire cancer population over the lifetime of these tumours, from metastatic disease over drug therapy to end-stage disease and enable ground breaking insights into cancer population evolution rules and mechanisms. Specifically, we will: 1. Define the genomic landscape and drivers of metastatic and end stage disease. 2. Understand the rules of cancer evolutionary dynamics of entire cancer cell populations. 3. Predict cancer evolution and define the limits of predictability. 4. Rapidly identify drug resistance mechanisms to chemo- and immunotherapy based on signals of Darwinian selection such as parallel and convergent evolution. Our sequencing technology and analysis framework will also transform the way cancer evolution metrics can be accessed and interpreted in the clinic which will have major impacts, ranging from better biomarkers to predict cancer evolution to the identification of drug targets that drive disease progression and therapy resistance.
Summary
The fundamental evolutionary nature of cancer is well recognized but an understanding of the dynamic evolutionary changes occurring throughout a tumour’s lifetime and their clinical implications is in its infancy. Current approaches to reveal cancer evolution by sequencing of multiple biopsies remain of limited use in the clinic due to sample access problems in multi-metastatic disease. Circulating tumour DNA (ctDNA) is thought to comprehensively sample subclones across metastatic sites. However, available technologies either have high sensitivity but are restricted to the analysis of small gene panels or they allow sequencing of large target regions such as exomes but with too limited sensitivity to detect rare subclones. We developed a novel error corrected sequencing technology that will be applied to perform deep exome sequencing on longitudinal ctDNA samples from highly heterogeneous metastatic gastro-oesophageal carcinomas. This will track the evolution of the entire cancer population over the lifetime of these tumours, from metastatic disease over drug therapy to end-stage disease and enable ground breaking insights into cancer population evolution rules and mechanisms. Specifically, we will: 1. Define the genomic landscape and drivers of metastatic and end stage disease. 2. Understand the rules of cancer evolutionary dynamics of entire cancer cell populations. 3. Predict cancer evolution and define the limits of predictability. 4. Rapidly identify drug resistance mechanisms to chemo- and immunotherapy based on signals of Darwinian selection such as parallel and convergent evolution. Our sequencing technology and analysis framework will also transform the way cancer evolution metrics can be accessed and interpreted in the clinic which will have major impacts, ranging from better biomarkers to predict cancer evolution to the identification of drug targets that drive disease progression and therapy resistance.
Max ERC Funding
2 000 000 €
Duration
Start date: 2019-03-01, End date: 2024-02-29
Project acronym DECISIONS
Project Choices and consumption: modelling long and short term decisions in a changing world
Researcher (PI) Stephane Hess
Host Institution (HI) UNIVERSITY OF LEEDS
Country United Kingdom
Call Details Consolidator Grant (CoG), SH3, ERC-2013-CoG
Summary Mathematical models of choice behaviour are used to understand consumer decisions and valuations and forecast choices across a range of topic areas, including transport and regional science. Their outputs form a key component in guidance underpinning government and industry decisions on changes to policy, infrastructure developments or the introduction of new services or products. Given the significant financial, environmental and societal implications of such decisions, model accuracy is crucial. Current models however, while powerful and flexible, still present a highly abstract representation of consumer decisions. This project aims to develop a new framework which realigns modelled behaviour with real world behaviour, jointly representing the choice of multiple options or products and the quantity of consumption for each of these. In contrast with existing work, these choices will be placed within a wider framework, incorporating links between long term decisions and day to day choices, accounting for the growing importance of virtual social networks and the role of joint decisions. The work will ensure consistency with economic theory and in particular deal with the formation and role of budgets and constraints. While many developments will take place within the random utility framework, the project will also operationalize alternative theories of behaviour, such as non-compensatory decision rules from mathematical psychology. To ensure the transition of methodological developments into practice, I will test the models and illustrate their advantages in a large scale application studying the relationship between long term decisions and short term energy consumption. I will ensure that the models can produce output suitable for economic analysis and will develop free estimation software. The research promises a step change in model flexibility and realism with impacts across a number of academic disciplines as well as real world benefits to society as a whole.
Summary
Mathematical models of choice behaviour are used to understand consumer decisions and valuations and forecast choices across a range of topic areas, including transport and regional science. Their outputs form a key component in guidance underpinning government and industry decisions on changes to policy, infrastructure developments or the introduction of new services or products. Given the significant financial, environmental and societal implications of such decisions, model accuracy is crucial. Current models however, while powerful and flexible, still present a highly abstract representation of consumer decisions. This project aims to develop a new framework which realigns modelled behaviour with real world behaviour, jointly representing the choice of multiple options or products and the quantity of consumption for each of these. In contrast with existing work, these choices will be placed within a wider framework, incorporating links between long term decisions and day to day choices, accounting for the growing importance of virtual social networks and the role of joint decisions. The work will ensure consistency with economic theory and in particular deal with the formation and role of budgets and constraints. While many developments will take place within the random utility framework, the project will also operationalize alternative theories of behaviour, such as non-compensatory decision rules from mathematical psychology. To ensure the transition of methodological developments into practice, I will test the models and illustrate their advantages in a large scale application studying the relationship between long term decisions and short term energy consumption. I will ensure that the models can produce output suitable for economic analysis and will develop free estimation software. The research promises a step change in model flexibility and realism with impacts across a number of academic disciplines as well as real world benefits to society as a whole.
Max ERC Funding
1 873 288 €
Duration
Start date: 2014-07-01, End date: 2020-06-30
Project acronym ENGAGES
Project Next generation algorithms for grabbing and exploiting symmetry
Researcher (PI) Pascal Schweitzer
Host Institution (HI) TECHNISCHE UNIVERSITAT KAISERSLAUTERN
Country Germany
Call Details Consolidator Grant (CoG), PE6, ERC-2018-COG
Summary Symmetry is a phenomenon that appears in many different contexts.
Algorithmic symmetry detection and exploitation is the concept of finding intrinsic symmetries of a given object and then using these symmetries to our advantage. Application areas of algorithmic symmetry detection and exploitation range from convolutional neural networks in machine learning to computer graphics, chemical data bases and beyond.
In contrast to this widespread use, our understanding of the theoretical foundation (namely the graph isomorphism problem) is incomplete and current algorithmic symmetry tools are inadequate for big data applications. Hence, EngageS addresses these key challenges in the field using a systematic approach to the theory and practice of symmetry detection. It thereby also fixes the existing lack of interplay between theory and practice, which is part of the problem.
EngageS' main aims are to tackle the classical and descriptive complexity of the graph isomorphism problem and to design the next generation of symmetry detection algorithms. As key ideas to resolve the complexity, EngageS offers three new approaches on how to prove lower bounds and a new method to settle the descriptive complexity.
EngageS will also develop practical symmetry detection algorithms for big data, exploiting parallelism and memory hierarchies of modern machines, and will introduce the concept of and a road map to exploiting absence of symmetry. Overall EngageS will establish a comprehensive software library that will serve as a platform for integrated research on the algorithmic treatment of symmetry.
In summary, EngageS will develop fast, efficient and accessible symmetry detection tools that will be used to solve complex algorithmic problems in a range of fields including combinatorial algorithms, generation problems, and canonization.
Summary
Symmetry is a phenomenon that appears in many different contexts.
Algorithmic symmetry detection and exploitation is the concept of finding intrinsic symmetries of a given object and then using these symmetries to our advantage. Application areas of algorithmic symmetry detection and exploitation range from convolutional neural networks in machine learning to computer graphics, chemical data bases and beyond.
In contrast to this widespread use, our understanding of the theoretical foundation (namely the graph isomorphism problem) is incomplete and current algorithmic symmetry tools are inadequate for big data applications. Hence, EngageS addresses these key challenges in the field using a systematic approach to the theory and practice of symmetry detection. It thereby also fixes the existing lack of interplay between theory and practice, which is part of the problem.
EngageS' main aims are to tackle the classical and descriptive complexity of the graph isomorphism problem and to design the next generation of symmetry detection algorithms. As key ideas to resolve the complexity, EngageS offers three new approaches on how to prove lower bounds and a new method to settle the descriptive complexity.
EngageS will also develop practical symmetry detection algorithms for big data, exploiting parallelism and memory hierarchies of modern machines, and will introduce the concept of and a road map to exploiting absence of symmetry. Overall EngageS will establish a comprehensive software library that will serve as a platform for integrated research on the algorithmic treatment of symmetry.
In summary, EngageS will develop fast, efficient and accessible symmetry detection tools that will be used to solve complex algorithmic problems in a range of fields including combinatorial algorithms, generation problems, and canonization.
Max ERC Funding
1 999 094 €
Duration
Start date: 2019-03-01, End date: 2024-02-29
Project acronym ERCC
Project Efficient Resource Constrained Cryptography
Researcher (PI) Eike Kiltz
Host Institution (HI) RUHR-UNIVERSITAET BOCHUM
Country Germany
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary "Traditionally, cryptographic protocols were run on servers or personal computers which have large and easily scalable computational resources. For these applications there exist a large variety of well-established cryptographic systems. Right now, we are in the midst of the shift toward ubiquitous computing on resource constrained devices (RCDs): small devices with severe constraints in terms of computing power, code size, and network capacities. RCDs are used virtually everywhere: smart phones, bank cards, electronic ID-cards, medical implants, cars, RFIDs as bar code replacement, etc. Due to their computational constraints, many current cryptographic security solutions are no longer applicable to RCDs. Existing solutions are often “ad-hoc” and do not come with a formal security treatment.
The central objective of the ERCC project is to initiate an overarching formal treatment of cryptographic solutions for RCDs, particularly focusing on efficiency. The main conceptual novelty is to follow the concept of provable security. We intend to design new cryptographic protocols that have a mathematical proof of security (assuming the hardness of some mathematical problem) and are still competitive with constructions currently used on RCDs. While we certainly cannot hope that all our new provably secure constructions will be superior to existing ad-hoc constructions, recent preliminary research
results give rise to optimism. Concretely, we will base our new protocols on hard problems in ideal and structures lattices and we will study weaker (yet still realistic) security models for RCDs allowing for efficient instantiations."
Summary
"Traditionally, cryptographic protocols were run on servers or personal computers which have large and easily scalable computational resources. For these applications there exist a large variety of well-established cryptographic systems. Right now, we are in the midst of the shift toward ubiquitous computing on resource constrained devices (RCDs): small devices with severe constraints in terms of computing power, code size, and network capacities. RCDs are used virtually everywhere: smart phones, bank cards, electronic ID-cards, medical implants, cars, RFIDs as bar code replacement, etc. Due to their computational constraints, many current cryptographic security solutions are no longer applicable to RCDs. Existing solutions are often “ad-hoc” and do not come with a formal security treatment.
The central objective of the ERCC project is to initiate an overarching formal treatment of cryptographic solutions for RCDs, particularly focusing on efficiency. The main conceptual novelty is to follow the concept of provable security. We intend to design new cryptographic protocols that have a mathematical proof of security (assuming the hardness of some mathematical problem) and are still competitive with constructions currently used on RCDs. While we certainly cannot hope that all our new provably secure constructions will be superior to existing ad-hoc constructions, recent preliminary research
results give rise to optimism. Concretely, we will base our new protocols on hard problems in ideal and structures lattices and we will study weaker (yet still realistic) security models for RCDs allowing for efficient instantiations."
Max ERC Funding
1 874 960 €
Duration
Start date: 2014-11-01, End date: 2019-10-31
Project acronym HONORLOGIC
Project The Cultural Logic of Honor and Social Interaction: A Cross-Cultural Comparison
Researcher (PI) Ayse USKUL
Host Institution (HI) UNIVERSITY OF KENT
Country United Kingdom
Call Details Consolidator Grant (CoG), SH3, ERC-2018-COG
Summary Understanding (un)willingness to coordinate with others, to compromise when faced with different choices, or to apologize for transgressions is crucial as these behaviors can act as strong facilitators or inhibitors of important interpersonal processes such as negotiations and coalition building. These behaviors play a major role when individuals from different cultural backgrounds work together to solve disputes or address joint challenges. Yet, we know little about what these behaviors mean in different cultural groups or how they are approached. With HONORLOGIC, I aim to initiate a step-change in our understanding of cultural variation in these important domains of social behavior by providing unique, multimethod, comparative and converging evidence from a wide range of cultural groups. I will answer the question “How do cultural groups that promote honor as a core cultural value approach coordinating with others, reaching compromise, and offering apologies?” by integrating insights from social/cultural psychology, behavioral economics, and anthropology. I will do this by collecting quantitative data using economic games, experiments, and surveys from Spain, Italy, Greece, Turkey, Cyprus, Lebanon, Egypt and Tunisia, as cultural groups where honor has been shown to play a defining role in individuals’ social worlds. I will also run the proposed studies in the US, the UK, Japan and Korea to provide a broader comparative perspective.
HONORLOGIC will produce transformative evidence for theories of social interaction and decision making in psychology, economics, and evolutionary science by (a) producing innovative theory and data with an interdisciplinary and multi-method approach, (b) increasing the diversity of the existing evidence pool, (c) testing established theoretical assumptions in new cultural groups, and (d) contributing to capacity building in under-researched cultural groups in psychological research.
Summary
Understanding (un)willingness to coordinate with others, to compromise when faced with different choices, or to apologize for transgressions is crucial as these behaviors can act as strong facilitators or inhibitors of important interpersonal processes such as negotiations and coalition building. These behaviors play a major role when individuals from different cultural backgrounds work together to solve disputes or address joint challenges. Yet, we know little about what these behaviors mean in different cultural groups or how they are approached. With HONORLOGIC, I aim to initiate a step-change in our understanding of cultural variation in these important domains of social behavior by providing unique, multimethod, comparative and converging evidence from a wide range of cultural groups. I will answer the question “How do cultural groups that promote honor as a core cultural value approach coordinating with others, reaching compromise, and offering apologies?” by integrating insights from social/cultural psychology, behavioral economics, and anthropology. I will do this by collecting quantitative data using economic games, experiments, and surveys from Spain, Italy, Greece, Turkey, Cyprus, Lebanon, Egypt and Tunisia, as cultural groups where honor has been shown to play a defining role in individuals’ social worlds. I will also run the proposed studies in the US, the UK, Japan and Korea to provide a broader comparative perspective.
HONORLOGIC will produce transformative evidence for theories of social interaction and decision making in psychology, economics, and evolutionary science by (a) producing innovative theory and data with an interdisciplinary and multi-method approach, (b) increasing the diversity of the existing evidence pool, (c) testing established theoretical assumptions in new cultural groups, and (d) contributing to capacity building in under-researched cultural groups in psychological research.
Max ERC Funding
1 998 694 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym iHEAR
Project Gene therapy of inherited and acquired hearing loss
Researcher (PI) Axel Rainer Schambach
Host Institution (HI) MEDIZINISCHE HOCHSCHULE HANNOVER
Country Germany
Call Details Consolidator Grant (CoG), LS7, ERC-2018-COG
Summary To address the substantial financial and social burden caused by hearing loss in 360 million people world-wide, I aim to improve hearing via gene therapy to correct inherited and protect from acquired hearing loss. In vitro experiments will establish the best vector configurations for transfer of therapeutic genes and miRNAs into inner ear hair cells (HC) and spiral ganglion neurons (SGN). The efficiency of the best-performing vector designs will then be explored in vivo using fluorescent marker proteins. Cell-type specific and inducible promoters as well as receptor-targeted vectors will be employed as a safety measure and to ensure transgene expression in HC and SGN target cells. Once efficient transduction of appropriate target cells and proper expression of therapeutic proteins are demonstrated, I will perform proof-of-concept studies in hearing loss models, incl. established mouse models, to correct (WP1) or protect (WP2) from impaired hearing. To ensure translatability of these findings, I will generate human induced pluripotent stem cells (iPSC) from patients with hearing loss (WP3), so that I can test optimized constructs in human otic cells. Moreover, I have access to a collection of well-characterized samples from over 600 hearing loss patients, including children with congenital hearing loss in whom many novel monogenetic alterations were identified. These resources provide the unique opportunity to generate a novel toolbox for the treatment of hearing loss. In addition to lentiviral and adeno-associated viral (AAV) vector delivery of corrective or protective genes to treat hearing loss, I will apply state-of-the-art genome editing tools to model and correct mutations causative for hearing loss in cell lines, primary cells from murine models, human patients and patient-derived iPSC. This work will contribute to development of clinically translatable approaches for precision medicine strategies to improve hearing loss treatment.
Summary
To address the substantial financial and social burden caused by hearing loss in 360 million people world-wide, I aim to improve hearing via gene therapy to correct inherited and protect from acquired hearing loss. In vitro experiments will establish the best vector configurations for transfer of therapeutic genes and miRNAs into inner ear hair cells (HC) and spiral ganglion neurons (SGN). The efficiency of the best-performing vector designs will then be explored in vivo using fluorescent marker proteins. Cell-type specific and inducible promoters as well as receptor-targeted vectors will be employed as a safety measure and to ensure transgene expression in HC and SGN target cells. Once efficient transduction of appropriate target cells and proper expression of therapeutic proteins are demonstrated, I will perform proof-of-concept studies in hearing loss models, incl. established mouse models, to correct (WP1) or protect (WP2) from impaired hearing. To ensure translatability of these findings, I will generate human induced pluripotent stem cells (iPSC) from patients with hearing loss (WP3), so that I can test optimized constructs in human otic cells. Moreover, I have access to a collection of well-characterized samples from over 600 hearing loss patients, including children with congenital hearing loss in whom many novel monogenetic alterations were identified. These resources provide the unique opportunity to generate a novel toolbox for the treatment of hearing loss. In addition to lentiviral and adeno-associated viral (AAV) vector delivery of corrective or protective genes to treat hearing loss, I will apply state-of-the-art genome editing tools to model and correct mutations causative for hearing loss in cell lines, primary cells from murine models, human patients and patient-derived iPSC. This work will contribute to development of clinically translatable approaches for precision medicine strategies to improve hearing loss treatment.
Max ERC Funding
1 999 500 €
Duration
Start date: 2019-05-01, End date: 2024-04-30