Project acronym ABACUS
Project Advancing Behavioral and Cognitive Understanding of Speech
Researcher (PI) Bart De Boer
Host Institution (HI) VRIJE UNIVERSITEIT BRUSSEL
Call Details Starting Grant (StG), SH4, ERC-2011-StG_20101124
Summary I intend to investigate what cognitive mechanisms give us combinatorial speech. Combinatorial speech is the ability to make new words using pre-existing speech sounds. Humans are the only apes that can do this, yet we do not know how our brains do it, nor how exactly we differ from other apes. Using new experimental techniques to study human behavior and new computational techniques to model human cognition, I will find out how we deal with combinatorial speech.
The experimental part will study individual and cultural learning. Experimental cultural learning is a new technique that simulates cultural evolution in the laboratory. Two types of cultural learning will be used: iterated learning, which simulates language transfer across generations, and social coordination, which simulates emergence of norms in a language community. Using the two types of cultural learning together with individual learning experiments will help to zero in, from three angles, on how humans deal with combinatorial speech. In addition it will make a methodological contribution by comparing the strengths and weaknesses of the three methods.
The computer modeling part will formalize hypotheses about how our brains deal with combinatorial speech. Two models will be built: a high-level model that will establish the basic algorithms with which combinatorial speech is learned and reproduced, and a neural model that will establish in more detail how the algorithms are implemented in the brain. In addition, the models, through increasing understanding of how humans deal with speech, will help bridge the performance gap between human and computer speech recognition.
The project will advance science in four ways: it will provide insight into how our unique ability for using combinatorial speech works, it will tell us how this is implemented in the brain, it will extend the novel methodology of experimental cultural learning and it will create new computer models for dealing with human speech.
Summary
I intend to investigate what cognitive mechanisms give us combinatorial speech. Combinatorial speech is the ability to make new words using pre-existing speech sounds. Humans are the only apes that can do this, yet we do not know how our brains do it, nor how exactly we differ from other apes. Using new experimental techniques to study human behavior and new computational techniques to model human cognition, I will find out how we deal with combinatorial speech.
The experimental part will study individual and cultural learning. Experimental cultural learning is a new technique that simulates cultural evolution in the laboratory. Two types of cultural learning will be used: iterated learning, which simulates language transfer across generations, and social coordination, which simulates emergence of norms in a language community. Using the two types of cultural learning together with individual learning experiments will help to zero in, from three angles, on how humans deal with combinatorial speech. In addition it will make a methodological contribution by comparing the strengths and weaknesses of the three methods.
The computer modeling part will formalize hypotheses about how our brains deal with combinatorial speech. Two models will be built: a high-level model that will establish the basic algorithms with which combinatorial speech is learned and reproduced, and a neural model that will establish in more detail how the algorithms are implemented in the brain. In addition, the models, through increasing understanding of how humans deal with speech, will help bridge the performance gap between human and computer speech recognition.
The project will advance science in four ways: it will provide insight into how our unique ability for using combinatorial speech works, it will tell us how this is implemented in the brain, it will extend the novel methodology of experimental cultural learning and it will create new computer models for dealing with human speech.
Max ERC Funding
1 276 620 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym ABEL
Project "Alpha-helical Barrels: Exploring, Understanding and Exploiting a New Class of Protein Structure"
Researcher (PI) Derek Neil Woolfson
Host Institution (HI) UNIVERSITY OF BRISTOL
Call Details Advanced Grant (AdG), LS9, ERC-2013-ADG
Summary "Recently through de novo peptide design, we have discovered and presented a new protein structure. This is an all-parallel, 6-helix bundle with a continuous central channel of 0.5 – 0.6 nm diameter. We posit that this is one of a broader class of protein structures that we call the alpha-helical barrels. Here, in three Work Packages, we propose to explore these structures and to develop protein functions within them. First, through a combination of computer-aided design, peptide synthesis and thorough biophysical characterization, we will examine the extents and limits of the alpha-helical-barrel structures. Whilst this is curiosity driven research, it also has practical consequences for the studies that will follow; that is, alpha-helical barrels made from increasing numbers of helices have channels or pores that increase in a predictable way. Second, we will use rational and empirical design approaches to engineer a range of functions within these cavities, including binding capabilities and enzyme-like activities. Finally, and taking the programme into another ambitious area, we will use the alpha-helical barrels to template other folds that are otherwise difficult to design and engineer, notably beta-barrels that insert into membranes to render ion-channel and sensor functions."
Summary
"Recently through de novo peptide design, we have discovered and presented a new protein structure. This is an all-parallel, 6-helix bundle with a continuous central channel of 0.5 – 0.6 nm diameter. We posit that this is one of a broader class of protein structures that we call the alpha-helical barrels. Here, in three Work Packages, we propose to explore these structures and to develop protein functions within them. First, through a combination of computer-aided design, peptide synthesis and thorough biophysical characterization, we will examine the extents and limits of the alpha-helical-barrel structures. Whilst this is curiosity driven research, it also has practical consequences for the studies that will follow; that is, alpha-helical barrels made from increasing numbers of helices have channels or pores that increase in a predictable way. Second, we will use rational and empirical design approaches to engineer a range of functions within these cavities, including binding capabilities and enzyme-like activities. Finally, and taking the programme into another ambitious area, we will use the alpha-helical barrels to template other folds that are otherwise difficult to design and engineer, notably beta-barrels that insert into membranes to render ion-channel and sensor functions."
Max ERC Funding
2 467 844 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ACB
Project The Analytic Conformal Bootstrap
Researcher (PI) Luis Fernando ALDAY
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Advanced Grant (AdG), PE2, ERC-2017-ADG
Summary The aim of the present proposal is to establish a research team developing and exploiting innovative techniques to study conformal field theories (CFT) analytically. Our approach does not rely on a Lagrangian description but on symmetries and consistency conditions. As such it applies to any CFT, offering a unified framework to study generic CFTs analytically. The initial implementation of this program has already led to striking new results and insights for both Lagrangian and non-Lagrangian CFTs.
The overarching aims of my team will be: To develop an analytic bootstrap program for CFTs in general dimensions; to complement these techniques with more traditional methods and develop a systematic machinery to obtain analytic results for generic CFTs; and to use these results to gain new insights into the mathematical structure of the space of quantum field theories.
The proposal will bring together researchers from different areas. The objectives in brief are:
1) Develop an alternative to Feynman diagram computations for Lagrangian CFTs.
2) Develop a machinery to compute loops for QFT on AdS, with and without gravity.
3) Develop an analytic approach to non-perturbative N=4 SYM and other CFTs.
4) Determine the space of all CFTs.
5) Gain new insights into the mathematical structure of the space of quantum field theories.
The outputs of this proposal will include a new way of doing perturbative computations based on symmetries; a constructive derivation of the AdS/CFT duality; new analytic techniques to attack strongly coupled systems and invaluable new lessons about the space of CFTs and QFTs.
Success in this research will lead to a completely new, unified way to view and solve CFTs, with a huge impact on several branches of physics and mathematics.
Summary
The aim of the present proposal is to establish a research team developing and exploiting innovative techniques to study conformal field theories (CFT) analytically. Our approach does not rely on a Lagrangian description but on symmetries and consistency conditions. As such it applies to any CFT, offering a unified framework to study generic CFTs analytically. The initial implementation of this program has already led to striking new results and insights for both Lagrangian and non-Lagrangian CFTs.
The overarching aims of my team will be: To develop an analytic bootstrap program for CFTs in general dimensions; to complement these techniques with more traditional methods and develop a systematic machinery to obtain analytic results for generic CFTs; and to use these results to gain new insights into the mathematical structure of the space of quantum field theories.
The proposal will bring together researchers from different areas. The objectives in brief are:
1) Develop an alternative to Feynman diagram computations for Lagrangian CFTs.
2) Develop a machinery to compute loops for QFT on AdS, with and without gravity.
3) Develop an analytic approach to non-perturbative N=4 SYM and other CFTs.
4) Determine the space of all CFTs.
5) Gain new insights into the mathematical structure of the space of quantum field theories.
The outputs of this proposal will include a new way of doing perturbative computations based on symmetries; a constructive derivation of the AdS/CFT duality; new analytic techniques to attack strongly coupled systems and invaluable new lessons about the space of CFTs and QFTs.
Success in this research will lead to a completely new, unified way to view and solve CFTs, with a huge impact on several branches of physics and mathematics.
Max ERC Funding
2 171 483 €
Duration
Start date: 2018-12-01, End date: 2023-11-30
Project acronym ACCELERATES
Project Acceleration in Extreme Shocks: from the microphysics to laboratory and astrophysics scenarios
Researcher (PI) Luis Miguel De Oliveira E Silva
Host Institution (HI) INSTITUTO SUPERIOR TECNICO
Call Details Advanced Grant (AdG), PE2, ERC-2010-AdG_20100224
Summary What is the origin of cosmic rays, what are the dominant acceleration mechanisms in relativistic shocks, how do cosmic rays self-consistently influence the shock dynamics, how are relativistic collisionless shocks formed are longstanding scientific questions, closely tied to extreme plasma physics processes, and where a close interplay between the micro-instabilities and the global dynamics is critical.
Relativistic shocks are closely connected with the propagation of intense streams of particles pervasive in many astrophysical scenarios. The possibility of exciting shocks in the laboratory will also be available very soon with multi-PW lasers or intense relativistic particle beams.
Computational modeling is now established as a prominent research tool, by enabling the fully kinetic modeling of these systems for the first time. With the fast paced developments in high performance computing, the time is ripe for a focused research programme on simulation-based studies of relativistic shocks. This proposal therefore focuses on using self-consistent ab initio massively parallel simulations to study the physics of relativistic shocks, bridging the gap between the multidimensional microphysics of shock onset, formation, and propagation and the global system dynamics. Particular focus will be given to the shock acceleration mechanisms and the radiation signatures of the various physical processes, with the goal of solving some of the central questions in plasma/relativistic phenomena in astrophysics and in the laboratory, and opening new avenues between theoretical/massive computational studies, laboratory experiments and astrophysical observations.
Summary
What is the origin of cosmic rays, what are the dominant acceleration mechanisms in relativistic shocks, how do cosmic rays self-consistently influence the shock dynamics, how are relativistic collisionless shocks formed are longstanding scientific questions, closely tied to extreme plasma physics processes, and where a close interplay between the micro-instabilities and the global dynamics is critical.
Relativistic shocks are closely connected with the propagation of intense streams of particles pervasive in many astrophysical scenarios. The possibility of exciting shocks in the laboratory will also be available very soon with multi-PW lasers or intense relativistic particle beams.
Computational modeling is now established as a prominent research tool, by enabling the fully kinetic modeling of these systems for the first time. With the fast paced developments in high performance computing, the time is ripe for a focused research programme on simulation-based studies of relativistic shocks. This proposal therefore focuses on using self-consistent ab initio massively parallel simulations to study the physics of relativistic shocks, bridging the gap between the multidimensional microphysics of shock onset, formation, and propagation and the global system dynamics. Particular focus will be given to the shock acceleration mechanisms and the radiation signatures of the various physical processes, with the goal of solving some of the central questions in plasma/relativistic phenomena in astrophysics and in the laboratory, and opening new avenues between theoretical/massive computational studies, laboratory experiments and astrophysical observations.
Max ERC Funding
1 588 800 €
Duration
Start date: 2011-06-01, End date: 2016-07-31
Project acronym ACCOPT
Project ACelerated COnvex OPTimization
Researcher (PI) Yurii NESTEROV
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Advanced Grant (AdG), PE1, ERC-2017-ADG
Summary The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Summary
The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Max ERC Funding
2 090 038 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym ACETOGENS
Project Acetogenic bacteria: from basic physiology via gene regulation to application in industrial biotechnology
Researcher (PI) Volker MÜLLER
Host Institution (HI) JOHANN WOLFGANG GOETHE-UNIVERSITATFRANKFURT AM MAIN
Call Details Advanced Grant (AdG), LS9, ERC-2016-ADG
Summary Demand for biofuels and other biologically derived commodities is growing worldwide as efforts increase to reduce reliance on fossil fuels and to limit climate change. Most commercial approaches rely on fermentations of organic matter with its inherent problems in producing CO2 and being in conflict with the food supply of humans. These problems are avoided if CO2 can be used as feedstock. Autotrophic organisms can fix CO2 by producing chemicals that are used as building blocks for the synthesis of cellular components (Biomass). Acetate-forming bacteria (acetogens) do neither require light nor oxygen for this and they can be used in bioreactors to reduce CO2 with hydrogen gas, carbon monoxide or an organic substrate. Gas fermentation using these bacteria has already been realized on an industrial level in two pre-commercial 100,000 gal/yr demonstration facilities to produce fuel ethanol from abundant waste gas resources (by LanzaTech). Acetogens can metabolise a wide variety of substrates that could be used for the production of biocommodities. However, their broad use to produce biofuels and platform chemicals from substrates other than gases or together with gases is hampered by our very limited knowledge about their metabolism and ability to use different substrates simultaneously. Nearly nothing is known about regulatory processes involved in substrate utilization or product formation but this is an absolute requirement for metabolic engineering approaches. The aim of this project is to provide this basic knowledge about metabolic routes in the acetogenic model strain Acetobacterium woodii and their regulation. We will unravel the function of “organelles” found in this bacterium and explore their potential as bio-nanoreactors for the production of biocommodities and pave the road for the industrial use of A. woodii in energy (hydrogen) storage. Thus, this project creates cutting-edge opportunities for the development of biosustainable technologies in Europe.
Summary
Demand for biofuels and other biologically derived commodities is growing worldwide as efforts increase to reduce reliance on fossil fuels and to limit climate change. Most commercial approaches rely on fermentations of organic matter with its inherent problems in producing CO2 and being in conflict with the food supply of humans. These problems are avoided if CO2 can be used as feedstock. Autotrophic organisms can fix CO2 by producing chemicals that are used as building blocks for the synthesis of cellular components (Biomass). Acetate-forming bacteria (acetogens) do neither require light nor oxygen for this and they can be used in bioreactors to reduce CO2 with hydrogen gas, carbon monoxide or an organic substrate. Gas fermentation using these bacteria has already been realized on an industrial level in two pre-commercial 100,000 gal/yr demonstration facilities to produce fuel ethanol from abundant waste gas resources (by LanzaTech). Acetogens can metabolise a wide variety of substrates that could be used for the production of biocommodities. However, their broad use to produce biofuels and platform chemicals from substrates other than gases or together with gases is hampered by our very limited knowledge about their metabolism and ability to use different substrates simultaneously. Nearly nothing is known about regulatory processes involved in substrate utilization or product formation but this is an absolute requirement for metabolic engineering approaches. The aim of this project is to provide this basic knowledge about metabolic routes in the acetogenic model strain Acetobacterium woodii and their regulation. We will unravel the function of “organelles” found in this bacterium and explore their potential as bio-nanoreactors for the production of biocommodities and pave the road for the industrial use of A. woodii in energy (hydrogen) storage. Thus, this project creates cutting-edge opportunities for the development of biosustainable technologies in Europe.
Max ERC Funding
2 497 140 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym AcetyLys
Project Unravelling the role of lysine acetylation in the regulation of glycolysis in cancer cells through the development of synthetic biology-based tools
Researcher (PI) Eyal Arbely
Host Institution (HI) BEN-GURION UNIVERSITY OF THE NEGEV
Call Details Starting Grant (StG), LS9, ERC-2015-STG
Summary Synthetic biology is an emerging discipline that offers powerful tools to control and manipulate fundamental processes in living matter. We propose to develop and apply such tools to modify the genetic code of cultured mammalian cells and bacteria with the aim to study the role of lysine acetylation in the regulation of metabolism and in cancer development. Thousands of lysine acetylation sites were recently discovered on non-histone proteins, suggesting that acetylation is a widespread and evolutionarily conserved post translational modification, similar in scope to phosphorylation and ubiquitination. Specifically, it has been found that most of the enzymes of metabolic processes—including glycolysis—are acetylated, implying that acetylation is key regulator of cellular metabolism in general and in glycolysis in particular. The regulation of metabolic pathways is of particular importance to cancer research, as misregulation of metabolic pathways, especially upregulation of glycolysis, is common to most transformed cells and is now considered a new hallmark of cancer. These data raise an immediate question: what is the role of acetylation in the regulation of glycolysis and in the metabolic reprogramming of cancer cells? While current methods rely on mutational analyses, we will genetically encode the incorporation of acetylated lysine and directly measure the functional role of each acetylation site in cancerous and non-cancerous cell lines. Using this methodology, we will study the structural and functional implications of all the acetylation sites in glycolytic enzymes. We will also decipher the mechanism by which acetylation is regulated by deacetylases and answer a long standing question – how 18 deacetylases recognise their substrates among thousands of acetylated proteins? The developed methodologies can be applied to a wide range of protein families known to be acetylated, thereby making this study relevant to diverse research fields.
Summary
Synthetic biology is an emerging discipline that offers powerful tools to control and manipulate fundamental processes in living matter. We propose to develop and apply such tools to modify the genetic code of cultured mammalian cells and bacteria with the aim to study the role of lysine acetylation in the regulation of metabolism and in cancer development. Thousands of lysine acetylation sites were recently discovered on non-histone proteins, suggesting that acetylation is a widespread and evolutionarily conserved post translational modification, similar in scope to phosphorylation and ubiquitination. Specifically, it has been found that most of the enzymes of metabolic processes—including glycolysis—are acetylated, implying that acetylation is key regulator of cellular metabolism in general and in glycolysis in particular. The regulation of metabolic pathways is of particular importance to cancer research, as misregulation of metabolic pathways, especially upregulation of glycolysis, is common to most transformed cells and is now considered a new hallmark of cancer. These data raise an immediate question: what is the role of acetylation in the regulation of glycolysis and in the metabolic reprogramming of cancer cells? While current methods rely on mutational analyses, we will genetically encode the incorporation of acetylated lysine and directly measure the functional role of each acetylation site in cancerous and non-cancerous cell lines. Using this methodology, we will study the structural and functional implications of all the acetylation sites in glycolytic enzymes. We will also decipher the mechanism by which acetylation is regulated by deacetylases and answer a long standing question – how 18 deacetylases recognise their substrates among thousands of acetylated proteins? The developed methodologies can be applied to a wide range of protein families known to be acetylated, thereby making this study relevant to diverse research fields.
Max ERC Funding
1 499 375 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym ACHILLES-HEEL
Project Crop resistance improvement by mining natural and induced variation in host accessibility factors
Researcher (PI) Sebastian Schornack
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Starting Grant (StG), LS9, ERC-2014-STG
Summary Increasing crop yield to feed the world is a grand challenge of the 21st century but it is hampered by diseases caused by filamentous plant pathogens. The arms race between pathogen and plant demands constant adjustment of crop germplasm to tackle emerging pathogen races with new virulence features. To date, most crop disease resistance has relied on specific resistance genes that are effective only against a subset of races. We cannot solely rely on classical resistance genes to keep ahead of the pathogens. There is an urgent need to develop approaches based on knowledge of the pathogen’s Achilles heel: core plant processes that are required for pathogen colonization.
Our hypothesis is that disease resistance based on manipulation of host accessibility processes has a higher probability for durability, and is best identified using a broad host-range pathogen. I will employ the filamentous pathogen Phytophthora palmivora to mine plant alleles and unravel host processes providing microbial access in roots and leaves of monocot and dicot plants.
In Aim 1 I will utilize plant symbiosis mutants and allelic variation to elucidate general mechanisms of colonization by filamentous microbes. Importantly, allelic variation will be studied in economically relevant barley and wheat to allow immediate translation into breeding programs.
In Aim 2 I will perform a comparative study of microbial colonization in monocot and dicot roots and leaves. Transcriptional profiling of pathogen and plant will highlight common and contrasting principles and illustrate the impact of differential plant anatomies.
We will challenge our findings by testing beneficial fungi to assess commonalities and differences between mutualist and pathogen colonization. We will use genetics, cell biology and genomics to find suitable resistance alleles highly relevant to crop production and global food security. At the completion of the project, I expect to have a set of genes for resistance breeding.
Summary
Increasing crop yield to feed the world is a grand challenge of the 21st century but it is hampered by diseases caused by filamentous plant pathogens. The arms race between pathogen and plant demands constant adjustment of crop germplasm to tackle emerging pathogen races with new virulence features. To date, most crop disease resistance has relied on specific resistance genes that are effective only against a subset of races. We cannot solely rely on classical resistance genes to keep ahead of the pathogens. There is an urgent need to develop approaches based on knowledge of the pathogen’s Achilles heel: core plant processes that are required for pathogen colonization.
Our hypothesis is that disease resistance based on manipulation of host accessibility processes has a higher probability for durability, and is best identified using a broad host-range pathogen. I will employ the filamentous pathogen Phytophthora palmivora to mine plant alleles and unravel host processes providing microbial access in roots and leaves of monocot and dicot plants.
In Aim 1 I will utilize plant symbiosis mutants and allelic variation to elucidate general mechanisms of colonization by filamentous microbes. Importantly, allelic variation will be studied in economically relevant barley and wheat to allow immediate translation into breeding programs.
In Aim 2 I will perform a comparative study of microbial colonization in monocot and dicot roots and leaves. Transcriptional profiling of pathogen and plant will highlight common and contrasting principles and illustrate the impact of differential plant anatomies.
We will challenge our findings by testing beneficial fungi to assess commonalities and differences between mutualist and pathogen colonization. We will use genetics, cell biology and genomics to find suitable resistance alleles highly relevant to crop production and global food security. At the completion of the project, I expect to have a set of genes for resistance breeding.
Max ERC Funding
1 991 054 €
Duration
Start date: 2015-09-01, End date: 2021-08-31
Project acronym ACOPS
Project Advanced Coherent Ultrafast Laser Pulse Stacking
Researcher (PI) Jens Limpert
Host Institution (HI) FRIEDRICH-SCHILLER-UNIVERSITAT JENA
Call Details Consolidator Grant (CoG), PE2, ERC-2013-CoG
Summary "An important driver of scientific progress has always been the envisioning of applications far beyond existing technological capabilities. Such thinking creates new challenges for physicists, driven by the groundbreaking nature of the anticipated application. In the case of laser physics, one of these applications is laser wake-field particle acceleration and possible future uses thereof, such as in collider experiments, or for medical applications such as cancer treatment. To accelerate electrons and positrons to TeV-energies, a laser architecture is required that allows for the combination of high efficiency, Petawatt peak powers, and Megawatt average powers. Developing such a laser system would be a challenging task that might take decades of aggressive research, development, and, most important, revolutionary approaches and innovative ideas.
The goal of the ACOPS project is to develop a compact, efficient, scalable, and cost-effective high-average and high-peak power ultra-short pulse laser concept.
The proposed approach to this goal relies on the spatially and temporally separated amplification of ultrashort laser pulses in waveguide structures, followed by coherent combination into a single train of pulses with increased average power and pulse energy. This combination can be realized through the coherent addition of the output beams of spatially separated amplifiers, combined with the pulse stacking of temporally separated pulses in passive enhancement cavities, employing a fast-switching element as cavity dumper.
Therefore, the three main tasks are the development of kW-class high-repetition-rate driving lasers, the investigation of non-steady state pulse enhancement in passive cavities, and the development of a suitable dumping element.
If successful, the proposed concept would undoubtedly provide a tool that would allow researchers to surpass the current limits in high-field physics and accelerator science."
Summary
"An important driver of scientific progress has always been the envisioning of applications far beyond existing technological capabilities. Such thinking creates new challenges for physicists, driven by the groundbreaking nature of the anticipated application. In the case of laser physics, one of these applications is laser wake-field particle acceleration and possible future uses thereof, such as in collider experiments, or for medical applications such as cancer treatment. To accelerate electrons and positrons to TeV-energies, a laser architecture is required that allows for the combination of high efficiency, Petawatt peak powers, and Megawatt average powers. Developing such a laser system would be a challenging task that might take decades of aggressive research, development, and, most important, revolutionary approaches and innovative ideas.
The goal of the ACOPS project is to develop a compact, efficient, scalable, and cost-effective high-average and high-peak power ultra-short pulse laser concept.
The proposed approach to this goal relies on the spatially and temporally separated amplification of ultrashort laser pulses in waveguide structures, followed by coherent combination into a single train of pulses with increased average power and pulse energy. This combination can be realized through the coherent addition of the output beams of spatially separated amplifiers, combined with the pulse stacking of temporally separated pulses in passive enhancement cavities, employing a fast-switching element as cavity dumper.
Therefore, the three main tasks are the development of kW-class high-repetition-rate driving lasers, the investigation of non-steady state pulse enhancement in passive cavities, and the development of a suitable dumping element.
If successful, the proposed concept would undoubtedly provide a tool that would allow researchers to surpass the current limits in high-field physics and accelerator science."
Max ERC Funding
1 881 040 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ACQDIV
Project Acquisition processes in maximally diverse languages: Min(d)ing the ambient language
Researcher (PI) Sabine Erika Stoll
Host Institution (HI) UNIVERSITAT ZURICH
Call Details Consolidator Grant (CoG), SH4, ERC-2013-CoG
Summary "Children learn any language that they grow up with, adapting to any of the ca. 7000 languages of the world, no matter how divergent or complex their structures are. What cognitive processes make this extreme flexibility possible? This is one of the most burning questions in cognitive science and the ACQDIV project aims at answering it by testing and refining the following leading hypothesis: Language acquisition is flexible and adaptive to any kind of language because it relies on a small set of universal cognitive processes that variably target different structures at different times during acquisition in every language. The project aims at establishing the precise set of processes and at determining the conditions of variation across maximally diverse languages. This project focuses on three processes: (i) distributional learning, (ii) generalization-based learning and (iii) interaction-based learning. To investigate these processes I will work with a sample of five clusters of languages including longitudinal data of two languages each. The clusters were determined by a clustering algorithm seeking the structurally most divergent languages in a typological database. The languages are: Cluster 1: Slavey and Cree, Cluster 2: Indonesian and Yucatec, Cluster 3: Inuktitut and Chintang, Cluster 4: Sesotho and Russian, Cluster 5: Japanese and Turkish. For all languages, corpora are available, except for Slavey where fieldwork is planned. The leading hypothesis will be tested against the acquisition of aspect and negation in each language of the sample and also against the two structures in each language that are most salient and challenging in them (e. g. complex morphology in Chintang). The acquisition processes also depend on statistical patterns in the input children receive. I will examine these patterns across the sample with respect to repetitiveness effects, applying data-mining methods and systematically comparing child-directed and child-surrounding speech."
Summary
"Children learn any language that they grow up with, adapting to any of the ca. 7000 languages of the world, no matter how divergent or complex their structures are. What cognitive processes make this extreme flexibility possible? This is one of the most burning questions in cognitive science and the ACQDIV project aims at answering it by testing and refining the following leading hypothesis: Language acquisition is flexible and adaptive to any kind of language because it relies on a small set of universal cognitive processes that variably target different structures at different times during acquisition in every language. The project aims at establishing the precise set of processes and at determining the conditions of variation across maximally diverse languages. This project focuses on three processes: (i) distributional learning, (ii) generalization-based learning and (iii) interaction-based learning. To investigate these processes I will work with a sample of five clusters of languages including longitudinal data of two languages each. The clusters were determined by a clustering algorithm seeking the structurally most divergent languages in a typological database. The languages are: Cluster 1: Slavey and Cree, Cluster 2: Indonesian and Yucatec, Cluster 3: Inuktitut and Chintang, Cluster 4: Sesotho and Russian, Cluster 5: Japanese and Turkish. For all languages, corpora are available, except for Slavey where fieldwork is planned. The leading hypothesis will be tested against the acquisition of aspect and negation in each language of the sample and also against the two structures in each language that are most salient and challenging in them (e. g. complex morphology in Chintang). The acquisition processes also depend on statistical patterns in the input children receive. I will examine these patterns across the sample with respect to repetitiveness effects, applying data-mining methods and systematically comparing child-directed and child-surrounding speech."
Max ERC Funding
1 998 438 €
Duration
Start date: 2014-09-01, End date: 2019-08-31