Project acronym 2DNANOPTICA
Project Nano-optics on flatland: from quantum nanotechnology to nano-bio-photonics
Researcher (PI) Pablo Alonso-González
Host Institution (HI) UNIVERSIDAD DE OVIEDO
Call Details Starting Grant (StG), PE3, ERC-2016-STG
Summary Ubiquitous in nature, light-matter interactions are of fundamental importance in science and all optical technologies. Understanding and controlling them has been a long-pursued objective in modern physics. However, so far, related experiments have relied on traditional optical schemes where, owing to the classical diffraction limit, control of optical fields to length scales below the wavelength of light is prevented. Importantly, this limitation impedes to exploit the extraordinary fundamental and scaling potentials of nanoscience and nanotechnology. A solution to concentrate optical fields into sub-diffracting volumes is the excitation of surface polaritons –coupled excitations of photons and mobile/bound charges in metals/polar materials (plasmons/phonons)-. However, their initial promises have been hindered by either strong optical losses or lack of electrical control in metals, and difficulties to fabricate high optical quality nanostructures in polar materials.
With the advent of two-dimensional (2D) materials and their extraordinary optical properties, during the last 2-3 years the visualization of both low-loss and electrically tunable (active) plasmons in graphene and high optical quality phonons in monolayer and multilayer h-BN nanostructures have been demonstrated in the mid-infrared spectral range, thus introducing a very encouraging arena for scientifically ground-breaking discoveries in nano-optics. Inspired by these extraordinary prospects, this ERC project aims to make use of our knowledge and unique expertise in 2D nanoplasmonics, and the recent advances in nanophononics, to establish a technological platform that, including coherent sources, waveguides, routers, and efficient detectors, permits an unprecedented active control and manipulation (at room temperature) of light and light-matter interactions on the nanoscale, thus laying experimentally the foundations of a 2D nano-optics field.
Summary
Ubiquitous in nature, light-matter interactions are of fundamental importance in science and all optical technologies. Understanding and controlling them has been a long-pursued objective in modern physics. However, so far, related experiments have relied on traditional optical schemes where, owing to the classical diffraction limit, control of optical fields to length scales below the wavelength of light is prevented. Importantly, this limitation impedes to exploit the extraordinary fundamental and scaling potentials of nanoscience and nanotechnology. A solution to concentrate optical fields into sub-diffracting volumes is the excitation of surface polaritons –coupled excitations of photons and mobile/bound charges in metals/polar materials (plasmons/phonons)-. However, their initial promises have been hindered by either strong optical losses or lack of electrical control in metals, and difficulties to fabricate high optical quality nanostructures in polar materials.
With the advent of two-dimensional (2D) materials and their extraordinary optical properties, during the last 2-3 years the visualization of both low-loss and electrically tunable (active) plasmons in graphene and high optical quality phonons in monolayer and multilayer h-BN nanostructures have been demonstrated in the mid-infrared spectral range, thus introducing a very encouraging arena for scientifically ground-breaking discoveries in nano-optics. Inspired by these extraordinary prospects, this ERC project aims to make use of our knowledge and unique expertise in 2D nanoplasmonics, and the recent advances in nanophononics, to establish a technological platform that, including coherent sources, waveguides, routers, and efficient detectors, permits an unprecedented active control and manipulation (at room temperature) of light and light-matter interactions on the nanoscale, thus laying experimentally the foundations of a 2D nano-optics field.
Max ERC Funding
1 459 219 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym 2DTHERMS
Project Design of new thermoelectric devices based on layered and field modulated nanostructures of strongly correlated electron systems
Researcher (PI) Jose Francisco Rivadulla Fernandez
Host Institution (HI) UNIVERSIDAD DE SANTIAGO DE COMPOSTELA
Call Details Starting Grant (StG), PE3, ERC-2010-StG_20091028
Summary Design of new thermoelectric devices based on layered and field modulated nanostructures of strongly correlated electron systems
Summary
Design of new thermoelectric devices based on layered and field modulated nanostructures of strongly correlated electron systems
Max ERC Funding
1 427 190 €
Duration
Start date: 2010-11-01, End date: 2015-10-31
Project acronym 3DNANOMECH
Project Three-dimensional molecular resolution mapping of soft matter-liquid interfaces
Researcher (PI) Ricardo Garcia
Host Institution (HI) AGENCIA ESTATAL CONSEJO SUPERIOR DEINVESTIGACIONES CIENTIFICAS
Call Details Advanced Grant (AdG), PE4, ERC-2013-ADG
Summary Optical, electron and probe microscopes are enabling tools for discoveries and knowledge generation in nanoscale sicence and technology. High resolution –nanoscale or molecular-, noninvasive and label-free imaging of three-dimensional soft matter-liquid interfaces has not been achieved by any microscopy method.
Force microscopy (AFM) is considered the second most relevant advance in materials science since 1960. Despite its impressive range of applications, the technique has some key limitations. Force microscopy has not three dimensional depth. What lies above or in the subsurface is not readily characterized.
3DNanoMech proposes to design, build and operate a high speed force-based method for the three-dimensional characterization soft matter-liquid interfaces (3D AFM). The microscope will combine a detection method based on force perturbations, adaptive algorithms, high speed piezo actuators and quantitative-oriented multifrequency approaches. The development of the microscope cannot be separated from its applications: imaging the error-free DNA repair and to understand the relationship existing between the nanomechanical properties and the malignancy of cancer cells. Those problems encompass the different spatial –molecular-nano-mesoscopic- and time –milli to seconds- scales of the instrument.
In short, 3DNanoMech aims to image, map and measure with picoNewton, millisecond and angstrom resolution soft matter surfaces and interfaces in liquid. The long-term vision of 3DNanoMech is to replace models or computer animations of bimolecular-liquid interfaces by real time, molecular resolution maps of properties and processes.
Summary
Optical, electron and probe microscopes are enabling tools for discoveries and knowledge generation in nanoscale sicence and technology. High resolution –nanoscale or molecular-, noninvasive and label-free imaging of three-dimensional soft matter-liquid interfaces has not been achieved by any microscopy method.
Force microscopy (AFM) is considered the second most relevant advance in materials science since 1960. Despite its impressive range of applications, the technique has some key limitations. Force microscopy has not three dimensional depth. What lies above or in the subsurface is not readily characterized.
3DNanoMech proposes to design, build and operate a high speed force-based method for the three-dimensional characterization soft matter-liquid interfaces (3D AFM). The microscope will combine a detection method based on force perturbations, adaptive algorithms, high speed piezo actuators and quantitative-oriented multifrequency approaches. The development of the microscope cannot be separated from its applications: imaging the error-free DNA repair and to understand the relationship existing between the nanomechanical properties and the malignancy of cancer cells. Those problems encompass the different spatial –molecular-nano-mesoscopic- and time –milli to seconds- scales of the instrument.
In short, 3DNanoMech aims to image, map and measure with picoNewton, millisecond and angstrom resolution soft matter surfaces and interfaces in liquid. The long-term vision of 3DNanoMech is to replace models or computer animations of bimolecular-liquid interfaces by real time, molecular resolution maps of properties and processes.
Max ERC Funding
2 499 928 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym AMORE
Project A distributional MOdel of Reference to Entities
Researcher (PI) Gemma BOLEDA TORRENT
Host Institution (HI) UNIVERSIDAD POMPEU FABRA
Call Details Starting Grant (StG), SH4, ERC-2016-STG
Summary "When I asked my seven-year-old daughter ""Who is the boy in your class who was also new in school last year, like you?"", she instantly replied ""Daniel"", using the descriptive content in my utterance to identify an entity in the real world and refer to it. The ability to use language to refer to reality is crucial for humans, and yet it is very difficult to model. AMORE breaks new ground in Computational Linguistics, Linguistics, and Artificial Intelligence by developing a model of linguistic reference to entities implemented as a computational system that can learn its own representations from data.
This interdisciplinary project builds on two complementary semantic traditions: 1) Formal semantics, a symbolic approach that can delimit and track linguistic referents, but does not adequately match them with the descriptive content of linguistic expressions; 2) Distributional semantics, which can handle descriptive content but does not associate it to individuated referents. AMORE synthesizes the two approaches into a unified, scalable model of reference that operates with individuated referents and links them to referential expressions characterized by rich descriptive content. The model is a distributed (neural network) version of a formal semantic framework that is furthermore able to integrate perceptual (visual) and linguistic information about entities. We test it extensively in referential tasks that require matching noun phrases (“the Medicine student”, “the white cat”) with entity representations extracted from text and images.
AMORE advances our scientific understanding of language and its computational modeling, and contributes to the far-reaching debate between symbolic and distributed approaches to cognition with an integrative proposal. I am in a privileged position to carry out this integration, since I have contributed top research in both distributional and formal semantics.
"
Summary
"When I asked my seven-year-old daughter ""Who is the boy in your class who was also new in school last year, like you?"", she instantly replied ""Daniel"", using the descriptive content in my utterance to identify an entity in the real world and refer to it. The ability to use language to refer to reality is crucial for humans, and yet it is very difficult to model. AMORE breaks new ground in Computational Linguistics, Linguistics, and Artificial Intelligence by developing a model of linguistic reference to entities implemented as a computational system that can learn its own representations from data.
This interdisciplinary project builds on two complementary semantic traditions: 1) Formal semantics, a symbolic approach that can delimit and track linguistic referents, but does not adequately match them with the descriptive content of linguistic expressions; 2) Distributional semantics, which can handle descriptive content but does not associate it to individuated referents. AMORE synthesizes the two approaches into a unified, scalable model of reference that operates with individuated referents and links them to referential expressions characterized by rich descriptive content. The model is a distributed (neural network) version of a formal semantic framework that is furthermore able to integrate perceptual (visual) and linguistic information about entities. We test it extensively in referential tasks that require matching noun phrases (“the Medicine student”, “the white cat”) with entity representations extracted from text and images.
AMORE advances our scientific understanding of language and its computational modeling, and contributes to the far-reaching debate between symbolic and distributed approaches to cognition with an integrative proposal. I am in a privileged position to carry out this integration, since I have contributed top research in both distributional and formal semantics.
"
Max ERC Funding
1 499 805 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym BACCO
Project Bias and Clustering Calculations Optimised: Maximising discovery with galaxy surveys
Researcher (PI) Raúl Esteban ANGULO de la Fuente
Host Institution (HI) FUNDACION CENTRO DE ESTUDIOS DE FISICA DEL COSMOS DE ARAGON
Call Details Starting Grant (StG), PE9, ERC-2016-STG
Summary A new generation of galaxy surveys will soon start measuring the spatial distribution of millions of galaxies over a broad range of redshifts, offering an imminent opportunity to discover new physics. A detailed comparison of these measurements with theoretical models of galaxy clustering may reveal a new fundamental particle, a breakdown of General Relativity, or a hint on the nature of cosmic acceleration. Despite a large progress in the analytic treatment of structure formation in recent years, traditional clustering models still suffer from large uncertainties. This limits cosmological analyses to a very restricted range of scales and statistics, which will be one of the main obstacles to reach a comprehensive exploitation of future surveys.
Here I propose to develop a novel simulation--based approach to predict galaxy clustering. Combining recent advances in computational cosmology, from cosmological N--body calculations to physically-motivated galaxy formation models, I will develop a unified framework to directly predict the position and velocity of individual dark matter structures and galaxies as function of cosmological and astrophysical parameters. In this formulation, galaxy clustering will be a prediction of a set of physical assumptions in a given cosmological setting. The new theoretical framework will be flexible, accurate and fast: it will provide predictions for any clustering statistic, down to scales 100 times smaller than in state-of-the-art perturbation--theory--based models, and in less than 1 minute of CPU time. These advances will enable major improvements in future cosmological constraints, which will significantly increase the overall power of future surveys maximising our potential to discover new physics.
Summary
A new generation of galaxy surveys will soon start measuring the spatial distribution of millions of galaxies over a broad range of redshifts, offering an imminent opportunity to discover new physics. A detailed comparison of these measurements with theoretical models of galaxy clustering may reveal a new fundamental particle, a breakdown of General Relativity, or a hint on the nature of cosmic acceleration. Despite a large progress in the analytic treatment of structure formation in recent years, traditional clustering models still suffer from large uncertainties. This limits cosmological analyses to a very restricted range of scales and statistics, which will be one of the main obstacles to reach a comprehensive exploitation of future surveys.
Here I propose to develop a novel simulation--based approach to predict galaxy clustering. Combining recent advances in computational cosmology, from cosmological N--body calculations to physically-motivated galaxy formation models, I will develop a unified framework to directly predict the position and velocity of individual dark matter structures and galaxies as function of cosmological and astrophysical parameters. In this formulation, galaxy clustering will be a prediction of a set of physical assumptions in a given cosmological setting. The new theoretical framework will be flexible, accurate and fast: it will provide predictions for any clustering statistic, down to scales 100 times smaller than in state-of-the-art perturbation--theory--based models, and in less than 1 minute of CPU time. These advances will enable major improvements in future cosmological constraints, which will significantly increase the overall power of future surveys maximising our potential to discover new physics.
Max ERC Funding
1 484 240 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym BePreSysE
Project Beyond Precision Cosmology: dealing with Systematic Errors
Researcher (PI) Licia VERDE
Host Institution (HI) UNIVERSITAT DE BARCELONA
Call Details Consolidator Grant (CoG), PE9, ERC-2016-COG
Summary Over the past 20 years cosmology has made the transition to a precision science: the standard cosmological model has been established and its parameters are now measured with unprecedented precision. But precision is not enough: accuracy is also crucial. Accuracy accounts for systematic errors which can be both on the observational and on the theory/modelling side (and everywhere in between). While there is a well-defined and developed framework for treating statistical errors, there is no established approach for systematic errors. The next decade will see the era of large surveys; a large coordinated effort of the scientific community in the field is on-going to map the cosmos producing an exponentially growing amount of data. This will shrink the statistical errors, making mitigation and control of systematics of the utmost importance. While there are isolated and targeted efforts to quantify systematic errors and propagate them through all the way to the final results, there is no well-established, self-consistent methodology. To go beyond precision cosmology and reap the benefits of the forthcoming observational program, a systematic approach to systematics is needed. Systematics should be interpreted in the most general sense as shifts between the recovered measured values and true values of physical quantities. I propose to develop a comprehensive approach to tackle systematic errors with the goal to uncover and quantify otherwise unknown differences between the interpretation of a measurement and reality. This will require to fully develop, combine and systematize all approaches proposed so far (many pioneered by the PI), develop new ones to fill the gaps, study and explore their interplay and finally test and validate the procedure. Beyond Precision Cosmology: Dealing with Systematic Errors (BePreSysE) will develop a framework to deal with systematics in forthcoming Cosmological surveys which, could, in principle, be applied beyond Cosmology.
Summary
Over the past 20 years cosmology has made the transition to a precision science: the standard cosmological model has been established and its parameters are now measured with unprecedented precision. But precision is not enough: accuracy is also crucial. Accuracy accounts for systematic errors which can be both on the observational and on the theory/modelling side (and everywhere in between). While there is a well-defined and developed framework for treating statistical errors, there is no established approach for systematic errors. The next decade will see the era of large surveys; a large coordinated effort of the scientific community in the field is on-going to map the cosmos producing an exponentially growing amount of data. This will shrink the statistical errors, making mitigation and control of systematics of the utmost importance. While there are isolated and targeted efforts to quantify systematic errors and propagate them through all the way to the final results, there is no well-established, self-consistent methodology. To go beyond precision cosmology and reap the benefits of the forthcoming observational program, a systematic approach to systematics is needed. Systematics should be interpreted in the most general sense as shifts between the recovered measured values and true values of physical quantities. I propose to develop a comprehensive approach to tackle systematic errors with the goal to uncover and quantify otherwise unknown differences between the interpretation of a measurement and reality. This will require to fully develop, combine and systematize all approaches proposed so far (many pioneered by the PI), develop new ones to fill the gaps, study and explore their interplay and finally test and validate the procedure. Beyond Precision Cosmology: Dealing with Systematic Errors (BePreSysE) will develop a framework to deal with systematics in forthcoming Cosmological surveys which, could, in principle, be applied beyond Cosmology.
Max ERC Funding
1 835 220 €
Duration
Start date: 2017-06-01, End date: 2022-05-31
Project acronym BILITERACY
Project Bi-literacy: Learning to read in L1 and in L2
Researcher (PI) Manuel Francisco Carreiras Valiña
Host Institution (HI) BCBL BASQUE CENTER ON COGNITION BRAIN AND LANGUAGE
Call Details Advanced Grant (AdG), SH4, ERC-2011-ADG_20110406
Summary Learning to read is probably one of the most exciting discoveries in our life. Using a longitudinal approach, the research proposed examines how the human brain responds to two major challenges: (a) the instantiation a complex cognitive function for which there is no genetic blueprint (learning to read in a first language, L1), and (b) the accommodation to new statistical regularities when learning to read in a second language (L2). The aim of the present research project is to identify the neural substrates of the reading process and its constituent cognitive components, with specific attention to individual differences and reading disabilities; as well as to investigate the relationship between specific cognitive functions and the changes in neural activity that take place in the course of learning to read in L1 and in L2. The project will employ a longitudinal design. We will recruit children before they learn to read in L1 and in L2 and track reading development with both cognitive and neuroimaging measures over 24 months. The findings from this project will provide a deeper understanding of (a) how general neurocognitive factors and language specific factors underlie individual differences – and reading disabilities– in reading acquisition in L1 and in L2; (b) how the neuro-cognitive circuitry changes and brain mechanisms synchronize while instantiating reading in L1 and in L2; (c) what the limitations and the extent of brain plasticity are in young readers. An interdisciplinary and multi-methodological approach is one of the keys to success of the present project, along with strong theory-driven investigation. By combining both we will generate breakthroughs to advance our understanding of how literacy in L1 and in L2 is acquired and mastered. The research proposed will also lay the foundations for more applied investigations of best practice in teaching reading in first and subsequent languages, and devising intervention methods for reading disabilities.
Summary
Learning to read is probably one of the most exciting discoveries in our life. Using a longitudinal approach, the research proposed examines how the human brain responds to two major challenges: (a) the instantiation a complex cognitive function for which there is no genetic blueprint (learning to read in a first language, L1), and (b) the accommodation to new statistical regularities when learning to read in a second language (L2). The aim of the present research project is to identify the neural substrates of the reading process and its constituent cognitive components, with specific attention to individual differences and reading disabilities; as well as to investigate the relationship between specific cognitive functions and the changes in neural activity that take place in the course of learning to read in L1 and in L2. The project will employ a longitudinal design. We will recruit children before they learn to read in L1 and in L2 and track reading development with both cognitive and neuroimaging measures over 24 months. The findings from this project will provide a deeper understanding of (a) how general neurocognitive factors and language specific factors underlie individual differences – and reading disabilities– in reading acquisition in L1 and in L2; (b) how the neuro-cognitive circuitry changes and brain mechanisms synchronize while instantiating reading in L1 and in L2; (c) what the limitations and the extent of brain plasticity are in young readers. An interdisciplinary and multi-methodological approach is one of the keys to success of the present project, along with strong theory-driven investigation. By combining both we will generate breakthroughs to advance our understanding of how literacy in L1 and in L2 is acquired and mastered. The research proposed will also lay the foundations for more applied investigations of best practice in teaching reading in first and subsequent languages, and devising intervention methods for reading disabilities.
Max ERC Funding
2 487 000 €
Duration
Start date: 2012-05-01, End date: 2017-04-30
Project acronym BIO2CHEM-D
Project Biomass to chemicals: Catalysis design from first principles for a sustainable chemical industry
Researcher (PI) Nuria Lopez
Host Institution (HI) FUNDACIO PRIVADA INSTITUT CATALA D'INVESTIGACIO QUIMICA
Call Details Starting Grant (StG), PE4, ERC-2010-StG_20091028
Summary The use of renewable feedstocks by the chemical industry is fundamental due to both the depletion of fossil
resources and the increasing pressure of environmental concerns. Biomass can act as a sustainable source of
organic industrial chemicals; however, the establishment of a renewable chemical industry that is
economically competitive with the present oil-based one requires the development of new processes to
convert biomass-derived compounds into useful industrial materials following the principles of green
chemistry. To achieve these goals, developments in several fields including heterogeneous catalysis are
needed. One of the ways to accelerate the discovery of new potentially active, selective and stable catalysts is
the massive use of computational chemistry. Recent advances have demonstrated that Density Functional
Theory coupled to ab initio thermodynamics, transition state theory and microkinetic analysis can provide a
full view of the catalytic phenomena.
The aim of the present project is thus to employ these well-tested computational techniques to the
development of a theoretical framework that can accelerate the identification of new catalysts for the
conversion of biomass derived target compounds into useful chemicals. Since compared to petroleum-based
materials-biomass derived ones are multifuncionalized, the search for new catalytic materials and processes
has a strong requirement in the selectivity of the chemical transformations. The main challenges in the
project are related to the high functionalization of the molecules, their liquid nature and the large number of
potentially competitive reaction paths. The requirements of specificity and selectivity in the chemical
transformations while keeping a reasonably flexible framework constitute a major objective. The work will
be divided in three main work packages, one devoted to the properties of small molecules or fragments
containing a single functional group; the second addresses competition in multiple functionalized molecules;
and third is dedicated to the specific transformations of two molecules that have already been identified as
potential platform generators. The goal is to identify suitable candidates that could be synthetized and tested
in the Institute facilities.
Summary
The use of renewable feedstocks by the chemical industry is fundamental due to both the depletion of fossil
resources and the increasing pressure of environmental concerns. Biomass can act as a sustainable source of
organic industrial chemicals; however, the establishment of a renewable chemical industry that is
economically competitive with the present oil-based one requires the development of new processes to
convert biomass-derived compounds into useful industrial materials following the principles of green
chemistry. To achieve these goals, developments in several fields including heterogeneous catalysis are
needed. One of the ways to accelerate the discovery of new potentially active, selective and stable catalysts is
the massive use of computational chemistry. Recent advances have demonstrated that Density Functional
Theory coupled to ab initio thermodynamics, transition state theory and microkinetic analysis can provide a
full view of the catalytic phenomena.
The aim of the present project is thus to employ these well-tested computational techniques to the
development of a theoretical framework that can accelerate the identification of new catalysts for the
conversion of biomass derived target compounds into useful chemicals. Since compared to petroleum-based
materials-biomass derived ones are multifuncionalized, the search for new catalytic materials and processes
has a strong requirement in the selectivity of the chemical transformations. The main challenges in the
project are related to the high functionalization of the molecules, their liquid nature and the large number of
potentially competitive reaction paths. The requirements of specificity and selectivity in the chemical
transformations while keeping a reasonably flexible framework constitute a major objective. The work will
be divided in three main work packages, one devoted to the properties of small molecules or fragments
containing a single functional group; the second addresses competition in multiple functionalized molecules;
and third is dedicated to the specific transformations of two molecules that have already been identified as
potential platform generators. The goal is to identify suitable candidates that could be synthetized and tested
in the Institute facilities.
Max ERC Funding
1 496 200 €
Duration
Start date: 2010-10-01, End date: 2015-09-30
Project acronym BIOCON
Project Biological origins of linguistic constraints
Researcher (PI) Juan Manuel Toro
Host Institution (HI) UNIVERSIDAD POMPEU FABRA
Call Details Starting Grant (StG), SH4, ERC-2012-StG_20111124
Summary The linguistic capacity to express and comprehend an unlimited number of ideas when combining a limited number of elements has only been observed in humans. Nevertheless, research has not fully identified the components of language that make it uniquely human and that allow infants to grasp the complexity of linguistic structure in an apparently effortless manner. Research on comparative cognition suggests humans and other species share powerful learning mechanisms and basic perceptual abilities we use for language processing. But humans display remarkable linguistic abilities that other animals do not possess. Understanding the interplay between general mechanisms shared across species and more specialized ones dedicated to the speech signal is at the heart of current debates in human language acquisition. This is a highly relevant issue for researchers in the fields of Psychology, Linguistics, Biology, Philosophy and Cognitive Neuroscience. By conducting experiments across several populations (human adults and infants) and species (human and nonhuman animals), and using a wide array of experimental techniques, the present proposal hopes to shed some light on the origins of shared biological constraints that guide more specialized mechanisms in the search for linguistic structure. More specifically, we hope to understand how general perceptual and cognitive mechanisms likely present in other animals constrain the way humans tackle the task of language acquisition. Our hypothesis is that differences between humans and other species are not the result of humans being able to process increasingly complex structures that are the hallmark of language. Rather, differences might be due to humans and other animals focusing on different cues present in the signal to extract relevant information. This research will hint at what is uniquely human and what is shared across different animals species.
Summary
The linguistic capacity to express and comprehend an unlimited number of ideas when combining a limited number of elements has only been observed in humans. Nevertheless, research has not fully identified the components of language that make it uniquely human and that allow infants to grasp the complexity of linguistic structure in an apparently effortless manner. Research on comparative cognition suggests humans and other species share powerful learning mechanisms and basic perceptual abilities we use for language processing. But humans display remarkable linguistic abilities that other animals do not possess. Understanding the interplay between general mechanisms shared across species and more specialized ones dedicated to the speech signal is at the heart of current debates in human language acquisition. This is a highly relevant issue for researchers in the fields of Psychology, Linguistics, Biology, Philosophy and Cognitive Neuroscience. By conducting experiments across several populations (human adults and infants) and species (human and nonhuman animals), and using a wide array of experimental techniques, the present proposal hopes to shed some light on the origins of shared biological constraints that guide more specialized mechanisms in the search for linguistic structure. More specifically, we hope to understand how general perceptual and cognitive mechanisms likely present in other animals constrain the way humans tackle the task of language acquisition. Our hypothesis is that differences between humans and other species are not the result of humans being able to process increasingly complex structures that are the hallmark of language. Rather, differences might be due to humans and other animals focusing on different cues present in the signal to extract relevant information. This research will hint at what is uniquely human and what is shared across different animals species.
Max ERC Funding
1 305 973 €
Duration
Start date: 2013-01-01, End date: 2018-12-31
Project acronym BioInspired_SolarH2
Project Engineering Bio-Inspired Systems for the Conversion of Solar Energy to Hydrogen
Researcher (PI) Elisabet ROMERO MESA
Host Institution (HI) FUNDACIO PRIVADA INSTITUT CATALA D'INVESTIGACIO QUIMICA
Call Details Starting Grant (StG), PE3, ERC-2018-STG
Summary With this proposal, I aim to achieve the efficient conversion of solar energy to hydrogen. The overall objective is to engineer bio-inspired systems able to convert solar energy into a separation of charges and to construct devices by coupling these systems to catalysts in order to drive sustainable and effective water oxidation and hydrogen production.
The global energy crisis requires an urgent solution, we must replace fossil fuels for a renewable energy source: Solar energy. However, the efficient and inexpensive conversion and storage of solar energy into fuel remains a fundamental challenge. Currently, solar-energy conversion devices suffer from energy losses mainly caused by disorder in the materials used. The solution to this problem is to learn from nature. In photosynthesis, the photosystem II reaction centre (PSII RC) is a pigment-protein complex able to overcome disorder and convert solar photons into a separation of charges with near 100% efficiency. Crucially, the generated charges have enough potential to drive water oxidation and hydrogen production.
Previously, I have investigated the charge separation process in the PSII RC by a collection of spectroscopic techniques, which allowed me to formulate the design principles of photosynthetic charge separation, where coherence plays a crucial role. Here I will put these knowledge into action to design efficient and robust chromophore-protein assemblies for the collection and conversion of solar energy, employ organic chemistry and synthetic biology tools to construct these well defined and fully controllable assemblies, and apply a complete set of spectroscopic methods to investigate these engineered systems.
Following the approach Understand, Engineer, Implement, I will create a new generation of bio-inspired devices based on abundant and biodegradable materials that will drive the transformation of solar energy and water into hydrogen, an energy-rich molecule that can be stored and transported.
Summary
With this proposal, I aim to achieve the efficient conversion of solar energy to hydrogen. The overall objective is to engineer bio-inspired systems able to convert solar energy into a separation of charges and to construct devices by coupling these systems to catalysts in order to drive sustainable and effective water oxidation and hydrogen production.
The global energy crisis requires an urgent solution, we must replace fossil fuels for a renewable energy source: Solar energy. However, the efficient and inexpensive conversion and storage of solar energy into fuel remains a fundamental challenge. Currently, solar-energy conversion devices suffer from energy losses mainly caused by disorder in the materials used. The solution to this problem is to learn from nature. In photosynthesis, the photosystem II reaction centre (PSII RC) is a pigment-protein complex able to overcome disorder and convert solar photons into a separation of charges with near 100% efficiency. Crucially, the generated charges have enough potential to drive water oxidation and hydrogen production.
Previously, I have investigated the charge separation process in the PSII RC by a collection of spectroscopic techniques, which allowed me to formulate the design principles of photosynthetic charge separation, where coherence plays a crucial role. Here I will put these knowledge into action to design efficient and robust chromophore-protein assemblies for the collection and conversion of solar energy, employ organic chemistry and synthetic biology tools to construct these well defined and fully controllable assemblies, and apply a complete set of spectroscopic methods to investigate these engineered systems.
Following the approach Understand, Engineer, Implement, I will create a new generation of bio-inspired devices based on abundant and biodegradable materials that will drive the transformation of solar energy and water into hydrogen, an energy-rich molecule that can be stored and transported.
Max ERC Funding
1 500 000 €
Duration
Start date: 2019-04-01, End date: 2024-03-31