Project acronym 3D-BioMat
Project Deciphering biomineralization mechanisms through 3D explorations of mesoscale crystalline structure in calcareous biomaterials
Researcher (PI) VIRGINIE CHAMARD
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Country France
Call Details Consolidator Grant (CoG), PE3, ERC-2016-COG
Summary The fundamental 3D-BioMat project aims at providing a biomineralization model to explain the formation of microscopic calcareous single-crystals produced by living organisms. Although these crystals present a wide variety of shapes, associated to various organic materials, the observation of a nanoscale granular structure common to almost all calcareous crystallizing organisms, associated to an extended crystalline coherence, underlies a generic biomineralization and assembly process. A key to building realistic scenarios of biomineralization is to reveal the crystalline architecture, at the mesoscale, (i. e., over a few granules), which none of the existing nano-characterization tools is able to provide.
3D-BioMat is based on the recognized PI’s expertise in the field of synchrotron coherent x-ray diffraction microscopy. It will extend the PI’s disruptive pioneering microscopy formalism, towards an innovative high-throughput approach able at giving access to the 3D mesoscale image of the crystalline properties (crystal-line coherence, crystal plane tilts and strains) with the required flexibility, nanoscale resolution, and non-invasiveness.
This achievement will be used to timely reveal the generics of the mesoscale crystalline structure through the pioneering explorations of a vast variety of crystalline biominerals produced by the famous Pinctada mar-garitifera oyster shell, and thereby build a realistic biomineralization scenario.
The inferred biomineralization pathways, including both physico-chemical pathways and biological controls, will ultimately be validated by comparing the mesoscale structures produced by biomimetic samples with the biogenic ones. Beyond deciphering one of the most intriguing questions of material nanosciences, 3D-BioMat may contribute to new climate models, pave the way for new routes in material synthesis and supply answers to the pearl-culture calcification problems.
Summary
The fundamental 3D-BioMat project aims at providing a biomineralization model to explain the formation of microscopic calcareous single-crystals produced by living organisms. Although these crystals present a wide variety of shapes, associated to various organic materials, the observation of a nanoscale granular structure common to almost all calcareous crystallizing organisms, associated to an extended crystalline coherence, underlies a generic biomineralization and assembly process. A key to building realistic scenarios of biomineralization is to reveal the crystalline architecture, at the mesoscale, (i. e., over a few granules), which none of the existing nano-characterization tools is able to provide.
3D-BioMat is based on the recognized PI’s expertise in the field of synchrotron coherent x-ray diffraction microscopy. It will extend the PI’s disruptive pioneering microscopy formalism, towards an innovative high-throughput approach able at giving access to the 3D mesoscale image of the crystalline properties (crystal-line coherence, crystal plane tilts and strains) with the required flexibility, nanoscale resolution, and non-invasiveness.
This achievement will be used to timely reveal the generics of the mesoscale crystalline structure through the pioneering explorations of a vast variety of crystalline biominerals produced by the famous Pinctada mar-garitifera oyster shell, and thereby build a realistic biomineralization scenario.
The inferred biomineralization pathways, including both physico-chemical pathways and biological controls, will ultimately be validated by comparing the mesoscale structures produced by biomimetic samples with the biogenic ones. Beyond deciphering one of the most intriguing questions of material nanosciences, 3D-BioMat may contribute to new climate models, pave the way for new routes in material synthesis and supply answers to the pearl-culture calcification problems.
Max ERC Funding
1 966 429 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym 3DICE
Project 3D Interstellar Chemo-physical Evolution
Researcher (PI) Valentine Wakelam
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Country France
Call Details Starting Grant (StG), PE9, ERC-2013-StG
Summary At the end of their life, stars spread their inner material into the diffuse interstellar medium. This diffuse medium gets locally denser and form dark clouds (also called dense or molecular clouds) whose innermost part is shielded from the external UV field by the dust, allowing for molecules to grow and get more complex. Gravitational collapse occurs inside these dense clouds, forming protostars and their surrounding disks, and eventually planetary systems like (or unlike) our solar system. The formation and evolution of molecules, minerals, ices and organics from the diffuse medium to planetary bodies, their alteration or preservation throughout this cosmic chemical history set the initial conditions for building planets, atmospheres and possibly the first bricks of life. The current view of interstellar chemistry is based on fragmental works on key steps of the sequence that are observed. The objective of this proposal is to follow the fractionation of the elements between the gas-phase and the interstellar grains, from the most diffuse medium to protoplanetary disks, in order to constrain the chemical composition of the material in which planets are formed. The potential outcome of this project is to get a consistent and more accurate description of the chemical evolution of interstellar matter. To achieve this objective, I will improve our chemical model by adding new processes on grain surfaces relevant under the diffuse medium conditions. This upgraded gas-grain model will be coupled to 3D dynamical models of the formation of dense clouds from diffuse medium and of protoplanetary disks from dense clouds. The computed chemical composition will also be used with 3D radiative transfer codes to study the chemical tracers of the physics of protoplanetary disk formation. The robustness of the model predictions will be studied with sensitivity analyses. Finally, model results will be confronted to observations to address some of the current challenges.
Summary
At the end of their life, stars spread their inner material into the diffuse interstellar medium. This diffuse medium gets locally denser and form dark clouds (also called dense or molecular clouds) whose innermost part is shielded from the external UV field by the dust, allowing for molecules to grow and get more complex. Gravitational collapse occurs inside these dense clouds, forming protostars and their surrounding disks, and eventually planetary systems like (or unlike) our solar system. The formation and evolution of molecules, minerals, ices and organics from the diffuse medium to planetary bodies, their alteration or preservation throughout this cosmic chemical history set the initial conditions for building planets, atmospheres and possibly the first bricks of life. The current view of interstellar chemistry is based on fragmental works on key steps of the sequence that are observed. The objective of this proposal is to follow the fractionation of the elements between the gas-phase and the interstellar grains, from the most diffuse medium to protoplanetary disks, in order to constrain the chemical composition of the material in which planets are formed. The potential outcome of this project is to get a consistent and more accurate description of the chemical evolution of interstellar matter. To achieve this objective, I will improve our chemical model by adding new processes on grain surfaces relevant under the diffuse medium conditions. This upgraded gas-grain model will be coupled to 3D dynamical models of the formation of dense clouds from diffuse medium and of protoplanetary disks from dense clouds. The computed chemical composition will also be used with 3D radiative transfer codes to study the chemical tracers of the physics of protoplanetary disk formation. The robustness of the model predictions will be studied with sensitivity analyses. Finally, model results will be confronted to observations to address some of the current challenges.
Max ERC Funding
1 166 231 €
Duration
Start date: 2013-09-01, End date: 2018-08-31
Project acronym ACAP
Project Acency Costs and Asset Pricing
Researcher (PI) Thomas Mariotti
Host Institution (HI) FONDATION JEAN JACQUES LAFFONT,TOULOUSE SCIENCES ECONOMIQUES
Country France
Call Details Starting Grant (StG), SH1, ERC-2007-StG
Summary The main objective of this research project is to contribute at bridging the gap between the two main branches of financial theory, namely corporate finance and asset pricing. It is motivated by the conviction that these two aspects of financial activity should and can be analyzed within a unified framework. This research will borrow from these two approaches in order to construct theoretical models that allow one to analyze the design and issuance of financial securities, as well as the dynamics of their valuations. Unlike asset pricing, which takes as given the price of the fundamentals, the goal is to derive security price processes from a precise description of firm’s operations and internal frictions. Regarding the latter, and in line with traditional corporate finance theory, the analysis will emphasize the role of agency costs within the firm for the design of its securities. But the analysis will be pushed one step further by studying the impact of these agency costs on key financial variables such as stock and bond prices, leverage, book-to-market ratios, default risk, or the holding of liquidities by firms. One of the contributions of this research project is to show how these variables are interrelated when firms and investors agree upon optimal financial arrangements. The final objective is to derive a rich set of testable asset pricing implications that would eventually be brought to the data.
Summary
The main objective of this research project is to contribute at bridging the gap between the two main branches of financial theory, namely corporate finance and asset pricing. It is motivated by the conviction that these two aspects of financial activity should and can be analyzed within a unified framework. This research will borrow from these two approaches in order to construct theoretical models that allow one to analyze the design and issuance of financial securities, as well as the dynamics of their valuations. Unlike asset pricing, which takes as given the price of the fundamentals, the goal is to derive security price processes from a precise description of firm’s operations and internal frictions. Regarding the latter, and in line with traditional corporate finance theory, the analysis will emphasize the role of agency costs within the firm for the design of its securities. But the analysis will be pushed one step further by studying the impact of these agency costs on key financial variables such as stock and bond prices, leverage, book-to-market ratios, default risk, or the holding of liquidities by firms. One of the contributions of this research project is to show how these variables are interrelated when firms and investors agree upon optimal financial arrangements. The final objective is to derive a rich set of testable asset pricing implications that would eventually be brought to the data.
Max ERC Funding
1 000 000 €
Duration
Start date: 2008-11-01, End date: 2014-10-31
Project acronym ADAM
Project The Adaptive Auditory Mind
Researcher (PI) Shihab Shamma
Host Institution (HI) ECOLE NORMALE SUPERIEURE
Country France
Call Details Advanced Grant (AdG), SH4, ERC-2011-ADG_20110406
Summary Listening in realistic situations is an active process that engages perceptual and cognitive faculties, endowing speech with meaning, music with joy, and environmental sounds with emotion. Through hearing, humans and other animals navigate complex acoustic scenes, separate sound mixtures, and assess their behavioral relevance. These remarkable feats are currently beyond our understanding and exceed the capabilities of the most sophisticated audio engineering systems. The goal of the proposed research is to investigate experimentally a novel view of hearing, where active hearing emerges from a deep interplay between adaptive sensory processes and goal-directed cognition. Specifically, we shall explore the postulate that versatile perception is mediated by rapid-plasticity at the neuronal level. At the conjunction of sensory and cognitive processing, rapid-plasticity pervades all levels of auditory system, from the cochlea up to the auditory and prefrontal cortices. Exploiting fundamental statistical regularities of acoustics, it is what allows humans and other animal to deal so successfully with natural acoustic scenes where artificial systems fail. The project builds on the internationally recognized leadership of the PI in the fields of physiology and computational modeling, combined with the expertise of the Co-Investigator in psychophysics. Building on these highly complementary fields and several technical innovations, we hope to promote a novel view of auditory perception and cognition. We aim also to contribute significantly to translational research in the domain of signal processing for clinical hearing aids, given that many current limitations are not technological but rather conceptual. The project will finally result in the creation of laboratory facilities and an intellectual network unique in France and rare in all of Europe, combining cognitive, neural, and computational approaches to auditory neuroscience.
Summary
Listening in realistic situations is an active process that engages perceptual and cognitive faculties, endowing speech with meaning, music with joy, and environmental sounds with emotion. Through hearing, humans and other animals navigate complex acoustic scenes, separate sound mixtures, and assess their behavioral relevance. These remarkable feats are currently beyond our understanding and exceed the capabilities of the most sophisticated audio engineering systems. The goal of the proposed research is to investigate experimentally a novel view of hearing, where active hearing emerges from a deep interplay between adaptive sensory processes and goal-directed cognition. Specifically, we shall explore the postulate that versatile perception is mediated by rapid-plasticity at the neuronal level. At the conjunction of sensory and cognitive processing, rapid-plasticity pervades all levels of auditory system, from the cochlea up to the auditory and prefrontal cortices. Exploiting fundamental statistical regularities of acoustics, it is what allows humans and other animal to deal so successfully with natural acoustic scenes where artificial systems fail. The project builds on the internationally recognized leadership of the PI in the fields of physiology and computational modeling, combined with the expertise of the Co-Investigator in psychophysics. Building on these highly complementary fields and several technical innovations, we hope to promote a novel view of auditory perception and cognition. We aim also to contribute significantly to translational research in the domain of signal processing for clinical hearing aids, given that many current limitations are not technological but rather conceptual. The project will finally result in the creation of laboratory facilities and an intellectual network unique in France and rare in all of Europe, combining cognitive, neural, and computational approaches to auditory neuroscience.
Max ERC Funding
3 199 078 €
Duration
Start date: 2012-10-01, End date: 2018-09-30
Project acronym ADEQUATE
Project Advanced optoelectronic Devices with Enhanced QUAntum efficiency at THz frEquencies
Researcher (PI) Carlo Sirtori
Host Institution (HI) UNIVERSITE PARIS DIDEROT - PARIS 7
Country France
Call Details Advanced Grant (AdG), PE3, ERC-2009-AdG
Summary The aim of this project is the realisation of efficient mid-infrared and THz optoelectronic emitters. This work is motivated by the fact that the spontaneous emission in this frequency range is characterized by an extremely long lifetime when compared to non-radiative processes, giving rise to devices with very low quantum efficiency. To this end we want to develop hybrid light-matter systems, already well known in quantum optics, within optoelectronics devices, that will be driven by electrical injection. With this project we want to extend the field of optoelectronics by introducing some of the concepts of quantum optic, particularly the light-matter strong coupling, into semiconductor devices. More precisely this project aims at the implementation of novel optoelectronic emitters operating in the strong coupling regime between an intersubband excitation of a two-dimensional electron gas and a microcavity photonic mode. The quasiparticles issued from this coupling are called intersubband polaritons. The major difficulties and challenges of this project, do not lay in the observation of these quantum effects, but in their exploitation for a specific function, in particular an efficient electrical to optical conversion. To obtain efficient quantum emitters in the THz frequency range we will follow two different approaches: - In the first case we will try to exploit the additional characteristic time of the system introduced by the light-matter interaction in the strong (or ultra-strong) coupling regime. - The second approach will exploit the fact that, under certain conditions, intersubband polaritons have a bosonic character; as a consequence they can undergo stimulated scattering, giving rise to polaritons lasers as it has been shown for excitonic polaritons.
Summary
The aim of this project is the realisation of efficient mid-infrared and THz optoelectronic emitters. This work is motivated by the fact that the spontaneous emission in this frequency range is characterized by an extremely long lifetime when compared to non-radiative processes, giving rise to devices with very low quantum efficiency. To this end we want to develop hybrid light-matter systems, already well known in quantum optics, within optoelectronics devices, that will be driven by electrical injection. With this project we want to extend the field of optoelectronics by introducing some of the concepts of quantum optic, particularly the light-matter strong coupling, into semiconductor devices. More precisely this project aims at the implementation of novel optoelectronic emitters operating in the strong coupling regime between an intersubband excitation of a two-dimensional electron gas and a microcavity photonic mode. The quasiparticles issued from this coupling are called intersubband polaritons. The major difficulties and challenges of this project, do not lay in the observation of these quantum effects, but in their exploitation for a specific function, in particular an efficient electrical to optical conversion. To obtain efficient quantum emitters in the THz frequency range we will follow two different approaches: - In the first case we will try to exploit the additional characteristic time of the system introduced by the light-matter interaction in the strong (or ultra-strong) coupling regime. - The second approach will exploit the fact that, under certain conditions, intersubband polaritons have a bosonic character; as a consequence they can undergo stimulated scattering, giving rise to polaritons lasers as it has been shown for excitonic polaritons.
Max ERC Funding
1 761 000 €
Duration
Start date: 2010-05-01, End date: 2015-04-30
Project acronym AlgoQIP
Project Beyond Shannon: Algorithms for optimal information processing
Researcher (PI) Omar Fawzi
Host Institution (HI) ECOLE NORMALE SUPERIEURE DE LYON
Country France
Call Details Starting Grant (StG), PE6, ERC-2019-STG
Summary In the road towards quantum technologies capable of exploiting the revolutionary potential of quantum theory for information technology, a major bottleneck is the large overhead needed to correct errors caused by unwanted noise. Despite important research activity and great progress in designing better error correcting codes and fault-tolerant schemes, the fundamental limits of communication/computation over a quantum noisy medium are far from being understood. In fact, no satisfactory quantum analogue of Shannon’s celebrated noisy coding theorem is known.
The objective of this project is to leverage tools from mathematical optimization in order to build an algorithmic theory of optimal information processing that would go beyond the statistical approach pioneered by Shannon. Our goal will be to establish efficient algorithms that determine optimal methods for achieving a given task, rather than only characterizing the best achievable rates in the asymptotic limit in terms of entropic expressions. This approach will address three limitations — that are particularly severe in the quantum context — faced by the statistical approach: the non-additivity of entropic expressions, the asymptotic nature of the theory and the independence assumption.
Our aim is to develop efficient algorithms that take as input a description of a noise model and output a near-optimal method for reliable communication under this model. For example, our algorithms will answer: how many logical qubits can be reliably stored using 100 physical qubits that undergo depolarizing noise with parameter 5%? We will also develop generic and efficient decoding algorithms for quantum error correcting codes. These algorithms will have direct applications to the development of quantum technologies. Moreover, we will establish methods to compute the relevant uncertainty of large structured systems and apply them to obtain tight and non-asymptotic security bounds for (quantum) cryptographic protocols.
Summary
In the road towards quantum technologies capable of exploiting the revolutionary potential of quantum theory for information technology, a major bottleneck is the large overhead needed to correct errors caused by unwanted noise. Despite important research activity and great progress in designing better error correcting codes and fault-tolerant schemes, the fundamental limits of communication/computation over a quantum noisy medium are far from being understood. In fact, no satisfactory quantum analogue of Shannon’s celebrated noisy coding theorem is known.
The objective of this project is to leverage tools from mathematical optimization in order to build an algorithmic theory of optimal information processing that would go beyond the statistical approach pioneered by Shannon. Our goal will be to establish efficient algorithms that determine optimal methods for achieving a given task, rather than only characterizing the best achievable rates in the asymptotic limit in terms of entropic expressions. This approach will address three limitations — that are particularly severe in the quantum context — faced by the statistical approach: the non-additivity of entropic expressions, the asymptotic nature of the theory and the independence assumption.
Our aim is to develop efficient algorithms that take as input a description of a noise model and output a near-optimal method for reliable communication under this model. For example, our algorithms will answer: how many logical qubits can be reliably stored using 100 physical qubits that undergo depolarizing noise with parameter 5%? We will also develop generic and efficient decoding algorithms for quantum error correcting codes. These algorithms will have direct applications to the development of quantum technologies. Moreover, we will establish methods to compute the relevant uncertainty of large structured systems and apply them to obtain tight and non-asymptotic security bounds for (quantum) cryptographic protocols.
Max ERC Funding
1 492 733 €
Duration
Start date: 2021-01-01, End date: 2025-12-31
Project acronym ALLEGRO
Project Active large-scale learning for visual recognition
Researcher (PI) Cordelia Schmid
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Country France
Call Details Advanced Grant (AdG), PE6, ERC-2012-ADG_20120216
Summary A massive and ever growing amount of digital image and video content
is available today, on sites such as
Flickr and YouTube, in audiovisual archives such as those of BBC and
INA, and in personal collections. In most cases, it comes with
additional information, such as text, audio or other metadata, that forms a
rather sparse and noisy, yet rich and diverse source of annotation,
ideally suited to emerging weakly supervised and active machine
learning technology. The ALLEGRO project will take visual recognition
to the next level by using this largely untapped source of data to
automatically learn visual models. The main research objective of
our project is the development of new algorithms and computer software
capable of autonomously exploring evolving data collections, selecting
the relevant information, and determining the visual models most
appropriate for different object, scene, and activity categories. An
emphasis will be put on learning visual models from video, a
particularly rich source of information, and on the representation of
human activities, one of today's most challenging problems in computer
vision. Although this project addresses fundamental research
issues, it is expected to result in significant advances in
high-impact applications that range from visual mining of the Web and
automated annotation and organization of family photo and video albums
to large-scale information retrieval in television archives.
Summary
A massive and ever growing amount of digital image and video content
is available today, on sites such as
Flickr and YouTube, in audiovisual archives such as those of BBC and
INA, and in personal collections. In most cases, it comes with
additional information, such as text, audio or other metadata, that forms a
rather sparse and noisy, yet rich and diverse source of annotation,
ideally suited to emerging weakly supervised and active machine
learning technology. The ALLEGRO project will take visual recognition
to the next level by using this largely untapped source of data to
automatically learn visual models. The main research objective of
our project is the development of new algorithms and computer software
capable of autonomously exploring evolving data collections, selecting
the relevant information, and determining the visual models most
appropriate for different object, scene, and activity categories. An
emphasis will be put on learning visual models from video, a
particularly rich source of information, and on the representation of
human activities, one of today's most challenging problems in computer
vision. Although this project addresses fundamental research
issues, it is expected to result in significant advances in
high-impact applications that range from visual mining of the Web and
automated annotation and organization of family photo and video albums
to large-scale information retrieval in television archives.
Max ERC Funding
2 493 322 €
Duration
Start date: 2013-04-01, End date: 2019-03-31
Project acronym aLzINK
Project Alzheimer's disease and Zinc: the missing link ?
Researcher (PI) Christelle Sandrine Florence HUREAU-SABATER
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Country France
Call Details Starting Grant (StG), PE5, ERC-2014-STG
Summary Alzheimer's disease (AD) is one of the most serious diseases mankind is now facing as its social and economical impacts are increasing fastly. AD is very complex and the amyloid-β (Aβ) peptide as well as metallic ions (mainly copper and zinc) have been linked to its aetiology. While the deleterious impact of Cu is widely acknowledged, intervention of Zn is certain but still needs to be figured out.
The main objective of the present proposal, which is strongly anchored in the bio-inorganic chemistry field at interface with spectroscopy and biochemistry, is to design, synthesize and study new drug candidates (ligands L) capable of (i) targeting Cu(II) bound to Aβ within the synaptic cleft, where Zn is co-localized and ultimately to develop Zn-driven Cu(II) removal from Aβ and (ii) disrupting the aberrant Cu(II)-Aβ interactions involved in ROS production and Aβ aggregation, two deleterious events in AD. The drug candidates will thus have high Cu(II) over Zn selectively to preserve the crucial physiological role of Zn in the neurotransmission process. Zn is always underestimated (if not completely neglected) in current therapeutic approaches targeting Cu(II) despite the known interference of Zn with Cu(II) binding.
To reach this objective, it is absolutely necessary to first understand the metal ions trafficking issues in presence of Aβ alone at a molecular level (i.e. without the drug candidates).This includes: (i) determination of Zn binding site to Aβ, impact on Aβ aggregation and cell toxicity, (ii) determination of the mutual influence of Zn and Cu to their coordination to Aβ, impact on Aβ aggregation, ROS production and cell toxicity.
Methods used will span from organic synthesis to studies of neuronal model cells, with a major contribution of a wide panel of spectroscopic techniques including NMR, EPR, mass spectrometry, fluorescence, UV-Vis, circular-dichroism, X-ray absorption spectroscopy...
Summary
Alzheimer's disease (AD) is one of the most serious diseases mankind is now facing as its social and economical impacts are increasing fastly. AD is very complex and the amyloid-β (Aβ) peptide as well as metallic ions (mainly copper and zinc) have been linked to its aetiology. While the deleterious impact of Cu is widely acknowledged, intervention of Zn is certain but still needs to be figured out.
The main objective of the present proposal, which is strongly anchored in the bio-inorganic chemistry field at interface with spectroscopy and biochemistry, is to design, synthesize and study new drug candidates (ligands L) capable of (i) targeting Cu(II) bound to Aβ within the synaptic cleft, where Zn is co-localized and ultimately to develop Zn-driven Cu(II) removal from Aβ and (ii) disrupting the aberrant Cu(II)-Aβ interactions involved in ROS production and Aβ aggregation, two deleterious events in AD. The drug candidates will thus have high Cu(II) over Zn selectively to preserve the crucial physiological role of Zn in the neurotransmission process. Zn is always underestimated (if not completely neglected) in current therapeutic approaches targeting Cu(II) despite the known interference of Zn with Cu(II) binding.
To reach this objective, it is absolutely necessary to first understand the metal ions trafficking issues in presence of Aβ alone at a molecular level (i.e. without the drug candidates).This includes: (i) determination of Zn binding site to Aβ, impact on Aβ aggregation and cell toxicity, (ii) determination of the mutual influence of Zn and Cu to their coordination to Aβ, impact on Aβ aggregation, ROS production and cell toxicity.
Methods used will span from organic synthesis to studies of neuronal model cells, with a major contribution of a wide panel of spectroscopic techniques including NMR, EPR, mass spectrometry, fluorescence, UV-Vis, circular-dichroism, X-ray absorption spectroscopy...
Max ERC Funding
1 499 948 €
Duration
Start date: 2015-03-01, End date: 2021-02-28
Project acronym ARPEMA
Project Anionic redox processes: A transformational approach for advanced energy materials
Researcher (PI) Jean-Marie Tarascon
Host Institution (HI) COLLEGE DE FRANCE
Country France
Call Details Advanced Grant (AdG), PE5, ERC-2014-ADG
Summary Redox chemistry provides the fundamental basis for numerous energy-related electrochemical devices, among which Li-ion batteries (LIB) have become the premier energy storage technology for portable electronics and vehicle electrification. Throughout its history, LIB technology has relied on cationic redox reactions as the sole source of energy storage capacity. This is no longer true. In 2013 we demonstrated that Li-driven reversible formation of (O2)n peroxo-groups in new layered oxides led to extraordinary increases in energy storage capacity. This finding, which is receiving worldwide attention, represents a transformational approach for creating advanced energy materials for not only energy storage, but also water splitting applications as both involve peroxo species. However, as is often the case with new discoveries, the fundamental science at work needs to be rationalized and understood. Specifically, what are the mechanisms for ion and electron transport in these Li-driven anionic redox reactions?
To address these seminal questions and to widen the spectrum of materials (transition metal and anion) showing anionic redox chemistry, we propose a comprehensive research program that combines experimental and computational methods. The experimental methods include structural and electrochemical analyses (both ex-situ and in-situ), and computational modeling will be based on first-principles DFT for identifying the fundamental processes that enable anionic redox activity. The knowledge gained from these studies, in combination with our expertise in inorganic synthesis, will enable us to design a new generation of Li-ion battery materials that exhibit substantial increases (20 -30%) in energy storage capacity, with additional impacts on the development of Na-ion batteries and the design of water splitting catalysts, with the feasibility to surpass current water splitting efficiencies via novel (O2)n-based electrocatalysts.
Summary
Redox chemistry provides the fundamental basis for numerous energy-related electrochemical devices, among which Li-ion batteries (LIB) have become the premier energy storage technology for portable electronics and vehicle electrification. Throughout its history, LIB technology has relied on cationic redox reactions as the sole source of energy storage capacity. This is no longer true. In 2013 we demonstrated that Li-driven reversible formation of (O2)n peroxo-groups in new layered oxides led to extraordinary increases in energy storage capacity. This finding, which is receiving worldwide attention, represents a transformational approach for creating advanced energy materials for not only energy storage, but also water splitting applications as both involve peroxo species. However, as is often the case with new discoveries, the fundamental science at work needs to be rationalized and understood. Specifically, what are the mechanisms for ion and electron transport in these Li-driven anionic redox reactions?
To address these seminal questions and to widen the spectrum of materials (transition metal and anion) showing anionic redox chemistry, we propose a comprehensive research program that combines experimental and computational methods. The experimental methods include structural and electrochemical analyses (both ex-situ and in-situ), and computational modeling will be based on first-principles DFT for identifying the fundamental processes that enable anionic redox activity. The knowledge gained from these studies, in combination with our expertise in inorganic synthesis, will enable us to design a new generation of Li-ion battery materials that exhibit substantial increases (20 -30%) in energy storage capacity, with additional impacts on the development of Na-ion batteries and the design of water splitting catalysts, with the feasibility to surpass current water splitting efficiencies via novel (O2)n-based electrocatalysts.
Max ERC Funding
2 249 196 €
Duration
Start date: 2015-10-01, End date: 2021-03-31
Project acronym ATMOFLEX
Project Turbulent Transport in the Atmosphere: Fluctuations and Extreme Events
Researcher (PI) Jeremie Bec
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Country France
Call Details Starting Grant (StG), PE3, ERC-2009-StG
Summary A major part of the physical and chemical processes occurring in the atmosphere involves the turbulent transport of tiny particles. Current studies and models use a formulation in terms of mean fields, where the strong variations in the dynamical and statistical properties of the particles are neglected and where the underlying fluctuations of the fluid flow velocity are oversimplified. Devising an accurate understanding of the influence of air turbulence and of the extreme fluctuations that it generates in the dispersed phase remains a challenging issue. This project aims at coordinating and integrating theoretical, numerical, experimental, and observational efforts to develop a new statistical understanding of the role of fluctuations in atmospheric transport processes. The proposed work will cover individual as well as collective behaviors and will provide a systematic and unified description of targeted specific processes involving suspended drops or particles: the dispersion of pollutants from a source, the growth by condensation and coagulation of droplets and ice crystals in clouds, the scavenging, settling and re-suspension of aerosols, and the radiative and climatic effects of particles. The proposed approach is based on the use of tools borrowed from statistical physics and field theory, and from the theory of large deviations and of random dynamical systems in order to design new observables that will be simultaneously tractable analytically in simplified models and of relevance for the quantitative handling of such physical mechanisms. One of the outcomes will be to provide a new framework for improving and refining the methods used in meteorology and atmospheric sciences and to answer the long-standing question of the effects of suspended particles onto climate.
Summary
A major part of the physical and chemical processes occurring in the atmosphere involves the turbulent transport of tiny particles. Current studies and models use a formulation in terms of mean fields, where the strong variations in the dynamical and statistical properties of the particles are neglected and where the underlying fluctuations of the fluid flow velocity are oversimplified. Devising an accurate understanding of the influence of air turbulence and of the extreme fluctuations that it generates in the dispersed phase remains a challenging issue. This project aims at coordinating and integrating theoretical, numerical, experimental, and observational efforts to develop a new statistical understanding of the role of fluctuations in atmospheric transport processes. The proposed work will cover individual as well as collective behaviors and will provide a systematic and unified description of targeted specific processes involving suspended drops or particles: the dispersion of pollutants from a source, the growth by condensation and coagulation of droplets and ice crystals in clouds, the scavenging, settling and re-suspension of aerosols, and the radiative and climatic effects of particles. The proposed approach is based on the use of tools borrowed from statistical physics and field theory, and from the theory of large deviations and of random dynamical systems in order to design new observables that will be simultaneously tractable analytically in simplified models and of relevance for the quantitative handling of such physical mechanisms. One of the outcomes will be to provide a new framework for improving and refining the methods used in meteorology and atmospheric sciences and to answer the long-standing question of the effects of suspended particles onto climate.
Max ERC Funding
1 200 000 €
Duration
Start date: 2009-11-01, End date: 2014-10-31