Project acronym 0MSPIN
Project Spintronics based on relativistic phenomena in systems with zero magnetic moment
Researcher (PI) Tomáš Jungwirth
Host Institution (HI) FYZIKALNI USTAV AV CR V.V.I
Call Details Advanced Grant (AdG), PE3, ERC-2010-AdG_20100224
Summary The 0MSPIN project consists of an extensive integrated theoretical, experimental and device development programme of research opening a radical new approach to spintronics. Spintronics has the potential to supersede existing storage and memory applications, and to provide alternatives to current CMOS technology. Ferromagnetic matels used in all current spintronics applications may make it impractical to realise the full potential of spintronics. Metals are unsuitable for transistor and information processing applications, for opto-electronics, or for high-density integration. The 0MSPIN project aims to remove the major road-block holding back the development of spintronics in a radical way: removing the ferromagnetic component from key active parts or from the whole of the spintronic devices. This approach is based on exploiting the combination of exchange and spin-orbit coupling phenomena and material systems with zero macroscopic moment. The goal of the 0MSPIN is to provide a new paradigm by which spintronics can enter the realms of conventional semiconductors in both fundamental condensed matter research and in information technologies. In the central part of the proposal, the research towards this goal is embedded within a materials science project whose aim is to introduce into physics and microelectronics an entirely new class of semiconductors. 0MSPIN seeks to exploit three classes of material systems: (1) Antiferromagnetic bi-metallic 3d-5d alloys (e.g. Mn2Au). (2) Antiferromagnetic I-II-V semiconductors (e.g. LiMnAs). (3) Non-magnetic spin-orbit coupled semiconductors with injected spin-polarized currents (e.g. 2D III-V structures). Proof of concept devices operating at high temperatures will be fabricated to show-case new functionalities offered by zero-moment systems for sensing and memory applications, information processing, and opto-electronics technologies.
Summary
The 0MSPIN project consists of an extensive integrated theoretical, experimental and device development programme of research opening a radical new approach to spintronics. Spintronics has the potential to supersede existing storage and memory applications, and to provide alternatives to current CMOS technology. Ferromagnetic matels used in all current spintronics applications may make it impractical to realise the full potential of spintronics. Metals are unsuitable for transistor and information processing applications, for opto-electronics, or for high-density integration. The 0MSPIN project aims to remove the major road-block holding back the development of spintronics in a radical way: removing the ferromagnetic component from key active parts or from the whole of the spintronic devices. This approach is based on exploiting the combination of exchange and spin-orbit coupling phenomena and material systems with zero macroscopic moment. The goal of the 0MSPIN is to provide a new paradigm by which spintronics can enter the realms of conventional semiconductors in both fundamental condensed matter research and in information technologies. In the central part of the proposal, the research towards this goal is embedded within a materials science project whose aim is to introduce into physics and microelectronics an entirely new class of semiconductors. 0MSPIN seeks to exploit three classes of material systems: (1) Antiferromagnetic bi-metallic 3d-5d alloys (e.g. Mn2Au). (2) Antiferromagnetic I-II-V semiconductors (e.g. LiMnAs). (3) Non-magnetic spin-orbit coupled semiconductors with injected spin-polarized currents (e.g. 2D III-V structures). Proof of concept devices operating at high temperatures will be fabricated to show-case new functionalities offered by zero-moment systems for sensing and memory applications, information processing, and opto-electronics technologies.
Max ERC Funding
1 938 000 €
Duration
Start date: 2011-06-01, End date: 2016-05-31
Project acronym 2D-CHEM
Project Two-Dimensional Chemistry towards New Graphene Derivatives
Researcher (PI) Michal Otyepka
Host Institution (HI) UNIVERZITA PALACKEHO V OLOMOUCI
Call Details Consolidator Grant (CoG), PE5, ERC-2015-CoG
Summary The suite of graphene’s unique properties and applications can be enormously enhanced by its functionalization. As non-covalently functionalized graphenes do not target all graphene’s properties and may suffer from limited stability, covalent functionalization represents a promising way for controlling graphene’s properties. To date, only a few well-defined graphene derivatives have been introduced. Among them, fluorographene (FG) stands out as a prominent member because of its easy synthesis and high stability. Being a perfluorinated hydrocarbon, FG was believed to be as unreactive as the two-dimensional counterpart perfluoropolyethylene (Teflon®). However, our recent experiments showed that FG is not chemically inert and can be used as a viable precursor for synthesizing graphene derivatives. This surprising behavior indicates that common textbook grade knowledge cannot blindly be applied to the chemistry of 2D materials. Further, there might be specific rules behind the chemistry of 2D materials, forming a new chemical discipline we tentatively call 2D chemistry. The main aim of the project is to explore, identify and apply the rules of 2D chemistry starting from FG. Using the knowledge gained of 2D chemistry, we will attempt to control the chemistry of various 2D materials aimed at preparing stable graphene derivatives with designed properties, e.g., 1-3 eV band gap, fluorescent properties, sustainable magnetic ordering and dispersability in polar media. The new graphene derivatives will be applied in sensing, imaging, magnetic delivery and catalysis and new emerging applications arising from the synergistic phenomena are expected. We envisage that new applications will be opened up that benefit from the 2D scaffold and tailored properties of the synthesized derivatives. The derivatives will be used for the synthesis of 3D hybrid materials by covalent linking of the 2D sheets joined with other organic and inorganic molecules, nanomaterials or biomacromolecules.
Summary
The suite of graphene’s unique properties and applications can be enormously enhanced by its functionalization. As non-covalently functionalized graphenes do not target all graphene’s properties and may suffer from limited stability, covalent functionalization represents a promising way for controlling graphene’s properties. To date, only a few well-defined graphene derivatives have been introduced. Among them, fluorographene (FG) stands out as a prominent member because of its easy synthesis and high stability. Being a perfluorinated hydrocarbon, FG was believed to be as unreactive as the two-dimensional counterpart perfluoropolyethylene (Teflon®). However, our recent experiments showed that FG is not chemically inert and can be used as a viable precursor for synthesizing graphene derivatives. This surprising behavior indicates that common textbook grade knowledge cannot blindly be applied to the chemistry of 2D materials. Further, there might be specific rules behind the chemistry of 2D materials, forming a new chemical discipline we tentatively call 2D chemistry. The main aim of the project is to explore, identify and apply the rules of 2D chemistry starting from FG. Using the knowledge gained of 2D chemistry, we will attempt to control the chemistry of various 2D materials aimed at preparing stable graphene derivatives with designed properties, e.g., 1-3 eV band gap, fluorescent properties, sustainable magnetic ordering and dispersability in polar media. The new graphene derivatives will be applied in sensing, imaging, magnetic delivery and catalysis and new emerging applications arising from the synergistic phenomena are expected. We envisage that new applications will be opened up that benefit from the 2D scaffold and tailored properties of the synthesized derivatives. The derivatives will be used for the synthesis of 3D hybrid materials by covalent linking of the 2D sheets joined with other organic and inorganic molecules, nanomaterials or biomacromolecules.
Max ERC Funding
1 831 103 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym 3D-nanoMorph
Project Label-free 3D morphological nanoscopy for studying sub-cellular dynamics in live cancer cells with high spatio-temporal resolution
Researcher (PI) Krishna AGARWAL
Host Institution (HI) UNIVERSITETET I TROMSOE - NORGES ARKTISKE UNIVERSITET
Call Details Starting Grant (StG), PE7, ERC-2018-STG
Summary Label-free optical nanoscopy, free from photobleaching and photochemical toxicity of fluorescence labels and yielding 3D morphological resolution of <50 nm, is the future of live cell imaging. 3D-nanoMorph breaks the diffraction barrier and shifts the paradigm in label-free nanoscopy, providing isotropic 3D resolution of <50 nm. To achieve this, 3D-nanoMorph performs non-linear inverse scattering for the first time in nanoscopy and decodes scattering between sub-cellular structures (organelles).
3D-nanoMorph innovatively devises complementary roles of light measurement system and computational nanoscopy algorithm. A novel illumination system and a novel light collection system together enable measurement of only the most relevant intensity component and create a fresh perspective about label-free measurements. A new computational nanoscopy approach employs non-linear inverse scattering. Harnessing non-linear inverse scattering for resolution enhancement in nanoscopy opens new possibilities in label-free 3D nanoscopy.
I will apply 3D-nanoMorph to study organelle degradation (autophagy) in live cancer cells over extended duration with high spatial and temporal resolution, presently limited by the lack of high-resolution label-free 3D morphological nanoscopy. Successful 3D mapping of nanoscale biological process of autophagy will open new avenues for cancer treatment and showcase 3D-nanoMorph for wider applications.
My cross-disciplinary expertise of 14 years spanning inverse problems, electromagnetism, optical microscopy, integrated optics and live cell nanoscopy paves path for successful implementation of 3D-nanoMorph.
Summary
Label-free optical nanoscopy, free from photobleaching and photochemical toxicity of fluorescence labels and yielding 3D morphological resolution of <50 nm, is the future of live cell imaging. 3D-nanoMorph breaks the diffraction barrier and shifts the paradigm in label-free nanoscopy, providing isotropic 3D resolution of <50 nm. To achieve this, 3D-nanoMorph performs non-linear inverse scattering for the first time in nanoscopy and decodes scattering between sub-cellular structures (organelles).
3D-nanoMorph innovatively devises complementary roles of light measurement system and computational nanoscopy algorithm. A novel illumination system and a novel light collection system together enable measurement of only the most relevant intensity component and create a fresh perspective about label-free measurements. A new computational nanoscopy approach employs non-linear inverse scattering. Harnessing non-linear inverse scattering for resolution enhancement in nanoscopy opens new possibilities in label-free 3D nanoscopy.
I will apply 3D-nanoMorph to study organelle degradation (autophagy) in live cancer cells over extended duration with high spatial and temporal resolution, presently limited by the lack of high-resolution label-free 3D morphological nanoscopy. Successful 3D mapping of nanoscale biological process of autophagy will open new avenues for cancer treatment and showcase 3D-nanoMorph for wider applications.
My cross-disciplinary expertise of 14 years spanning inverse problems, electromagnetism, optical microscopy, integrated optics and live cell nanoscopy paves path for successful implementation of 3D-nanoMorph.
Max ERC Funding
1 499 999 €
Duration
Start date: 2019-07-01, End date: 2024-06-30
Project acronym ABACUS
Project Ab-initio adiabatic-connection curves for density-functional analysis and construction
Researcher (PI) Trygve Ulf Helgaker
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE4, ERC-2010-AdG_20100224
Summary Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Summary
Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Max ERC Funding
2 017 932 €
Duration
Start date: 2011-03-01, End date: 2016-02-29
Project acronym AgeConsolidate
Project The Missing Link of Episodic Memory Decline in Aging: The Role of Inefficient Systems Consolidation
Researcher (PI) Anders Martin FJELL
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Consolidator Grant (CoG), SH4, ERC-2016-COG
Summary Which brain mechanisms are responsible for the faith of the memories we make with age, whether they wither or stay, and in what form? Episodic memory function does decline with age. While this decline can have multiple causes, research has focused almost entirely on encoding and retrieval processes, largely ignoring a third critical process– consolidation. The objective of AgeConsolidate is to provide this missing link, by combining novel experimental cognitive paradigms with neuroimaging in a longitudinal large-scale attempt to directly test how age-related changes in consolidation processes in the brain impact episodic memory decline. The ambitious aims of the present proposal are two-fold:
(1) Use recent advances in memory consolidation theory to achieve an elaborate model of episodic memory deficits in aging
(2) Use aging as a model to uncover how structural and functional brain changes affect episodic memory consolidation in general
The novelty of the project lies in the synthesis of recent methodological advances and theoretical models for episodic memory consolidation to explain age-related decline, by employing a unique combination of a range of different techniques and approaches. This is ground-breaking, in that it aims at taking our understanding of the brain processes underlying episodic memory decline in aging to a new level, while at the same time advancing our theoretical understanding of how episodic memories are consolidated in the human brain. To obtain this outcome, I will test the main hypothesis of the project: Brain processes of episodic memory consolidation are less effective in older adults, and this can account for a significant portion of the episodic memory decline in aging. This will be answered by six secondary hypotheses, with 1-3 experiments or tasks designated to address each hypothesis, focusing on functional and structural MRI, positron emission tomography data and sleep experiments to target consolidation from different angles.
Summary
Which brain mechanisms are responsible for the faith of the memories we make with age, whether they wither or stay, and in what form? Episodic memory function does decline with age. While this decline can have multiple causes, research has focused almost entirely on encoding and retrieval processes, largely ignoring a third critical process– consolidation. The objective of AgeConsolidate is to provide this missing link, by combining novel experimental cognitive paradigms with neuroimaging in a longitudinal large-scale attempt to directly test how age-related changes in consolidation processes in the brain impact episodic memory decline. The ambitious aims of the present proposal are two-fold:
(1) Use recent advances in memory consolidation theory to achieve an elaborate model of episodic memory deficits in aging
(2) Use aging as a model to uncover how structural and functional brain changes affect episodic memory consolidation in general
The novelty of the project lies in the synthesis of recent methodological advances and theoretical models for episodic memory consolidation to explain age-related decline, by employing a unique combination of a range of different techniques and approaches. This is ground-breaking, in that it aims at taking our understanding of the brain processes underlying episodic memory decline in aging to a new level, while at the same time advancing our theoretical understanding of how episodic memories are consolidated in the human brain. To obtain this outcome, I will test the main hypothesis of the project: Brain processes of episodic memory consolidation are less effective in older adults, and this can account for a significant portion of the episodic memory decline in aging. This will be answered by six secondary hypotheses, with 1-3 experiments or tasks designated to address each hypothesis, focusing on functional and structural MRI, positron emission tomography data and sleep experiments to target consolidation from different angles.
Max ERC Funding
1 999 482 €
Duration
Start date: 2017-05-01, End date: 2022-04-30
Project acronym AGENSI
Project A Genetic View into Past Sea Ice Variability in the Arctic
Researcher (PI) Stijn DE SCHEPPER
Host Institution (HI) NORCE NORWEGIAN RESEARCH CENTRE AS
Call Details Consolidator Grant (CoG), PE10, ERC-2018-COG
Summary Arctic sea ice decline is the exponent of the rapidly transforming Arctic climate. The ensuing local and global implications can be understood by studying past climate transitions, yet few methods are available to examine past Arctic sea ice cover, severely restricting our understanding of sea ice in the climate system. The decline in Arctic sea ice cover is a ‘canary in the coalmine’ for the state of our climate, and if greenhouse gas emissions remain unchecked, summer sea ice loss may pass a critical threshold that could drastically transform the Arctic. Because historical observations are limited, it is crucial to have reliable proxies for assessing natural sea ice variability, its stability and sensitivity to climate forcing on different time scales. Current proxies address aspects of sea ice variability, but are limited due to a selective fossil record, preservation effects, regional applicability, or being semi-quantitative. With such restraints on our knowledge about natural variations and drivers, major uncertainties about the future remain.
I propose to develop and apply a novel sea ice proxy that exploits genetic information stored in marine sediments, sedimentary ancient DNA (sedaDNA). This innovation uses the genetic signature of phytoplankton communities from surface waters and sea ice as it gets stored in sediments. This wealth of information has not been explored before for reconstructing sea ice conditions. Preliminary results from my cross-disciplinary team indicate that our unconventional approach can provide a detailed, qualitative account of past sea ice ecosystems and quantitative estimates of sea ice parameters. I will address fundamental questions about past Arctic sea ice variability on different timescales, information essential to provide a framework upon which to assess the ecological and socio-economic consequences of a changing Arctic. This new proxy is not limited to sea ice research and can transform the field of paleoceanography.
Summary
Arctic sea ice decline is the exponent of the rapidly transforming Arctic climate. The ensuing local and global implications can be understood by studying past climate transitions, yet few methods are available to examine past Arctic sea ice cover, severely restricting our understanding of sea ice in the climate system. The decline in Arctic sea ice cover is a ‘canary in the coalmine’ for the state of our climate, and if greenhouse gas emissions remain unchecked, summer sea ice loss may pass a critical threshold that could drastically transform the Arctic. Because historical observations are limited, it is crucial to have reliable proxies for assessing natural sea ice variability, its stability and sensitivity to climate forcing on different time scales. Current proxies address aspects of sea ice variability, but are limited due to a selective fossil record, preservation effects, regional applicability, or being semi-quantitative. With such restraints on our knowledge about natural variations and drivers, major uncertainties about the future remain.
I propose to develop and apply a novel sea ice proxy that exploits genetic information stored in marine sediments, sedimentary ancient DNA (sedaDNA). This innovation uses the genetic signature of phytoplankton communities from surface waters and sea ice as it gets stored in sediments. This wealth of information has not been explored before for reconstructing sea ice conditions. Preliminary results from my cross-disciplinary team indicate that our unconventional approach can provide a detailed, qualitative account of past sea ice ecosystems and quantitative estimates of sea ice parameters. I will address fundamental questions about past Arctic sea ice variability on different timescales, information essential to provide a framework upon which to assess the ecological and socio-economic consequences of a changing Arctic. This new proxy is not limited to sea ice research and can transform the field of paleoceanography.
Max ERC Funding
2 615 858 €
Duration
Start date: 2019-08-01, End date: 2024-07-31
Project acronym AI4REASON
Project Artificial Intelligence for Large-Scale Computer-Assisted Reasoning
Researcher (PI) Josef Urban
Host Institution (HI) CESKE VYSOKE UCENI TECHNICKE V PRAZE
Call Details Consolidator Grant (CoG), PE6, ERC-2014-CoG
Summary The goal of the AI4REASON project is a breakthrough in what is considered a very hard problem in AI and automation of reasoning, namely the problem of automatically proving theorems in large and complex theories. Such complex formal theories arise in projects aimed at verification of today's advanced mathematics such as the Formal Proof of the Kepler Conjecture (Flyspeck), verification of software and hardware designs such as the seL4 operating system kernel, and verification of other advanced systems and technologies on which today's information society critically depends.
It seems extremely complex and unlikely to design an explicitly programmed solution to the problem. However, we have recently demonstrated that the performance of existing approaches can be multiplied by data-driven AI methods that learn reasoning guidance from large proof corpora. The breakthrough will be achieved by developing such novel AI methods. First, we will devise suitable Automated Reasoning and Machine Learning methods that learn reasoning knowledge and steer the reasoning processes at various levels of granularity. Second, we will combine them into autonomous self-improving AI systems that interleave deduction and learning in positive feedback loops. Third, we will develop approaches that aggregate reasoning knowledge across many formal, semi-formal and informal corpora and deploy the methods as strong automation services for the formal proof community.
The expected outcome is our ability to prove automatically at least 50% more theorems in high-assurance projects such as Flyspeck and seL4, bringing a major breakthrough in formal reasoning and verification. As an AI effort, the project offers a unique path to large-scale semantic AI. The formal corpora concentrate centuries of deep human thinking in a computer-understandable form on which deductive and inductive AI can be combined and co-evolved, providing new insights into how humans do mathematics and science.
Summary
The goal of the AI4REASON project is a breakthrough in what is considered a very hard problem in AI and automation of reasoning, namely the problem of automatically proving theorems in large and complex theories. Such complex formal theories arise in projects aimed at verification of today's advanced mathematics such as the Formal Proof of the Kepler Conjecture (Flyspeck), verification of software and hardware designs such as the seL4 operating system kernel, and verification of other advanced systems and technologies on which today's information society critically depends.
It seems extremely complex and unlikely to design an explicitly programmed solution to the problem. However, we have recently demonstrated that the performance of existing approaches can be multiplied by data-driven AI methods that learn reasoning guidance from large proof corpora. The breakthrough will be achieved by developing such novel AI methods. First, we will devise suitable Automated Reasoning and Machine Learning methods that learn reasoning knowledge and steer the reasoning processes at various levels of granularity. Second, we will combine them into autonomous self-improving AI systems that interleave deduction and learning in positive feedback loops. Third, we will develop approaches that aggregate reasoning knowledge across many formal, semi-formal and informal corpora and deploy the methods as strong automation services for the formal proof community.
The expected outcome is our ability to prove automatically at least 50% more theorems in high-assurance projects such as Flyspeck and seL4, bringing a major breakthrough in formal reasoning and verification. As an AI effort, the project offers a unique path to large-scale semantic AI. The formal corpora concentrate centuries of deep human thinking in a computer-understandable form on which deductive and inductive AI can be combined and co-evolved, providing new insights into how humans do mathematics and science.
Max ERC Funding
1 499 500 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym Amitochondriates
Project Life without mitochondrion
Researcher (PI) Vladimir HAMPL
Host Institution (HI) UNIVERZITA KARLOVA
Call Details Consolidator Grant (CoG), LS8, ERC-2017-COG
Summary Mitochondria are often referred to as the “power houses” of eukaryotic cells. All eukaryotes were thought to have mitochondria of some form until 2016, when the first eukaryote thriving without mitochondria was discovered by our laboratory – a flagellate Monocercomonoides. Understanding cellular functions of these cells, which represent a new functional type of eukaryotes, and understanding the circumstances of the unique event of mitochondrial loss are motivations for this proposal. The first objective focuses on the cell physiology. We will perform a metabolomic study revealing major metabolic pathways and concentrate further on elucidating its unique system of iron-sulphur cluster assembly. In the second objective, we will investigate in details the unique case of mitochondrial loss. We will examine two additional potentially amitochondriate lineages by means of genomics and transcriptomics, conduct experiments simulating the moments of mitochondrial loss and try to induce the mitochondrial loss in vitro by knocking out or down genes for mitochondrial biogenesis. We have chosen Giardia intestinalis and Entamoeba histolytica as models for the latter experiments, because their mitochondria are already reduced to minimalistic “mitosomes” and because some genetic tools are already available for them. Successful mitochondrial knock-outs would enable us to study mitochondrial loss in ‘real time’ and in vivo. In the third objective, we will focus on transforming Monocercomonoides into a tractable laboratory model by developing methods of axenic cultivation and genetic manipulation. This will open new possibilities in the studies of this organism and create a cell culture representing an amitochondriate model for cell biological studies enabling the dissection of mitochondrial effects from those of other compartments. The team is composed of the laboratory of PI and eight invited experts and we hope it has the ability to address these challenging questions.
Summary
Mitochondria are often referred to as the “power houses” of eukaryotic cells. All eukaryotes were thought to have mitochondria of some form until 2016, when the first eukaryote thriving without mitochondria was discovered by our laboratory – a flagellate Monocercomonoides. Understanding cellular functions of these cells, which represent a new functional type of eukaryotes, and understanding the circumstances of the unique event of mitochondrial loss are motivations for this proposal. The first objective focuses on the cell physiology. We will perform a metabolomic study revealing major metabolic pathways and concentrate further on elucidating its unique system of iron-sulphur cluster assembly. In the second objective, we will investigate in details the unique case of mitochondrial loss. We will examine two additional potentially amitochondriate lineages by means of genomics and transcriptomics, conduct experiments simulating the moments of mitochondrial loss and try to induce the mitochondrial loss in vitro by knocking out or down genes for mitochondrial biogenesis. We have chosen Giardia intestinalis and Entamoeba histolytica as models for the latter experiments, because their mitochondria are already reduced to minimalistic “mitosomes” and because some genetic tools are already available for them. Successful mitochondrial knock-outs would enable us to study mitochondrial loss in ‘real time’ and in vivo. In the third objective, we will focus on transforming Monocercomonoides into a tractable laboratory model by developing methods of axenic cultivation and genetic manipulation. This will open new possibilities in the studies of this organism and create a cell culture representing an amitochondriate model for cell biological studies enabling the dissection of mitochondrial effects from those of other compartments. The team is composed of the laboratory of PI and eight invited experts and we hope it has the ability to address these challenging questions.
Max ERC Funding
1 935 500 €
Duration
Start date: 2018-05-01, End date: 2023-04-30
Project acronym AN07AT
Project Understanding computational roles of new neurons generated in the adult hippocampus
Researcher (PI) Ayumu Tashiro
Host Institution (HI) NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET NTNU
Call Details Starting Grant (StG), LS4, ERC-2007-StG
Summary New neurons are continuously generated in certain regions of adult mammalian brain. One of those regions is the dentate gyrus, a subregion of hippocampus, which is essential for memory formation. Although these new neurons in the adult dentate gyrus are thought to have an important role in learning and memory, it is largely unclear how new neurons are involved in information processing and storage underlying memory. Because new neurons constitute a minor portion of intermingled local neuronal population, simple application of conventional techniques such as multi-unit extracellular recording and pharmacological lesion are not suitable for the functional analysis of new neurons. In this proposed research program, I will combine multi-unit recording and behavioral analysis with virus mediated, cell-type-specific genetic manipulation of neuronal activity, to investigate computational roles of new neurons in learning and memory. Specifically, I will determine: 1) specific memory processes that require new neurons, 2) dynamic patterns of activity that new neurons express during memory-related behavior, 3) influence of new neurons on their downstream structure. Further, based on the information obtained by these three lines of studies, we will establish causal relationship between specific memory-related behavior and specific pattern of activity in new neurons. Solving these issues will cooperatively provide important insight into the understanding of computational roles performed by adult neurogenesis. The information on the function of new neurons in normal brain could contribute to future development of efficient therapeutic strategy for a variety of brain disorders.
Summary
New neurons are continuously generated in certain regions of adult mammalian brain. One of those regions is the dentate gyrus, a subregion of hippocampus, which is essential for memory formation. Although these new neurons in the adult dentate gyrus are thought to have an important role in learning and memory, it is largely unclear how new neurons are involved in information processing and storage underlying memory. Because new neurons constitute a minor portion of intermingled local neuronal population, simple application of conventional techniques such as multi-unit extracellular recording and pharmacological lesion are not suitable for the functional analysis of new neurons. In this proposed research program, I will combine multi-unit recording and behavioral analysis with virus mediated, cell-type-specific genetic manipulation of neuronal activity, to investigate computational roles of new neurons in learning and memory. Specifically, I will determine: 1) specific memory processes that require new neurons, 2) dynamic patterns of activity that new neurons express during memory-related behavior, 3) influence of new neurons on their downstream structure. Further, based on the information obtained by these three lines of studies, we will establish causal relationship between specific memory-related behavior and specific pattern of activity in new neurons. Solving these issues will cooperatively provide important insight into the understanding of computational roles performed by adult neurogenesis. The information on the function of new neurons in normal brain could contribute to future development of efficient therapeutic strategy for a variety of brain disorders.
Max ERC Funding
1 991 743 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym ANISOTROPIC UNIVERSE
Project The anisotropic universe -- a reality or fluke?
Researcher (PI) Hans Kristian Kamfjord Eriksen
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE9, ERC-2010-StG_20091028
Summary "During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Summary
"During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31