Project acronym 3D-In-Macro
Project Inequality in 3D – measurement and implications for macroeconomic theory
Researcher (PI) Andreas Fagereng
Host Institution (HI) STIFTELSEN HANDELSHOYSKOLEN BI
Country Norway
Call Details Starting Grant (StG), SH1, ERC-2019-STG
Summary This project will contribute toward a better understanding of inequality and its macroeconomic implications. We will study inequality and its dynamics along three dimensions: Consumption, Income and Wealth, “3D Inequality.” With novel microdata we can measure the entirety of the economy down to the single household along the 3 dimensions.
In macroeconomics, much theoretical progress has been made in understanding when distributions matter for aggregates. Newer heterogeneous agent models deliver strikingly different implications for monetary and fiscal policies than what the traditional representative agent models do, and also allow us to study the distributional implications of different policies across households. In principle, this class of models can incorporate the potentially rich interactions between inequality and the macroeconomy: on the one hand, inequality shapes macroeconomic aggregates; on the other hand, macroeconomic shocks and policies affect inequality. However, absent precise micro-level facts it is difficult to establish which of the potential mechanisms highlighted by these models are the most important in reality.
Our empirical efforts will be disciplined by these recent developments in modelling macroeconomic phenomena with microeconomic heterogeneity. Our overarching motivation is to quantify the type of micro heterogeneity that matters for macroeconomic theory and thereby inform the development of current and future macroeconomic models. The novel insights we aim to provide could lead to substantial improvements in both fiscal and monetary policy tools. Furthermore, a better understanding of the forces behind growing inequality will inform the current debate on this issue and provide important lessons to policy makers who see economic inequality as a problem in itself.
Summary
This project will contribute toward a better understanding of inequality and its macroeconomic implications. We will study inequality and its dynamics along three dimensions: Consumption, Income and Wealth, “3D Inequality.” With novel microdata we can measure the entirety of the economy down to the single household along the 3 dimensions.
In macroeconomics, much theoretical progress has been made in understanding when distributions matter for aggregates. Newer heterogeneous agent models deliver strikingly different implications for monetary and fiscal policies than what the traditional representative agent models do, and also allow us to study the distributional implications of different policies across households. In principle, this class of models can incorporate the potentially rich interactions between inequality and the macroeconomy: on the one hand, inequality shapes macroeconomic aggregates; on the other hand, macroeconomic shocks and policies affect inequality. However, absent precise micro-level facts it is difficult to establish which of the potential mechanisms highlighted by these models are the most important in reality.
Our empirical efforts will be disciplined by these recent developments in modelling macroeconomic phenomena with microeconomic heterogeneity. Our overarching motivation is to quantify the type of micro heterogeneity that matters for macroeconomic theory and thereby inform the development of current and future macroeconomic models. The novel insights we aim to provide could lead to substantial improvements in both fiscal and monetary policy tools. Furthermore, a better understanding of the forces behind growing inequality will inform the current debate on this issue and provide important lessons to policy makers who see economic inequality as a problem in itself.
Max ERC Funding
1 376 875 €
Duration
Start date: 2020-05-01, End date: 2025-04-30
Project acronym 3D-nanoMorph
Project Label-free 3D morphological nanoscopy for studying sub-cellular dynamics in live cancer cells with high spatio-temporal resolution
Researcher (PI) Krishna AGARWAL
Host Institution (HI) UNIVERSITETET I TROMSOE - NORGES ARKTISKE UNIVERSITET
Country Norway
Call Details Starting Grant (StG), PE7, ERC-2018-STG
Summary Label-free optical nanoscopy, free from photobleaching and photochemical toxicity of fluorescence labels and yielding 3D morphological resolution of <50 nm, is the future of live cell imaging. 3D-nanoMorph breaks the diffraction barrier and shifts the paradigm in label-free nanoscopy, providing isotropic 3D resolution of <50 nm. To achieve this, 3D-nanoMorph performs non-linear inverse scattering for the first time in nanoscopy and decodes scattering between sub-cellular structures (organelles).
3D-nanoMorph innovatively devises complementary roles of light measurement system and computational nanoscopy algorithm. A novel illumination system and a novel light collection system together enable measurement of only the most relevant intensity component and create a fresh perspective about label-free measurements. A new computational nanoscopy approach employs non-linear inverse scattering. Harnessing non-linear inverse scattering for resolution enhancement in nanoscopy opens new possibilities in label-free 3D nanoscopy.
I will apply 3D-nanoMorph to study organelle degradation (autophagy) in live cancer cells over extended duration with high spatial and temporal resolution, presently limited by the lack of high-resolution label-free 3D morphological nanoscopy. Successful 3D mapping of nanoscale biological process of autophagy will open new avenues for cancer treatment and showcase 3D-nanoMorph for wider applications.
My cross-disciplinary expertise of 14 years spanning inverse problems, electromagnetism, optical microscopy, integrated optics and live cell nanoscopy paves path for successful implementation of 3D-nanoMorph.
Summary
Label-free optical nanoscopy, free from photobleaching and photochemical toxicity of fluorescence labels and yielding 3D morphological resolution of <50 nm, is the future of live cell imaging. 3D-nanoMorph breaks the diffraction barrier and shifts the paradigm in label-free nanoscopy, providing isotropic 3D resolution of <50 nm. To achieve this, 3D-nanoMorph performs non-linear inverse scattering for the first time in nanoscopy and decodes scattering between sub-cellular structures (organelles).
3D-nanoMorph innovatively devises complementary roles of light measurement system and computational nanoscopy algorithm. A novel illumination system and a novel light collection system together enable measurement of only the most relevant intensity component and create a fresh perspective about label-free measurements. A new computational nanoscopy approach employs non-linear inverse scattering. Harnessing non-linear inverse scattering for resolution enhancement in nanoscopy opens new possibilities in label-free 3D nanoscopy.
I will apply 3D-nanoMorph to study organelle degradation (autophagy) in live cancer cells over extended duration with high spatial and temporal resolution, presently limited by the lack of high-resolution label-free 3D morphological nanoscopy. Successful 3D mapping of nanoscale biological process of autophagy will open new avenues for cancer treatment and showcase 3D-nanoMorph for wider applications.
My cross-disciplinary expertise of 14 years spanning inverse problems, electromagnetism, optical microscopy, integrated optics and live cell nanoscopy paves path for successful implementation of 3D-nanoMorph.
Max ERC Funding
1 499 999 €
Duration
Start date: 2019-07-01, End date: 2024-06-30
Project acronym ABACUS
Project Ab-initio adiabatic-connection curves for density-functional analysis and construction
Researcher (PI) Trygve Ulf Helgaker
Host Institution (HI) UNIVERSITETET I OSLO
Country Norway
Call Details Advanced Grant (AdG), PE4, ERC-2010-AdG_20100224
Summary Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Summary
Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Max ERC Funding
2 017 932 €
Duration
Start date: 2011-03-01, End date: 2016-02-29
Project acronym AgeConsolidate
Project The Missing Link of Episodic Memory Decline in Aging: The Role of Inefficient Systems Consolidation
Researcher (PI) Anders Martin FJELL
Host Institution (HI) UNIVERSITETET I OSLO
Country Norway
Call Details Consolidator Grant (CoG), SH4, ERC-2016-COG
Summary Which brain mechanisms are responsible for the faith of the memories we make with age, whether they wither or stay, and in what form? Episodic memory function does decline with age. While this decline can have multiple causes, research has focused almost entirely on encoding and retrieval processes, largely ignoring a third critical process– consolidation. The objective of AgeConsolidate is to provide this missing link, by combining novel experimental cognitive paradigms with neuroimaging in a longitudinal large-scale attempt to directly test how age-related changes in consolidation processes in the brain impact episodic memory decline. The ambitious aims of the present proposal are two-fold:
(1) Use recent advances in memory consolidation theory to achieve an elaborate model of episodic memory deficits in aging
(2) Use aging as a model to uncover how structural and functional brain changes affect episodic memory consolidation in general
The novelty of the project lies in the synthesis of recent methodological advances and theoretical models for episodic memory consolidation to explain age-related decline, by employing a unique combination of a range of different techniques and approaches. This is ground-breaking, in that it aims at taking our understanding of the brain processes underlying episodic memory decline in aging to a new level, while at the same time advancing our theoretical understanding of how episodic memories are consolidated in the human brain. To obtain this outcome, I will test the main hypothesis of the project: Brain processes of episodic memory consolidation are less effective in older adults, and this can account for a significant portion of the episodic memory decline in aging. This will be answered by six secondary hypotheses, with 1-3 experiments or tasks designated to address each hypothesis, focusing on functional and structural MRI, positron emission tomography data and sleep experiments to target consolidation from different angles.
Summary
Which brain mechanisms are responsible for the faith of the memories we make with age, whether they wither or stay, and in what form? Episodic memory function does decline with age. While this decline can have multiple causes, research has focused almost entirely on encoding and retrieval processes, largely ignoring a third critical process– consolidation. The objective of AgeConsolidate is to provide this missing link, by combining novel experimental cognitive paradigms with neuroimaging in a longitudinal large-scale attempt to directly test how age-related changes in consolidation processes in the brain impact episodic memory decline. The ambitious aims of the present proposal are two-fold:
(1) Use recent advances in memory consolidation theory to achieve an elaborate model of episodic memory deficits in aging
(2) Use aging as a model to uncover how structural and functional brain changes affect episodic memory consolidation in general
The novelty of the project lies in the synthesis of recent methodological advances and theoretical models for episodic memory consolidation to explain age-related decline, by employing a unique combination of a range of different techniques and approaches. This is ground-breaking, in that it aims at taking our understanding of the brain processes underlying episodic memory decline in aging to a new level, while at the same time advancing our theoretical understanding of how episodic memories are consolidated in the human brain. To obtain this outcome, I will test the main hypothesis of the project: Brain processes of episodic memory consolidation are less effective in older adults, and this can account for a significant portion of the episodic memory decline in aging. This will be answered by six secondary hypotheses, with 1-3 experiments or tasks designated to address each hypothesis, focusing on functional and structural MRI, positron emission tomography data and sleep experiments to target consolidation from different angles.
Max ERC Funding
1 999 482 €
Duration
Start date: 2017-05-01, End date: 2022-04-30
Project acronym AGENSI
Project A Genetic View into Past Sea Ice Variability in the Arctic
Researcher (PI) Stijn DE SCHEPPER
Host Institution (HI) NORCE NORWEGIAN RESEARCH CENTRE AS
Country Norway
Call Details Consolidator Grant (CoG), PE10, ERC-2018-COG
Summary Arctic sea ice decline is the exponent of the rapidly transforming Arctic climate. The ensuing local and global implications can be understood by studying past climate transitions, yet few methods are available to examine past Arctic sea ice cover, severely restricting our understanding of sea ice in the climate system. The decline in Arctic sea ice cover is a ‘canary in the coalmine’ for the state of our climate, and if greenhouse gas emissions remain unchecked, summer sea ice loss may pass a critical threshold that could drastically transform the Arctic. Because historical observations are limited, it is crucial to have reliable proxies for assessing natural sea ice variability, its stability and sensitivity to climate forcing on different time scales. Current proxies address aspects of sea ice variability, but are limited due to a selective fossil record, preservation effects, regional applicability, or being semi-quantitative. With such restraints on our knowledge about natural variations and drivers, major uncertainties about the future remain.
I propose to develop and apply a novel sea ice proxy that exploits genetic information stored in marine sediments, sedimentary ancient DNA (sedaDNA). This innovation uses the genetic signature of phytoplankton communities from surface waters and sea ice as it gets stored in sediments. This wealth of information has not been explored before for reconstructing sea ice conditions. Preliminary results from my cross-disciplinary team indicate that our unconventional approach can provide a detailed, qualitative account of past sea ice ecosystems and quantitative estimates of sea ice parameters. I will address fundamental questions about past Arctic sea ice variability on different timescales, information essential to provide a framework upon which to assess the ecological and socio-economic consequences of a changing Arctic. This new proxy is not limited to sea ice research and can transform the field of paleoceanography.
Summary
Arctic sea ice decline is the exponent of the rapidly transforming Arctic climate. The ensuing local and global implications can be understood by studying past climate transitions, yet few methods are available to examine past Arctic sea ice cover, severely restricting our understanding of sea ice in the climate system. The decline in Arctic sea ice cover is a ‘canary in the coalmine’ for the state of our climate, and if greenhouse gas emissions remain unchecked, summer sea ice loss may pass a critical threshold that could drastically transform the Arctic. Because historical observations are limited, it is crucial to have reliable proxies for assessing natural sea ice variability, its stability and sensitivity to climate forcing on different time scales. Current proxies address aspects of sea ice variability, but are limited due to a selective fossil record, preservation effects, regional applicability, or being semi-quantitative. With such restraints on our knowledge about natural variations and drivers, major uncertainties about the future remain.
I propose to develop and apply a novel sea ice proxy that exploits genetic information stored in marine sediments, sedimentary ancient DNA (sedaDNA). This innovation uses the genetic signature of phytoplankton communities from surface waters and sea ice as it gets stored in sediments. This wealth of information has not been explored before for reconstructing sea ice conditions. Preliminary results from my cross-disciplinary team indicate that our unconventional approach can provide a detailed, qualitative account of past sea ice ecosystems and quantitative estimates of sea ice parameters. I will address fundamental questions about past Arctic sea ice variability on different timescales, information essential to provide a framework upon which to assess the ecological and socio-economic consequences of a changing Arctic. This new proxy is not limited to sea ice research and can transform the field of paleoceanography.
Max ERC Funding
2 615 858 €
Duration
Start date: 2019-08-01, End date: 2024-07-31
Project acronym AN07AT
Project Understanding computational roles of new neurons generated in the adult hippocampus
Researcher (PI) Ayumu Tashiro
Host Institution (HI) NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET NTNU
Country Norway
Call Details Starting Grant (StG), LS4, ERC-2007-StG
Summary New neurons are continuously generated in certain regions of adult mammalian brain. One of those regions is the dentate gyrus, a subregion of hippocampus, which is essential for memory formation. Although these new neurons in the adult dentate gyrus are thought to have an important role in learning and memory, it is largely unclear how new neurons are involved in information processing and storage underlying memory. Because new neurons constitute a minor portion of intermingled local neuronal population, simple application of conventional techniques such as multi-unit extracellular recording and pharmacological lesion are not suitable for the functional analysis of new neurons. In this proposed research program, I will combine multi-unit recording and behavioral analysis with virus mediated, cell-type-specific genetic manipulation of neuronal activity, to investigate computational roles of new neurons in learning and memory. Specifically, I will determine: 1) specific memory processes that require new neurons, 2) dynamic patterns of activity that new neurons express during memory-related behavior, 3) influence of new neurons on their downstream structure. Further, based on the information obtained by these three lines of studies, we will establish causal relationship between specific memory-related behavior and specific pattern of activity in new neurons. Solving these issues will cooperatively provide important insight into the understanding of computational roles performed by adult neurogenesis. The information on the function of new neurons in normal brain could contribute to future development of efficient therapeutic strategy for a variety of brain disorders.
Summary
New neurons are continuously generated in certain regions of adult mammalian brain. One of those regions is the dentate gyrus, a subregion of hippocampus, which is essential for memory formation. Although these new neurons in the adult dentate gyrus are thought to have an important role in learning and memory, it is largely unclear how new neurons are involved in information processing and storage underlying memory. Because new neurons constitute a minor portion of intermingled local neuronal population, simple application of conventional techniques such as multi-unit extracellular recording and pharmacological lesion are not suitable for the functional analysis of new neurons. In this proposed research program, I will combine multi-unit recording and behavioral analysis with virus mediated, cell-type-specific genetic manipulation of neuronal activity, to investigate computational roles of new neurons in learning and memory. Specifically, I will determine: 1) specific memory processes that require new neurons, 2) dynamic patterns of activity that new neurons express during memory-related behavior, 3) influence of new neurons on their downstream structure. Further, based on the information obtained by these three lines of studies, we will establish causal relationship between specific memory-related behavior and specific pattern of activity in new neurons. Solving these issues will cooperatively provide important insight into the understanding of computational roles performed by adult neurogenesis. The information on the function of new neurons in normal brain could contribute to future development of efficient therapeutic strategy for a variety of brain disorders.
Max ERC Funding
1 991 743 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym ANISOTROPIC UNIVERSE
Project The anisotropic universe -- a reality or fluke?
Researcher (PI) Hans Kristian Kamfjord Eriksen
Host Institution (HI) UNIVERSITETET I OSLO
Country Norway
Call Details Starting Grant (StG), PE9, ERC-2010-StG_20091028
Summary "During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Summary
"During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym APOCRYPHA
Project Storyworlds in Transition: Coptic Apocrypha in Changing Contexts in the Byzantine and Early Islamic Periods
Researcher (PI) Hugo Lundhaug
Host Institution (HI) UNIVERSITETET I OSLO
Country Norway
Call Details Consolidator Grant (CoG), SH5, ERC-2019-COG
Summary This project proposes the first systematic study of Coptic apocrypha covering the entire timespan of Coptic literary production, and it aims to do so with unprecedented methodological sophistication. Apocrypha is here defined as (1) texts and traditions that develop or expand upon characters and events of the biblical storyworld; (2) and/or contain a claim to authorship by a character from that storyworld or a direct witness to it. A great number of such apocryphal texts and traditions has been preserved in Coptic manuscripts from the fourth to the twelfth centuries. Most of these texts are attributed to apostles or other important early Christian figures, and over time such materials were also increasingly embedded in pseudepigraphical frames, such as in homilies attributed to later, but still early, heroes of the Church. The manuscripts in which this literature has been preserved were almost exclusively produced and used in Egyptian monasteries. Although the use of such apocrypha were at times controversial, the evidence clearly indicates the widespread use of such literature in Coptic monasteries over centuries, and this project will investigate the contents, development, and functions of apocrypha over time, as they were copied, adapted, and used in changing socio-religious contexts over time. The period covered by the project saw drastic changes in the religious landscape of Egypt, from its Christianity having a dominant position in the fourth century, through the marginalization of Egyptian Christianity in relation to the imperial Chalcedonian Church after 451, to a period of increasing marginalization in relation to Islam following the Arab conquest of Egypt in the mid-seventh century. The project will investigate how these changing contexts are reflected in the Coptic apocrypha that were copied and used in Egyptian monasteries, and what functions they had for their users throughout the period under investigation.
Summary
This project proposes the first systematic study of Coptic apocrypha covering the entire timespan of Coptic literary production, and it aims to do so with unprecedented methodological sophistication. Apocrypha is here defined as (1) texts and traditions that develop or expand upon characters and events of the biblical storyworld; (2) and/or contain a claim to authorship by a character from that storyworld or a direct witness to it. A great number of such apocryphal texts and traditions has been preserved in Coptic manuscripts from the fourth to the twelfth centuries. Most of these texts are attributed to apostles or other important early Christian figures, and over time such materials were also increasingly embedded in pseudepigraphical frames, such as in homilies attributed to later, but still early, heroes of the Church. The manuscripts in which this literature has been preserved were almost exclusively produced and used in Egyptian monasteries. Although the use of such apocrypha were at times controversial, the evidence clearly indicates the widespread use of such literature in Coptic monasteries over centuries, and this project will investigate the contents, development, and functions of apocrypha over time, as they were copied, adapted, and used in changing socio-religious contexts over time. The period covered by the project saw drastic changes in the religious landscape of Egypt, from its Christianity having a dominant position in the fourth century, through the marginalization of Egyptian Christianity in relation to the imperial Chalcedonian Church after 451, to a period of increasing marginalization in relation to Islam following the Arab conquest of Egypt in the mid-seventh century. The project will investigate how these changing contexts are reflected in the Coptic apocrypha that were copied and used in Egyptian monasteries, and what functions they had for their users throughout the period under investigation.
Max ERC Funding
1 998 626 €
Duration
Start date: 2020-08-01, End date: 2025-07-31
Project acronym ATLANTIS
Project Whales, waste and sea walnuts: incorporating human impacts on the marine ecosystem within life cycle impact assessment
Researcher (PI) Francesca VERONES
Host Institution (HI) NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET NTNU
Country Norway
Call Details Starting Grant (StG), SH2, ERC-2019-STG
Summary The marine ecosystem covers around 70% of the planet. Today, every single part of this vast ecosystem is affected by at least one anthropogenic driver of change. This fact should cause us to pause and think: such a huge space, yet habitat and species loss are occurring at an unprecedented rate. The marine ecosystem provides us with a wealth of services and has an economic value exceeding 20 trillion US Dollars. In addition, the marine ecosystem is considered crucial for our sustainable future and is often regarded as the “next economic frontier”.
However, despite its importance for humankind, the marine ecosystem is significantly underrepresented in sustainability research. We currently have no holistic approach to quantify the impacts caused by a large number of human pressures in the marine ecosystem.
A powerful tool for identifying such impacts is life cycle assessment (LCA). LCA is the best available tool to assess potential environmental impacts of products and processes in a comprehensive way. However, methods have never been properly developed for including marine impacts in LCA results.
I will contribute to closing this substantial research gap by developing novel models for quantifying impacts on ecosystem service losses (“whales”), as well as impacts of marine plastic debris (“waste”) and of marine invasive species (“sea walnuts”) within the LCA framework. These models will be developed based on impacts on species richness and ecosystem service potential. Including ecosystem services will be a paradigm extension and a substantial advancement for the LCA framework. All models will be tested in an overarching case study.
Currently we are unable to determine whether planned marine activities and processes are sustainable. By developing these models, we will be able to do so with a holistic perspective. This is of unprecedented importance, if we want to manage this vital ecosystem in a sustainable way and preserve it for future generations.
Summary
The marine ecosystem covers around 70% of the planet. Today, every single part of this vast ecosystem is affected by at least one anthropogenic driver of change. This fact should cause us to pause and think: such a huge space, yet habitat and species loss are occurring at an unprecedented rate. The marine ecosystem provides us with a wealth of services and has an economic value exceeding 20 trillion US Dollars. In addition, the marine ecosystem is considered crucial for our sustainable future and is often regarded as the “next economic frontier”.
However, despite its importance for humankind, the marine ecosystem is significantly underrepresented in sustainability research. We currently have no holistic approach to quantify the impacts caused by a large number of human pressures in the marine ecosystem.
A powerful tool for identifying such impacts is life cycle assessment (LCA). LCA is the best available tool to assess potential environmental impacts of products and processes in a comprehensive way. However, methods have never been properly developed for including marine impacts in LCA results.
I will contribute to closing this substantial research gap by developing novel models for quantifying impacts on ecosystem service losses (“whales”), as well as impacts of marine plastic debris (“waste”) and of marine invasive species (“sea walnuts”) within the LCA framework. These models will be developed based on impacts on species richness and ecosystem service potential. Including ecosystem services will be a paradigm extension and a substantial advancement for the LCA framework. All models will be tested in an overarching case study.
Currently we are unable to determine whether planned marine activities and processes are sustainable. By developing these models, we will be able to do so with a holistic perspective. This is of unprecedented importance, if we want to manage this vital ecosystem in a sustainable way and preserve it for future generations.
Max ERC Funding
1 500 000 €
Duration
Start date: 2020-04-01, End date: 2025-03-31
Project acronym ATRONICS
Project Creating building blocks for atomic-scale electronics
Researcher (PI) Dennis MEIER
Host Institution (HI) NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET NTNU
Country Norway
Call Details Consolidator Grant (CoG), PE3, ERC-2019-COG
Summary Interfaces in oxide materials offer amazing opportunities for fundamental and applied research, giving a new dimension to functional properties, such as magnetism, multiferroicity and superconductivity. Ferroelectric domain walls recently emerged as a new type of interface, where the dynamic characteristics of ferroelectricity introduce the element of spatial mobility, allowing for the real-time adjustment of position, density and orientation of the walls. This mobility adds an additional degree of flexibility that enables domain walls to take an active role in future devices and hold great potential as functional 2D systems for electronics.
Up to now, application concepts rely on injecting and deleting domain walls in micrometer-size devices to control electric conductivity. While this approach achieves a step beyond conventional interfaces by utilizing the wall mobility, it does not break the mould of classical device architectures. Completely new strategies are required to functionalize the versatile electronic properties and atomic-scale feature size of ferroelectric domain walls.
ATRONICS will establish a new conceptual approach for developing domain-wall-based technology. At the length scale of only a few atoms, we will use individual walls in improper ferroelectrics to emulate key electronic components such as diodes, transistors and logic gates. Crucially, as the functionality of the components is intrinsic to the domain walls, the walls themselves are the devices, instead of the previous approach of writing and erasing domain walls within a much larger classical device architecture. Beyond demonstrating individual devices, we will integrate multiple domain-wall devices, and develop quasi-2D circuitry and networks with a higher order of complexity then is currently achievable. ATRONICS will represent a major advancement in 2D functional materials for future technologies and play an essential role in the transition from nano- to atomic-scale electronics.
Summary
Interfaces in oxide materials offer amazing opportunities for fundamental and applied research, giving a new dimension to functional properties, such as magnetism, multiferroicity and superconductivity. Ferroelectric domain walls recently emerged as a new type of interface, where the dynamic characteristics of ferroelectricity introduce the element of spatial mobility, allowing for the real-time adjustment of position, density and orientation of the walls. This mobility adds an additional degree of flexibility that enables domain walls to take an active role in future devices and hold great potential as functional 2D systems for electronics.
Up to now, application concepts rely on injecting and deleting domain walls in micrometer-size devices to control electric conductivity. While this approach achieves a step beyond conventional interfaces by utilizing the wall mobility, it does not break the mould of classical device architectures. Completely new strategies are required to functionalize the versatile electronic properties and atomic-scale feature size of ferroelectric domain walls.
ATRONICS will establish a new conceptual approach for developing domain-wall-based technology. At the length scale of only a few atoms, we will use individual walls in improper ferroelectrics to emulate key electronic components such as diodes, transistors and logic gates. Crucially, as the functionality of the components is intrinsic to the domain walls, the walls themselves are the devices, instead of the previous approach of writing and erasing domain walls within a much larger classical device architecture. Beyond demonstrating individual devices, we will integrate multiple domain-wall devices, and develop quasi-2D circuitry and networks with a higher order of complexity then is currently achievable. ATRONICS will represent a major advancement in 2D functional materials for future technologies and play an essential role in the transition from nano- to atomic-scale electronics.
Max ERC Funding
1 845 338 €
Duration
Start date: 2020-09-01, End date: 2025-08-31