Project acronym AgeConsolidate
Project The Missing Link of Episodic Memory Decline in Aging: The Role of Inefficient Systems Consolidation
Researcher (PI) Anders Martin FJELL
Host Institution (HI) UNIVERSITETET I OSLO
Country Norway
Call Details Consolidator Grant (CoG), SH4, ERC-2016-COG
Summary Which brain mechanisms are responsible for the faith of the memories we make with age, whether they wither or stay, and in what form? Episodic memory function does decline with age. While this decline can have multiple causes, research has focused almost entirely on encoding and retrieval processes, largely ignoring a third critical process– consolidation. The objective of AgeConsolidate is to provide this missing link, by combining novel experimental cognitive paradigms with neuroimaging in a longitudinal large-scale attempt to directly test how age-related changes in consolidation processes in the brain impact episodic memory decline. The ambitious aims of the present proposal are two-fold:
(1) Use recent advances in memory consolidation theory to achieve an elaborate model of episodic memory deficits in aging
(2) Use aging as a model to uncover how structural and functional brain changes affect episodic memory consolidation in general
The novelty of the project lies in the synthesis of recent methodological advances and theoretical models for episodic memory consolidation to explain age-related decline, by employing a unique combination of a range of different techniques and approaches. This is ground-breaking, in that it aims at taking our understanding of the brain processes underlying episodic memory decline in aging to a new level, while at the same time advancing our theoretical understanding of how episodic memories are consolidated in the human brain. To obtain this outcome, I will test the main hypothesis of the project: Brain processes of episodic memory consolidation are less effective in older adults, and this can account for a significant portion of the episodic memory decline in aging. This will be answered by six secondary hypotheses, with 1-3 experiments or tasks designated to address each hypothesis, focusing on functional and structural MRI, positron emission tomography data and sleep experiments to target consolidation from different angles.
Summary
Which brain mechanisms are responsible for the faith of the memories we make with age, whether they wither or stay, and in what form? Episodic memory function does decline with age. While this decline can have multiple causes, research has focused almost entirely on encoding and retrieval processes, largely ignoring a third critical process– consolidation. The objective of AgeConsolidate is to provide this missing link, by combining novel experimental cognitive paradigms with neuroimaging in a longitudinal large-scale attempt to directly test how age-related changes in consolidation processes in the brain impact episodic memory decline. The ambitious aims of the present proposal are two-fold:
(1) Use recent advances in memory consolidation theory to achieve an elaborate model of episodic memory deficits in aging
(2) Use aging as a model to uncover how structural and functional brain changes affect episodic memory consolidation in general
The novelty of the project lies in the synthesis of recent methodological advances and theoretical models for episodic memory consolidation to explain age-related decline, by employing a unique combination of a range of different techniques and approaches. This is ground-breaking, in that it aims at taking our understanding of the brain processes underlying episodic memory decline in aging to a new level, while at the same time advancing our theoretical understanding of how episodic memories are consolidated in the human brain. To obtain this outcome, I will test the main hypothesis of the project: Brain processes of episodic memory consolidation are less effective in older adults, and this can account for a significant portion of the episodic memory decline in aging. This will be answered by six secondary hypotheses, with 1-3 experiments or tasks designated to address each hypothesis, focusing on functional and structural MRI, positron emission tomography data and sleep experiments to target consolidation from different angles.
Max ERC Funding
1 999 482 €
Duration
Start date: 2017-05-01, End date: 2022-04-30
Project acronym AGENSI
Project A Genetic View into Past Sea Ice Variability in the Arctic
Researcher (PI) Stijn DE SCHEPPER
Host Institution (HI) NORCE NORWEGIAN RESEARCH CENTRE AS
Country Norway
Call Details Consolidator Grant (CoG), PE10, ERC-2018-COG
Summary Arctic sea ice decline is the exponent of the rapidly transforming Arctic climate. The ensuing local and global implications can be understood by studying past climate transitions, yet few methods are available to examine past Arctic sea ice cover, severely restricting our understanding of sea ice in the climate system. The decline in Arctic sea ice cover is a ‘canary in the coalmine’ for the state of our climate, and if greenhouse gas emissions remain unchecked, summer sea ice loss may pass a critical threshold that could drastically transform the Arctic. Because historical observations are limited, it is crucial to have reliable proxies for assessing natural sea ice variability, its stability and sensitivity to climate forcing on different time scales. Current proxies address aspects of sea ice variability, but are limited due to a selective fossil record, preservation effects, regional applicability, or being semi-quantitative. With such restraints on our knowledge about natural variations and drivers, major uncertainties about the future remain.
I propose to develop and apply a novel sea ice proxy that exploits genetic information stored in marine sediments, sedimentary ancient DNA (sedaDNA). This innovation uses the genetic signature of phytoplankton communities from surface waters and sea ice as it gets stored in sediments. This wealth of information has not been explored before for reconstructing sea ice conditions. Preliminary results from my cross-disciplinary team indicate that our unconventional approach can provide a detailed, qualitative account of past sea ice ecosystems and quantitative estimates of sea ice parameters. I will address fundamental questions about past Arctic sea ice variability on different timescales, information essential to provide a framework upon which to assess the ecological and socio-economic consequences of a changing Arctic. This new proxy is not limited to sea ice research and can transform the field of paleoceanography.
Summary
Arctic sea ice decline is the exponent of the rapidly transforming Arctic climate. The ensuing local and global implications can be understood by studying past climate transitions, yet few methods are available to examine past Arctic sea ice cover, severely restricting our understanding of sea ice in the climate system. The decline in Arctic sea ice cover is a ‘canary in the coalmine’ for the state of our climate, and if greenhouse gas emissions remain unchecked, summer sea ice loss may pass a critical threshold that could drastically transform the Arctic. Because historical observations are limited, it is crucial to have reliable proxies for assessing natural sea ice variability, its stability and sensitivity to climate forcing on different time scales. Current proxies address aspects of sea ice variability, but are limited due to a selective fossil record, preservation effects, regional applicability, or being semi-quantitative. With such restraints on our knowledge about natural variations and drivers, major uncertainties about the future remain.
I propose to develop and apply a novel sea ice proxy that exploits genetic information stored in marine sediments, sedimentary ancient DNA (sedaDNA). This innovation uses the genetic signature of phytoplankton communities from surface waters and sea ice as it gets stored in sediments. This wealth of information has not been explored before for reconstructing sea ice conditions. Preliminary results from my cross-disciplinary team indicate that our unconventional approach can provide a detailed, qualitative account of past sea ice ecosystems and quantitative estimates of sea ice parameters. I will address fundamental questions about past Arctic sea ice variability on different timescales, information essential to provide a framework upon which to assess the ecological and socio-economic consequences of a changing Arctic. This new proxy is not limited to sea ice research and can transform the field of paleoceanography.
Max ERC Funding
2 615 858 €
Duration
Start date: 2019-08-01, End date: 2024-07-31
Project acronym APOCRYPHA
Project Storyworlds in Transition: Coptic Apocrypha in Changing Contexts in the Byzantine and Early Islamic Periods
Researcher (PI) Hugo Lundhaug
Host Institution (HI) UNIVERSITETET I OSLO
Country Norway
Call Details Consolidator Grant (CoG), SH5, ERC-2019-COG
Summary This project proposes the first systematic study of Coptic apocrypha covering the entire timespan of Coptic literary production, and it aims to do so with unprecedented methodological sophistication. Apocrypha is here defined as (1) texts and traditions that develop or expand upon characters and events of the biblical storyworld; (2) and/or contain a claim to authorship by a character from that storyworld or a direct witness to it. A great number of such apocryphal texts and traditions has been preserved in Coptic manuscripts from the fourth to the twelfth centuries. Most of these texts are attributed to apostles or other important early Christian figures, and over time such materials were also increasingly embedded in pseudepigraphical frames, such as in homilies attributed to later, but still early, heroes of the Church. The manuscripts in which this literature has been preserved were almost exclusively produced and used in Egyptian monasteries. Although the use of such apocrypha were at times controversial, the evidence clearly indicates the widespread use of such literature in Coptic monasteries over centuries, and this project will investigate the contents, development, and functions of apocrypha over time, as they were copied, adapted, and used in changing socio-religious contexts over time. The period covered by the project saw drastic changes in the religious landscape of Egypt, from its Christianity having a dominant position in the fourth century, through the marginalization of Egyptian Christianity in relation to the imperial Chalcedonian Church after 451, to a period of increasing marginalization in relation to Islam following the Arab conquest of Egypt in the mid-seventh century. The project will investigate how these changing contexts are reflected in the Coptic apocrypha that were copied and used in Egyptian monasteries, and what functions they had for their users throughout the period under investigation.
Summary
This project proposes the first systematic study of Coptic apocrypha covering the entire timespan of Coptic literary production, and it aims to do so with unprecedented methodological sophistication. Apocrypha is here defined as (1) texts and traditions that develop or expand upon characters and events of the biblical storyworld; (2) and/or contain a claim to authorship by a character from that storyworld or a direct witness to it. A great number of such apocryphal texts and traditions has been preserved in Coptic manuscripts from the fourth to the twelfth centuries. Most of these texts are attributed to apostles or other important early Christian figures, and over time such materials were also increasingly embedded in pseudepigraphical frames, such as in homilies attributed to later, but still early, heroes of the Church. The manuscripts in which this literature has been preserved were almost exclusively produced and used in Egyptian monasteries. Although the use of such apocrypha were at times controversial, the evidence clearly indicates the widespread use of such literature in Coptic monasteries over centuries, and this project will investigate the contents, development, and functions of apocrypha over time, as they were copied, adapted, and used in changing socio-religious contexts over time. The period covered by the project saw drastic changes in the religious landscape of Egypt, from its Christianity having a dominant position in the fourth century, through the marginalization of Egyptian Christianity in relation to the imperial Chalcedonian Church after 451, to a period of increasing marginalization in relation to Islam following the Arab conquest of Egypt in the mid-seventh century. The project will investigate how these changing contexts are reflected in the Coptic apocrypha that were copied and used in Egyptian monasteries, and what functions they had for their users throughout the period under investigation.
Max ERC Funding
1 998 626 €
Duration
Start date: 2020-08-01, End date: 2025-07-31
Project acronym ATRONICS
Project Creating building blocks for atomic-scale electronics
Researcher (PI) Dennis MEIER
Host Institution (HI) NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET NTNU
Country Norway
Call Details Consolidator Grant (CoG), PE3, ERC-2019-COG
Summary Interfaces in oxide materials offer amazing opportunities for fundamental and applied research, giving a new dimension to functional properties, such as magnetism, multiferroicity and superconductivity. Ferroelectric domain walls recently emerged as a new type of interface, where the dynamic characteristics of ferroelectricity introduce the element of spatial mobility, allowing for the real-time adjustment of position, density and orientation of the walls. This mobility adds an additional degree of flexibility that enables domain walls to take an active role in future devices and hold great potential as functional 2D systems for electronics.
Up to now, application concepts rely on injecting and deleting domain walls in micrometer-size devices to control electric conductivity. While this approach achieves a step beyond conventional interfaces by utilizing the wall mobility, it does not break the mould of classical device architectures. Completely new strategies are required to functionalize the versatile electronic properties and atomic-scale feature size of ferroelectric domain walls.
ATRONICS will establish a new conceptual approach for developing domain-wall-based technology. At the length scale of only a few atoms, we will use individual walls in improper ferroelectrics to emulate key electronic components such as diodes, transistors and logic gates. Crucially, as the functionality of the components is intrinsic to the domain walls, the walls themselves are the devices, instead of the previous approach of writing and erasing domain walls within a much larger classical device architecture. Beyond demonstrating individual devices, we will integrate multiple domain-wall devices, and develop quasi-2D circuitry and networks with a higher order of complexity then is currently achievable. ATRONICS will represent a major advancement in 2D functional materials for future technologies and play an essential role in the transition from nano- to atomic-scale electronics.
Summary
Interfaces in oxide materials offer amazing opportunities for fundamental and applied research, giving a new dimension to functional properties, such as magnetism, multiferroicity and superconductivity. Ferroelectric domain walls recently emerged as a new type of interface, where the dynamic characteristics of ferroelectricity introduce the element of spatial mobility, allowing for the real-time adjustment of position, density and orientation of the walls. This mobility adds an additional degree of flexibility that enables domain walls to take an active role in future devices and hold great potential as functional 2D systems for electronics.
Up to now, application concepts rely on injecting and deleting domain walls in micrometer-size devices to control electric conductivity. While this approach achieves a step beyond conventional interfaces by utilizing the wall mobility, it does not break the mould of classical device architectures. Completely new strategies are required to functionalize the versatile electronic properties and atomic-scale feature size of ferroelectric domain walls.
ATRONICS will establish a new conceptual approach for developing domain-wall-based technology. At the length scale of only a few atoms, we will use individual walls in improper ferroelectrics to emulate key electronic components such as diodes, transistors and logic gates. Crucially, as the functionality of the components is intrinsic to the domain walls, the walls themselves are the devices, instead of the previous approach of writing and erasing domain walls within a much larger classical device architecture. Beyond demonstrating individual devices, we will integrate multiple domain-wall devices, and develop quasi-2D circuitry and networks with a higher order of complexity then is currently achievable. ATRONICS will represent a major advancement in 2D functional materials for future technologies and play an essential role in the transition from nano- to atomic-scale electronics.
Max ERC Funding
1 845 338 €
Duration
Start date: 2020-09-01, End date: 2025-08-31
Project acronym BENEDICAMUS
Project Musical and Poetic Creativity for A Unique Moment in the Western Christian Liturgy, c.1000-1500
Researcher (PI) Catherine Anne Bradley
Host Institution (HI) UNIVERSITETET I OSLO
Country Norway
Call Details Consolidator Grant (CoG), SH5, ERC-2019-COG
Summary BENEDICAMUS pursues a transformative focus on creative practices surrounding a particular moment in the Western Christian liturgy: the exclamation Benedicamus Domino (“Let us Bless the Lord”), which sounded in song several times a day from c.1000 to 1500. This moment was granted special musical licence c.1000: singers of plainchant melodies could choose to reprise a favourite tune from the Church music for the day, re-texting it with the words Benedicamus Domino. In consequence, Benedicamus Domino enjoyed unprecedented longevity and significance as a focus of compositional interest, prompting some of the earliest experiments in multi-voiced polyphonic composition c.1100, as well as a lasting tradition of popular, devotional carols in the 1300s and 1400s. Histories of music have principally told the stories of particular composers, genres, institutions, or geographical centres. BENEDICAMUS undertakes the first longue durée study of musical and poetic responses to an exceptional liturgical moment, using this innovative perspective to work productively across established historiographical and disciplinary boundaries. Encompassing half a millennium of musical and ritual activity, hundreds of musical compositions, poetic texts, and manuscript sources, it offers pan-European perspectives on a chronologically and geographically diverse range of musical and poetic genres never before considered in conjunction. It develops new methods of music analysis to uncover traces of ad hoc or improvisatory performative practices that were not explicitly recorded in writing, forging interdisciplinary contexts for thinking about artistic creativity and experimentation in a time-period where these concepts have been little studied. BENEDICAMUS engages with the beginnings of musical and poetic genres and techniques that were crucial in shaping practices still current today, and reflects on music’s enduringly complex relationship with spirituality, ritual, and the sacred.
Summary
BENEDICAMUS pursues a transformative focus on creative practices surrounding a particular moment in the Western Christian liturgy: the exclamation Benedicamus Domino (“Let us Bless the Lord”), which sounded in song several times a day from c.1000 to 1500. This moment was granted special musical licence c.1000: singers of plainchant melodies could choose to reprise a favourite tune from the Church music for the day, re-texting it with the words Benedicamus Domino. In consequence, Benedicamus Domino enjoyed unprecedented longevity and significance as a focus of compositional interest, prompting some of the earliest experiments in multi-voiced polyphonic composition c.1100, as well as a lasting tradition of popular, devotional carols in the 1300s and 1400s. Histories of music have principally told the stories of particular composers, genres, institutions, or geographical centres. BENEDICAMUS undertakes the first longue durée study of musical and poetic responses to an exceptional liturgical moment, using this innovative perspective to work productively across established historiographical and disciplinary boundaries. Encompassing half a millennium of musical and ritual activity, hundreds of musical compositions, poetic texts, and manuscript sources, it offers pan-European perspectives on a chronologically and geographically diverse range of musical and poetic genres never before considered in conjunction. It develops new methods of music analysis to uncover traces of ad hoc or improvisatory performative practices that were not explicitly recorded in writing, forging interdisciplinary contexts for thinking about artistic creativity and experimentation in a time-period where these concepts have been little studied. BENEDICAMUS engages with the beginnings of musical and poetic genres and techniques that were crucial in shaping practices still current today, and reflects on music’s enduringly complex relationship with spirituality, ritual, and the sacred.
Max ERC Funding
1 990 329 €
Duration
Start date: 2020-09-01, End date: 2025-08-31
Project acronym Bits2Cosmology
Project Time-domain Gibbs sampling: From bits to inflationary gravitational waves
Researcher (PI) Hans Kristian ERIKSEN
Host Institution (HI) UNIVERSITETET I OSLO
Country Norway
Call Details Consolidator Grant (CoG), PE9, ERC-2017-COG
Summary The detection of primordial gravity waves created during the Big Bang ranks among the greatest potential intellectual achievements in modern science. During the last few decades, the instrumental progress necessary to achieve this has been nothing short of breathtaking, and we today are able to measure the microwave sky with better than one-in-a-million precision. However, from the latest ultra-sensitive experiments such as BICEP2 and Planck, it is clear that instrumental sensitivity alone will not be sufficient to make a robust detection of gravitational waves. Contamination in the form of astrophysical radiation from the Milky Way, for instance thermal dust and synchrotron radiation, obscures the cosmological signal by orders of magnitude. Even more critically, though, are second-order interactions between this radiation and the instrument characterization itself that lead to a highly non-linear and complicated problem.
I propose a ground-breaking solution to this problem that allows for joint estimation of cosmological parameters, astrophysical components, and instrument specifications. The engine of this method is called Gibbs sampling, which I have already applied extremely successfully to basic CMB component separation. The new and ciritical step is to apply this method to raw time-ordered observations observed directly by the instrument, as opposed to pre-processed frequency maps. While representing a ~100-fold increase in input data volume, this step is unavoidable in order to break through the current foreground-induced systematics floor. I will apply this method to the best currently available and future data sets (WMAP, Planck, SPIDER and LiteBIRD), and thereby derive the world's tightest constraint on the amplitude of inflationary gravitational waves. Additionally, the resulting ancillary science in the form of robust cosmological parameters and astrophysical component maps will represent the state-of-the-art in observational cosmology in years to come.
Summary
The detection of primordial gravity waves created during the Big Bang ranks among the greatest potential intellectual achievements in modern science. During the last few decades, the instrumental progress necessary to achieve this has been nothing short of breathtaking, and we today are able to measure the microwave sky with better than one-in-a-million precision. However, from the latest ultra-sensitive experiments such as BICEP2 and Planck, it is clear that instrumental sensitivity alone will not be sufficient to make a robust detection of gravitational waves. Contamination in the form of astrophysical radiation from the Milky Way, for instance thermal dust and synchrotron radiation, obscures the cosmological signal by orders of magnitude. Even more critically, though, are second-order interactions between this radiation and the instrument characterization itself that lead to a highly non-linear and complicated problem.
I propose a ground-breaking solution to this problem that allows for joint estimation of cosmological parameters, astrophysical components, and instrument specifications. The engine of this method is called Gibbs sampling, which I have already applied extremely successfully to basic CMB component separation. The new and ciritical step is to apply this method to raw time-ordered observations observed directly by the instrument, as opposed to pre-processed frequency maps. While representing a ~100-fold increase in input data volume, this step is unavoidable in order to break through the current foreground-induced systematics floor. I will apply this method to the best currently available and future data sets (WMAP, Planck, SPIDER and LiteBIRD), and thereby derive the world's tightest constraint on the amplitude of inflationary gravitational waves. Additionally, the resulting ancillary science in the form of robust cosmological parameters and astrophysical component maps will represent the state-of-the-art in observational cosmology in years to come.
Max ERC Funding
1 999 205 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym CARDYADS
Project Controlling Cardiomyocyte Dyadic Structure
Researcher (PI) William Edward Louch
Host Institution (HI) UNIVERSITETET I OSLO
Country Norway
Call Details Consolidator Grant (CoG), LS4, ERC-2014-CoG
Summary Contraction and relaxation of cardiac myocytes, and thus the whole heart, are critically dependent on dyads. These functional junctions between t-tubules, which are invaginations of the surface membrane, and the sarcoplasmic reticulum allow efficient control of calcium release into the cytosol, and also its removal. Dyads are formed gradually during development and break down during disease. However, the precise nature of dyadic structure is unclear, even in healthy adult cardiac myocytes, as are the triggers and consequences of altering dyadic integrity. In this proposal, my group will investigate the precise 3-dimensional arrangement of dyads and their proteins during development, adulthood, and heart failure by employing CLEM imaging (PALM and EM tomography). This will be accomplished by developing transgenic mice with fluorescent labels on four dyadic proteins (L-type calcium channel, ryanodine receptor, sodium-calcium exchanger, SERCA), and by imaging tissue from explanted normal and failing human hearts. The signals responsible for controlling dyadic formation, maintenance, and disruption will be determined by performing high-throughput sequencing to identify novel genes involved with these processes in several established model systems. Particular focus will be given to investigating left ventricular wall stress and stretch-dependent gene regulation as controllers of dyadic integrity. Candidate genes will be manipulated in cell models and transgenic animals to promote dyadic formation and maintenance, and reverse dyadic disruption in heart failure. The consequences of dyadic structure for function will be tested experimentally and with mathematical modeling to examine effects on cardiac myocyte calcium homeostasis and whole-heart function. The results of this project are anticipated to yield unprecedented insight into dyadic structure, regulation, and function, and to identify novel therapeutic targets for heart disease patients.
Summary
Contraction and relaxation of cardiac myocytes, and thus the whole heart, are critically dependent on dyads. These functional junctions between t-tubules, which are invaginations of the surface membrane, and the sarcoplasmic reticulum allow efficient control of calcium release into the cytosol, and also its removal. Dyads are formed gradually during development and break down during disease. However, the precise nature of dyadic structure is unclear, even in healthy adult cardiac myocytes, as are the triggers and consequences of altering dyadic integrity. In this proposal, my group will investigate the precise 3-dimensional arrangement of dyads and their proteins during development, adulthood, and heart failure by employing CLEM imaging (PALM and EM tomography). This will be accomplished by developing transgenic mice with fluorescent labels on four dyadic proteins (L-type calcium channel, ryanodine receptor, sodium-calcium exchanger, SERCA), and by imaging tissue from explanted normal and failing human hearts. The signals responsible for controlling dyadic formation, maintenance, and disruption will be determined by performing high-throughput sequencing to identify novel genes involved with these processes in several established model systems. Particular focus will be given to investigating left ventricular wall stress and stretch-dependent gene regulation as controllers of dyadic integrity. Candidate genes will be manipulated in cell models and transgenic animals to promote dyadic formation and maintenance, and reverse dyadic disruption in heart failure. The consequences of dyadic structure for function will be tested experimentally and with mathematical modeling to examine effects on cardiac myocyte calcium homeostasis and whole-heart function. The results of this project are anticipated to yield unprecedented insight into dyadic structure, regulation, and function, and to identify novel therapeutic targets for heart disease patients.
Max ERC Funding
2 000 000 €
Duration
Start date: 2015-07-01, End date: 2020-12-31
Project acronym CLIMSEC
Project Climate Variability and Security Threats
Researcher (PI) Halvard Buhaug
Host Institution (HI) INSTITUTT FOR FREDSFORSKNING
Country Norway
Call Details Consolidator Grant (CoG), SH2, ERC-2014-CoG
Summary Recent uprisings across the world have accentuated claims that food insecurity is an important trigger of political violence. Is the Arab Spring representative of a general climate-conflict pattern, where severe droughts and other climate anomalies are a key driving force? Research to date has failed to conclude on a robust relationship but several notable theoretical and methodological shortcomings limit inference. CLIMSEC will address these research gaps. It asks: How does climate variability affect dynamics of political violence? This overarching research question will be addressed through the accomplishment of four key objectives: (1) Investigate how food security impacts of climate variability affect political violence; (2) Investigate how economic impacts of climate variability affect political violence; (3) Conduct short-term forecasts of political violence in response to food and economic shocks; and (4) Develop a comprehensive, testable theoretical model of security implications of climate variability. To achieve these objectives, CLIMSEC will advance the research frontier on theoretical as well as analytical accounts. Central in this endeavor is conceptual and empirical disaggregation. Instead of assuming states and calendar years as unitary and fixed entities, the project proposes causal processes that act at multiple temporal and spatial scales, involve various types of actors, and lead to very different forms of outcomes depending on the context. The empirical component will make innovative use of new geo - referenced data and methods; focus on a broad range of insecurity outcomes, including non-violent resistance; and combine rigorous statistical models with out-of-sample simulations and qualitative case studies for theorizing and validation of key findings. Based at PRIO, the project will be led by Research Professor Halvard Buhaug, a leading scholar on climate change and security with strong publication record and project management experience.
Summary
Recent uprisings across the world have accentuated claims that food insecurity is an important trigger of political violence. Is the Arab Spring representative of a general climate-conflict pattern, where severe droughts and other climate anomalies are a key driving force? Research to date has failed to conclude on a robust relationship but several notable theoretical and methodological shortcomings limit inference. CLIMSEC will address these research gaps. It asks: How does climate variability affect dynamics of political violence? This overarching research question will be addressed through the accomplishment of four key objectives: (1) Investigate how food security impacts of climate variability affect political violence; (2) Investigate how economic impacts of climate variability affect political violence; (3) Conduct short-term forecasts of political violence in response to food and economic shocks; and (4) Develop a comprehensive, testable theoretical model of security implications of climate variability. To achieve these objectives, CLIMSEC will advance the research frontier on theoretical as well as analytical accounts. Central in this endeavor is conceptual and empirical disaggregation. Instead of assuming states and calendar years as unitary and fixed entities, the project proposes causal processes that act at multiple temporal and spatial scales, involve various types of actors, and lead to very different forms of outcomes depending on the context. The empirical component will make innovative use of new geo - referenced data and methods; focus on a broad range of insecurity outcomes, including non-violent resistance; and combine rigorous statistical models with out-of-sample simulations and qualitative case studies for theorizing and validation of key findings. Based at PRIO, the project will be led by Research Professor Halvard Buhaug, a leading scholar on climate change and security with strong publication record and project management experience.
Max ERC Funding
1 996 945 €
Duration
Start date: 2015-09-01, End date: 2021-06-30
Project acronym CONSERVATION
Project The Economics and Politics of Conservation
Researcher (PI) Baard Gjul Harstad
Host Institution (HI) UNIVERSITETET I OSLO
Country Norway
Call Details Consolidator Grant (CoG), SH1, ERC-2015-CoG
Summary The UN’s approach to climate policy is to focus on national emission caps for greenhouse gases. Most of the economic theory on environmental agreements is also studying such a demand-side approach, even though it is well known that such an approach has several flaws, including carbon leakage and the incentive to free ride. Recent theory has suggested that a better approach may be to focus on the supply-side of the equation, rather than the demand-side. While this recent theory is promising, it is only indicative and has several shortcomings that must be analysed. The goal of this project is to investigate in depth how to best use conservation as an environmental policy tool. The project aims at integrating the theory of emissions and pollution with a model of extraction and thus the supply of exhaustible resources in a coherent and dynamic game-theoretic framework. I will apply this framework to analyse negotiations, agreements, and contracts on extraction levels, and how such policies can interact, complement or substitute for agreements focusing on consumption/emissions. It will also be important to develop and apply the tools of political economics to investigate which (second-best) agreement one may expect to be feasible as equilibria of the game. For highly asymmetric settings, where the possessors of the resource are few (such as for tropical forests), side transfers are necessary and contract theory will be the natural analytical tool when
searching for the best agreement. However, also standard contract theory needs to be developed further once one recognizes that the “agent” in the principal-agent relationship is an organization or a government, rather than an individual.
Summary
The UN’s approach to climate policy is to focus on national emission caps for greenhouse gases. Most of the economic theory on environmental agreements is also studying such a demand-side approach, even though it is well known that such an approach has several flaws, including carbon leakage and the incentive to free ride. Recent theory has suggested that a better approach may be to focus on the supply-side of the equation, rather than the demand-side. While this recent theory is promising, it is only indicative and has several shortcomings that must be analysed. The goal of this project is to investigate in depth how to best use conservation as an environmental policy tool. The project aims at integrating the theory of emissions and pollution with a model of extraction and thus the supply of exhaustible resources in a coherent and dynamic game-theoretic framework. I will apply this framework to analyse negotiations, agreements, and contracts on extraction levels, and how such policies can interact, complement or substitute for agreements focusing on consumption/emissions. It will also be important to develop and apply the tools of political economics to investigate which (second-best) agreement one may expect to be feasible as equilibria of the game. For highly asymmetric settings, where the possessors of the resource are few (such as for tropical forests), side transfers are necessary and contract theory will be the natural analytical tool when
searching for the best agreement. However, also standard contract theory needs to be developed further once one recognizes that the “agent” in the principal-agent relationship is an organization or a government, rather than an individual.
Max ERC Funding
1 571 554 €
Duration
Start date: 2016-08-01, End date: 2021-07-31
Project acronym Cosmoglobe
Project Cosmoglobe -- mapping the universe from the Milky Way to the Big Bang
Researcher (PI) Ingunn Kathrine WEHUS
Host Institution (HI) UNIVERSITETET I OSLO
Country Norway
Call Details Consolidator Grant (CoG), PE9, ERC-2018-COG
Summary In the aftermath of the high-precision Planck and BICEP2 experiments, cosmology has undergone a critical transition. Before 2014, most breakthroughs came as direct results of improved detector technology and increased noise sensitivity. After 2014, the main source of uncertainty will be due to astrophysical foregrounds, typically in the form of dust or synchrotron emission from the Milky Way. Indeed, this holds as true for the study of reionization and the cosmic dawn as it does for the hunt for inflationary gravitational waves. To break through this obscuring veil, it is of utmost importance to optimally exploit every piece of available information, merging the world's best observational data with the world's most advanced theoretical models. A first step toward this ultimate goal was recently published as the Planck 2015 Astrophysical Baseline Model, an effort led and conducted by myself.
Here I propose to build Cosmoglobe, a comprehensive model of the radio, microwave and sub-mm sky, covering 100 MHz to 10 THz in both intensity and polarization, extending existing models by three orders of magnitude in frequency and a factor of five in angular resolution. I will leverage a recent algorithmic breakthrough in multi-resolution component separation to jointly analyze some of the world's best data sets, including C-BASS, COMAP, PASIPHAE, Planck, SPIDER, WMAP and many more. This will result in the best cosmological (CMB, SZ, CIB etc.) and astrophysical (thermal and spinning dust, synchrotron and free-free emission etc.) component maps published to date. I will then use this model to derive the world's strongest limits on, and potentially detect, inflationary gravity waves using SPIDER observations; forecast, optimize and analyze observations from the leading next-generation CMB experiments, including LiteBIRD and S4; and derive the first 3D large-scale structure maps from CO intensity mapping from COMAP, potentially opening up a new window on the cosmic dawn.
Summary
In the aftermath of the high-precision Planck and BICEP2 experiments, cosmology has undergone a critical transition. Before 2014, most breakthroughs came as direct results of improved detector technology and increased noise sensitivity. After 2014, the main source of uncertainty will be due to astrophysical foregrounds, typically in the form of dust or synchrotron emission from the Milky Way. Indeed, this holds as true for the study of reionization and the cosmic dawn as it does for the hunt for inflationary gravitational waves. To break through this obscuring veil, it is of utmost importance to optimally exploit every piece of available information, merging the world's best observational data with the world's most advanced theoretical models. A first step toward this ultimate goal was recently published as the Planck 2015 Astrophysical Baseline Model, an effort led and conducted by myself.
Here I propose to build Cosmoglobe, a comprehensive model of the radio, microwave and sub-mm sky, covering 100 MHz to 10 THz in both intensity and polarization, extending existing models by three orders of magnitude in frequency and a factor of five in angular resolution. I will leverage a recent algorithmic breakthrough in multi-resolution component separation to jointly analyze some of the world's best data sets, including C-BASS, COMAP, PASIPHAE, Planck, SPIDER, WMAP and many more. This will result in the best cosmological (CMB, SZ, CIB etc.) and astrophysical (thermal and spinning dust, synchrotron and free-free emission etc.) component maps published to date. I will then use this model to derive the world's strongest limits on, and potentially detect, inflationary gravity waves using SPIDER observations; forecast, optimize and analyze observations from the leading next-generation CMB experiments, including LiteBIRD and S4; and derive the first 3D large-scale structure maps from CO intensity mapping from COMAP, potentially opening up a new window on the cosmic dawn.
Max ERC Funding
1 999 382 €
Duration
Start date: 2019-06-01, End date: 2024-05-31