Project acronym 100 Archaic Genomes
Project Genome sequences from extinct hominins
Researcher (PI) Svante PÄÄBO
Host Institution (HI) MAX-PLANCK-GESELLSCHAFT ZUR FORDERUNG DER WISSENSCHAFTEN EV
Call Details Advanced Grant (AdG), LS2, ERC-2015-AdG
Summary Neandertals and Denisovans, an Asian group distantly related to Neandertals, are the closest evolutionary relatives of present-day humans. They are thus of direct relevance for understanding the origin of modern humans and how modern humans differ from their closest relatives. We will generate genome-wide data from a large number of Neandertal and Denisovan individuals from across their geographical and temporal range as well as from other extinct hominin groups which we may discover. This will be possible by automating highly sensitive approaches to ancient DNA extraction and DNA libraries construction that we have developed so that they can be applied to many specimens from many sites in order to identify those that contain retrievable DNA. Whenever possible we will sequence whole genomes and in other cases use DNA capture methods to generate high-quality data from representative parts of the genome. This will allow us to study the population history of Neandertals and Denisovans, elucidate how many times and where these extinct hominins contributed genes to present-day people, and the extent to which modern humans and archaic groups contributed genetically to Neandertals and Denisovans. By retrieving DNA from specimens that go back to the Middle Pleistocene we will furthermore shed light on the early history and origins of Neandertals and Denisovans.
Summary
Neandertals and Denisovans, an Asian group distantly related to Neandertals, are the closest evolutionary relatives of present-day humans. They are thus of direct relevance for understanding the origin of modern humans and how modern humans differ from their closest relatives. We will generate genome-wide data from a large number of Neandertal and Denisovan individuals from across their geographical and temporal range as well as from other extinct hominin groups which we may discover. This will be possible by automating highly sensitive approaches to ancient DNA extraction and DNA libraries construction that we have developed so that they can be applied to many specimens from many sites in order to identify those that contain retrievable DNA. Whenever possible we will sequence whole genomes and in other cases use DNA capture methods to generate high-quality data from representative parts of the genome. This will allow us to study the population history of Neandertals and Denisovans, elucidate how many times and where these extinct hominins contributed genes to present-day people, and the extent to which modern humans and archaic groups contributed genetically to Neandertals and Denisovans. By retrieving DNA from specimens that go back to the Middle Pleistocene we will furthermore shed light on the early history and origins of Neandertals and Denisovans.
Max ERC Funding
2 350 000 €
Duration
Start date: 2016-11-01, End date: 2021-10-31
Project acronym 1st-principles-discs
Project A First Principles Approach to Accretion Discs
Researcher (PI) Martin Elias Pessah
Host Institution (HI) KOBENHAVNS UNIVERSITET
Call Details Starting Grant (StG), PE9, ERC-2012-StG_20111012
Summary Most celestial bodies, from planets, to stars, to black holes; gain mass during their lives by means of an accretion disc. Understanding the physical processes that determine the rate at which matter accretes and energy is radiated in these discs is vital for unraveling the formation, evolution, and fate of almost every type of object in the Universe. Despite the fact that magnetic fields have been known to be crucial in accretion discs since the early 90’s, the majority of astrophysical questions that depend on the details of how disc accretion proceeds are still being addressed using the “standard” accretion disc model (developed in the early 70’s), where magnetic fields do not play an explicit role. This has prevented us from fully exploring the astrophysical consequences and observational signatures of realistic accretion disc models, leading to a profound disconnect between observations (usually interpreted with the standard paradigm) and modern accretion disc theory and numerical simulations (where magnetic turbulence is crucial). The goal of this proposal is to use several complementary approaches in order to finally move beyond the standard paradigm. This program has two main objectives: 1) Develop the theoretical framework to incorporate magnetic fields, and the ensuing turbulence, into self-consistent accretion disc models, and investigate their observational implications. 2) Investigate transport and radiative processes in collision-less disc regions, where non-thermal radiation originates, by employing a kinetic particle description of the plasma. In order to achieve these goals, we will use, and build upon, state-of-the-art magnetohydrodynamic and particle-in-cell codes in conjunction with theoretical modeling. This framework will make it possible to address fundamental questions on stellar and planet formation, binary systems with a compact object, and supermassive black hole feedback in a way that has no counterpart within the standard paradigm.
Summary
Most celestial bodies, from planets, to stars, to black holes; gain mass during their lives by means of an accretion disc. Understanding the physical processes that determine the rate at which matter accretes and energy is radiated in these discs is vital for unraveling the formation, evolution, and fate of almost every type of object in the Universe. Despite the fact that magnetic fields have been known to be crucial in accretion discs since the early 90’s, the majority of astrophysical questions that depend on the details of how disc accretion proceeds are still being addressed using the “standard” accretion disc model (developed in the early 70’s), where magnetic fields do not play an explicit role. This has prevented us from fully exploring the astrophysical consequences and observational signatures of realistic accretion disc models, leading to a profound disconnect between observations (usually interpreted with the standard paradigm) and modern accretion disc theory and numerical simulations (where magnetic turbulence is crucial). The goal of this proposal is to use several complementary approaches in order to finally move beyond the standard paradigm. This program has two main objectives: 1) Develop the theoretical framework to incorporate magnetic fields, and the ensuing turbulence, into self-consistent accretion disc models, and investigate their observational implications. 2) Investigate transport and radiative processes in collision-less disc regions, where non-thermal radiation originates, by employing a kinetic particle description of the plasma. In order to achieve these goals, we will use, and build upon, state-of-the-art magnetohydrodynamic and particle-in-cell codes in conjunction with theoretical modeling. This framework will make it possible to address fundamental questions on stellar and planet formation, binary systems with a compact object, and supermassive black hole feedback in a way that has no counterpart within the standard paradigm.
Max ERC Funding
1 793 697 €
Duration
Start date: 2013-02-01, End date: 2018-01-31
Project acronym 2-HIT
Project Genetic interaction networks: From C. elegans to human disease
Researcher (PI) Ben Lehner
Host Institution (HI) FUNDACIO CENTRE DE REGULACIO GENOMICA
Call Details Starting Grant (StG), LS2, ERC-2007-StG
Summary Most hereditary diseases in humans are genetically complex, resulting from combinations of mutations in multiple genes. However synthetic interactions between genes are very difficult to identify in population studies because of a lack of statistical power and we fundamentally do not understand how mutations interact to produce phenotypes. C. elegans is a unique animal in which genetic interactions can be rapidly identified in vivo using RNA interference, and we recently used this system to construct the first genetic interaction network for any animal, focused on signal transduction genes. The first objective of this proposal is to extend this work and map a comprehensive genetic interaction network for this model metazoan. This project will provide the first insights into the global properties of animal genetic interaction networks, and a comprehensive view of the functional relationships between genes in an animal. The second objective of the proposal is to use C. elegans to develop and validate experimentally integrated gene networks that connect genes to phenotypes and predict genetic interactions on a genome-wide scale. The methods that we develop and validate in C. elegans will then be applied to predict phenotypes and interactions for human genes. The final objective is to dissect the molecular mechanisms underlying genetic interactions, and to understand how these interactions evolve. The combined aim of these three objectives is to generate a framework for understanding and predicting how mutations interact to produce phenotypes, including in human disease.
Summary
Most hereditary diseases in humans are genetically complex, resulting from combinations of mutations in multiple genes. However synthetic interactions between genes are very difficult to identify in population studies because of a lack of statistical power and we fundamentally do not understand how mutations interact to produce phenotypes. C. elegans is a unique animal in which genetic interactions can be rapidly identified in vivo using RNA interference, and we recently used this system to construct the first genetic interaction network for any animal, focused on signal transduction genes. The first objective of this proposal is to extend this work and map a comprehensive genetic interaction network for this model metazoan. This project will provide the first insights into the global properties of animal genetic interaction networks, and a comprehensive view of the functional relationships between genes in an animal. The second objective of the proposal is to use C. elegans to develop and validate experimentally integrated gene networks that connect genes to phenotypes and predict genetic interactions on a genome-wide scale. The methods that we develop and validate in C. elegans will then be applied to predict phenotypes and interactions for human genes. The final objective is to dissect the molecular mechanisms underlying genetic interactions, and to understand how these interactions evolve. The combined aim of these three objectives is to generate a framework for understanding and predicting how mutations interact to produce phenotypes, including in human disease.
Max ERC Funding
1 100 000 €
Duration
Start date: 2008-09-01, End date: 2014-04-30
Project acronym 3D-REPAIR
Project Spatial organization of DNA repair within the nucleus
Researcher (PI) Evanthia Soutoglou
Host Institution (HI) CENTRE EUROPEEN DE RECHERCHE EN BIOLOGIE ET MEDECINE
Call Details Consolidator Grant (CoG), LS2, ERC-2015-CoG
Summary Faithful repair of double stranded DNA breaks (DSBs) is essential, as they are at the origin of genome instability, chromosomal translocations and cancer. Cells repair DSBs through different pathways, which can be faithful or mutagenic, and the balance between them at a given locus must be tightly regulated to preserve genome integrity. Although, much is known about DSB repair factors, how the choice between pathways is controlled within the nuclear environment is not understood. We have shown that nuclear architecture and non-random genome organization determine the frequency of chromosomal translocations and that pathway choice is dictated by the spatial organization of DNA in the nucleus. Nevertheless, what determines which pathway is activated in response to DSBs at specific genomic locations is not understood. Furthermore, the impact of 3D-genome folding on the kinetics and efficiency of DSB repair is completely unknown.
Here we aim to understand how nuclear compartmentalization, chromatin structure and genome organization impact on the efficiency of detection, signaling and repair of DSBs. We will unravel what determines the DNA repair specificity within distinct nuclear compartments using protein tethering, promiscuous biotinylation and quantitative proteomics. We will determine how DNA repair is orchestrated at different heterochromatin structures using a CRISPR/Cas9-based system that allows, for the first time robust induction of DSBs at specific heterochromatin compartments. Finally, we will investigate the role of 3D-genome folding in the kinetics of DNA repair and pathway choice using single nucleotide resolution DSB-mapping coupled to 3D-topological maps.
This proposal has significant implications for understanding the mechanisms controlling DNA repair within the nuclear environment and will reveal the regions of the genome that are susceptible to genomic instability and help us understand why certain mutations and translocations are recurrent in cancer
Summary
Faithful repair of double stranded DNA breaks (DSBs) is essential, as they are at the origin of genome instability, chromosomal translocations and cancer. Cells repair DSBs through different pathways, which can be faithful or mutagenic, and the balance between them at a given locus must be tightly regulated to preserve genome integrity. Although, much is known about DSB repair factors, how the choice between pathways is controlled within the nuclear environment is not understood. We have shown that nuclear architecture and non-random genome organization determine the frequency of chromosomal translocations and that pathway choice is dictated by the spatial organization of DNA in the nucleus. Nevertheless, what determines which pathway is activated in response to DSBs at specific genomic locations is not understood. Furthermore, the impact of 3D-genome folding on the kinetics and efficiency of DSB repair is completely unknown.
Here we aim to understand how nuclear compartmentalization, chromatin structure and genome organization impact on the efficiency of detection, signaling and repair of DSBs. We will unravel what determines the DNA repair specificity within distinct nuclear compartments using protein tethering, promiscuous biotinylation and quantitative proteomics. We will determine how DNA repair is orchestrated at different heterochromatin structures using a CRISPR/Cas9-based system that allows, for the first time robust induction of DSBs at specific heterochromatin compartments. Finally, we will investigate the role of 3D-genome folding in the kinetics of DNA repair and pathway choice using single nucleotide resolution DSB-mapping coupled to 3D-topological maps.
This proposal has significant implications for understanding the mechanisms controlling DNA repair within the nuclear environment and will reveal the regions of the genome that are susceptible to genomic instability and help us understand why certain mutations and translocations are recurrent in cancer
Max ERC Funding
1 999 750 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym 3DEpi
Project Transgenerational epigenetic inheritance of chromatin states : the role of Polycomb and 3D chromosome architecture
Researcher (PI) Giacomo CAVALLI
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), LS2, ERC-2017-ADG
Summary Epigenetic inheritance entails transmission of phenotypic traits not encoded in the DNA sequence and, in the most extreme case, Transgenerational Epigenetic Inheritance (TEI) involves transmission of memory through multiple generations. Very little is known on the mechanisms governing TEI and this is the subject of the present proposal. By transiently enhancing long-range chromatin interactions, we recently established isogenic Drosophila epilines that carry stable alternative epialleles, defined by differential levels of the Polycomb-dependent H3K27me3 mark. Furthermore, we extended our paradigm to natural phenotypes. These are ideal systems to study the role of Polycomb group (PcG) proteins and other components in regulating nuclear organization and epigenetic inheritance of chromatin states. The present project conjugates genetics, epigenomics, imaging and molecular biology to reach three critical aims.
Aim 1: Analysis of the molecular mechanisms regulating Polycomb-mediated TEI. We will identify the DNA, protein and RNA components that trigger and maintain transgenerational chromatin inheritance as well as their mechanisms of action.
Aim 2: Role of 3D genome organization in the regulation of TEI. We will analyze the developmental dynamics of TEI-inducing long-range chromatin interactions, identify chromatin components mediating 3D chromatin contacts and characterize their function in the TEI process.
Aim 3: Identification of a broader role of TEI during development. TEI might reflect a normal role of PcG components in the transmission of parental chromatin onto the next embryonic generation. We will explore this possibility by establishing other TEI paradigms and by relating TEI to the normal PcG function in these systems and in normal development.
This research program will unravel the biological significance and the molecular underpinnings of TEI and lead the way towards establishing this area of research into a consolidated scientific discipline.
Summary
Epigenetic inheritance entails transmission of phenotypic traits not encoded in the DNA sequence and, in the most extreme case, Transgenerational Epigenetic Inheritance (TEI) involves transmission of memory through multiple generations. Very little is known on the mechanisms governing TEI and this is the subject of the present proposal. By transiently enhancing long-range chromatin interactions, we recently established isogenic Drosophila epilines that carry stable alternative epialleles, defined by differential levels of the Polycomb-dependent H3K27me3 mark. Furthermore, we extended our paradigm to natural phenotypes. These are ideal systems to study the role of Polycomb group (PcG) proteins and other components in regulating nuclear organization and epigenetic inheritance of chromatin states. The present project conjugates genetics, epigenomics, imaging and molecular biology to reach three critical aims.
Aim 1: Analysis of the molecular mechanisms regulating Polycomb-mediated TEI. We will identify the DNA, protein and RNA components that trigger and maintain transgenerational chromatin inheritance as well as their mechanisms of action.
Aim 2: Role of 3D genome organization in the regulation of TEI. We will analyze the developmental dynamics of TEI-inducing long-range chromatin interactions, identify chromatin components mediating 3D chromatin contacts and characterize their function in the TEI process.
Aim 3: Identification of a broader role of TEI during development. TEI might reflect a normal role of PcG components in the transmission of parental chromatin onto the next embryonic generation. We will explore this possibility by establishing other TEI paradigms and by relating TEI to the normal PcG function in these systems and in normal development.
This research program will unravel the biological significance and the molecular underpinnings of TEI and lead the way towards establishing this area of research into a consolidated scientific discipline.
Max ERC Funding
2 500 000 €
Duration
Start date: 2018-11-01, End date: 2023-10-31
Project acronym 3DICE
Project 3D Interstellar Chemo-physical Evolution
Researcher (PI) Valentine Wakelam
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE9, ERC-2013-StG
Summary At the end of their life, stars spread their inner material into the diffuse interstellar medium. This diffuse medium gets locally denser and form dark clouds (also called dense or molecular clouds) whose innermost part is shielded from the external UV field by the dust, allowing for molecules to grow and get more complex. Gravitational collapse occurs inside these dense clouds, forming protostars and their surrounding disks, and eventually planetary systems like (or unlike) our solar system. The formation and evolution of molecules, minerals, ices and organics from the diffuse medium to planetary bodies, their alteration or preservation throughout this cosmic chemical history set the initial conditions for building planets, atmospheres and possibly the first bricks of life. The current view of interstellar chemistry is based on fragmental works on key steps of the sequence that are observed. The objective of this proposal is to follow the fractionation of the elements between the gas-phase and the interstellar grains, from the most diffuse medium to protoplanetary disks, in order to constrain the chemical composition of the material in which planets are formed. The potential outcome of this project is to get a consistent and more accurate description of the chemical evolution of interstellar matter. To achieve this objective, I will improve our chemical model by adding new processes on grain surfaces relevant under the diffuse medium conditions. This upgraded gas-grain model will be coupled to 3D dynamical models of the formation of dense clouds from diffuse medium and of protoplanetary disks from dense clouds. The computed chemical composition will also be used with 3D radiative transfer codes to study the chemical tracers of the physics of protoplanetary disk formation. The robustness of the model predictions will be studied with sensitivity analyses. Finally, model results will be confronted to observations to address some of the current challenges.
Summary
At the end of their life, stars spread their inner material into the diffuse interstellar medium. This diffuse medium gets locally denser and form dark clouds (also called dense or molecular clouds) whose innermost part is shielded from the external UV field by the dust, allowing for molecules to grow and get more complex. Gravitational collapse occurs inside these dense clouds, forming protostars and their surrounding disks, and eventually planetary systems like (or unlike) our solar system. The formation and evolution of molecules, minerals, ices and organics from the diffuse medium to planetary bodies, their alteration or preservation throughout this cosmic chemical history set the initial conditions for building planets, atmospheres and possibly the first bricks of life. The current view of interstellar chemistry is based on fragmental works on key steps of the sequence that are observed. The objective of this proposal is to follow the fractionation of the elements between the gas-phase and the interstellar grains, from the most diffuse medium to protoplanetary disks, in order to constrain the chemical composition of the material in which planets are formed. The potential outcome of this project is to get a consistent and more accurate description of the chemical evolution of interstellar matter. To achieve this objective, I will improve our chemical model by adding new processes on grain surfaces relevant under the diffuse medium conditions. This upgraded gas-grain model will be coupled to 3D dynamical models of the formation of dense clouds from diffuse medium and of protoplanetary disks from dense clouds. The computed chemical composition will also be used with 3D radiative transfer codes to study the chemical tracers of the physics of protoplanetary disk formation. The robustness of the model predictions will be studied with sensitivity analyses. Finally, model results will be confronted to observations to address some of the current challenges.
Max ERC Funding
1 166 231 €
Duration
Start date: 2013-09-01, End date: 2018-08-31
Project acronym 4C
Project 4C technology: uncovering the multi-dimensional structure of the genome
Researcher (PI) Wouter Leonard De Laat
Host Institution (HI) KONINKLIJKE NEDERLANDSE AKADEMIE VAN WETENSCHAPPEN - KNAW
Call Details Starting Grant (StG), LS2, ERC-2007-StG
Summary The architecture of DNA in the cell nucleus is an emerging epigenetic key contributor to genome function. We recently developed 4C technology, a high-throughput technique that combines state-of-the-art 3C technology with tailored micro-arrays to uniquely allow for an unbiased genome-wide search for DNA loci that interact in the nuclear space. Based on 4C technology, we were the first to provide a comprehensive overview of long-range DNA contacts of selected loci. The data showed that active and inactive chromatin domains contact many distinct regions within and between chromosomes and genes switch long-range DNA contacts in relation to their expression status. 4C technology not only allows investigating the three-dimensional structure of DNA in the nucleus, it also accurately reconstructs at least 10 megabases of the one-dimensional chromosome sequence map around the target sequence. Changes in this physical map as a result of genomic rearrangements are therefore identified by 4C technology. We recently demonstrated that 4C detects deletions, balanced inversions and translocations in patient samples at a resolution (~7kb) that allowed immediate sequencing of the breakpoints. Excitingly, 4C technology therefore offers the first high-resolution genomic approach that can identify both balanced and unbalanced genomic rearrangements. 4C is expected to become an important tool in clinical diagnosis and prognosis. Key objectives of this proposal are: 1. Explore the functional significance of DNA folding in the nucleus by systematically applying 4C technology to differentially expressed gene loci. 2. Adapt 4C technology such that it allows for massive parallel analysis of DNA interactions between regulatory elements and gene promoters. This method would greatly facilitate the identification of functionally relevant DNA elements in the genome. 3. Develop 4C technology into a clinical diagnostic tool for the accurate detection of balanced and unbalanced rearrangements.
Summary
The architecture of DNA in the cell nucleus is an emerging epigenetic key contributor to genome function. We recently developed 4C technology, a high-throughput technique that combines state-of-the-art 3C technology with tailored micro-arrays to uniquely allow for an unbiased genome-wide search for DNA loci that interact in the nuclear space. Based on 4C technology, we were the first to provide a comprehensive overview of long-range DNA contacts of selected loci. The data showed that active and inactive chromatin domains contact many distinct regions within and between chromosomes and genes switch long-range DNA contacts in relation to their expression status. 4C technology not only allows investigating the three-dimensional structure of DNA in the nucleus, it also accurately reconstructs at least 10 megabases of the one-dimensional chromosome sequence map around the target sequence. Changes in this physical map as a result of genomic rearrangements are therefore identified by 4C technology. We recently demonstrated that 4C detects deletions, balanced inversions and translocations in patient samples at a resolution (~7kb) that allowed immediate sequencing of the breakpoints. Excitingly, 4C technology therefore offers the first high-resolution genomic approach that can identify both balanced and unbalanced genomic rearrangements. 4C is expected to become an important tool in clinical diagnosis and prognosis. Key objectives of this proposal are: 1. Explore the functional significance of DNA folding in the nucleus by systematically applying 4C technology to differentially expressed gene loci. 2. Adapt 4C technology such that it allows for massive parallel analysis of DNA interactions between regulatory elements and gene promoters. This method would greatly facilitate the identification of functionally relevant DNA elements in the genome. 3. Develop 4C technology into a clinical diagnostic tool for the accurate detection of balanced and unbalanced rearrangements.
Max ERC Funding
1 225 000 €
Duration
Start date: 2008-09-01, End date: 2013-08-31
Project acronym 4D-GenEx
Project Spatio-temporal Organization and Expression of the Genome
Researcher (PI) Antoine COULON
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), LS2, ERC-2017-STG
Summary This project investigates the two-way relationship between spatio-temporal genome organization and coordinated gene regulation, through an approach at the interface between physics, computer science and biology.
In the nucleus, preferred positions are observed from chromosomes to single genes, in relation to normal and pathological cellular states. Evidence indicates a complex spatio-temporal coupling between co-regulated genes: e.g. certain genes cluster spatially when responding to similar factors and transcriptional noise patterns suggest domain-wide mechanisms. Yet, no individual experiment allows probing transcriptional coordination in 4 dimensions (FISH, live locus tracking, Hi-C...). Interpreting such data also critically requires theory (stochastic processes, statistical physics…). A lack of appropriate experimental/analytical approaches is impairing our understanding of the 4D genome.
Our proposal combines cutting-edge single-molecule imaging, signal-theory data analysis and physical modeling to study how genes coordinate in space and time in a single nucleus. Our objectives are to understand (a) competition/recycling of shared resources between genes within subnuclear compartments, (b) how enhancers communicate with genes domain-wide, and (c) the role of local conformational dynamics and supercoiling in gene co-regulation. Our organizing hypothesis is that, by acting on their microenvironment, genes shape their co-expression with other genes.
Building upon my expertise, we will use dual-color MS2/PP7 RNA labeling to visualize for the first time transcription and motion of pairs of hormone-responsive genes in real time. With our innovative signal analysis tools, we will extract spatio-temporal signatures of underlying processes, which we will investigate with stochastic modeling and validate through experimental perturbations. We expect to uncover how the functional organization of the linear genome relates to its physical properties and dynamics in 4D.
Summary
This project investigates the two-way relationship between spatio-temporal genome organization and coordinated gene regulation, through an approach at the interface between physics, computer science and biology.
In the nucleus, preferred positions are observed from chromosomes to single genes, in relation to normal and pathological cellular states. Evidence indicates a complex spatio-temporal coupling between co-regulated genes: e.g. certain genes cluster spatially when responding to similar factors and transcriptional noise patterns suggest domain-wide mechanisms. Yet, no individual experiment allows probing transcriptional coordination in 4 dimensions (FISH, live locus tracking, Hi-C...). Interpreting such data also critically requires theory (stochastic processes, statistical physics…). A lack of appropriate experimental/analytical approaches is impairing our understanding of the 4D genome.
Our proposal combines cutting-edge single-molecule imaging, signal-theory data analysis and physical modeling to study how genes coordinate in space and time in a single nucleus. Our objectives are to understand (a) competition/recycling of shared resources between genes within subnuclear compartments, (b) how enhancers communicate with genes domain-wide, and (c) the role of local conformational dynamics and supercoiling in gene co-regulation. Our organizing hypothesis is that, by acting on their microenvironment, genes shape their co-expression with other genes.
Building upon my expertise, we will use dual-color MS2/PP7 RNA labeling to visualize for the first time transcription and motion of pairs of hormone-responsive genes in real time. With our innovative signal analysis tools, we will extract spatio-temporal signatures of underlying processes, which we will investigate with stochastic modeling and validate through experimental perturbations. We expect to uncover how the functional organization of the linear genome relates to its physical properties and dynamics in 4D.
Max ERC Funding
1 499 750 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym 4PI-SKY
Project 4 pi sky: Extreme Astrophysics with Revolutionary Radio Telescopes
Researcher (PI) Robert Philip Fender
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Advanced Grant (AdG), PE9, ERC-2010-AdG_20100224
Summary Extreme astrophysical events such as relativistic flows, cataclysmic explosions and black hole accretion are one of the key areas for astrophysics in the 21st century. The extremes of physics experienced in these environments are beyond anything achievable in any laboratory on Earth, and provide a unique glimpse at the laws of physics operating in extraordinary regimes. All of these events are associated with transient radio emission, a tracer both of the acceleration of particles to relativistic energies, and coherent emitting regions with huge effective temperatures. By studying radio bursts from these phenomena we can pinpoint the sources of explosive events, understand the budget of kinetic feedback by explosive events in the ambient medium, and probe the physical state of the universe back to the epoch of reionisation, less than a billion years after the big bang. In seeking to push back the frontiers of extreme astrophysics, I will use a trio of revolutionary new radio telescopes, LOFAR, ASKAP and MeerKAT, pathfinders for the Square Kilometre Array, and all facilities in which I have a major role in the search for transients. I will build an infrastructure which transforms their combined operations for the discovery, classification and reporting of transient astrophysical events, over the whole sky, making them much more than the sum of their parts. This will include development of environments for the coordinated handling of extreme astrophysical events, in real time, via automated systems, as well as novel techniques for the detection of these events in a sea of noise. I will furthermore augment this program by buying in as a major partner to a rapid-response robotic optical telescope, and by cementing my relationship with an orbiting X-ray facility. This multiwavelength dimension will secure the astrophysical interpretation of our observational results and help to revolutionise high-energy astrophysics via a strong scientific exploitation program.
Summary
Extreme astrophysical events such as relativistic flows, cataclysmic explosions and black hole accretion are one of the key areas for astrophysics in the 21st century. The extremes of physics experienced in these environments are beyond anything achievable in any laboratory on Earth, and provide a unique glimpse at the laws of physics operating in extraordinary regimes. All of these events are associated with transient radio emission, a tracer both of the acceleration of particles to relativistic energies, and coherent emitting regions with huge effective temperatures. By studying radio bursts from these phenomena we can pinpoint the sources of explosive events, understand the budget of kinetic feedback by explosive events in the ambient medium, and probe the physical state of the universe back to the epoch of reionisation, less than a billion years after the big bang. In seeking to push back the frontiers of extreme astrophysics, I will use a trio of revolutionary new radio telescopes, LOFAR, ASKAP and MeerKAT, pathfinders for the Square Kilometre Array, and all facilities in which I have a major role in the search for transients. I will build an infrastructure which transforms their combined operations for the discovery, classification and reporting of transient astrophysical events, over the whole sky, making them much more than the sum of their parts. This will include development of environments for the coordinated handling of extreme astrophysical events, in real time, via automated systems, as well as novel techniques for the detection of these events in a sea of noise. I will furthermore augment this program by buying in as a major partner to a rapid-response robotic optical telescope, and by cementing my relationship with an orbiting X-ray facility. This multiwavelength dimension will secure the astrophysical interpretation of our observational results and help to revolutionise high-energy astrophysics via a strong scientific exploitation program.
Max ERC Funding
2 999 847 €
Duration
Start date: 2011-07-01, End date: 2017-06-30
Project acronym A-BINGOS
Project Accreting binary populations in Nearby Galaxies: Observations and Simulations
Researcher (PI) Andreas Zezas
Host Institution (HI) IDRYMA TECHNOLOGIAS KAI EREVNAS
Call Details Consolidator Grant (CoG), PE9, ERC-2013-CoG
Summary "High-energy observations of our Galaxy offer a good, albeit not complete, picture of the X-ray source populations, in particular the accreting binary sources. Recent ability to study accreting binaries in nearby galaxies has shown that we would be short-sighted if we restricted ourselves to our Galaxy or to a few nearby ones. I propose an ambitious project that involves a comprehensive study of all the galaxies within 10 Mpc for which we can study in detail their X-ray sources and stellar populations. The study will combine data from a unique suite of observatories (Chandra, XMM-Newton, HST, Spitzer) with state-of-the-art theoretical modelling of binary systems. I propose a novel approach that links the accreting binary populations to their parent stellar populations and surpasses any current studies of X-ray binary populations, both in scale and in scope, by: (a) combining methods and results from several different areas of astrophysics (compact objects, binary systems, stellar populations, galaxy evolution); (b) using data from almost the whole electromagnetic spectrum (infrared to X-ray bands); (c) identifying and studying the different sub-populations of accreting binaries; and (d) performing direct comparison between observations and theoretical predictions, over a broad parameter space. The project: (a) will answer the long-standing question of the formation efficiency of accreting binaries in different environments; and (b) will constrain their evolutionary paths. As by-products the project will provide eagerly awaited input to the fields of gravitational-wave sources, γ-ray bursts, and X-ray emitting galaxies at cosmological distances and it will produce a heritage multi-wavelength dataset and library of models for future studies of galaxies and accreting binaries."
Summary
"High-energy observations of our Galaxy offer a good, albeit not complete, picture of the X-ray source populations, in particular the accreting binary sources. Recent ability to study accreting binaries in nearby galaxies has shown that we would be short-sighted if we restricted ourselves to our Galaxy or to a few nearby ones. I propose an ambitious project that involves a comprehensive study of all the galaxies within 10 Mpc for which we can study in detail their X-ray sources and stellar populations. The study will combine data from a unique suite of observatories (Chandra, XMM-Newton, HST, Spitzer) with state-of-the-art theoretical modelling of binary systems. I propose a novel approach that links the accreting binary populations to their parent stellar populations and surpasses any current studies of X-ray binary populations, both in scale and in scope, by: (a) combining methods and results from several different areas of astrophysics (compact objects, binary systems, stellar populations, galaxy evolution); (b) using data from almost the whole electromagnetic spectrum (infrared to X-ray bands); (c) identifying and studying the different sub-populations of accreting binaries; and (d) performing direct comparison between observations and theoretical predictions, over a broad parameter space. The project: (a) will answer the long-standing question of the formation efficiency of accreting binaries in different environments; and (b) will constrain their evolutionary paths. As by-products the project will provide eagerly awaited input to the fields of gravitational-wave sources, γ-ray bursts, and X-ray emitting galaxies at cosmological distances and it will produce a heritage multi-wavelength dataset and library of models for future studies of galaxies and accreting binaries."
Max ERC Funding
1 242 000 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym AARTFAAC
Project Amsterdam-ASTRON Radio Transient Facility And Analysis Centre: Probing the Extremes of Astrophysics
Researcher (PI) Ralph Antoine Marie Joseph Wijers
Host Institution (HI) UNIVERSITEIT VAN AMSTERDAM
Call Details Advanced Grant (AdG), PE9, ERC-2009-AdG
Summary Some of the most extreme tests of physical law come from its manifestations in the behaviour of black holes and neutron stars, and as such these objects should be used as fundamental physics labs. Due to advances in both theoretical work and observational techniques, I have a major opportunity now to significantly push this agenda forward and get better answers to questions like: How are black holes born? How can energy be extracted from black holes? What is the origin of magnetic fields and cosmic rays in jets and shocks? Is their primary energy stream hadronic or magnetic? I propose to do this by exploiting the advent of wide-field radio astronomy: extreme objects are very rare and usually transient, so not only must one survey large areas of sky, but also must one do this often. I propose to form and shape a group that will use the LOFAR wide-field radio telescope to hunt for these extreme transients and systematically collect enough well-documented examples of the behaviour of each type of transient. Furthermore, I propose to expand LOFAR with a true 24/7 all-sky monitor to catch and study even the rarest of events. Next, I will use my experience in gamma-ray burst followup to conduct a vigorous multi-wavelength programme of study of these objects, to constrain their physics from as many angles as possible. This will eventually include results from multi-messenger astrophysics, in which we use neutrinos, gravity waves, and other non-electromagnetic messengers as extra diagnostics of the physics of these sources. Finally, I will build on my experience in modelling accretion phenomena and relativistic explosions to develop a theoretical framework for these phenomena and constrain the resulting models with the rich data sets we obtain.
Summary
Some of the most extreme tests of physical law come from its manifestations in the behaviour of black holes and neutron stars, and as such these objects should be used as fundamental physics labs. Due to advances in both theoretical work and observational techniques, I have a major opportunity now to significantly push this agenda forward and get better answers to questions like: How are black holes born? How can energy be extracted from black holes? What is the origin of magnetic fields and cosmic rays in jets and shocks? Is their primary energy stream hadronic or magnetic? I propose to do this by exploiting the advent of wide-field radio astronomy: extreme objects are very rare and usually transient, so not only must one survey large areas of sky, but also must one do this often. I propose to form and shape a group that will use the LOFAR wide-field radio telescope to hunt for these extreme transients and systematically collect enough well-documented examples of the behaviour of each type of transient. Furthermore, I propose to expand LOFAR with a true 24/7 all-sky monitor to catch and study even the rarest of events. Next, I will use my experience in gamma-ray burst followup to conduct a vigorous multi-wavelength programme of study of these objects, to constrain their physics from as many angles as possible. This will eventually include results from multi-messenger astrophysics, in which we use neutrinos, gravity waves, and other non-electromagnetic messengers as extra diagnostics of the physics of these sources. Finally, I will build on my experience in modelling accretion phenomena and relativistic explosions to develop a theoretical framework for these phenomena and constrain the resulting models with the rich data sets we obtain.
Max ERC Funding
3 499 128 €
Duration
Start date: 2010-10-01, End date: 2016-09-30
Project acronym ACCOMPLI
Project Assembly and maintenance of a co-regulated chromosomal compartment
Researcher (PI) Peter Burkhard Becker
Host Institution (HI) LUDWIG-MAXIMILIANS-UNIVERSITAET MUENCHEN
Call Details Advanced Grant (AdG), LS2, ERC-2011-ADG_20110310
Summary "Eukaryotic nuclei are organised into functional compartments, – local microenvironments that are enriched in certain molecules or biochemical activities and therefore specify localised functional outputs. Our study seeks to unveil fundamental principles of co-regulation of genes in a chromo¬somal compartment and the preconditions for homeostasis of such a compartment in the dynamic nuclear environment.
The dosage-compensated X chromosome of male Drosophila flies satisfies the criteria for a functional com¬partment. It is rendered structurally distinct from all other chromosomes by association of a regulatory ribonucleoprotein ‘Dosage Compensation Complex’ (DCC), enrichment of histone modifications and global decondensation. As a result, most genes on the X chromosome are co-ordinately activated. Autosomal genes inserted into the X acquire X-chromosomal features and are subject to the X-specific regulation.
We seek to uncover the molecular principles that initiate, establish and maintain the dosage-compensated chromosome. We will follow the kinetics of DCC assembly and the timing of association with different types of chromosomal targets in nuclei with high spatial resolution afforded by sub-wavelength microscopy and deep sequencing of DNA binding sites. We will characterise DCC sub-complexes with respect to their roles as kinetic assembly intermediates or as representations of local, functional heterogeneity. We will evaluate the roles of a DCC- novel ubiquitin ligase activity for homeostasis.
Crucial to the recruitment of the DCC and its distribution to target genes are non-coding roX RNAs that are transcribed from the X. We will determine the secondary structure ‘signatures’ of roX RNAs in vitro and determine the binding sites of the protein subunits in vivo. By biochemical and cellular reconstitution will test the hypothesis that roX-encoded RNA aptamers orchestrate the assembly of the DCC and contribute to the exquisite targeting of the complex."
Summary
"Eukaryotic nuclei are organised into functional compartments, – local microenvironments that are enriched in certain molecules or biochemical activities and therefore specify localised functional outputs. Our study seeks to unveil fundamental principles of co-regulation of genes in a chromo¬somal compartment and the preconditions for homeostasis of such a compartment in the dynamic nuclear environment.
The dosage-compensated X chromosome of male Drosophila flies satisfies the criteria for a functional com¬partment. It is rendered structurally distinct from all other chromosomes by association of a regulatory ribonucleoprotein ‘Dosage Compensation Complex’ (DCC), enrichment of histone modifications and global decondensation. As a result, most genes on the X chromosome are co-ordinately activated. Autosomal genes inserted into the X acquire X-chromosomal features and are subject to the X-specific regulation.
We seek to uncover the molecular principles that initiate, establish and maintain the dosage-compensated chromosome. We will follow the kinetics of DCC assembly and the timing of association with different types of chromosomal targets in nuclei with high spatial resolution afforded by sub-wavelength microscopy and deep sequencing of DNA binding sites. We will characterise DCC sub-complexes with respect to their roles as kinetic assembly intermediates or as representations of local, functional heterogeneity. We will evaluate the roles of a DCC- novel ubiquitin ligase activity for homeostasis.
Crucial to the recruitment of the DCC and its distribution to target genes are non-coding roX RNAs that are transcribed from the X. We will determine the secondary structure ‘signatures’ of roX RNAs in vitro and determine the binding sites of the protein subunits in vivo. By biochemical and cellular reconstitution will test the hypothesis that roX-encoded RNA aptamers orchestrate the assembly of the DCC and contribute to the exquisite targeting of the complex."
Max ERC Funding
2 482 770 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym ACTIVATION OF XCI
Project Molecular mechanisms controlling X chromosome inactivation
Researcher (PI) Joost Henk Gribnau
Host Institution (HI) ERASMUS UNIVERSITAIR MEDISCH CENTRUM ROTTERDAM
Call Details Starting Grant (StG), LS2, ERC-2010-StG_20091118
Summary In mammals, gene dosage of X-chromosomal genes is equalized between sexes by random inactivation of either one of the two X chromosomes in female cells. In the initial phase of X chromosome inactivation (XCI), a counting and initiation process determines the number of X chromosomes per nucleus, and elects the future inactive X chromosome (Xi). Xist is an X-encoded gene that plays a crucial role in the XCI process. At the start of XCI Xist expression is up-regulated and Xist RNA accumulates on the future Xi thereby initiating silencing in cis. Recent work performed in my laboratory indicates that the counting and initiation process is directed by a stochastic mechanism, in which each X chromosome has an independent probability to be inactivated. We also found that this probability is determined by the X:ploïdy ratio. These results indicated the presence of at least one X-linked activator of XCI. With a BAC screen we recently identified X-encoded RNF12 to be a dose-dependent activator of XCI. Expression of RNF12 correlates with Xist expression, and a heterozygous deletion of Rnf12 results in a marked loss of XCI in female cells. The presence of a small proportion of cells that still initiate XCI, in Rnf12+/- cells, also indicated that more XCI-activators are involved in XCI. Here, we propose to investigate the molecular mechanism by which RNF12 activates XCI in mouse and human, and to search for additional XCI-activators. We will also attempt to establish the role of different inhibitors of XCI, including CTCF and the pluripotency factors OCT4, SOX2 and NANOG. We anticipate that these studies will significantly advance our understanding of XCI mechanisms, which is highly relevant for a better insight in the manifestation of X-linked diseases that are affected by XCI.
Summary
In mammals, gene dosage of X-chromosomal genes is equalized between sexes by random inactivation of either one of the two X chromosomes in female cells. In the initial phase of X chromosome inactivation (XCI), a counting and initiation process determines the number of X chromosomes per nucleus, and elects the future inactive X chromosome (Xi). Xist is an X-encoded gene that plays a crucial role in the XCI process. At the start of XCI Xist expression is up-regulated and Xist RNA accumulates on the future Xi thereby initiating silencing in cis. Recent work performed in my laboratory indicates that the counting and initiation process is directed by a stochastic mechanism, in which each X chromosome has an independent probability to be inactivated. We also found that this probability is determined by the X:ploïdy ratio. These results indicated the presence of at least one X-linked activator of XCI. With a BAC screen we recently identified X-encoded RNF12 to be a dose-dependent activator of XCI. Expression of RNF12 correlates with Xist expression, and a heterozygous deletion of Rnf12 results in a marked loss of XCI in female cells. The presence of a small proportion of cells that still initiate XCI, in Rnf12+/- cells, also indicated that more XCI-activators are involved in XCI. Here, we propose to investigate the molecular mechanism by which RNF12 activates XCI in mouse and human, and to search for additional XCI-activators. We will also attempt to establish the role of different inhibitors of XCI, including CTCF and the pluripotency factors OCT4, SOX2 and NANOG. We anticipate that these studies will significantly advance our understanding of XCI mechanisms, which is highly relevant for a better insight in the manifestation of X-linked diseases that are affected by XCI.
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-04-01, End date: 2016-03-31
Project acronym ADULT
Project Analysis of the Dark Universe through Lensing Tomography
Researcher (PI) Hendrik Hoekstra
Host Institution (HI) UNIVERSITEIT LEIDEN
Call Details Starting Grant (StG), PE9, ERC-2011-StG_20101014
Summary The discoveries that the expansion of the universe is accelerating due to an unknown “dark energy”
and that most of the matter is invisible, highlight our lack of understanding of the major constituents
of the universe. These surprising findings set the stage for research in cosmology at the start of the
21st century. The objective of this proposal is to advance observational constraints to a level where we can distinguish between physical mechanisms that aim to explain the properties of dark energy and the observed distribution of dark matter throughout the universe. We use a relatively new technique called weak gravitational lensing: the accurate measurement of correlations in the orientations of distant galaxies enables us to map the dark matter distribution directly and to extract the cosmological information that is encoded by the large-scale structure.
To study the dark universe we will analyse data from a new state-of-the-art imaging survey: the Kilo-
Degree Survey (KiDS) will cover 1500 square degrees in 9 filters. The combination of its large survey
area and the availability of exquisite photometric redshifts for the sources makes KiDS the first
project that can place interesting constraints on the dark energy equation-of-state using lensing data
alone. Combined with complementary results from Planck, our measurements will provide one of the
best views of the dark side of the universe before much larger space-based projects commence.
To reach the desired accuracy we need to carefully measure the shapes of distant background galaxies. We also need to account for any intrinsic alignments that arise due to tidal interactions, rather than through lensing. Reducing these observational and physical biases to negligible levels is a necessarystep to ensure the success of KiDS and an important part of our preparation for more challenging projects such as the European-led space mission Euclid.
Summary
The discoveries that the expansion of the universe is accelerating due to an unknown “dark energy”
and that most of the matter is invisible, highlight our lack of understanding of the major constituents
of the universe. These surprising findings set the stage for research in cosmology at the start of the
21st century. The objective of this proposal is to advance observational constraints to a level where we can distinguish between physical mechanisms that aim to explain the properties of dark energy and the observed distribution of dark matter throughout the universe. We use a relatively new technique called weak gravitational lensing: the accurate measurement of correlations in the orientations of distant galaxies enables us to map the dark matter distribution directly and to extract the cosmological information that is encoded by the large-scale structure.
To study the dark universe we will analyse data from a new state-of-the-art imaging survey: the Kilo-
Degree Survey (KiDS) will cover 1500 square degrees in 9 filters. The combination of its large survey
area and the availability of exquisite photometric redshifts for the sources makes KiDS the first
project that can place interesting constraints on the dark energy equation-of-state using lensing data
alone. Combined with complementary results from Planck, our measurements will provide one of the
best views of the dark side of the universe before much larger space-based projects commence.
To reach the desired accuracy we need to carefully measure the shapes of distant background galaxies. We also need to account for any intrinsic alignments that arise due to tidal interactions, rather than through lensing. Reducing these observational and physical biases to negligible levels is a necessarystep to ensure the success of KiDS and an important part of our preparation for more challenging projects such as the European-led space mission Euclid.
Max ERC Funding
1 316 880 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym AEROSOL
Project Astrochemistry of old stars:direct probing of unique chemical laboratories
Researcher (PI) Leen Katrien Els Decin
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Consolidator Grant (CoG), PE9, ERC-2014-CoG
Summary The gas and dust in the interstellar medium (ISM) drive the chemical evolution of galaxies, the formation of stars and planets, and the synthesis of complex prebiotic molecules. The prime birth places for this interstellar material are the winds of evolved (super)giant stars. These winds are unique chemical laboratories, in which a large variety of gas and dust species radially expand away from the star.
Recent progress on the observations of these winds has been impressive thanks to Herschel and ALMA. The next challenge is to unravel the wealth of chemical information contained in these data. This is an ambitious task since (1) a plethora of physical and chemical processes interact in a complex way, (2) laboratory data to interpret these interactions are lacking, and (3) theoretical tools to analyse the data do not meet current needs.
To boost the knowledge of the physics and chemistry characterizing these winds, I propose a world-leading multi-disciplinary project combining (1) high-quality data, (2) novel theoretical wind models, and (3) targeted laboratory experiments. The aim is to pinpoint the dominant chemical pathways, unravel the transition from gas-phase to dust species, elucidate the role of clumps on the overall wind structure, and study the reciprocal effect between various dynamical and chemical phenomena.
Now is the right time for this ambitious project thanks to the availability of (1) high-quality multi-wavelength data, including ALMA and Herschel data of the PI, (2) supercomputers enabling a homogeneous analysis of the data using sophisticated theoretical wind models, and (3) novel laboratory equipment to measure the gas-phase reaction rates of key species.
This project will have far-reaching impact on (1) the field of evolved stars, (2) the understanding of the chemical lifecycle of the ISM, (3) chemical studies of dynamically more complex systems, such as exoplanets, protostars, supernovae etc., and (4) it will guide new instrument development.
Summary
The gas and dust in the interstellar medium (ISM) drive the chemical evolution of galaxies, the formation of stars and planets, and the synthesis of complex prebiotic molecules. The prime birth places for this interstellar material are the winds of evolved (super)giant stars. These winds are unique chemical laboratories, in which a large variety of gas and dust species radially expand away from the star.
Recent progress on the observations of these winds has been impressive thanks to Herschel and ALMA. The next challenge is to unravel the wealth of chemical information contained in these data. This is an ambitious task since (1) a plethora of physical and chemical processes interact in a complex way, (2) laboratory data to interpret these interactions are lacking, and (3) theoretical tools to analyse the data do not meet current needs.
To boost the knowledge of the physics and chemistry characterizing these winds, I propose a world-leading multi-disciplinary project combining (1) high-quality data, (2) novel theoretical wind models, and (3) targeted laboratory experiments. The aim is to pinpoint the dominant chemical pathways, unravel the transition from gas-phase to dust species, elucidate the role of clumps on the overall wind structure, and study the reciprocal effect between various dynamical and chemical phenomena.
Now is the right time for this ambitious project thanks to the availability of (1) high-quality multi-wavelength data, including ALMA and Herschel data of the PI, (2) supercomputers enabling a homogeneous analysis of the data using sophisticated theoretical wind models, and (3) novel laboratory equipment to measure the gas-phase reaction rates of key species.
This project will have far-reaching impact on (1) the field of evolved stars, (2) the understanding of the chemical lifecycle of the ISM, (3) chemical studies of dynamically more complex systems, such as exoplanets, protostars, supernovae etc., and (4) it will guide new instrument development.
Max ERC Funding
2 605 897 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym AGELESS
Project Comparative genomics / ‘wildlife’ transcriptomics uncovers the mechanisms of halted ageing in mammals
Researcher (PI) Emma Teeling
Host Institution (HI) UNIVERSITY COLLEGE DUBLIN, NATIONAL UNIVERSITY OF IRELAND, DUBLIN
Call Details Starting Grant (StG), LS2, ERC-2012-StG_20111109
Summary "Ageing is the gradual and irreversible breakdown of living systems associated with the advancement of time, which leads to an increase in vulnerability and eventual mortality. Despite recent advances in ageing research, the intrinsic complexity of the ageing process has prevented a full understanding of this process, therefore, ageing remains a grand challenge in contemporary biology. In AGELESS, we will tackle this challenge by uncovering the molecular mechanisms of halted ageing in a unique model system, the bats. Bats are the longest-lived mammals relative to their body size, and defy the ‘rate-of-living’ theories as they use twice as much the energy as other species of considerable size, but live far longer. This suggests that bats have some underlying mechanisms that may explain their exceptional longevity. In AGELESS, we will identify the molecular mechanisms that enable mammals to achieve extraordinary longevity, using state-of-the-art comparative genomic methodologies focused on bats. We will identify, using population transcriptomics and telomere/mtDNA genomics, the molecular changes that occur in an ageing wild population of bats to uncover how bats ‘age’ so slowly compared with other mammals. In silico whole genome analyses, field based ageing transcriptomic data, mtDNA and telomeric studies will be integrated and analysed using a networks approach, to ascertain how these systems interact to halt ageing. For the first time, we will be able to utilize the diversity seen within nature to identify key molecular targets and regions that regulate and control ageing in mammals. AGELESS will provide a deeper understanding of the causal mechanisms of ageing, potentially uncovering the crucial molecular pathways that can be modified to halt, alleviate and perhaps even reverse this process in man."
Summary
"Ageing is the gradual and irreversible breakdown of living systems associated with the advancement of time, which leads to an increase in vulnerability and eventual mortality. Despite recent advances in ageing research, the intrinsic complexity of the ageing process has prevented a full understanding of this process, therefore, ageing remains a grand challenge in contemporary biology. In AGELESS, we will tackle this challenge by uncovering the molecular mechanisms of halted ageing in a unique model system, the bats. Bats are the longest-lived mammals relative to their body size, and defy the ‘rate-of-living’ theories as they use twice as much the energy as other species of considerable size, but live far longer. This suggests that bats have some underlying mechanisms that may explain their exceptional longevity. In AGELESS, we will identify the molecular mechanisms that enable mammals to achieve extraordinary longevity, using state-of-the-art comparative genomic methodologies focused on bats. We will identify, using population transcriptomics and telomere/mtDNA genomics, the molecular changes that occur in an ageing wild population of bats to uncover how bats ‘age’ so slowly compared with other mammals. In silico whole genome analyses, field based ageing transcriptomic data, mtDNA and telomeric studies will be integrated and analysed using a networks approach, to ascertain how these systems interact to halt ageing. For the first time, we will be able to utilize the diversity seen within nature to identify key molecular targets and regions that regulate and control ageing in mammals. AGELESS will provide a deeper understanding of the causal mechanisms of ageing, potentially uncovering the crucial molecular pathways that can be modified to halt, alleviate and perhaps even reverse this process in man."
Max ERC Funding
1 499 768 €
Duration
Start date: 2013-01-01, End date: 2017-12-31
Project acronym Agglomerates
Project Infinite Protein Self-Assembly in Health and Disease
Researcher (PI) Emmanuel Doram LEVY
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Consolidator Grant (CoG), LS2, ERC-2018-COG
Summary Understanding how proteins respond to mutations is of paramount importance to biology and disease. While protein stability and misfolding have been instrumental in rationalizing the impact of mutations, we recently discovered that an alternative route is also frequent, where mutations at the surface of symmetric proteins trigger novel self-interactions that lead to infinite self-assembly. This mechanism can be involved in disease, as in sickle-cell anemia, but may also serve in adaptation. Importantly, it differs fundamentally from aggregation, because misfolding does not drive it. Thus, we term it “agglomeration”. The ease with which agglomeration can occur, even by single point mutations, shifts the paradigm of how quickly new protein assemblies can emerge, both in health and disease. This prompts us to determine the basic principles of protein agglomeration and explore its implications in cell physiology and human disease.
We propose an interdisciplinary research program bridging atomic and cellular scales to explore agglomeration in three aims: (i) Map the landscape of protein agglomeration in response to mutation in endogenous yeast proteins; (ii) Characterize how yeast physiology impacts agglomeration by changes in gene expression or cell state, and, conversely, how protein agglomerates impact yeast fitness. (iii) Analyze agglomeration in relation to human disease via two approaches. First, by predicting single nucleotide polymorphisms that trigger agglomeration, prioritizing them using knowledge from Aims 1 & 2, and characterizing them experimentally. Second, by providing a proof-of-concept that agglomeration can be exploited in drug design, whereby drugs induce its formation, like mutations can do.
Overall, through this research, we aim to establish agglomeration as a paradigm for protein assembly, with implications for our understanding of evolution, physiology, and disease.
Summary
Understanding how proteins respond to mutations is of paramount importance to biology and disease. While protein stability and misfolding have been instrumental in rationalizing the impact of mutations, we recently discovered that an alternative route is also frequent, where mutations at the surface of symmetric proteins trigger novel self-interactions that lead to infinite self-assembly. This mechanism can be involved in disease, as in sickle-cell anemia, but may also serve in adaptation. Importantly, it differs fundamentally from aggregation, because misfolding does not drive it. Thus, we term it “agglomeration”. The ease with which agglomeration can occur, even by single point mutations, shifts the paradigm of how quickly new protein assemblies can emerge, both in health and disease. This prompts us to determine the basic principles of protein agglomeration and explore its implications in cell physiology and human disease.
We propose an interdisciplinary research program bridging atomic and cellular scales to explore agglomeration in three aims: (i) Map the landscape of protein agglomeration in response to mutation in endogenous yeast proteins; (ii) Characterize how yeast physiology impacts agglomeration by changes in gene expression or cell state, and, conversely, how protein agglomerates impact yeast fitness. (iii) Analyze agglomeration in relation to human disease via two approaches. First, by predicting single nucleotide polymorphisms that trigger agglomeration, prioritizing them using knowledge from Aims 1 & 2, and characterizing them experimentally. Second, by providing a proof-of-concept that agglomeration can be exploited in drug design, whereby drugs induce its formation, like mutations can do.
Overall, through this research, we aim to establish agglomeration as a paradigm for protein assembly, with implications for our understanding of evolution, physiology, and disease.
Max ERC Funding
2 574 819 €
Duration
Start date: 2019-04-01, End date: 2024-03-31
Project acronym AGNES
Project ACTIVE AGEING – RESILIENCE AND EXTERNAL SUPPORT AS MODIFIERS OF THE DISABLEMENT OUTCOME
Researcher (PI) Taina Tuulikki RANTANEN
Host Institution (HI) JYVASKYLAN YLIOPISTO
Call Details Advanced Grant (AdG), SH3, ERC-2015-AdG
Summary The goals are 1. To develop a scale assessing the diversity of active ageing with four dimensions that are ability (what people can do), activity (what people do do), ambition (what are the valued activities that people want to do), and autonomy (how satisfied people are with the opportunity to do valued activities); 2. To examine health and physical and psychological functioning as the determinants and social and build environment, resilience and personal skills as modifiers of active ageing; 3. To develop a multicomponent sustainable intervention aiming to promote active ageing (methods: counselling, information technology, help from volunteers); 4. To test the feasibility and effectiveness on the intervention; and 5. To study cohort effects on the phenotypes on the pathway to active ageing.
“If You Can Measure It, You Can Change It.” Active ageing assessment needs conceptual progress, which I propose to do. A quantifiable scale will be developed that captures the diversity of active ageing stemming from the WHO definition of active ageing as the process of optimizing opportunities for health and participation in the society for all people in line with their needs, goals and capacities as they age. I will collect cross-sectional data (N=1000, ages 75, 80 and 85 years) and model the pathway to active ageing with state-of-the art statistical methods. By doing this I will create novel knowledge on preconditions for active ageing. The collected cohort data will be compared to a pre-existing cohort data that was collected 25 years ago to obtain knowledge about changes over time in functioning of older people. A randomized controlled trial (N=200) will be conducted to assess the effectiveness of the envisioned intervention promoting active ageing through participation. The project will regenerate ageing research by launching a novel scale, by training young scientists, by creating new concepts and theory development and by producing evidence for active ageing promotion
Summary
The goals are 1. To develop a scale assessing the diversity of active ageing with four dimensions that are ability (what people can do), activity (what people do do), ambition (what are the valued activities that people want to do), and autonomy (how satisfied people are with the opportunity to do valued activities); 2. To examine health and physical and psychological functioning as the determinants and social and build environment, resilience and personal skills as modifiers of active ageing; 3. To develop a multicomponent sustainable intervention aiming to promote active ageing (methods: counselling, information technology, help from volunteers); 4. To test the feasibility and effectiveness on the intervention; and 5. To study cohort effects on the phenotypes on the pathway to active ageing.
“If You Can Measure It, You Can Change It.” Active ageing assessment needs conceptual progress, which I propose to do. A quantifiable scale will be developed that captures the diversity of active ageing stemming from the WHO definition of active ageing as the process of optimizing opportunities for health and participation in the society for all people in line with their needs, goals and capacities as they age. I will collect cross-sectional data (N=1000, ages 75, 80 and 85 years) and model the pathway to active ageing with state-of-the art statistical methods. By doing this I will create novel knowledge on preconditions for active ageing. The collected cohort data will be compared to a pre-existing cohort data that was collected 25 years ago to obtain knowledge about changes over time in functioning of older people. A randomized controlled trial (N=200) will be conducted to assess the effectiveness of the envisioned intervention promoting active ageing through participation. The project will regenerate ageing research by launching a novel scale, by training young scientists, by creating new concepts and theory development and by producing evidence for active ageing promotion
Max ERC Funding
2 044 364 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym AIDA
Project An Illumination of the Dark Ages: modeling reionization and interpreting observations
Researcher (PI) Andrei Albert Mesinger
Host Institution (HI) SCUOLA NORMALE SUPERIORE
Call Details Starting Grant (StG), PE9, ERC-2014-STG
Summary "Understanding the dawn of the first galaxies and how their light permeated the early Universe is at the very frontier of modern astrophysical cosmology. Generous resources, including ambitions observational programs, are being devoted to studying these epochs of Cosmic Dawn (CD) and Reionization (EoR). In order to interpret these observations, we propose to build on our widely-used, semi-numeric simulation tool, 21cmFAST, and apply it to observations. Using sub-grid, semi-analytic models, we will incorporate additional physical processes governing the evolution of sources and sinks of ionizing photons. The resulting state-of-the-art simulations will be well poised to interpret topical observations of quasar spectra and the cosmic 21cm signal. They would be both physically-motivated and fast, allowing us to rapidly explore astrophysical parameter space. We will statistically quantify the resulting degeneracies and constraints, providing a robust answer to the question, ""What can we learn from EoR/CD observations?"" As an end goal, these investigations will help us understand when the first generations of galaxies formed, how they drove the EoR, and what are the associated large-scale observational signatures."
Summary
"Understanding the dawn of the first galaxies and how their light permeated the early Universe is at the very frontier of modern astrophysical cosmology. Generous resources, including ambitions observational programs, are being devoted to studying these epochs of Cosmic Dawn (CD) and Reionization (EoR). In order to interpret these observations, we propose to build on our widely-used, semi-numeric simulation tool, 21cmFAST, and apply it to observations. Using sub-grid, semi-analytic models, we will incorporate additional physical processes governing the evolution of sources and sinks of ionizing photons. The resulting state-of-the-art simulations will be well poised to interpret topical observations of quasar spectra and the cosmic 21cm signal. They would be both physically-motivated and fast, allowing us to rapidly explore astrophysical parameter space. We will statistically quantify the resulting degeneracies and constraints, providing a robust answer to the question, ""What can we learn from EoR/CD observations?"" As an end goal, these investigations will help us understand when the first generations of galaxies formed, how they drove the EoR, and what are the associated large-scale observational signatures."
Max ERC Funding
1 468 750 €
Duration
Start date: 2015-05-01, End date: 2021-01-31
Project acronym ALERT
Project ALERT - The Apertif-LOFAR Exploration of the Radio Transient Sky
Researcher (PI) Albert Van Leeuwen
Host Institution (HI) STICHTING ASTRON, NETHERLANDS INSTITUTE FOR RADIO ASTRONOMY
Call Details Consolidator Grant (CoG), PE9, ERC-2013-CoG
Summary "In our largely unchanging radio Universe, a highly dynamic component was recently discovered: flashes of bright radio emission that last only milliseconds but appear all over the sky. Some of these radio bursts can be traced to intermittently pulsating neutron stars. Other bursts however, apparently originate far outside our Galaxy. Due to great observational challenges, the evolution of the neutron stars is not understood, while more importantly, the nature of the extragalactic bursts remains an outright mystery.
My overall aim is to understand the physics that drives both kinds of brief and luminous bursts.
My primary goal is to identify the highly compact astrophysical explosions powering the extragalactic bursts. My previous surveys are the state of the art in fast-transient detection; I will now increase by a factor of 10 this exploration volume. In real-time I will provide arcsec positions, 10,000-fold more accurate than currently possible, to localize such extragalactic bursts for the first time and understand their origin.
My secondary goal is to unravel the unexplained evolution of intermittently pulsating neutron stars (building on e.g., my recent papers in Science, 2013), by doubling their number and modeling their population.
To achieve these goals, I will carry out a highly innovative survey: the Apertif-LOFAR Exploration of the Radio Transient Sky. ALERT is over an order of magnitude more sensitive than all current state-of-the art fast-transient surveys.
Through its novel, extremely wide field-of-view, Westerbork/Apertif will detect many tens of extragalactic bursts. Through real-time triggers to LOFAR I will next provide the precise localisation that is essential for radio, optical and high-energy follow-up to, for the first time, shed light on the physics and objects driving these bursts – evaporating primordial black holes; explosions in host galaxies; or, the unknown?"
Summary
"In our largely unchanging radio Universe, a highly dynamic component was recently discovered: flashes of bright radio emission that last only milliseconds but appear all over the sky. Some of these radio bursts can be traced to intermittently pulsating neutron stars. Other bursts however, apparently originate far outside our Galaxy. Due to great observational challenges, the evolution of the neutron stars is not understood, while more importantly, the nature of the extragalactic bursts remains an outright mystery.
My overall aim is to understand the physics that drives both kinds of brief and luminous bursts.
My primary goal is to identify the highly compact astrophysical explosions powering the extragalactic bursts. My previous surveys are the state of the art in fast-transient detection; I will now increase by a factor of 10 this exploration volume. In real-time I will provide arcsec positions, 10,000-fold more accurate than currently possible, to localize such extragalactic bursts for the first time and understand their origin.
My secondary goal is to unravel the unexplained evolution of intermittently pulsating neutron stars (building on e.g., my recent papers in Science, 2013), by doubling their number and modeling their population.
To achieve these goals, I will carry out a highly innovative survey: the Apertif-LOFAR Exploration of the Radio Transient Sky. ALERT is over an order of magnitude more sensitive than all current state-of-the art fast-transient surveys.
Through its novel, extremely wide field-of-view, Westerbork/Apertif will detect many tens of extragalactic bursts. Through real-time triggers to LOFAR I will next provide the precise localisation that is essential for radio, optical and high-energy follow-up to, for the first time, shed light on the physics and objects driving these bursts – evaporating primordial black holes; explosions in host galaxies; or, the unknown?"
Max ERC Funding
1 999 823 €
Duration
Start date: 2014-12-01, End date: 2019-11-30
Project acronym AlgoFinance
Project Algorithmic Finance: Inquiring into the Reshaping of Financial Markets
Researcher (PI) Christian BORCH
Host Institution (HI) COPENHAGEN BUSINESS SCHOOL
Call Details Consolidator Grant (CoG), SH3, ERC-2016-COG
Summary Present-day financial markets are turning algorithmic, as market orders are increasingly being executed by fully automated computer algorithms, without any direct human intervention. Although algorithmic finance seems to fundamentally reshape the central dynamics in financial markets, and even though it prompts core sociological questions, it has not yet received any systematic attention. In a pioneering contribution to economic sociology and social studies of finance, ALGOFINANCE aims to understand how and with what consequences the turn to algorithms is changing financial markets. The overall concept and central contributions of ALGOFINANCE are the following: (1) on an intra-firm level, the project examines how the shift to algorithmic finance reshapes the ways in which trading firms operate, and does so by systematically and empirically investigating the reconfiguration of organizational structures and employee subjectivity; (2) on an inter-algorithmic level, it offers a ground-breaking methodology (agent-based modelling informed by qualitative data) to grasp how trading algorithms interact with one another in a fully digital space; and (3) on the level of market sociality, it proposes a novel theorization of how intra-firm and inter-algorithmic dynamics can be conceived of as introducing a particular form of sociality that is characteristic to algorithmic finance: a form of sociality-as-association heuristically analyzed as imitation. None of these three levels have received systematic attention in the state-of-the-art literature. Addressing them will significantly advance the understanding of present-day algorithmic finance in economic sociology. By contributing novel empirical, methodological, and theoretical understandings of the functioning and consequences of algorithms, ALGOFINANCE will pave the way for other research into digital sociology and the broader algorithmization of society.
Summary
Present-day financial markets are turning algorithmic, as market orders are increasingly being executed by fully automated computer algorithms, without any direct human intervention. Although algorithmic finance seems to fundamentally reshape the central dynamics in financial markets, and even though it prompts core sociological questions, it has not yet received any systematic attention. In a pioneering contribution to economic sociology and social studies of finance, ALGOFINANCE aims to understand how and with what consequences the turn to algorithms is changing financial markets. The overall concept and central contributions of ALGOFINANCE are the following: (1) on an intra-firm level, the project examines how the shift to algorithmic finance reshapes the ways in which trading firms operate, and does so by systematically and empirically investigating the reconfiguration of organizational structures and employee subjectivity; (2) on an inter-algorithmic level, it offers a ground-breaking methodology (agent-based modelling informed by qualitative data) to grasp how trading algorithms interact with one another in a fully digital space; and (3) on the level of market sociality, it proposes a novel theorization of how intra-firm and inter-algorithmic dynamics can be conceived of as introducing a particular form of sociality that is characteristic to algorithmic finance: a form of sociality-as-association heuristically analyzed as imitation. None of these three levels have received systematic attention in the state-of-the-art literature. Addressing them will significantly advance the understanding of present-day algorithmic finance in economic sociology. By contributing novel empirical, methodological, and theoretical understandings of the functioning and consequences of algorithms, ALGOFINANCE will pave the way for other research into digital sociology and the broader algorithmization of society.
Max ERC Funding
1 590 036 €
Duration
Start date: 2017-05-01, End date: 2021-04-30
Project acronym ALLEGRO
Project unrAvelLing sLow modE travelinG and tRaffic: with innOvative data to a new transportation and traffic theory for pedestrians and bicycles
Researcher (PI) Serge Hoogendoorn
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Advanced Grant (AdG), SH3, ERC-2014-ADG
Summary A major challenge in contemporary traffic and transportation theory is having a comprehensive understanding of pedestrians and cyclists behaviour. This is notoriously hard to observe, since sensors providing abundant and detailed information about key variables characterising this behaviour have not been available until very recently. The behaviour is also far more complex than that of the much better understood fast mode. This is due to the many degrees of freedom in decision-making, the interactions among slow traffic participants that are more involved and far less guided by traffic rules and regulations than those between car-drivers, and the many fascinating but complex phenomena in slow traffic flows (self-organised patterns, turbulence, spontaneous phase transitions, herding, etc.) that are very hard to predict accurately.
With slow traffic modes gaining ground in terms of mode share in many cities, lack of empirical insights, behavioural theories, predictively valid analytical and simulation models, and tools to support planning, design, management and control is posing a major societal problem as well: examples of major accidents due to bad planning, organisation and management of events are manifold, as are locations where safety of slow modes is a serious issue due to interactions with fast modes.
This programme is geared towards establishing a comprehensive theory of slow mode traffic behaviour, considering the different behavioural levels relevant for understanding, reproducing and predicting slow mode traffic flows in cities. The levels deal with walking and cycling operations, activity scheduling and travel behaviour, and knowledge representation and learning. Major scientific breakthroughs are expected at each of these levels, in terms of theory and modelling, by using innovative (big) data collection and experimentation, analysis and fusion techniques, including social media data analytics, using augmented reality, and remote and crowd sensing.
Summary
A major challenge in contemporary traffic and transportation theory is having a comprehensive understanding of pedestrians and cyclists behaviour. This is notoriously hard to observe, since sensors providing abundant and detailed information about key variables characterising this behaviour have not been available until very recently. The behaviour is also far more complex than that of the much better understood fast mode. This is due to the many degrees of freedom in decision-making, the interactions among slow traffic participants that are more involved and far less guided by traffic rules and regulations than those between car-drivers, and the many fascinating but complex phenomena in slow traffic flows (self-organised patterns, turbulence, spontaneous phase transitions, herding, etc.) that are very hard to predict accurately.
With slow traffic modes gaining ground in terms of mode share in many cities, lack of empirical insights, behavioural theories, predictively valid analytical and simulation models, and tools to support planning, design, management and control is posing a major societal problem as well: examples of major accidents due to bad planning, organisation and management of events are manifold, as are locations where safety of slow modes is a serious issue due to interactions with fast modes.
This programme is geared towards establishing a comprehensive theory of slow mode traffic behaviour, considering the different behavioural levels relevant for understanding, reproducing and predicting slow mode traffic flows in cities. The levels deal with walking and cycling operations, activity scheduling and travel behaviour, and knowledge representation and learning. Major scientific breakthroughs are expected at each of these levels, in terms of theory and modelling, by using innovative (big) data collection and experimentation, analysis and fusion techniques, including social media data analytics, using augmented reality, and remote and crowd sensing.
Max ERC Funding
2 458 700 €
Duration
Start date: 2015-11-01, End date: 2020-10-31
Project acronym Allelic Regulation
Project Revealing Allele-level Regulation and Dynamics using Single-cell Gene Expression Analyses
Researcher (PI) Thore Rickard Hakan Sandberg
Host Institution (HI) KAROLINSKA INSTITUTET
Call Details Consolidator Grant (CoG), LS2, ERC-2014-CoG
Summary As diploid organisms inherit one gene copy from each parent, a gene can be expressed from both alleles (biallelic) or from only one allele (monoallelic). Although transcription from both alleles is detected for most genes in cell population experiments, little is known about allele-specific expression in single cells and its phenotypic consequences. To answer fundamental questions about allelic transcription heterogeneity in single cells, this research program will focus on single-cell transcriptome analyses with allelic-origin resolution. To this end, we will investigate both clonally stable and dynamic random monoallelic expression across a large number of cell types, including cells from embryonic and adult stages. This research program will be accomplished with the novel single-cell RNA-seq method developed within my lab to obtain quantitative, genome-wide gene expression measurement. To distinguish between mitotically stable and dynamic patterns of allelic expression, we will analyze large numbers a clonally related cells per cell type, from both primary cultures (in vitro) and using transgenic models to obtain clonally related cells in vivo.
The biological significance of the research program is first an understanding of allelic transcription, including the nature and extent of random monoallelic expression across in vivo tissues and cell types. These novel insights into allelic transcription will be important for an improved understanding of how variable phenotypes (e.g. incomplete penetrance and variable expressivity) can arise in genetically identical individuals. Additionally, the single-cell transcriptome analyses of clonally related cells in vivo will provide unique insights into the clonality of gene expression per se.
Summary
As diploid organisms inherit one gene copy from each parent, a gene can be expressed from both alleles (biallelic) or from only one allele (monoallelic). Although transcription from both alleles is detected for most genes in cell population experiments, little is known about allele-specific expression in single cells and its phenotypic consequences. To answer fundamental questions about allelic transcription heterogeneity in single cells, this research program will focus on single-cell transcriptome analyses with allelic-origin resolution. To this end, we will investigate both clonally stable and dynamic random monoallelic expression across a large number of cell types, including cells from embryonic and adult stages. This research program will be accomplished with the novel single-cell RNA-seq method developed within my lab to obtain quantitative, genome-wide gene expression measurement. To distinguish between mitotically stable and dynamic patterns of allelic expression, we will analyze large numbers a clonally related cells per cell type, from both primary cultures (in vitro) and using transgenic models to obtain clonally related cells in vivo.
The biological significance of the research program is first an understanding of allelic transcription, including the nature and extent of random monoallelic expression across in vivo tissues and cell types. These novel insights into allelic transcription will be important for an improved understanding of how variable phenotypes (e.g. incomplete penetrance and variable expressivity) can arise in genetically identical individuals. Additionally, the single-cell transcriptome analyses of clonally related cells in vivo will provide unique insights into the clonality of gene expression per se.
Max ERC Funding
1 923 060 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym ANISOTROPIC UNIVERSE
Project The anisotropic universe -- a reality or fluke?
Researcher (PI) Hans Kristian Kamfjord Eriksen
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE9, ERC-2010-StG_20091028
Summary "During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Summary
"During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym AnoPath
Project Genetics of mosquito resistance to pathogens
Researcher (PI) Kenneth Du Souchet Vernick
Host Institution (HI) INSTITUT PASTEUR
Call Details Advanced Grant (AdG), LS2, ERC-2012-ADG_20120314
Summary Malaria parasite infection in humans has been called “the strongest known force for evolutionary selection in the recent history of the human genome”, and I hypothesize that a similar statement may apply to the mosquito vector, which is the definitive host of the malaria parasite. We previously discovered efficient malaria-resistance mechanisms in natural populations of the African malaria vector, Anopheles gambiae. Aim 1 of the proposed project will implement a novel genetic mapping design to systematically survey the mosquito population for common and rare genetic variants of strong effect against the human malaria parasite, Plasmodium falciparum. A product of the mapping design will be living mosquito families carrying the resistance loci. Aim 2 will use the segregating families to functionally dissect the underlying molecular mechanisms controlled by the loci, including determination of the pathogen specificity spectra of the host-defense traits. Aim 3 targets arbovirus transmission, where Anopheles mosquitoes transmit human malaria but not arboviruses such as Dengue and Chikungunya, even though the two mosquitoes bite the same people and are exposed to the same pathogens, often in malaria-arbovirus co-infections. We will use deep-sequencing to detect processing of the arbovirus dsRNA intermediates of replication produced by the RNAi pathway of the mosquitoes. The results will reveal important new information about differences in the efficiency and quality of the RNAi response between mosquitoes, which is likely to underlie at least part of the host specificity of arbovirus transmission. The 3 Aims will make significant contributions to understanding malaria and arbovirus transmission, major global public health problems, will aid the development of a next generation of vector surveillance and control tools, and will produce a definitive description of the major genetic factors influencing host-pathogen interactions in mosquito immunity.
Summary
Malaria parasite infection in humans has been called “the strongest known force for evolutionary selection in the recent history of the human genome”, and I hypothesize that a similar statement may apply to the mosquito vector, which is the definitive host of the malaria parasite. We previously discovered efficient malaria-resistance mechanisms in natural populations of the African malaria vector, Anopheles gambiae. Aim 1 of the proposed project will implement a novel genetic mapping design to systematically survey the mosquito population for common and rare genetic variants of strong effect against the human malaria parasite, Plasmodium falciparum. A product of the mapping design will be living mosquito families carrying the resistance loci. Aim 2 will use the segregating families to functionally dissect the underlying molecular mechanisms controlled by the loci, including determination of the pathogen specificity spectra of the host-defense traits. Aim 3 targets arbovirus transmission, where Anopheles mosquitoes transmit human malaria but not arboviruses such as Dengue and Chikungunya, even though the two mosquitoes bite the same people and are exposed to the same pathogens, often in malaria-arbovirus co-infections. We will use deep-sequencing to detect processing of the arbovirus dsRNA intermediates of replication produced by the RNAi pathway of the mosquitoes. The results will reveal important new information about differences in the efficiency and quality of the RNAi response between mosquitoes, which is likely to underlie at least part of the host specificity of arbovirus transmission. The 3 Aims will make significant contributions to understanding malaria and arbovirus transmission, major global public health problems, will aid the development of a next generation of vector surveillance and control tools, and will produce a definitive description of the major genetic factors influencing host-pathogen interactions in mosquito immunity.
Max ERC Funding
2 307 800 €
Duration
Start date: 2013-03-01, End date: 2018-02-28
Project acronym ANOREP
Project Targeting the reproductive biology of the malaria mosquito Anopheles gambiae: from laboratory studies to field applications
Researcher (PI) Flaminia Catteruccia
Host Institution (HI) UNIVERSITA DEGLI STUDI DI PERUGIA
Call Details Starting Grant (StG), LS2, ERC-2010-StG_20091118
Summary Anopheles gambiae mosquitoes are the major vectors of malaria, a disease with devastating consequences for
human health. Novel methods for controlling the natural vector populations are urgently needed, given the
evolution of insecticide resistance in mosquitoes and the lack of novel insecticidals. Understanding the
processes at the bases of mosquito biology may help to roll back malaria. In this proposal, we will target
mosquito reproduction, a major determinant of the An. gambiae vectorial capacity. This will be achieved at
two levels: (i) fundamental research, to provide a deeper knowledge of the processes regulating reproduction
in this species, and (ii) applied research, to identify novel targets and to develop innovative approaches for
the control of natural populations. We will focus our analysis on three major players of mosquito
reproduction: male accessory glands (MAGs), sperm, and spermatheca, in both laboratory and field settings.
We will then translate this information into the identification of inhibitors of mosquito fertility. The
experimental activities will be divided across three objectives. In Objective 1, we will unravel the role of the
MAGs in shaping mosquito fertility and behaviour, by performing a combination of transcriptional and
functional studies that will reveal the multifaceted activities of these tissues. In Objective 2 we will instead
focus on the identification of the male and female factors responsible for sperm viability and function.
Results obtained in both objectives will be validated in field mosquitoes. In Objective 3, we will perform
screens aimed at the identification of inhibitors of mosquito reproductive success. This study will reveal as
yet unknown molecular mechanisms underlying reproductive success in mosquitoes, considerably increasing
our knowledge beyond the state-of-the-art and critically contributing with innovative tools and ideas to the
fight against malaria.
Summary
Anopheles gambiae mosquitoes are the major vectors of malaria, a disease with devastating consequences for
human health. Novel methods for controlling the natural vector populations are urgently needed, given the
evolution of insecticide resistance in mosquitoes and the lack of novel insecticidals. Understanding the
processes at the bases of mosquito biology may help to roll back malaria. In this proposal, we will target
mosquito reproduction, a major determinant of the An. gambiae vectorial capacity. This will be achieved at
two levels: (i) fundamental research, to provide a deeper knowledge of the processes regulating reproduction
in this species, and (ii) applied research, to identify novel targets and to develop innovative approaches for
the control of natural populations. We will focus our analysis on three major players of mosquito
reproduction: male accessory glands (MAGs), sperm, and spermatheca, in both laboratory and field settings.
We will then translate this information into the identification of inhibitors of mosquito fertility. The
experimental activities will be divided across three objectives. In Objective 1, we will unravel the role of the
MAGs in shaping mosquito fertility and behaviour, by performing a combination of transcriptional and
functional studies that will reveal the multifaceted activities of these tissues. In Objective 2 we will instead
focus on the identification of the male and female factors responsible for sperm viability and function.
Results obtained in both objectives will be validated in field mosquitoes. In Objective 3, we will perform
screens aimed at the identification of inhibitors of mosquito reproductive success. This study will reveal as
yet unknown molecular mechanisms underlying reproductive success in mosquitoes, considerably increasing
our knowledge beyond the state-of-the-art and critically contributing with innovative tools and ideas to the
fight against malaria.
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym ANTHROPOID
Project Great ape organoids to reconstruct uniquely human development
Researcher (PI) Jarrett CAMP
Host Institution (HI) INSTITUT FUR MOLEKULARE UND KLINISCHE OPHTHALMOLOGIE BASEL
Call Details Starting Grant (StG), LS2, ERC-2018-STG
Summary Humans diverged from our closest living relatives, chimpanzees and other great apes, 6-10 million years ago. Since this divergence, our ancestors acquired genetic changes that enhanced cognition, altered metabolism, and endowed our species with an adaptive capacity to colonize the entire planet and reshape the biosphere. Through genome comparisons between modern humans, Neandertals, chimpanzees and other apes we have identified genetic changes that likely contribute to innovations in human metabolic and cognitive physiology. However, it has been difficult to assess the functional effects of these genetic changes due to the lack of cell culture systems that recapitulate great ape organ complexity. Human and chimpanzee pluripotent stem cells (PSCs) can self-organize into three-dimensional (3D) tissues that recapitulate the morphology, function, and genetic programs controlling organ development. Our vision is to use organoids to study the changes that set modern humans apart from our closest evolutionary relatives as well as all other organisms on the planet. In ANTHROPOID we will generate a great ape developmental cell atlas using cortex, liver, and small intestine organoids. We will use single-cell transcriptomics and chromatin accessibility to identify cell type-specific features of transcriptome divergence at cellular resolution. We will dissect enhancer evolution using single-cell genomic screens and ancestralize human cells to resurrect pre-human cellular phenotypes. ANTHROPOID utilizes quantitative and state-of-the-art methods to explore exciting high-risk questions at multiple branches of the modern human lineage. This project is a ground breaking starting point to replay evolution and tackle the ancient question of what makes us uniquely human?
Summary
Humans diverged from our closest living relatives, chimpanzees and other great apes, 6-10 million years ago. Since this divergence, our ancestors acquired genetic changes that enhanced cognition, altered metabolism, and endowed our species with an adaptive capacity to colonize the entire planet and reshape the biosphere. Through genome comparisons between modern humans, Neandertals, chimpanzees and other apes we have identified genetic changes that likely contribute to innovations in human metabolic and cognitive physiology. However, it has been difficult to assess the functional effects of these genetic changes due to the lack of cell culture systems that recapitulate great ape organ complexity. Human and chimpanzee pluripotent stem cells (PSCs) can self-organize into three-dimensional (3D) tissues that recapitulate the morphology, function, and genetic programs controlling organ development. Our vision is to use organoids to study the changes that set modern humans apart from our closest evolutionary relatives as well as all other organisms on the planet. In ANTHROPOID we will generate a great ape developmental cell atlas using cortex, liver, and small intestine organoids. We will use single-cell transcriptomics and chromatin accessibility to identify cell type-specific features of transcriptome divergence at cellular resolution. We will dissect enhancer evolution using single-cell genomic screens and ancestralize human cells to resurrect pre-human cellular phenotypes. ANTHROPOID utilizes quantitative and state-of-the-art methods to explore exciting high-risk questions at multiple branches of the modern human lineage. This project is a ground breaking starting point to replay evolution and tackle the ancient question of what makes us uniquely human?
Max ERC Funding
1 500 000 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym ANXIETY & COGNITION
Project How anxiety transforms human cognition: an Affective Neuroscience perspective
Researcher (PI) Gilles Roger Charles Pourtois
Host Institution (HI) UNIVERSITEIT GENT
Call Details Starting Grant (StG), SH3, ERC-2007-StG
Summary Anxiety, a state of apprehension or fear, may provoke cognitive or behavioural disorders and eventually lead to serious medical illnesses. The high prevalence of anxiety disorders in our society sharply contrasts with the lack of clear factual knowledge about the corresponding brain mechanisms at the origin of this profound change in the appraisal of the environment. Little is known about how the psychopathological state of anxiety ultimately turns to a medical condition. The core of this proposal is to gain insight in the neural underpinnings of anxiety and disorders related to anxiety using modern human brain-imaging such as scalp EEG and fMRI. I propose to enlighten how anxiety transforms and shapes human cognition and what the neural correlates and time-course of this modulatory effect are. The primary innovation of this project is the systematic use scalp EEG and fMRI in human participants to better understand the neural mechanisms by which anxiety profoundly influences specific cognitive functions, in particular selective attention and decision-making. The goal of this proposal is to precisely determine the exact timing (using scalp EEG), location, size and extent (using fMRI) of anxiety-related modulations on selective attention and decision-making in the human brain. Here I propose to focus on these two specific processes, because they are likely to reveal selective effects of anxiety on human cognition and can thus serve as powerful models to better figure out how anxiety operates in the human brain. Another important aspect of this project is the fact I envision to help bridge the gap in Health Psychology between fundamental research and clinical practice by proposing alternative revalidation strategies for human adult subjects affected by anxiety-related disorders, which could directly exploit the neuro-scientific discoveries generated in this scientific project.
Summary
Anxiety, a state of apprehension or fear, may provoke cognitive or behavioural disorders and eventually lead to serious medical illnesses. The high prevalence of anxiety disorders in our society sharply contrasts with the lack of clear factual knowledge about the corresponding brain mechanisms at the origin of this profound change in the appraisal of the environment. Little is known about how the psychopathological state of anxiety ultimately turns to a medical condition. The core of this proposal is to gain insight in the neural underpinnings of anxiety and disorders related to anxiety using modern human brain-imaging such as scalp EEG and fMRI. I propose to enlighten how anxiety transforms and shapes human cognition and what the neural correlates and time-course of this modulatory effect are. The primary innovation of this project is the systematic use scalp EEG and fMRI in human participants to better understand the neural mechanisms by which anxiety profoundly influences specific cognitive functions, in particular selective attention and decision-making. The goal of this proposal is to precisely determine the exact timing (using scalp EEG), location, size and extent (using fMRI) of anxiety-related modulations on selective attention and decision-making in the human brain. Here I propose to focus on these two specific processes, because they are likely to reveal selective effects of anxiety on human cognition and can thus serve as powerful models to better figure out how anxiety operates in the human brain. Another important aspect of this project is the fact I envision to help bridge the gap in Health Psychology between fundamental research and clinical practice by proposing alternative revalidation strategies for human adult subjects affected by anxiety-related disorders, which could directly exploit the neuro-scientific discoveries generated in this scientific project.
Max ERC Funding
812 986 €
Duration
Start date: 2008-11-01, End date: 2013-10-31
Project acronym AORVM
Project The Effects of Aging on Object Representation in Visual Working Memory
Researcher (PI) James Robert Brockmole
Host Institution (HI) THE UNIVERSITY OF EDINBURGH
Call Details Starting Grant (StG), SH3, ERC-2007-StG
Summary One’s ability to remember visual material such as objects, faces, and spatial locations over a short period of time declines with age. The proposed research will examine whether these deficits are explained by a reduction in visual working memory (VWM) capacity, or an impairment in one’s ability to maintain, or ‘bind’ appropriate associations among pieces of related information. In this project successful binding is operationally defined as the proper recall or recognition of objects that are defined by the conjunction of multiple visual features. While tests of long-term memory have demonstrated that, despite preserved memory for isolated features, older adults have more difficulty remembering conjunctions of features, no research has yet investigated analogous age related binding deficits in VWM. This is a critical oversight because, given the current state of the science, it is unknown whether these deficits are specific to the long-term memory system, or if they originate in VWM. The project interweaves three strands of research that each investigate whether older adults have more difficulty creating, maintaining, and updating bound multi-feature object representations than younger adults. This theoretical program of enquiry will provide insight into the cognitive architecture of VWM and how this system changes with age, and its outcomes will have wide ranging multi-disciplinary applications in applied theory and intervention techniques that may reduce the adverse consequences of aging on memory.
Summary
One’s ability to remember visual material such as objects, faces, and spatial locations over a short period of time declines with age. The proposed research will examine whether these deficits are explained by a reduction in visual working memory (VWM) capacity, or an impairment in one’s ability to maintain, or ‘bind’ appropriate associations among pieces of related information. In this project successful binding is operationally defined as the proper recall or recognition of objects that are defined by the conjunction of multiple visual features. While tests of long-term memory have demonstrated that, despite preserved memory for isolated features, older adults have more difficulty remembering conjunctions of features, no research has yet investigated analogous age related binding deficits in VWM. This is a critical oversight because, given the current state of the science, it is unknown whether these deficits are specific to the long-term memory system, or if they originate in VWM. The project interweaves three strands of research that each investigate whether older adults have more difficulty creating, maintaining, and updating bound multi-feature object representations than younger adults. This theoretical program of enquiry will provide insight into the cognitive architecture of VWM and how this system changes with age, and its outcomes will have wide ranging multi-disciplinary applications in applied theory and intervention techniques that may reduce the adverse consequences of aging on memory.
Max ERC Funding
500 000 €
Duration
Start date: 2008-09-01, End date: 2011-08-31
Project acronym ArcheoDyn
Project Globular clusters as living fossils of the past of galaxies
Researcher (PI) Petrus VAN DE VEN
Host Institution (HI) UNIVERSITAT WIEN
Call Details Consolidator Grant (CoG), PE9, ERC-2016-COG
Summary Globular clusters (GCs) are enigmatic objects that hide a wealth of information. They are the living fossils of the history of their native galaxies and the record keepers of the violent events that made them change their domicile. This proposal aims to mine GCs as living fossils of galaxy evolution to address fundamental questions in astrophysics: (1) Do satellite galaxies merge as predicted by the hierarchical build-up of galaxies? (2) Which are the seeds of supermassive black holes in the centres of galaxies? (3) How did star formation originate in the earliest phases of galaxy formation? To answer these questions, novel population-dependent dynamical modelling techniques are required, whose development the PI has led over the past years. This uniquely positions him to take full advantage of the emerging wealth of chemical and kinematical data on GCs.
Following the tidal disruption of satellite galaxies, their dense GCs, and maybe even their nuclei, are left as the most visible remnants in the main galaxy. The hierarchical build-up of their new host galaxy can thus be unearthed by recovering the GCs’ orbits. However, currently it is unclear which of the GCs are accretion survivors. Actually, the existence of a central intermediate mass black hole (IMBH) or of multiple stellar populations in GCs might tell which ones are accreted. At the same time, detection of IMBHs is important as they are predicted seeds for supermassive black holes in galaxies; while the multiple stellar populations in GCs are vital witnesses to the extreme modes of star formation in the early Universe. However, for every putative dynamical IMBH detection so far there is a corresponding non-detection; also the origin of multiple stellar populations in GCs still lacks any uncontrived explanation. The synergy of novel techniques and exquisite data proposed here promises a breakthrough in this emerging field of dynamical archeology with GCs as living fossils of the past of galaxies.
Summary
Globular clusters (GCs) are enigmatic objects that hide a wealth of information. They are the living fossils of the history of their native galaxies and the record keepers of the violent events that made them change their domicile. This proposal aims to mine GCs as living fossils of galaxy evolution to address fundamental questions in astrophysics: (1) Do satellite galaxies merge as predicted by the hierarchical build-up of galaxies? (2) Which are the seeds of supermassive black holes in the centres of galaxies? (3) How did star formation originate in the earliest phases of galaxy formation? To answer these questions, novel population-dependent dynamical modelling techniques are required, whose development the PI has led over the past years. This uniquely positions him to take full advantage of the emerging wealth of chemical and kinematical data on GCs.
Following the tidal disruption of satellite galaxies, their dense GCs, and maybe even their nuclei, are left as the most visible remnants in the main galaxy. The hierarchical build-up of their new host galaxy can thus be unearthed by recovering the GCs’ orbits. However, currently it is unclear which of the GCs are accretion survivors. Actually, the existence of a central intermediate mass black hole (IMBH) or of multiple stellar populations in GCs might tell which ones are accreted. At the same time, detection of IMBHs is important as they are predicted seeds for supermassive black holes in galaxies; while the multiple stellar populations in GCs are vital witnesses to the extreme modes of star formation in the early Universe. However, for every putative dynamical IMBH detection so far there is a corresponding non-detection; also the origin of multiple stellar populations in GCs still lacks any uncontrived explanation. The synergy of novel techniques and exquisite data proposed here promises a breakthrough in this emerging field of dynamical archeology with GCs as living fossils of the past of galaxies.
Max ERC Funding
1 999 250 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym ARITHMUS
Project Peopling Europe: How data make a people
Researcher (PI) Evelyn Sharon Ruppert
Host Institution (HI) GOLDSMITHS' COLLEGE
Call Details Consolidator Grant (CoG), SH3, ERC-2013-CoG
Summary Who are the people of Europe? This question is facing statisticians as they grapple with standardising national census methods so that their numbers can be assembled into a European population. Yet, by so doing—intentionally or otherwise—they also contribute to the making of a European people. This, at least, is the central thesis of ARITHMUS. While typically framed as a methodological or statistical problem, the project approaches this as a practical and political problem of assembling multiple national populations into a European population and people.
Why is this both an urgent political and practical problem? Politically, Europe is said to be unable to address itself to a constituted polity and people, which is crucial to European integration. Practically, its efforts to constitute a European population are also being challenged by digital technologies, which are being used to diversify census methods and bringing into question the comparability of national population data. Consequently, over the next several years Eurostat and national statistical institutes are negotiating regulations for the 2020 census round towards ensuring 'Europe-wide comparability.'
ARITHMUS will follow this process and investigate the practices of statisticians as they juggle scientific independence, national autonomy and EU comparability to innovate census methods. It will then connect this practical work to political questions of the making and governing of a European people and polity. It will do so by going beyond state-of-the art scholarship on methods, politics and science and technology studies. Five case studies involving discourse analysis and ethnographic methods will investigate the situated practices of EU and national statisticians as they remake census methods, arguably the most fundamental changes since modern censuses were launched over two centuries ago. At the same time it will attend to how these practices affect the constitution of who are the people of Europe.
Summary
Who are the people of Europe? This question is facing statisticians as they grapple with standardising national census methods so that their numbers can be assembled into a European population. Yet, by so doing—intentionally or otherwise—they also contribute to the making of a European people. This, at least, is the central thesis of ARITHMUS. While typically framed as a methodological or statistical problem, the project approaches this as a practical and political problem of assembling multiple national populations into a European population and people.
Why is this both an urgent political and practical problem? Politically, Europe is said to be unable to address itself to a constituted polity and people, which is crucial to European integration. Practically, its efforts to constitute a European population are also being challenged by digital technologies, which are being used to diversify census methods and bringing into question the comparability of national population data. Consequently, over the next several years Eurostat and national statistical institutes are negotiating regulations for the 2020 census round towards ensuring 'Europe-wide comparability.'
ARITHMUS will follow this process and investigate the practices of statisticians as they juggle scientific independence, national autonomy and EU comparability to innovate census methods. It will then connect this practical work to political questions of the making and governing of a European people and polity. It will do so by going beyond state-of-the art scholarship on methods, politics and science and technology studies. Five case studies involving discourse analysis and ethnographic methods will investigate the situated practices of EU and national statisticians as they remake census methods, arguably the most fundamental changes since modern censuses were launched over two centuries ago. At the same time it will attend to how these practices affect the constitution of who are the people of Europe.
Max ERC Funding
1 833 649 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym ARTHUS
Project Advances in Research on Theories of the Dark Universe - Inhomogeneity Effects in Relativistic Cosmology
Researcher (PI) Thomas BUCHERT
Host Institution (HI) UNIVERSITE LYON 1 CLAUDE BERNARD
Call Details Advanced Grant (AdG), PE9, ERC-2016-ADG
Summary The project ARTHUS aims at determining the physical origin of Dark Energy: in addition to the energy sources of the standard model of cosmology, effective terms arise through spatially averaging inhomogeneous cosmological models in General Relativity. It has been demonstrated that these additional terms can play the role of Dark Energy on large scales (but they can also mimic Dark Matter on scales of mass accumulations). The underlying rationale is that fluctuations in the Universe generically couple to spatially averaged intrinsic properties of space, such as its averaged scalar curvature, thus changing the global evolution of the effective (spatially averaged) cosmological model. At present, we understand these so- called backreaction effects only qualitatively. The project ARTHUS is directed towards a conclusive quantitative evaluation of these effects by developing generic and non-perturbative relativistic models of structure formation, by statistically measuring the key-variables of the models in observations and in simulation data, and by reinterpreting observational results in light of the new models. It is to be emphasized that there is no doubt about the existence of backreaction effects; the question is whether they are even capable of getting rid of the dark sources (as some models discussed in the literature suggest), or whether their impact is substantially smaller. The project thus addresses an essential issue of current cosmological research: to find pertinent answers concerning the quantitative impact of inhomogeneity effects, a necessary, worldwide recognized step toward high-precision cosmology. If the project objectives are attained, the results will have a far-reaching impact on theoretical and observational cosmology, on the interpretation of astronomical experiments such as Planck and Euclid, as well as on a wide spectrum of particle physics theories and experiments.
Summary
The project ARTHUS aims at determining the physical origin of Dark Energy: in addition to the energy sources of the standard model of cosmology, effective terms arise through spatially averaging inhomogeneous cosmological models in General Relativity. It has been demonstrated that these additional terms can play the role of Dark Energy on large scales (but they can also mimic Dark Matter on scales of mass accumulations). The underlying rationale is that fluctuations in the Universe generically couple to spatially averaged intrinsic properties of space, such as its averaged scalar curvature, thus changing the global evolution of the effective (spatially averaged) cosmological model. At present, we understand these so- called backreaction effects only qualitatively. The project ARTHUS is directed towards a conclusive quantitative evaluation of these effects by developing generic and non-perturbative relativistic models of structure formation, by statistically measuring the key-variables of the models in observations and in simulation data, and by reinterpreting observational results in light of the new models. It is to be emphasized that there is no doubt about the existence of backreaction effects; the question is whether they are even capable of getting rid of the dark sources (as some models discussed in the literature suggest), or whether their impact is substantially smaller. The project thus addresses an essential issue of current cosmological research: to find pertinent answers concerning the quantitative impact of inhomogeneity effects, a necessary, worldwide recognized step toward high-precision cosmology. If the project objectives are attained, the results will have a far-reaching impact on theoretical and observational cosmology, on the interpretation of astronomical experiments such as Planck and Euclid, as well as on a wide spectrum of particle physics theories and experiments.
Max ERC Funding
2 091 000 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym ASSESS
Project Episodic Mass Loss in the Most Massive Stars: Key to Understanding the Explosive Early Universe
Researcher (PI) Alceste BONANOS
Host Institution (HI) NATIONAL OBSERVATORY OF ATHENS
Call Details Consolidator Grant (CoG), PE9, ERC-2017-COG
Summary Massive stars dominate their surroundings during their short lifetimes, while their explosive deaths impact the chemical evolution and spatial cohesion of their hosts. After birth, their evolution is largely dictated by their ability to remove layers of hydrogen from their envelopes. Multiple lines of evidence are pointing to violent, episodic mass-loss events being responsible for removing a large part of the massive stellar envelope, especially in low-metallicity galaxies. Episodic mass loss, however, is not understood theoretically, neither accounted for in state-of-the-art models of stellar evolution, which has far-reaching consequences for many areas of astronomy. We aim to determine whether episodic mass loss is a dominant process in the evolution of the most massive stars by conducting the first extensive, multi-wavelength survey of evolved massive stars in the nearby Universe. The project hinges on the fact that mass-losing stars form dust and are bright in the mid-infrared. We plan to (i) derive physical parameters of a large sample of dusty, evolved targets and estimate the amount of ejected mass, (ii) constrain evolutionary models, (iii) quantify the duration and frequency of episodic mass loss as a function of metallicity. The approach involves applying machine-learning algorithms to existing multi-band and time-series photometry of luminous sources in ~25 nearby galaxies. Dusty, luminous evolved massive stars will thus be automatically classified and follow-up spectroscopy will be obtained for selected targets. Atmospheric and SED modeling will yield parameters and estimates of time-dependent mass loss for ~1000 luminous stars. The emerging trend for the ubiquity of episodic mass loss, if confirmed, will be key to understanding the explosive early Universe and will have profound consequences for low-metallicity stars, reionization, and the chemical evolution of galaxies.
Summary
Massive stars dominate their surroundings during their short lifetimes, while their explosive deaths impact the chemical evolution and spatial cohesion of their hosts. After birth, their evolution is largely dictated by their ability to remove layers of hydrogen from their envelopes. Multiple lines of evidence are pointing to violent, episodic mass-loss events being responsible for removing a large part of the massive stellar envelope, especially in low-metallicity galaxies. Episodic mass loss, however, is not understood theoretically, neither accounted for in state-of-the-art models of stellar evolution, which has far-reaching consequences for many areas of astronomy. We aim to determine whether episodic mass loss is a dominant process in the evolution of the most massive stars by conducting the first extensive, multi-wavelength survey of evolved massive stars in the nearby Universe. The project hinges on the fact that mass-losing stars form dust and are bright in the mid-infrared. We plan to (i) derive physical parameters of a large sample of dusty, evolved targets and estimate the amount of ejected mass, (ii) constrain evolutionary models, (iii) quantify the duration and frequency of episodic mass loss as a function of metallicity. The approach involves applying machine-learning algorithms to existing multi-band and time-series photometry of luminous sources in ~25 nearby galaxies. Dusty, luminous evolved massive stars will thus be automatically classified and follow-up spectroscopy will be obtained for selected targets. Atmospheric and SED modeling will yield parameters and estimates of time-dependent mass loss for ~1000 luminous stars. The emerging trend for the ubiquity of episodic mass loss, if confirmed, will be key to understanding the explosive early Universe and will have profound consequences for low-metallicity stars, reionization, and the chemical evolution of galaxies.
Max ERC Funding
1 128 750 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym ASSHURED
Project Analysing South-South Humanitarian Responses to Displacement from Syria: Views from Lebanon, Jordan and Turkey
Researcher (PI) Elena FIDDIAN-QASMIYEH
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Starting Grant (StG), SH3, ERC-2016-STG
Summary Since 2012, over 4 million people have fled Syria in ‘the most dramatic humanitarian crisis that we have ever faced’ (UNHCR). By November 2015 there were 1,078,338 refugees from Syria in Lebanon, 630,776 in Jordan and 2,181,293 in Turkey. Humanitarian agencies and donor states from both the global North and the global South have funded and implemented aid programmes, and yet commentators have argued that civil society groups from the global South are the most significant actors supporting refugees in Lebanon, Jordan and Turkey. Whilst they are highly significant responses, however, major gaps in knowledge remain regarding the motivations, nature and implications of Southern-led responses to conflict-induced displacement. This project draws on multi-sited ethnographic and participatory research with refugees from Syria and their aid providers in Lebanon, Jordan and Turkey to critically examine why, how and with what effect actors from the South have responded to the displacement of refugees from Syria. The main research aims are:
1. identifying diverse models of Southern-led responses to conflict-induced displacement,
2. examining the (un)official motivations, nature and implications of Southern-led responses,
3. examining refugees’ experiences and perceptions of Southern-led responses,
4. exploring diverse Southern and Northern actors’ perceptions of Southern-led responses,
5. tracing the implications of Southern-led initiatives for humanitarian theory and practice.
Based on a critical theoretical framework inspired by post-colonial and feminist approaches, the project contributes to theories of humanitarianism and debates regarding donor-recipient relations and refugees’ agency in displacement situations. It will also inform the development of policies to most appropriately address refugees’ needs and rights. This highly topical and innovative project thus has far-reaching implications for refugees and local communities, academics, policy-makers and practitioners.
Summary
Since 2012, over 4 million people have fled Syria in ‘the most dramatic humanitarian crisis that we have ever faced’ (UNHCR). By November 2015 there were 1,078,338 refugees from Syria in Lebanon, 630,776 in Jordan and 2,181,293 in Turkey. Humanitarian agencies and donor states from both the global North and the global South have funded and implemented aid programmes, and yet commentators have argued that civil society groups from the global South are the most significant actors supporting refugees in Lebanon, Jordan and Turkey. Whilst they are highly significant responses, however, major gaps in knowledge remain regarding the motivations, nature and implications of Southern-led responses to conflict-induced displacement. This project draws on multi-sited ethnographic and participatory research with refugees from Syria and their aid providers in Lebanon, Jordan and Turkey to critically examine why, how and with what effect actors from the South have responded to the displacement of refugees from Syria. The main research aims are:
1. identifying diverse models of Southern-led responses to conflict-induced displacement,
2. examining the (un)official motivations, nature and implications of Southern-led responses,
3. examining refugees’ experiences and perceptions of Southern-led responses,
4. exploring diverse Southern and Northern actors’ perceptions of Southern-led responses,
5. tracing the implications of Southern-led initiatives for humanitarian theory and practice.
Based on a critical theoretical framework inspired by post-colonial and feminist approaches, the project contributes to theories of humanitarianism and debates regarding donor-recipient relations and refugees’ agency in displacement situations. It will also inform the development of policies to most appropriately address refugees’ needs and rights. This highly topical and innovative project thus has far-reaching implications for refugees and local communities, academics, policy-makers and practitioners.
Max ERC Funding
1 498 069 €
Duration
Start date: 2017-07-01, End date: 2022-06-30
Project acronym ASTERISK
Project ASTERoseismic Investigations with SONG and Kepler
Researcher (PI) Jørgen Christensen-Dalsgaard
Host Institution (HI) AARHUS UNIVERSITET
Call Details Advanced Grant (AdG), PE9, ERC-2010-AdG_20100224
Summary The project aims at a breakthrough in our understanding of stellar evolution, by combining advanced observations of stellar oscillations with state-of-the-art modelling of stars. This will largely be based on very extensive and precise data on stellar oscillations from the NASA Kepler mission launched in March 2009, but additional high-quality data will also be included. In particular, my group is developing the global SONG network for observations of stellar oscillations. These observational efforts will be supplemented by sophisticated modelling of stellar evolution, and by the development of asteroseismic tools to use the observations to probe stellar interiors. This will lead to a far more reliable determination of stellar ages, and hence ages of other astrophysical objects; it will compare the properties of the Sun with other stars and hence provide an understanding of the life history of the Sun; it will investigate the physical processes that control stellar properties, both at the level of the thermodynamical properties of stellar plasmas and the hydrodynamical instabilities that play a central role in stellar evolution; and it will characterize central stars in extra-solar planetary systems, determining the size and age of the star and hence constrain the evolution of the planetary systems. The Kepler data will be analysed in a large international collaboration coordinated by our group. The SONG network, which will become partially operational during the present project, will yield even detailed information about the conditions in the interior of stars, allowing tests of subtle but central aspects of the physics of stellar interiors. The projects involve the organization of a central data archive for asteroseismic data, at the Royal Library, Copenhagen.
Summary
The project aims at a breakthrough in our understanding of stellar evolution, by combining advanced observations of stellar oscillations with state-of-the-art modelling of stars. This will largely be based on very extensive and precise data on stellar oscillations from the NASA Kepler mission launched in March 2009, but additional high-quality data will also be included. In particular, my group is developing the global SONG network for observations of stellar oscillations. These observational efforts will be supplemented by sophisticated modelling of stellar evolution, and by the development of asteroseismic tools to use the observations to probe stellar interiors. This will lead to a far more reliable determination of stellar ages, and hence ages of other astrophysical objects; it will compare the properties of the Sun with other stars and hence provide an understanding of the life history of the Sun; it will investigate the physical processes that control stellar properties, both at the level of the thermodynamical properties of stellar plasmas and the hydrodynamical instabilities that play a central role in stellar evolution; and it will characterize central stars in extra-solar planetary systems, determining the size and age of the star and hence constrain the evolution of the planetary systems. The Kepler data will be analysed in a large international collaboration coordinated by our group. The SONG network, which will become partially operational during the present project, will yield even detailed information about the conditions in the interior of stars, allowing tests of subtle but central aspects of the physics of stellar interiors. The projects involve the organization of a central data archive for asteroseismic data, at the Royal Library, Copenhagen.
Max ERC Funding
2 498 149 €
Duration
Start date: 2011-04-01, End date: 2016-03-31
Project acronym Asterochronometry
Project Galactic archeology with high temporal resolution
Researcher (PI) Andrea MIGLIO
Host Institution (HI) THE UNIVERSITY OF BIRMINGHAM
Call Details Consolidator Grant (CoG), PE9, ERC-2017-COG
Summary The Milky Way is a complex system, with dynamical and chemical substructures, where several competing processes such as mergers, internal secular evolution, gas accretion and gas flows take place. To study in detail how such a giant spiral galaxy was formed and evolved, we need to reconstruct the sequence of its main formation events with high (~10%) temporal resolution.
Asterochronometry will determine accurate, precise ages for tens of thousands of stars in the Galaxy. We will take an approach distinguished by a number of key aspects including, developing novel star-dating methods that fully utilise the potential of individual pulsation modes, coupled with a careful appraisal of systematic uncertainties on age deriving from our limited understanding of stellar physics.
We will then capitalise on opportunities provided by the timely availability of astrometric, spectroscopic, and asteroseismic data to build and data-mine chrono-chemo-dynamical maps of regions of the Milky Way probed by the space missions CoRoT, Kepler, K2, and TESS. We will quantify, by comparison with predictions of chemodynamical models, the relative importance of various processes which play a role in shaping the Galaxy, for example mergers and dynamical processes. We will use chrono-chemical tagging to look for evidence of aggregates, and precise and accurate ages to reconstruct the early star formation history of the Milky Way’s main constituents.
The Asterochronometry project will also provide stringent observational tests of stellar structure and answer some of the long-standing open questions in stellar modelling (e.g. efficiency of transport processes, mass loss on the giant branch, the occurrence of products of coalescence / mass exchange). These tests will improve our ability to determine stellar ages and chemical yields, with wide impact e.g. on the characterisation and ensemble studies of exoplanets, on evolutionary population synthesis, integrated colours and thus ages of galaxies.
Summary
The Milky Way is a complex system, with dynamical and chemical substructures, where several competing processes such as mergers, internal secular evolution, gas accretion and gas flows take place. To study in detail how such a giant spiral galaxy was formed and evolved, we need to reconstruct the sequence of its main formation events with high (~10%) temporal resolution.
Asterochronometry will determine accurate, precise ages for tens of thousands of stars in the Galaxy. We will take an approach distinguished by a number of key aspects including, developing novel star-dating methods that fully utilise the potential of individual pulsation modes, coupled with a careful appraisal of systematic uncertainties on age deriving from our limited understanding of stellar physics.
We will then capitalise on opportunities provided by the timely availability of astrometric, spectroscopic, and asteroseismic data to build and data-mine chrono-chemo-dynamical maps of regions of the Milky Way probed by the space missions CoRoT, Kepler, K2, and TESS. We will quantify, by comparison with predictions of chemodynamical models, the relative importance of various processes which play a role in shaping the Galaxy, for example mergers and dynamical processes. We will use chrono-chemical tagging to look for evidence of aggregates, and precise and accurate ages to reconstruct the early star formation history of the Milky Way’s main constituents.
The Asterochronometry project will also provide stringent observational tests of stellar structure and answer some of the long-standing open questions in stellar modelling (e.g. efficiency of transport processes, mass loss on the giant branch, the occurrence of products of coalescence / mass exchange). These tests will improve our ability to determine stellar ages and chemical yields, with wide impact e.g. on the characterisation and ensemble studies of exoplanets, on evolutionary population synthesis, integrated colours and thus ages of galaxies.
Max ERC Funding
1 958 863 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym ASTRODYN
Project Astrophysical Dynamos
Researcher (PI) Axel Brandenburg
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Advanced Grant (AdG), PE9, ERC-2008-AdG
Summary Magnetic fields in stars, planets, accretion discs, and galaxies are believed to be the result of a dynamo process converting kinetic energy into magnetic energy. This work focuses on the solar dynamo, but dynamos in other astrophysical systems will also be addressed. In particular, direct high-resolution three-dimensional simulations are used to understand particular aspects of the solar dynamo and ultimately to simulate the solar dynamo as a whole. Phenomenological approaches will be avoided in favor of obtaining rigorous results. A major problem is catastrophic quenching, i.e. the decline of dynamo effects in inverse proportion to the magnetic Reynolds number, which is huge. Tremendous advances have been made in the last few years since the cause of catastrophic quenching in dynamos has been understood in terms of magnetic helicity evolution. The numerical tools are now in place to allow for magnetic helicity fluxes via coronal mass ejections, thus alleviating catastrophic quenching. This work employs simulations in spherical shells, augmented by Cartesian simulations in special cases. The roles of the near-surface shear layer, the tachocline, as well as pumping in the bulk of the convection zone are to be clarified. The Pencil Code will be used for most applications. The code is third order in time and sixth order in space and is used for solving the hydromagnetic equations. It is a public domain code developed by roughly 20 scientists world wide and maintained under an a central versioning system at Nordita. Automatic nightly tests of currently 30 applications ensure the integrity of the code. It is used for a wide range of applications and may include the effects of radiation, self-gravity, dust, chemistry, variable ionization, cosmic rays, in addition to those of magnetohydrodynamics. The code with its infrastructure offers a good opportunity for individuals within a broad group of people to develop new tools that may automatically be useful to others.
Summary
Magnetic fields in stars, planets, accretion discs, and galaxies are believed to be the result of a dynamo process converting kinetic energy into magnetic energy. This work focuses on the solar dynamo, but dynamos in other astrophysical systems will also be addressed. In particular, direct high-resolution three-dimensional simulations are used to understand particular aspects of the solar dynamo and ultimately to simulate the solar dynamo as a whole. Phenomenological approaches will be avoided in favor of obtaining rigorous results. A major problem is catastrophic quenching, i.e. the decline of dynamo effects in inverse proportion to the magnetic Reynolds number, which is huge. Tremendous advances have been made in the last few years since the cause of catastrophic quenching in dynamos has been understood in terms of magnetic helicity evolution. The numerical tools are now in place to allow for magnetic helicity fluxes via coronal mass ejections, thus alleviating catastrophic quenching. This work employs simulations in spherical shells, augmented by Cartesian simulations in special cases. The roles of the near-surface shear layer, the tachocline, as well as pumping in the bulk of the convection zone are to be clarified. The Pencil Code will be used for most applications. The code is third order in time and sixth order in space and is used for solving the hydromagnetic equations. It is a public domain code developed by roughly 20 scientists world wide and maintained under an a central versioning system at Nordita. Automatic nightly tests of currently 30 applications ensure the integrity of the code. It is used for a wide range of applications and may include the effects of radiation, self-gravity, dust, chemistry, variable ionization, cosmic rays, in addition to those of magnetohydrodynamics. The code with its infrastructure offers a good opportunity for individuals within a broad group of people to develop new tools that may automatically be useful to others.
Max ERC Funding
2 220 000 €
Duration
Start date: 2009-02-01, End date: 2014-01-31
Project acronym ASTROFLOW
Project The influence of stellar outflows on exoplanetary mass loss
Researcher (PI) Aline VIDOTTO
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Consolidator Grant (CoG), PE9, ERC-2018-COG
Summary ASTROFLOW aims to make ground-breaking progress in our physical understanding of exoplanetary mass loss, by quantifying the influence of stellar outflows on atmospheric escape of close-in exoplanets. Escape plays a key role in planetary evolution, population, and potential to develop life. Stellar irradiation and outflows affect planetary mass loss: irradiation heats planetary atmospheres, which inflate and more likely escape; outflows cause pressure confinement around otherwise freely escaping atmospheres. This external pressure can increase, reduce or even suppress escape rates; its effects on exoplanetary mass loss remain largely unexplored due to the complexity of such interactions. I will fill this knowledge gap by developing a novel modelling framework of atmospheric escape that will, for the first time, consider the effects of realistic stellar outflows on exoplanetary mass loss. My expertise in stellar wind theory and 3D magnetohydrodynamic simulations is crucial for producing the next-generation models of planetary escape. My framework will consist of state-of-the-art, time-dependent, 3D simulations of stellar outflows (Method 1), which will be coupled to novel 3D simulations of atmospheric escape (Method 2). My models will account for the major underlying physical processes of mass loss. With this, I will determine the response of planetary mass loss to realistic stellar particle, magnetic and radiation environments and will characterise the physical conditions of the escaping material. I will compute how its extinction varies during transit and compare synthetic line profiles to atmospheric escape observations from, eg, Hubble and our NASA cubesat CUTE. Strong synergy with upcoming observations (JWST, TESS, SPIRou, CARMENES) also exists. Determining the lifetime of planetary atmospheres is essential to understanding populations of exoplanets. ASTROFLOW’s work will be the foundation for future research of how exoplanets evolve under mass-loss processes.
Summary
ASTROFLOW aims to make ground-breaking progress in our physical understanding of exoplanetary mass loss, by quantifying the influence of stellar outflows on atmospheric escape of close-in exoplanets. Escape plays a key role in planetary evolution, population, and potential to develop life. Stellar irradiation and outflows affect planetary mass loss: irradiation heats planetary atmospheres, which inflate and more likely escape; outflows cause pressure confinement around otherwise freely escaping atmospheres. This external pressure can increase, reduce or even suppress escape rates; its effects on exoplanetary mass loss remain largely unexplored due to the complexity of such interactions. I will fill this knowledge gap by developing a novel modelling framework of atmospheric escape that will, for the first time, consider the effects of realistic stellar outflows on exoplanetary mass loss. My expertise in stellar wind theory and 3D magnetohydrodynamic simulations is crucial for producing the next-generation models of planetary escape. My framework will consist of state-of-the-art, time-dependent, 3D simulations of stellar outflows (Method 1), which will be coupled to novel 3D simulations of atmospheric escape (Method 2). My models will account for the major underlying physical processes of mass loss. With this, I will determine the response of planetary mass loss to realistic stellar particle, magnetic and radiation environments and will characterise the physical conditions of the escaping material. I will compute how its extinction varies during transit and compare synthetic line profiles to atmospheric escape observations from, eg, Hubble and our NASA cubesat CUTE. Strong synergy with upcoming observations (JWST, TESS, SPIRou, CARMENES) also exists. Determining the lifetime of planetary atmospheres is essential to understanding populations of exoplanets. ASTROFLOW’s work will be the foundation for future research of how exoplanets evolve under mass-loss processes.
Max ERC Funding
1 999 956 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym ASTROLAB
Project Cold Collisions and the Pathways Toward Life in Interstellar Space
Researcher (PI) Holger Kreckel
Host Institution (HI) MAX-PLANCK-GESELLSCHAFT ZUR FORDERUNG DER WISSENSCHAFTEN EV
Call Details Starting Grant (StG), PE9, ERC-2012-StG_20111012
Summary Modern telescopes like Herschel and ALMA open up a new window into molecular astrophysics to investigate a surprisingly rich chemistry that operates even at low densities and low temperatures. Observations with these instruments have the potential of unraveling key questions of astrobiology, like the accumulation of water and pre-biotic organic molecules on (exo)planets from asteroids and comets. Hand-in-hand with the heightened observational activities comes a strong demand for a thorough understanding of the molecular formation mechanisms. The vast majority of interstellar molecules are formed in ion-neutral reactions that remain efficient even at low temperatures. Unfortunately, the unusual nature of these processes under terrestrial conditions makes their laboratory study extremely difficult.
To address these issues, I propose to build a versatile merged beams setup for laboratory studies of ion-neutral collisions at the Cryogenic Storage Ring (CSR), the most ambitious of the next-generation storage devices under development worldwide. With this experimental setup, I will make use of a low-temperature and low-density environment that is ideal to simulate the conditions prevailing in interstellar space. The cryogenic surrounding, in combination with laser-generated ground state atom beams, will allow me to perform precise energy-resolved rate coefficient measurements for reactions between cold molecular ions (like, e.g., H2+, H3+, HCO+, CH2+, CH3+, etc.) and neutral atoms (H, D, C or O) in order to shed light on long-standing problems of astrochemistry and the formation of organic molecules in space.
With the large variability of the collision energy (corresponding to 40-40000 K), I will be able to provide data that are crucial for the interpretation of molecular observations in a variety of objects, ranging from cold molecular clouds to warm layers in protoplanetary disks.
Summary
Modern telescopes like Herschel and ALMA open up a new window into molecular astrophysics to investigate a surprisingly rich chemistry that operates even at low densities and low temperatures. Observations with these instruments have the potential of unraveling key questions of astrobiology, like the accumulation of water and pre-biotic organic molecules on (exo)planets from asteroids and comets. Hand-in-hand with the heightened observational activities comes a strong demand for a thorough understanding of the molecular formation mechanisms. The vast majority of interstellar molecules are formed in ion-neutral reactions that remain efficient even at low temperatures. Unfortunately, the unusual nature of these processes under terrestrial conditions makes their laboratory study extremely difficult.
To address these issues, I propose to build a versatile merged beams setup for laboratory studies of ion-neutral collisions at the Cryogenic Storage Ring (CSR), the most ambitious of the next-generation storage devices under development worldwide. With this experimental setup, I will make use of a low-temperature and low-density environment that is ideal to simulate the conditions prevailing in interstellar space. The cryogenic surrounding, in combination with laser-generated ground state atom beams, will allow me to perform precise energy-resolved rate coefficient measurements for reactions between cold molecular ions (like, e.g., H2+, H3+, HCO+, CH2+, CH3+, etc.) and neutral atoms (H, D, C or O) in order to shed light on long-standing problems of astrochemistry and the formation of organic molecules in space.
With the large variability of the collision energy (corresponding to 40-40000 K), I will be able to provide data that are crucial for the interpretation of molecular observations in a variety of objects, ranging from cold molecular clouds to warm layers in protoplanetary disks.
Max ERC Funding
1 486 800 €
Duration
Start date: 2012-09-01, End date: 2017-11-30
Project acronym ASYFAIR
Project Fair and Consistent Border Controls? A Critical, Multi-methodological and Interdisciplinary Study of Asylum Adjudication in Europe
Researcher (PI) Nicholas Mark Gill
Host Institution (HI) THE UNIVERSITY OF EXETER
Call Details Starting Grant (StG), SH3, ERC-2015-STG
Summary ‘Consistency’ is regularly cited as a desirable attribute of border control, but it has received little critical social scientific attention. This inter-disciplinary project, at the inter-face between critical human geography, border studies and law, will scrutinise the consistency of European asylum adjudication in order to develop richer theoretical understanding of this lynchpin concept. It will move beyond the administrative legal concepts of substantive and procedural consistency by advancing a three-fold conceptualisation of consistency – as everyday practice, discursive deployment of facts and disciplinary technique. In order to generate productive intellectual tension it will also employ an explicitly antagonistic conceptualisation of the relationship between geography and law that views law as seeking to constrain and systematise lived space. The project will employ an innovative combination of methodologies that will produce unique and rich data sets including quantitative analysis, multi-sited legal ethnography, discourse analysis and interviews, and the findings are likely to be of interest both to academic communities like geographers, legal and border scholars and to policy makers and activists working in border control settings. In 2013 the Common European Asylum System (CEAS) was launched to standardise the procedures of asylum determination. But as yet no sustained multi-methodological assessment of the claims of consistency inherent to the CEAS has been carried out. This project offers not only the opportunity to assess progress towards harmonisation of asylum determination processes in Europe, but will also provide a new conceptual framework with which to approach the dilemmas and risks of inconsistency in an area of law fraught with political controversy and uncertainty around the world. Most fundamentally, the project promises to debunk the myths surrounding the possibility of fair and consistent border controls in Europe and elsewhere.
Summary
‘Consistency’ is regularly cited as a desirable attribute of border control, but it has received little critical social scientific attention. This inter-disciplinary project, at the inter-face between critical human geography, border studies and law, will scrutinise the consistency of European asylum adjudication in order to develop richer theoretical understanding of this lynchpin concept. It will move beyond the administrative legal concepts of substantive and procedural consistency by advancing a three-fold conceptualisation of consistency – as everyday practice, discursive deployment of facts and disciplinary technique. In order to generate productive intellectual tension it will also employ an explicitly antagonistic conceptualisation of the relationship between geography and law that views law as seeking to constrain and systematise lived space. The project will employ an innovative combination of methodologies that will produce unique and rich data sets including quantitative analysis, multi-sited legal ethnography, discourse analysis and interviews, and the findings are likely to be of interest both to academic communities like geographers, legal and border scholars and to policy makers and activists working in border control settings. In 2013 the Common European Asylum System (CEAS) was launched to standardise the procedures of asylum determination. But as yet no sustained multi-methodological assessment of the claims of consistency inherent to the CEAS has been carried out. This project offers not only the opportunity to assess progress towards harmonisation of asylum determination processes in Europe, but will also provide a new conceptual framework with which to approach the dilemmas and risks of inconsistency in an area of law fraught with political controversy and uncertainty around the world. Most fundamentally, the project promises to debunk the myths surrounding the possibility of fair and consistent border controls in Europe and elsewhere.
Max ERC Funding
1 252 067 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym ATMO
Project Atmospheres across the Universe
Researcher (PI) Pascal TREMBLIN
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE9, ERC-2017-STG
Summary Which molecules are present in the atmosphere of exoplanets? What are their mass, radius and age? Do they have clouds, convection (atmospheric turbulence), fingering convection, or a circulation induced by irradiation? These questions are fundamental in exoplanetology in order to study issues such as planet formation and exoplanet habitability.
Yet, the impact of fingering convection and circulation induced by irradiation remain poorly understood:
- Fingering convection (triggered by gradients of mean-molecular-weight) has already been suggested to happen in stars (accumulation of heavy elements) and in brown dwarfs and exoplanets (chemical transition e.g. CO/CH4). A large-scale efficient turbulent transport of energy through the fingering instability can reduce the temperature gradient in the atmosphere and explain many observed spectral properties of brown dwarfs and exoplanets. Nonetheless, this large-scale efficiency is not yet characterized and standard approximations (Boussinesq) cannot be used to achieve this goal.
- The interaction between atmospheric circulation and the fingering instability is an open question in the case of irradiated exoplanets. Fingering convection can change the location and magnitude of the hot spot induced by irradiation, whereas the hot deep atmosphere induced by irradiation can change the location of the chemical transitions that trigger the fingering instability.
This project will characterize the impact of fingering convection in the atmosphere of stars, brown dwarfs, and exoplanets and its interaction with the circulation in the case of irradiated planets. By developing innovative numerical models, we will characterize the reduction of the temperature gradient of the atmosphere induced by the instability and study the impact of the circulation. We will then predict and interpret the mass, radius, and chemical composition of exoplanets that will be observed with future missions such as the James Webb Space Telescope (JWST).
Summary
Which molecules are present in the atmosphere of exoplanets? What are their mass, radius and age? Do they have clouds, convection (atmospheric turbulence), fingering convection, or a circulation induced by irradiation? These questions are fundamental in exoplanetology in order to study issues such as planet formation and exoplanet habitability.
Yet, the impact of fingering convection and circulation induced by irradiation remain poorly understood:
- Fingering convection (triggered by gradients of mean-molecular-weight) has already been suggested to happen in stars (accumulation of heavy elements) and in brown dwarfs and exoplanets (chemical transition e.g. CO/CH4). A large-scale efficient turbulent transport of energy through the fingering instability can reduce the temperature gradient in the atmosphere and explain many observed spectral properties of brown dwarfs and exoplanets. Nonetheless, this large-scale efficiency is not yet characterized and standard approximations (Boussinesq) cannot be used to achieve this goal.
- The interaction between atmospheric circulation and the fingering instability is an open question in the case of irradiated exoplanets. Fingering convection can change the location and magnitude of the hot spot induced by irradiation, whereas the hot deep atmosphere induced by irradiation can change the location of the chemical transitions that trigger the fingering instability.
This project will characterize the impact of fingering convection in the atmosphere of stars, brown dwarfs, and exoplanets and its interaction with the circulation in the case of irradiated planets. By developing innovative numerical models, we will characterize the reduction of the temperature gradient of the atmosphere induced by the instability and study the impact of the circulation. We will then predict and interpret the mass, radius, and chemical composition of exoplanets that will be observed with future missions such as the James Webb Space Telescope (JWST).
Max ERC Funding
1 500 000 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym ATTACK
Project Pressured to Attack: How Carrying-Capacity Stress Creates and Shapes Intergroup Conflict
Researcher (PI) Carsten DE DREU
Host Institution (HI) UNIVERSITEIT LEIDEN
Call Details Advanced Grant (AdG), SH3, ERC-2017-ADG
Summary Throughout history, what has been causing tremendous suffering is groups of people fighting each other. While behavioral science research has advanced our understanding of such intergroup conflict, it has exclusively focused on micro-level processes within and between groups at conflict. Disciplines that employ a more historical perspective like climate studies or political geography report that macro-level pressures due to changes in climate or economic scarcity can go along with social unrest and wars. How do these macro-level pressures relate to micro-level processes? Do they both occur independently, or do macro-level pressures trigger micro-level processes that cause intergroup conflict? And if so, which micro-level processes are triggered, and how?
With unavoidable signs of climate change and increasing resource scarcities, answers to these questions are urgently needed. Here I propose carrying-capacity stress (CCS) as the missing link between macro-level pressures and micro-level processes. A group experiences CCS when its resources do not suffice to maintain its functionality. CCS is a function of macro-level pressures and creates intergroup conflict because it impacts micro-level motivation to contribute to one’s group’s fighting capacity and shapes the coordination of individual contributions to out-group aggression through emergent norms, communication and leadership.
To test these propositions I develop a parametric model of CCS that is amenable to measurement and experimentation, and use techniques used in my work on conflict and cooperation: Meta-analyses and time-series analysis of macro-level historical data; experiments on intergroup conflict; and measurement of neuro-hormonal correlates of cooperation and conflict. In combination, this project provides novel multi-level conflict theory that integrates macro-level discoveries in climate research and political geography with micro-level processes uncovered in the biobehavioral sciences
Summary
Throughout history, what has been causing tremendous suffering is groups of people fighting each other. While behavioral science research has advanced our understanding of such intergroup conflict, it has exclusively focused on micro-level processes within and between groups at conflict. Disciplines that employ a more historical perspective like climate studies or political geography report that macro-level pressures due to changes in climate or economic scarcity can go along with social unrest and wars. How do these macro-level pressures relate to micro-level processes? Do they both occur independently, or do macro-level pressures trigger micro-level processes that cause intergroup conflict? And if so, which micro-level processes are triggered, and how?
With unavoidable signs of climate change and increasing resource scarcities, answers to these questions are urgently needed. Here I propose carrying-capacity stress (CCS) as the missing link between macro-level pressures and micro-level processes. A group experiences CCS when its resources do not suffice to maintain its functionality. CCS is a function of macro-level pressures and creates intergroup conflict because it impacts micro-level motivation to contribute to one’s group’s fighting capacity and shapes the coordination of individual contributions to out-group aggression through emergent norms, communication and leadership.
To test these propositions I develop a parametric model of CCS that is amenable to measurement and experimentation, and use techniques used in my work on conflict and cooperation: Meta-analyses and time-series analysis of macro-level historical data; experiments on intergroup conflict; and measurement of neuro-hormonal correlates of cooperation and conflict. In combination, this project provides novel multi-level conflict theory that integrates macro-level discoveries in climate research and political geography with micro-level processes uncovered in the biobehavioral sciences
Max ERC Funding
2 490 383 €
Duration
Start date: 2018-08-01, End date: 2023-07-31
Project acronym Auger-Horizon
Project A large-scale radio detector for the Pierre Auger cosmic-ray Observatory – precision measurements of ultra-high-energy cosmic rays
Researcher (PI) Jörg HÖRANDEL
Host Institution (HI) STICHTING KATHOLIEKE UNIVERSITEIT
Call Details Advanced Grant (AdG), PE9, ERC-2017-ADG
Summary Cosmic Rays (ionized atomic nuclei) are the only matter from beyond our solar system or even from extragalactic space, that we can directly investigate. Up to energies of 10^17 eV they most likely originate in our Galaxy. The highest-energy cosmic rays (>10^18 eV) cannot be magnetically bound any more to the Galaxy and are most likely of extragalactic origin.
The pure existence of these particles raises the question about their origin – how and where are they accelerated? How do they propagate through the universe and interact? How can we directly probe extragalactic matter and how can we locate its origin?
A key to understand the origin of cosmic rays is to measure the particle species (atomic mass). A precise mass measurement will allow discriminating astrophysical models and will clarify the reason for the observed suppression of the cosmic-ray flux at the highest energies, namely the maximum energy of the accelerators or the energy losses during propagation.
I address these questions by employing a new technique to precisely measure the cosmic-ray mass composition, which my group pioneered, the radio detection of air showers (induced by high-energy cosmic rays in the atmosphere) on very large scales, detecting horizontal air showers with zenith angles from 60° to 90°.
The new set-up will be the world-largest radio array, operated together with the well-established Auger surface and fluorescence detectors, forming a unique set-up to measure the properties of cosmic rays with unprecedented precision for energies above 10^17.5 eV. The radio technique is a cost-effective and robust method to measure the cosmic-ray energy and mass, complementary to established techniques. The energy scale of the radio measurements is established from first principles. The proposed detectors will also enhance the detection capabilities for high-energy neutrinos and the search for new physics through precision measurements of the electromagnetic and muonic shower components.
Summary
Cosmic Rays (ionized atomic nuclei) are the only matter from beyond our solar system or even from extragalactic space, that we can directly investigate. Up to energies of 10^17 eV they most likely originate in our Galaxy. The highest-energy cosmic rays (>10^18 eV) cannot be magnetically bound any more to the Galaxy and are most likely of extragalactic origin.
The pure existence of these particles raises the question about their origin – how and where are they accelerated? How do they propagate through the universe and interact? How can we directly probe extragalactic matter and how can we locate its origin?
A key to understand the origin of cosmic rays is to measure the particle species (atomic mass). A precise mass measurement will allow discriminating astrophysical models and will clarify the reason for the observed suppression of the cosmic-ray flux at the highest energies, namely the maximum energy of the accelerators or the energy losses during propagation.
I address these questions by employing a new technique to precisely measure the cosmic-ray mass composition, which my group pioneered, the radio detection of air showers (induced by high-energy cosmic rays in the atmosphere) on very large scales, detecting horizontal air showers with zenith angles from 60° to 90°.
The new set-up will be the world-largest radio array, operated together with the well-established Auger surface and fluorescence detectors, forming a unique set-up to measure the properties of cosmic rays with unprecedented precision for energies above 10^17.5 eV. The radio technique is a cost-effective and robust method to measure the cosmic-ray energy and mass, complementary to established techniques. The energy scale of the radio measurements is established from first principles. The proposed detectors will also enhance the detection capabilities for high-energy neutrinos and the search for new physics through precision measurements of the electromagnetic and muonic shower components.
Max ERC Funding
3 499 249 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym AWESoMeStars
Project Accretion, Winds, and Evolution of Spins and Magnetism of Stars
Researcher (PI) Sean Patrick Matt
Host Institution (HI) THE UNIVERSITY OF EXETER
Call Details Consolidator Grant (CoG), PE9, ERC-2015-CoG
Summary This project focuses on Sun-like stars, which possess convective envelopes and universally exhibit magnetic activity (in the mass range 0.1 to 1.3 MSun). The rotation of these stars influences their internal structure, energy and chemical transport, and magnetic field generation, as well as their external magnetic activity and environmental interactions. Due to the huge range of timescales, spatial scales, and physics involved, understanding how each of these processes relate to each other and to the long-term evolution remains an enormous challenge in astrophysics. To face this challenge, the AWESoMeStars project will develop a comprehensive, physical picture of the evolution of stellar rotation, magnetic activity, mass loss, and accretion.
In doing so, we will
(1) Discover how stars lose the vast majority of their angular momentum, which happens in the accretion phase
(2) Explain the observed rotation-activity relationship and saturation in terms of the evolution of magnetic properties & coronal physics
(3) Characterize coronal heating and mass loss across the full range of mass & age
(4) Explain the Skumanich (1972) relationship and distributions of spin rates observed in young clusters & old field stars
(5) Develop physics-based gyrochronology as a tool for using rotation rates to constrain stellar ages.
We will accomplish these goals using a fundamentally new and multi-faceted approach, which combines the power of multi-dimensional MHD simulations with long-timescale rotational-evolution models. Specifically, we will develop a next generation of MHD simulations of both star-disk interactions and stellar winds, to model stars over the full range of mass & age, and to characterize how magnetically active stars impact their environments. Simultaneously, we will create a new class of rotational-evolution models that include external torques derived from our simulations, compute the evolution of spin rates of entire star clusters, & compare with observations.
Summary
This project focuses on Sun-like stars, which possess convective envelopes and universally exhibit magnetic activity (in the mass range 0.1 to 1.3 MSun). The rotation of these stars influences their internal structure, energy and chemical transport, and magnetic field generation, as well as their external magnetic activity and environmental interactions. Due to the huge range of timescales, spatial scales, and physics involved, understanding how each of these processes relate to each other and to the long-term evolution remains an enormous challenge in astrophysics. To face this challenge, the AWESoMeStars project will develop a comprehensive, physical picture of the evolution of stellar rotation, magnetic activity, mass loss, and accretion.
In doing so, we will
(1) Discover how stars lose the vast majority of their angular momentum, which happens in the accretion phase
(2) Explain the observed rotation-activity relationship and saturation in terms of the evolution of magnetic properties & coronal physics
(3) Characterize coronal heating and mass loss across the full range of mass & age
(4) Explain the Skumanich (1972) relationship and distributions of spin rates observed in young clusters & old field stars
(5) Develop physics-based gyrochronology as a tool for using rotation rates to constrain stellar ages.
We will accomplish these goals using a fundamentally new and multi-faceted approach, which combines the power of multi-dimensional MHD simulations with long-timescale rotational-evolution models. Specifically, we will develop a next generation of MHD simulations of both star-disk interactions and stellar winds, to model stars over the full range of mass & age, and to characterize how magnetically active stars impact their environments. Simultaneously, we will create a new class of rotational-evolution models that include external torques derived from our simulations, compute the evolution of spin rates of entire star clusters, & compare with observations.
Max ERC Funding
2 206 205 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym B Massive
Project Binary massive black hole astrophysics
Researcher (PI) Alberto SESANA
Host Institution (HI) UNIVERSITA' DEGLI STUDI DI MILANO-BICOCCA
Call Details Consolidator Grant (CoG), PE9, ERC-2018-COG
Summary Massive black hole binaries (MBHBs) are the most extreme, fascinating yet elusive astrophysical objects in the Universe. Establishing observationally their existence will be a milestone for contemporary astronomy, providing a fundamental missing piece in the puzzle of galaxy formation, piercing through the (hydro)dynamical physical processes shaping dense galactic nuclei from parsec scales down to the event horizon, and probing gravity in extreme conditions.
We can both see and listen to MBHBs. Remarkably, besides arguably being among the brightest variable objects shining in the Cosmos, MBHBs are also the loudest gravitational wave (GW) sources in the Universe. As such, we shall take advantage of both the type of messengers – photons and gravitons – they are sending to us, which can now be probed by all-sky time-domain surveys and radio pulsar timing arrays (PTAs) respectively.
B MASSIVE leverages on a unique comprehensive approach combining theoretical astrophysics, radio and gravitational-wave astronomy and time-domain surveys, with state of the art data analysis techniques to: i) observationally prove the existence of MBHBs, ii) understand and constrain their astrophysics and dynamics, iii) enable and bring closer in time the direct detection of GWs with PTA.
As European PTA (EPTA) executive committee member and former I
International PTA (IPTA) chair, I am a driving force in the development of pulsar timing science world-wide, and the project will build on the profound knowledge, broad vision and wide collaboration network that established me as a world leader in the field of MBHB and GW astrophysics. B MASSIVE is extremely timely; a pulsar timing data set of unprecedented quality is being assembled by EPTA/IPTA, and Time-Domain astronomy surveys are at their dawn. In the long term, B MASSIVE will be a fundamental milestone establishing European leadership in the cutting-edge field of MBHB astrophysics in the era of LSST, SKA and LISA.
Summary
Massive black hole binaries (MBHBs) are the most extreme, fascinating yet elusive astrophysical objects in the Universe. Establishing observationally their existence will be a milestone for contemporary astronomy, providing a fundamental missing piece in the puzzle of galaxy formation, piercing through the (hydro)dynamical physical processes shaping dense galactic nuclei from parsec scales down to the event horizon, and probing gravity in extreme conditions.
We can both see and listen to MBHBs. Remarkably, besides arguably being among the brightest variable objects shining in the Cosmos, MBHBs are also the loudest gravitational wave (GW) sources in the Universe. As such, we shall take advantage of both the type of messengers – photons and gravitons – they are sending to us, which can now be probed by all-sky time-domain surveys and radio pulsar timing arrays (PTAs) respectively.
B MASSIVE leverages on a unique comprehensive approach combining theoretical astrophysics, radio and gravitational-wave astronomy and time-domain surveys, with state of the art data analysis techniques to: i) observationally prove the existence of MBHBs, ii) understand and constrain their astrophysics and dynamics, iii) enable and bring closer in time the direct detection of GWs with PTA.
As European PTA (EPTA) executive committee member and former I
International PTA (IPTA) chair, I am a driving force in the development of pulsar timing science world-wide, and the project will build on the profound knowledge, broad vision and wide collaboration network that established me as a world leader in the field of MBHB and GW astrophysics. B MASSIVE is extremely timely; a pulsar timing data set of unprecedented quality is being assembled by EPTA/IPTA, and Time-Domain astronomy surveys are at their dawn. In the long term, B MASSIVE will be a fundamental milestone establishing European leadership in the cutting-edge field of MBHB astrophysics in the era of LSST, SKA and LISA.
Max ERC Funding
1 532 750 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym BACCO
Project Bias and Clustering Calculations Optimised: Maximising discovery with galaxy surveys
Researcher (PI) Raúl Esteban ANGULO de la Fuente
Host Institution (HI) FUNDACION CENTRO DE ESTUDIOS DE FISICA DEL COSMOS DE ARAGON
Call Details Starting Grant (StG), PE9, ERC-2016-STG
Summary A new generation of galaxy surveys will soon start measuring the spatial distribution of millions of galaxies over a broad range of redshifts, offering an imminent opportunity to discover new physics. A detailed comparison of these measurements with theoretical models of galaxy clustering may reveal a new fundamental particle, a breakdown of General Relativity, or a hint on the nature of cosmic acceleration. Despite a large progress in the analytic treatment of structure formation in recent years, traditional clustering models still suffer from large uncertainties. This limits cosmological analyses to a very restricted range of scales and statistics, which will be one of the main obstacles to reach a comprehensive exploitation of future surveys.
Here I propose to develop a novel simulation--based approach to predict galaxy clustering. Combining recent advances in computational cosmology, from cosmological N--body calculations to physically-motivated galaxy formation models, I will develop a unified framework to directly predict the position and velocity of individual dark matter structures and galaxies as function of cosmological and astrophysical parameters. In this formulation, galaxy clustering will be a prediction of a set of physical assumptions in a given cosmological setting. The new theoretical framework will be flexible, accurate and fast: it will provide predictions for any clustering statistic, down to scales 100 times smaller than in state-of-the-art perturbation--theory--based models, and in less than 1 minute of CPU time. These advances will enable major improvements in future cosmological constraints, which will significantly increase the overall power of future surveys maximising our potential to discover new physics.
Summary
A new generation of galaxy surveys will soon start measuring the spatial distribution of millions of galaxies over a broad range of redshifts, offering an imminent opportunity to discover new physics. A detailed comparison of these measurements with theoretical models of galaxy clustering may reveal a new fundamental particle, a breakdown of General Relativity, or a hint on the nature of cosmic acceleration. Despite a large progress in the analytic treatment of structure formation in recent years, traditional clustering models still suffer from large uncertainties. This limits cosmological analyses to a very restricted range of scales and statistics, which will be one of the main obstacles to reach a comprehensive exploitation of future surveys.
Here I propose to develop a novel simulation--based approach to predict galaxy clustering. Combining recent advances in computational cosmology, from cosmological N--body calculations to physically-motivated galaxy formation models, I will develop a unified framework to directly predict the position and velocity of individual dark matter structures and galaxies as function of cosmological and astrophysical parameters. In this formulation, galaxy clustering will be a prediction of a set of physical assumptions in a given cosmological setting. The new theoretical framework will be flexible, accurate and fast: it will provide predictions for any clustering statistic, down to scales 100 times smaller than in state-of-the-art perturbation--theory--based models, and in less than 1 minute of CPU time. These advances will enable major improvements in future cosmological constraints, which will significantly increase the overall power of future surveys maximising our potential to discover new physics.
Max ERC Funding
1 484 240 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym BactRNA
Project Bacterial small RNAs networks unravelling novel features of transcription and translation
Researcher (PI) Maude Audrey Guillier
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), LS2, ERC-2018-COG
Summary Regulation of gene expression plays a key role in the ability of bacteria to rapidly adapt to changing environments and to colonize extremely diverse habitats. The relatively recent discovery of a plethora of small regulatory RNAs and the beginning of their characterization has unravelled new aspects of bacterial gene expression. First, the expression of many bacterial genes responds to a complex network of both transcriptional and post-transcriptional regulators. However, the properties of the resulting regulatory circuits on the dynamics of gene expression and in the bacterial adaptive response have been poorly addressed so far. In a first part of this project, we will tackle this question by characterizing the circuits that are formed between two widespread classes of bacterial regulators, the sRNAs and the two-component systems, which act at the post-transcriptional and the transcriptional level, respectively. The study of sRNAs also led to major breakthroughs regarding the basic mechanisms of gene expression. In particular, we recently showed that repressor sRNAs can target activating stem-loop structures located within the coding region of mRNAs that promote translation initiation, in striking contrast with the previously recognized inhibitory role of mRNA structures in translation. The second objective of this project is thus to draw an unprecedented map of non-canonical translation initiation events and their regulation by sRNAs.
Overall, this project will greatly improve our understanding of how bacteria can so rapidly and successfully adapt to many different environments, and in the long term, provide clues towards the development of anti-bacterial strategies.
Summary
Regulation of gene expression plays a key role in the ability of bacteria to rapidly adapt to changing environments and to colonize extremely diverse habitats. The relatively recent discovery of a plethora of small regulatory RNAs and the beginning of their characterization has unravelled new aspects of bacterial gene expression. First, the expression of many bacterial genes responds to a complex network of both transcriptional and post-transcriptional regulators. However, the properties of the resulting regulatory circuits on the dynamics of gene expression and in the bacterial adaptive response have been poorly addressed so far. In a first part of this project, we will tackle this question by characterizing the circuits that are formed between two widespread classes of bacterial regulators, the sRNAs and the two-component systems, which act at the post-transcriptional and the transcriptional level, respectively. The study of sRNAs also led to major breakthroughs regarding the basic mechanisms of gene expression. In particular, we recently showed that repressor sRNAs can target activating stem-loop structures located within the coding region of mRNAs that promote translation initiation, in striking contrast with the previously recognized inhibitory role of mRNA structures in translation. The second objective of this project is thus to draw an unprecedented map of non-canonical translation initiation events and their regulation by sRNAs.
Overall, this project will greatly improve our understanding of how bacteria can so rapidly and successfully adapt to many different environments, and in the long term, provide clues towards the development of anti-bacterial strategies.
Max ERC Funding
1 999 754 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym BAHAMAS
Project A holistic approach to large-scale structure cosmology
Researcher (PI) Ian MCCARTHY
Host Institution (HI) LIVERPOOL JOHN MOORES UNIVERSITY
Call Details Consolidator Grant (CoG), PE9, ERC-2017-COG
Summary The standard model of cosmology, the ɅCDM model, is remarkably successful at explaining a wide range of observations of our Universe. However, it is now being subjected to much more stringent tests than ever before, and recent large-scale structure (LSS) measurements appear to be in tension with its predictions. Is this tension signalling that new physics is required? For example, time-varying dark energy, or perhaps a modified theory of gravity? A contribution from massive neutrinos? Before coming to such bold conclusions we must be certain that all of the important systematic errors in the LSS tests have been accounted for.
Presently, the largest source of systematic uncertainty is from the modelling of complicated astrophysical phenomena associated with galaxy formation. In particular, energetic feedback processes associated with star formation and black hole growth can heat and expel gas from collapsed structures and modify the large-scale distribution of matter. Furthermore, the LSS field is presently separated into many sub-fields (each using different models, that usually neglect feedback), preventing a coherent analysis.
Cosmological hydrodynamical simulations (are the only method which) can follow all the relevant matter components and self-consistently capture the effects of feedback. I have been leading the development of large-scale simulations with physically-motivated prescriptions for feedback that are unrivalled in their ability to reproduce the observed properties of massive systems. With ERC support, I will build a team to exploit these developments, to produce a suite of simulations designed specifically for LSS cosmology applications with the effects of feedback realistically accounted for and which will allow us to unite the different LSS tests. My team and I will make the first self-consistent comparisons with the full range of LSS cosmology tests, and critically assess the evidence for physics beyond the standard model.
Summary
The standard model of cosmology, the ɅCDM model, is remarkably successful at explaining a wide range of observations of our Universe. However, it is now being subjected to much more stringent tests than ever before, and recent large-scale structure (LSS) measurements appear to be in tension with its predictions. Is this tension signalling that new physics is required? For example, time-varying dark energy, or perhaps a modified theory of gravity? A contribution from massive neutrinos? Before coming to such bold conclusions we must be certain that all of the important systematic errors in the LSS tests have been accounted for.
Presently, the largest source of systematic uncertainty is from the modelling of complicated astrophysical phenomena associated with galaxy formation. In particular, energetic feedback processes associated with star formation and black hole growth can heat and expel gas from collapsed structures and modify the large-scale distribution of matter. Furthermore, the LSS field is presently separated into many sub-fields (each using different models, that usually neglect feedback), preventing a coherent analysis.
Cosmological hydrodynamical simulations (are the only method which) can follow all the relevant matter components and self-consistently capture the effects of feedback. I have been leading the development of large-scale simulations with physically-motivated prescriptions for feedback that are unrivalled in their ability to reproduce the observed properties of massive systems. With ERC support, I will build a team to exploit these developments, to produce a suite of simulations designed specifically for LSS cosmology applications with the effects of feedback realistically accounted for and which will allow us to unite the different LSS tests. My team and I will make the first self-consistent comparisons with the full range of LSS cosmology tests, and critically assess the evidence for physics beyond the standard model.
Max ERC Funding
1 725 982 €
Duration
Start date: 2018-06-01, End date: 2023-05-31
Project acronym BAM
Project Becoming A Minority
Researcher (PI) Maurice CRUL
Host Institution (HI) STICHTING VU
Call Details Advanced Grant (AdG), SH3, ERC-2016-ADG
Summary In the last forty years, researchers in the Field of Migration and Ethnic Studies looked at the integration of migrants and their descendants. Concepts, methodological tools and theoretical frameworks have been developed to measure and predict integration outcomes both across different ethnic groups and in comparison with people of native descent. But are we also looking into the actual integration of the receiving group of native ‘white’ descent in city contexts where they have become a numerical minority themselves? In cities like Amsterdam, now only one in three youngsters under age fifteen is of native descent. This situation, referred to as a majority-minority context, is a new phenomenon in Western Europe and it presents itself as one of the most important societal and psychological transformations of our time. I argue that the field of migration and ethnic studies is stagnating because of the one-sided focus on migrants and their children. This is even more urgent given the increased ant-immigrant vote. These pressing scientific and societal reasons pushed me to develop the project BAM (Becoming A Minority). The project will be executed in three harbor cities, Rotterdam, Antwerp and Malmö, and three service sector cities, Amsterdam, Frankfurt and Vienna. BAM consists of 5 subprojects: (1) A meta-analysis of secondary data on people of native ‘white’ descent in the six research sites; (2) A newly developed survey for the target group; (3) An analysis of critical circumstances of encounter that trigger either positive or rather negative responses to increased ethnic diversity (4) Experimental diversity labs to test under which circumstances people will change their attitudes or their actions towards increased ethnic diversity; (5) The formulation of a new theory of integration that includes the changed position of the group of native ‘white’ descent as an important actor.
Summary
In the last forty years, researchers in the Field of Migration and Ethnic Studies looked at the integration of migrants and their descendants. Concepts, methodological tools and theoretical frameworks have been developed to measure and predict integration outcomes both across different ethnic groups and in comparison with people of native descent. But are we also looking into the actual integration of the receiving group of native ‘white’ descent in city contexts where they have become a numerical minority themselves? In cities like Amsterdam, now only one in three youngsters under age fifteen is of native descent. This situation, referred to as a majority-minority context, is a new phenomenon in Western Europe and it presents itself as one of the most important societal and psychological transformations of our time. I argue that the field of migration and ethnic studies is stagnating because of the one-sided focus on migrants and their children. This is even more urgent given the increased ant-immigrant vote. These pressing scientific and societal reasons pushed me to develop the project BAM (Becoming A Minority). The project will be executed in three harbor cities, Rotterdam, Antwerp and Malmö, and three service sector cities, Amsterdam, Frankfurt and Vienna. BAM consists of 5 subprojects: (1) A meta-analysis of secondary data on people of native ‘white’ descent in the six research sites; (2) A newly developed survey for the target group; (3) An analysis of critical circumstances of encounter that trigger either positive or rather negative responses to increased ethnic diversity (4) Experimental diversity labs to test under which circumstances people will change their attitudes or their actions towards increased ethnic diversity; (5) The formulation of a new theory of integration that includes the changed position of the group of native ‘white’ descent as an important actor.
Max ERC Funding
2 499 714 €
Duration
Start date: 2017-11-01, End date: 2022-10-31
Project acronym BAPS
Project Bayesian Agent-based Population Studies: Transforming Simulation Models of Human Migration
Researcher (PI) Jakub KAZIMIERZ BIJAK
Host Institution (HI) UNIVERSITY OF SOUTHAMPTON
Call Details Consolidator Grant (CoG), SH3, ERC-2016-COG
Summary The aim of BAPS is to develop a ground-breaking simulation model of international migration, based on a population of intelligent, cognitive agents, their social networks and institutions, all interacting with one another. The project will transform the study of migration – one of the most uncertain population processes and a top-priority EU policy area – by offering a step change in the way it can be understood, predicted and managed. In this way, BAPS will effectively integrate behavioural and social theory with modelling.
To develop micro-foundations for migration studies, model design will follow cutting-edge developments in demography, statistics, cognitive psychology and computer science. BAPS will also offer a pioneering environment for applying the findings in practice through a bespoke modelling language. Bayesian statistical principles will be used to design innovative computer experiments, and learn about modelling the simulated individuals and the way they make decisions.
In BAPS, we will collate available information for migration models; build and test the simulations by applying experimental design principles to enhance our knowledge of migration processes; collect information on the underpinning decision-making mechanisms through psychological experiments; and design software for implementing Bayesian agent-based models in practice. The project will use various information sources to build models bottom-up, filling an important epistemological gap in demography.
BAPS will be carried out by the Allianz European Demographer 2015, recognised as a leader in the field for methodological innovation, directing an interdisciplinary team with expertise in demography, agent-based models, statistical analysis of uncertainty, meta-cognition, and computer simulations. The project will open up exciting research possibilities beyond demography, and will generate both academic and practical impact, offering methodological advice for policy-relevant simulations.
Summary
The aim of BAPS is to develop a ground-breaking simulation model of international migration, based on a population of intelligent, cognitive agents, their social networks and institutions, all interacting with one another. The project will transform the study of migration – one of the most uncertain population processes and a top-priority EU policy area – by offering a step change in the way it can be understood, predicted and managed. In this way, BAPS will effectively integrate behavioural and social theory with modelling.
To develop micro-foundations for migration studies, model design will follow cutting-edge developments in demography, statistics, cognitive psychology and computer science. BAPS will also offer a pioneering environment for applying the findings in practice through a bespoke modelling language. Bayesian statistical principles will be used to design innovative computer experiments, and learn about modelling the simulated individuals and the way they make decisions.
In BAPS, we will collate available information for migration models; build and test the simulations by applying experimental design principles to enhance our knowledge of migration processes; collect information on the underpinning decision-making mechanisms through psychological experiments; and design software for implementing Bayesian agent-based models in practice. The project will use various information sources to build models bottom-up, filling an important epistemological gap in demography.
BAPS will be carried out by the Allianz European Demographer 2015, recognised as a leader in the field for methodological innovation, directing an interdisciplinary team with expertise in demography, agent-based models, statistical analysis of uncertainty, meta-cognition, and computer simulations. The project will open up exciting research possibilities beyond demography, and will generate both academic and practical impact, offering methodological advice for policy-relevant simulations.
Max ERC Funding
1 455 590 €
Duration
Start date: 2017-06-01, End date: 2021-05-31