Project acronym 1st-principles-discs
Project A First Principles Approach to Accretion Discs
Researcher (PI) Martin Elias Pessah
Host Institution (HI) KOBENHAVNS UNIVERSITET
Country Denmark
Call Details Starting Grant (StG), PE9, ERC-2012-StG_20111012
Summary Most celestial bodies, from planets, to stars, to black holes; gain mass during their lives by means of an accretion disc. Understanding the physical processes that determine the rate at which matter accretes and energy is radiated in these discs is vital for unraveling the formation, evolution, and fate of almost every type of object in the Universe. Despite the fact that magnetic fields have been known to be crucial in accretion discs since the early 90’s, the majority of astrophysical questions that depend on the details of how disc accretion proceeds are still being addressed using the “standard” accretion disc model (developed in the early 70’s), where magnetic fields do not play an explicit role. This has prevented us from fully exploring the astrophysical consequences and observational signatures of realistic accretion disc models, leading to a profound disconnect between observations (usually interpreted with the standard paradigm) and modern accretion disc theory and numerical simulations (where magnetic turbulence is crucial). The goal of this proposal is to use several complementary approaches in order to finally move beyond the standard paradigm. This program has two main objectives: 1) Develop the theoretical framework to incorporate magnetic fields, and the ensuing turbulence, into self-consistent accretion disc models, and investigate their observational implications. 2) Investigate transport and radiative processes in collision-less disc regions, where non-thermal radiation originates, by employing a kinetic particle description of the plasma. In order to achieve these goals, we will use, and build upon, state-of-the-art magnetohydrodynamic and particle-in-cell codes in conjunction with theoretical modeling. This framework will make it possible to address fundamental questions on stellar and planet formation, binary systems with a compact object, and supermassive black hole feedback in a way that has no counterpart within the standard paradigm.
Summary
Most celestial bodies, from planets, to stars, to black holes; gain mass during their lives by means of an accretion disc. Understanding the physical processes that determine the rate at which matter accretes and energy is radiated in these discs is vital for unraveling the formation, evolution, and fate of almost every type of object in the Universe. Despite the fact that magnetic fields have been known to be crucial in accretion discs since the early 90’s, the majority of astrophysical questions that depend on the details of how disc accretion proceeds are still being addressed using the “standard” accretion disc model (developed in the early 70’s), where magnetic fields do not play an explicit role. This has prevented us from fully exploring the astrophysical consequences and observational signatures of realistic accretion disc models, leading to a profound disconnect between observations (usually interpreted with the standard paradigm) and modern accretion disc theory and numerical simulations (where magnetic turbulence is crucial). The goal of this proposal is to use several complementary approaches in order to finally move beyond the standard paradigm. This program has two main objectives: 1) Develop the theoretical framework to incorporate magnetic fields, and the ensuing turbulence, into self-consistent accretion disc models, and investigate their observational implications. 2) Investigate transport and radiative processes in collision-less disc regions, where non-thermal radiation originates, by employing a kinetic particle description of the plasma. In order to achieve these goals, we will use, and build upon, state-of-the-art magnetohydrodynamic and particle-in-cell codes in conjunction with theoretical modeling. This framework will make it possible to address fundamental questions on stellar and planet formation, binary systems with a compact object, and supermassive black hole feedback in a way that has no counterpart within the standard paradigm.
Max ERC Funding
1 793 697 €
Duration
Start date: 2013-02-01, End date: 2018-01-31
Project acronym 2STEPPARKIN
Project A novel two-step model for neurodegeneration in Parkinson’s disease
Researcher (PI) Emi Nagoshi
Host Institution (HI) UNIVERSITE DE GENEVE
Country Switzerland
Call Details Starting Grant (StG), LS5, ERC-2012-StG_20111109
Summary Parkinson’s disease (PD) is the second most common neurodegenerative disorder primarily caused by the progressive loss of dopaminergic (DA) neurons in the substantia nigra (SN). Despite the advances in gene discovery associated with PD, the knowledge of the PD pathogenesis is largely limited to the involvement of these genes in the generic cell death pathways, and why degeneration is specific to DA neurons and why the degeneration is progressive remain enigmatic. Broad goal of our work is therefore to elucidate the mechanisms underlying specific and progressive DA neuron degeneration in PD. Our new Drosophila model of PD ⎯Fer2 gene loss-of-function mutation⎯ is unusually well suited to address these questions. Fer2 mutants exhibit specific and progressive death of brain DA neurons as well as severe locomotor defects and short life span. Strikingly, the death of DA neuron is initiated in a small cluster of Fer2-expressing DA neurons and subsequently propagates to Fer2-negative DA neurons. We therefore propose a novel two-step model of the neurodegeneration in PD: primary cell death occurs in a specific subset of dopamindegic neurons that are genetically defined, and subsequently the failure of the neuronal connectivity triggers and propagates secondary cell death to remaining DA neurons. In this research, we will test this hypothesis and investigate the underlying molecular mechanisms. This will be the first study to examine circuit-dependency in DA neuron degeneration. Our approach will use a combination of non-biased genomic techniques and candidate-based screening, in addition to the powerful Drosophila genetic toolbox. Furthermore, to test this hypothesis beyond the Drosophila model, we will establish new mouse models of PD that exhibit progressive DA neuron degeneration. Outcome of this research will likely revolutionize the understanding of PD pathogenesis and open an avenue toward the discovery of effective therapy strategies against PD.
Summary
Parkinson’s disease (PD) is the second most common neurodegenerative disorder primarily caused by the progressive loss of dopaminergic (DA) neurons in the substantia nigra (SN). Despite the advances in gene discovery associated with PD, the knowledge of the PD pathogenesis is largely limited to the involvement of these genes in the generic cell death pathways, and why degeneration is specific to DA neurons and why the degeneration is progressive remain enigmatic. Broad goal of our work is therefore to elucidate the mechanisms underlying specific and progressive DA neuron degeneration in PD. Our new Drosophila model of PD ⎯Fer2 gene loss-of-function mutation⎯ is unusually well suited to address these questions. Fer2 mutants exhibit specific and progressive death of brain DA neurons as well as severe locomotor defects and short life span. Strikingly, the death of DA neuron is initiated in a small cluster of Fer2-expressing DA neurons and subsequently propagates to Fer2-negative DA neurons. We therefore propose a novel two-step model of the neurodegeneration in PD: primary cell death occurs in a specific subset of dopamindegic neurons that are genetically defined, and subsequently the failure of the neuronal connectivity triggers and propagates secondary cell death to remaining DA neurons. In this research, we will test this hypothesis and investigate the underlying molecular mechanisms. This will be the first study to examine circuit-dependency in DA neuron degeneration. Our approach will use a combination of non-biased genomic techniques and candidate-based screening, in addition to the powerful Drosophila genetic toolbox. Furthermore, to test this hypothesis beyond the Drosophila model, we will establish new mouse models of PD that exhibit progressive DA neuron degeneration. Outcome of this research will likely revolutionize the understanding of PD pathogenesis and open an avenue toward the discovery of effective therapy strategies against PD.
Max ERC Funding
1 518 960 €
Duration
Start date: 2013-06-01, End date: 2018-05-31
Project acronym ABINITIODGA
Project Ab initio Dynamical Vertex Approximation
Researcher (PI) Karsten Held
Host Institution (HI) TECHNISCHE UNIVERSITAET WIEN
Country Austria
Call Details Starting Grant (StG), PE3, ERC-2012-StG_20111012
Summary Some of the most fascinating physical phenomena are experimentally observed in strongly correlated electron systems and, on the theoretical side, only poorly understood hitherto. The aim of the ERC project AbinitioDGA is the development, implementation and application of a new, 21th century method for the ab initio calculation of materials with such strong electronic correlations. AbinitioDGA includes strong electronic correlations on all time and length scales and hence is a big step beyond the state-of-the-art methods, such as the local density approximation, dynamical mean field theory, and the GW approach (Green function G times screened interaction W). It has the potential for an extraordinary high impact not only in the field of computational materials science but also for a better understanding of quantum critical heavy fermion systems, high-temperature superconductors, and transport through nano- and heterostructures. These four physical problems and related materials will be studied within the ERC project, besides the methodological development.
On the technical side, AbinitioDGA realizes Hedin's idea to include vertex corrections beyond the GW approximation. All vertex corrections which can be traced back to a fully irreducible local vertex and the bare non-local Coulomb interaction are included. This way, AbinitioDGA does not only contain the GW physics of screened exchange and the strong local correlations of dynamical mean field theory but also non-local correlations beyond on all length scales. Through the latter, AbinitioDGA can prospectively describe phenomena such as quantum criticality, spin-fluctuation mediated superconductivity, and weak localization corrections to the conductivity. Nonetheless, the computational effort is still manageable even for realistic materials calculations, making the considerable effort to implement AbinitioDGA worthwhile.
Summary
Some of the most fascinating physical phenomena are experimentally observed in strongly correlated electron systems and, on the theoretical side, only poorly understood hitherto. The aim of the ERC project AbinitioDGA is the development, implementation and application of a new, 21th century method for the ab initio calculation of materials with such strong electronic correlations. AbinitioDGA includes strong electronic correlations on all time and length scales and hence is a big step beyond the state-of-the-art methods, such as the local density approximation, dynamical mean field theory, and the GW approach (Green function G times screened interaction W). It has the potential for an extraordinary high impact not only in the field of computational materials science but also for a better understanding of quantum critical heavy fermion systems, high-temperature superconductors, and transport through nano- and heterostructures. These four physical problems and related materials will be studied within the ERC project, besides the methodological development.
On the technical side, AbinitioDGA realizes Hedin's idea to include vertex corrections beyond the GW approximation. All vertex corrections which can be traced back to a fully irreducible local vertex and the bare non-local Coulomb interaction are included. This way, AbinitioDGA does not only contain the GW physics of screened exchange and the strong local correlations of dynamical mean field theory but also non-local correlations beyond on all length scales. Through the latter, AbinitioDGA can prospectively describe phenomena such as quantum criticality, spin-fluctuation mediated superconductivity, and weak localization corrections to the conductivity. Nonetheless, the computational effort is still manageable even for realistic materials calculations, making the considerable effort to implement AbinitioDGA worthwhile.
Max ERC Funding
1 491 090 €
Duration
Start date: 2013-01-01, End date: 2018-07-31
Project acronym ACTIVIA
Project Visual Recognition of Function and Intention
Researcher (PI) Ivan Laptev
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Country France
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary "Computer vision is concerned with the automated interpretation of images and video streams. Today's research is (mostly) aimed at answering queries such as ""Is this a picture of a dog?"", (classification) or sometimes ""Find the dog in this photo"" (detection). While categorisation and detection are useful for many tasks, inferring correct class labels is not the final answer to visual recognition. The categories and locations of objects do not provide direct understanding of their function i.e., how things work, what they can be used for, or how they can act and react. Such an understanding, however, would be highly desirable to answer currently unsolvable queries such as ""Am I in danger?"" or ""What can happen in this scene?"". Solving such queries is the aim of this proposal.
My goal is to uncover the functional properties of objects and the purpose of actions by addressing visual recognition from a different and yet unexplored perspective. The main novelty of this proposal is to leverage observations of people, i.e., their actions and interactions to automatically learn the use, the purpose and the function of objects and scenes from visual data. The project is timely as it builds upon the two key recent technological advances: (a) the immense progress in visual recognition of objects, scenes and human actions achieved in the last ten years, as well as (b) the emergence of a massive amount of public image and video data now available to train visual models.
ACTIVIA addresses fundamental research issues in automated interpretation of dynamic visual scenes, but its results are expected to serve as a basis for ground-breaking technological advances in practical applications. The recognition of functional properties and intentions as explored in this project will directly support high-impact applications such as detection of abnormal events, which are likely to revolutionise today's approaches to crime protection, hazard prevention, elderly care, and many others."
Summary
"Computer vision is concerned with the automated interpretation of images and video streams. Today's research is (mostly) aimed at answering queries such as ""Is this a picture of a dog?"", (classification) or sometimes ""Find the dog in this photo"" (detection). While categorisation and detection are useful for many tasks, inferring correct class labels is not the final answer to visual recognition. The categories and locations of objects do not provide direct understanding of their function i.e., how things work, what they can be used for, or how they can act and react. Such an understanding, however, would be highly desirable to answer currently unsolvable queries such as ""Am I in danger?"" or ""What can happen in this scene?"". Solving such queries is the aim of this proposal.
My goal is to uncover the functional properties of objects and the purpose of actions by addressing visual recognition from a different and yet unexplored perspective. The main novelty of this proposal is to leverage observations of people, i.e., their actions and interactions to automatically learn the use, the purpose and the function of objects and scenes from visual data. The project is timely as it builds upon the two key recent technological advances: (a) the immense progress in visual recognition of objects, scenes and human actions achieved in the last ten years, as well as (b) the emergence of a massive amount of public image and video data now available to train visual models.
ACTIVIA addresses fundamental research issues in automated interpretation of dynamic visual scenes, but its results are expected to serve as a basis for ground-breaking technological advances in practical applications. The recognition of functional properties and intentions as explored in this project will directly support high-impact applications such as detection of abnormal events, which are likely to revolutionise today's approaches to crime protection, hazard prevention, elderly care, and many others."
Max ERC Funding
1 497 420 €
Duration
Start date: 2013-01-01, End date: 2018-12-31
Project acronym ADAPT
Project Theory and Algorithms for Adaptive Particle Simulation
Researcher (PI) Stephane Redon
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Country France
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary "During the twentieth century, the development of macroscopic engineering has been largely stimulated by progress in digital prototyping: cars, planes, boats, etc. are nowadays designed and tested on computers. Digital prototypes have progressively replaced actual ones, and effective computer-aided engineering tools have helped cut costs and reduce production cycles of these macroscopic systems.
The twenty-first century is most likely to see a similar development at the atomic scale. Indeed, the recent years have seen tremendous progress in nanotechnology - in particular in the ability to control matter at the atomic scale. Similar to what has happened with macroscopic engineering, powerful and generic computational tools will be needed to engineer complex nanosystems, through modeling and simulation. As a result, a major challenge is to develop efficient simulation methods and algorithms.
NANO-D, the INRIA research group I started in January 2008 in Grenoble, France, aims at developing
efficient computational methods for modeling and simulating complex nanosystems, both natural and artificial. In particular, NANO-D develops SAMSON, a software application which gathers all algorithms designed by the group and its collaborators (SAMSON: Software for Adaptive Modeling and Simulation Of Nanosystems).
In this project, I propose to develop a unified theory, and associated algorithms, for adaptive particle simulation. The proposed theory will avoid problems that plague current popular multi-scale or hybrid simulation approaches by simulating a single potential throughout the system, while allowing users to finely trade precision for computational speed.
I believe the full development of the adaptive particle simulation theory will have an important impact on current modeling and simulation practices, and will enable practical design of complex nanosystems on desktop computers, which should significantly boost the emergence of generic nano-engineering."
Summary
"During the twentieth century, the development of macroscopic engineering has been largely stimulated by progress in digital prototyping: cars, planes, boats, etc. are nowadays designed and tested on computers. Digital prototypes have progressively replaced actual ones, and effective computer-aided engineering tools have helped cut costs and reduce production cycles of these macroscopic systems.
The twenty-first century is most likely to see a similar development at the atomic scale. Indeed, the recent years have seen tremendous progress in nanotechnology - in particular in the ability to control matter at the atomic scale. Similar to what has happened with macroscopic engineering, powerful and generic computational tools will be needed to engineer complex nanosystems, through modeling and simulation. As a result, a major challenge is to develop efficient simulation methods and algorithms.
NANO-D, the INRIA research group I started in January 2008 in Grenoble, France, aims at developing
efficient computational methods for modeling and simulating complex nanosystems, both natural and artificial. In particular, NANO-D develops SAMSON, a software application which gathers all algorithms designed by the group and its collaborators (SAMSON: Software for Adaptive Modeling and Simulation Of Nanosystems).
In this project, I propose to develop a unified theory, and associated algorithms, for adaptive particle simulation. The proposed theory will avoid problems that plague current popular multi-scale or hybrid simulation approaches by simulating a single potential throughout the system, while allowing users to finely trade precision for computational speed.
I believe the full development of the adaptive particle simulation theory will have an important impact on current modeling and simulation practices, and will enable practical design of complex nanosystems on desktop computers, which should significantly boost the emergence of generic nano-engineering."
Max ERC Funding
1 476 882 €
Duration
Start date: 2012-09-01, End date: 2017-08-31
Project acronym ADAPT
Project Origins and factors governing adaptation: Insights from experimental evolution and population genomic data
Researcher (PI) Thomas, Martin Jean Bataillon
Host Institution (HI) AARHUS UNIVERSITET
Country Denmark
Call Details Starting Grant (StG), LS8, ERC-2012-StG_20111109
Summary "I propose a systematic study of the type of genetic variation enabling adaptation and factors that limit rates of adaptation in natural populations. New methods will be developed for analysing data from experimental evolution and population genomics. The methods will be applied to state of the art data from both fields. Adaptation is generated by natural selection sieving through heritable variation. Examples of adaptation are available from the fossil record and from extant populations. Genomic studies have supplied many instances of genomic regions exhibiting footprint of natural selection favouring new variants. Despite ample proof that adaptation happens, we know little about beneficial mutations– the raw stuff enabling adaptation. Is adaptation mediated by genetic variation pre-existing in the population, or by variation supplied de novo through mutations? We know even less about what factors limit rates of adaptation. Answers to these questions are crucial for Evolutionary Biology, but also for believable quantifications of the evolutionary potential of populations. Population genetic theory makes predictions and allows inference from the patterns of polymorphism within species and divergence between species. Yet models specifying the fitness effects of mutations are often missing. Fitness landscape models will be mobilized to fill this gap and develop methods for inferring the distribution of fitness effects and factors governing rates of adaptation. Insights into the processes underlying adaptation will thus be gained from experimental evolution and population genomics data. The applicability of insights gained from experimental evolution to comprehend adaptation in nature will be scrutinized. We will unite two very different approaches for studying adaptation. The project will boost our understanding of how selection shapes genomes and open the way for further quantitative tests of theories of adaptation."
Summary
"I propose a systematic study of the type of genetic variation enabling adaptation and factors that limit rates of adaptation in natural populations. New methods will be developed for analysing data from experimental evolution and population genomics. The methods will be applied to state of the art data from both fields. Adaptation is generated by natural selection sieving through heritable variation. Examples of adaptation are available from the fossil record and from extant populations. Genomic studies have supplied many instances of genomic regions exhibiting footprint of natural selection favouring new variants. Despite ample proof that adaptation happens, we know little about beneficial mutations– the raw stuff enabling adaptation. Is adaptation mediated by genetic variation pre-existing in the population, or by variation supplied de novo through mutations? We know even less about what factors limit rates of adaptation. Answers to these questions are crucial for Evolutionary Biology, but also for believable quantifications of the evolutionary potential of populations. Population genetic theory makes predictions and allows inference from the patterns of polymorphism within species and divergence between species. Yet models specifying the fitness effects of mutations are often missing. Fitness landscape models will be mobilized to fill this gap and develop methods for inferring the distribution of fitness effects and factors governing rates of adaptation. Insights into the processes underlying adaptation will thus be gained from experimental evolution and population genomics data. The applicability of insights gained from experimental evolution to comprehend adaptation in nature will be scrutinized. We will unite two very different approaches for studying adaptation. The project will boost our understanding of how selection shapes genomes and open the way for further quantitative tests of theories of adaptation."
Max ERC Funding
1 159 857 €
Duration
Start date: 2013-04-01, End date: 2018-03-31
Project acronym APGREID
Project Ancient Pathogen Genomics of Re-Emerging Infectious Disease
Researcher (PI) Johannes Krause
Host Institution (HI) Klinik Max Planck Institut für Psychiatrie
Country Germany
Call Details Starting Grant (StG), LS8, ERC-2012-StG_20111109
Summary Here we propose a first step toward a direct reconstruction of the evolutionary history of human infectious disease agents by obtaining genome wide data of historic pathogens. Through an extensive screening of skeletal collections from well-characterized catastrophe, or emergency, mass burials we plan to detect and sequence pathogen DNA from various historic pandemics spanning at least 2,500 years using a general purpose molecular capture method that will screen for hundreds of pathogens in a single assay. Subsequent experiments will attempt to reconstruct full genomes from all pathogenic species identified. The molecular fossil record of human pathogens will provide insights into host adaptation and evolutionary rates of infectious disease. In addition, human genomic regions relating to disease susceptibility and immunity will be characterized in the skeletal material in order to observe the direct effect that pathogens have made on the genetic makeup of human populations over time. The results of this project will allow a multidisciplinary interpretation of historical pandemics that have influenced the course of human history. It will provide priceless information for the field of history, evolutionary biology, anthropology as well as medicine and will have direct consequences on how we manage emerging and re-emerging infectious disease in the future.
Summary
Here we propose a first step toward a direct reconstruction of the evolutionary history of human infectious disease agents by obtaining genome wide data of historic pathogens. Through an extensive screening of skeletal collections from well-characterized catastrophe, or emergency, mass burials we plan to detect and sequence pathogen DNA from various historic pandemics spanning at least 2,500 years using a general purpose molecular capture method that will screen for hundreds of pathogens in a single assay. Subsequent experiments will attempt to reconstruct full genomes from all pathogenic species identified. The molecular fossil record of human pathogens will provide insights into host adaptation and evolutionary rates of infectious disease. In addition, human genomic regions relating to disease susceptibility and immunity will be characterized in the skeletal material in order to observe the direct effect that pathogens have made on the genetic makeup of human populations over time. The results of this project will allow a multidisciplinary interpretation of historical pandemics that have influenced the course of human history. It will provide priceless information for the field of history, evolutionary biology, anthropology as well as medicine and will have direct consequences on how we manage emerging and re-emerging infectious disease in the future.
Max ERC Funding
1 474 560 €
Duration
Start date: 2013-01-01, End date: 2017-12-31
Project acronym APOQUANT
Project The quantitative Bcl-2 interactome in apoptosis: decoding how cancer cells escape death
Researcher (PI) Ana Jesus GarcIa Saez
Host Institution (HI) EBERHARD KARLS UNIVERSITAET TUEBINGEN
Country Germany
Call Details Starting Grant (StG), LS3, ERC-2012-StG_20111109
Summary The proteins of the Bcl-2 family function as key regulators of apoptosis by controlling the permeabilization of the mitochondrial outer membrane. They form an intricate, fine-tuned interaction network which is altered in cancer cells to avoid cell death. Currently, we do not understand how signaling within this network, which combines events in cytosol and membranes, is orchestrated to decide the cell fate. The main goal of this proposal is to unravel how apoptosis signaling is integrated by the Bcl-2 network by determining the quantitative Bcl-2 interactome and building with it a mathematical model that identifies which interactions determine the overall outcome. To this aim, we have established a reconstituted system for the quantification of the interactions between Bcl-2 proteins not only in solution but also in membranes at the single molecule level by fluorescence correlation spectroscopy (FCS).
(1) This project aims to quantify the relative affinities between an reconstituted Bcl-2 network by FCS.
(2) This will be combined with quantitative studies in living cells, which include the signaling pathway in its entirety. To this aim, we will develop new FCS methods for mitochondria.
(3) The structural and dynamic aspects of the Bcl-2 network will be studied by super resolution and live cell microscopy.
(4) The acquired knowledge will be used to build a mathematical model that uncovers how the multiple interactions within the Bcl-2 network are integrated and identifies critical steps in apoptosis regulation.
These studies are expected to broaden the general knowledge about the design principles of cellular signaling as well as how cancer cells alter the Bcl-2 network to escape cell death. This systems analysis will allow us to predict which perturbations in the Bcl-2 network of cancer cells can switch signaling towards cell death. Ultimately it could be translated into clinical applications for anticancer therapy.
Summary
The proteins of the Bcl-2 family function as key regulators of apoptosis by controlling the permeabilization of the mitochondrial outer membrane. They form an intricate, fine-tuned interaction network which is altered in cancer cells to avoid cell death. Currently, we do not understand how signaling within this network, which combines events in cytosol and membranes, is orchestrated to decide the cell fate. The main goal of this proposal is to unravel how apoptosis signaling is integrated by the Bcl-2 network by determining the quantitative Bcl-2 interactome and building with it a mathematical model that identifies which interactions determine the overall outcome. To this aim, we have established a reconstituted system for the quantification of the interactions between Bcl-2 proteins not only in solution but also in membranes at the single molecule level by fluorescence correlation spectroscopy (FCS).
(1) This project aims to quantify the relative affinities between an reconstituted Bcl-2 network by FCS.
(2) This will be combined with quantitative studies in living cells, which include the signaling pathway in its entirety. To this aim, we will develop new FCS methods for mitochondria.
(3) The structural and dynamic aspects of the Bcl-2 network will be studied by super resolution and live cell microscopy.
(4) The acquired knowledge will be used to build a mathematical model that uncovers how the multiple interactions within the Bcl-2 network are integrated and identifies critical steps in apoptosis regulation.
These studies are expected to broaden the general knowledge about the design principles of cellular signaling as well as how cancer cells alter the Bcl-2 network to escape cell death. This systems analysis will allow us to predict which perturbations in the Bcl-2 network of cancer cells can switch signaling towards cell death. Ultimately it could be translated into clinical applications for anticancer therapy.
Max ERC Funding
1 462 900 €
Duration
Start date: 2013-04-01, End date: 2019-03-31
Project acronym ARCHOFCON
Project The Architecture of Consciousness
Researcher (PI) Timothy John Bayne
Host Institution (HI) THE UNIVERSITY OF MANCHESTER
Country United Kingdom
Call Details Starting Grant (StG), SH4, ERC-2012-StG_20111124
Summary The nature of consciousness is one of the great unsolved mysteries of science. Although the global research effort dedicated to explaining how consciousness arises from neural and cognitive activity is now more than two decades old, as yet there is no widely accepted theory of consciousness. One reason for why no adequate theory of consciousness has yet been found is that there is a lack of clarity about what exactly a theory of consciousness needs to explain. What is needed is thus a model of the general features of consciousness — a model of the ‘architecture’ of consciousness — that will systematize the structural differences between conscious states, processes and creatures on the one hand and unconscious states, processes and creatures on the other. The aim of this project is to remove one of the central impediments to the progress of the science of consciousness by constructing such a model.
A great many of the data required for this task already exist, but these data concern different aspects of consciousness and are distributed across many disciplines. As a result, there have been few attempts to develop a truly comprehensive model of the architecture of consciousness. This project will overcome the limitations of previous work by drawing on research in philosophy, psychology, psychiatry, and cognitive neuroscience to develop a model of the architecture of consciousness that is structured around five of its core features: its subjectivity, its temporality, its unity, its selectivity, and its dimensionality (that is, the relationship between the levels of consciousness and the contents of consciousness). By providing a comprehensive characterization of what a theory of consciousness needs to explain, this project will provide a crucial piece of the puzzle of consciousness, enabling future generations of researchers to bridge the gap between raw data on the one hand and a full-blown theory of consciousness on the other
Summary
The nature of consciousness is one of the great unsolved mysteries of science. Although the global research effort dedicated to explaining how consciousness arises from neural and cognitive activity is now more than two decades old, as yet there is no widely accepted theory of consciousness. One reason for why no adequate theory of consciousness has yet been found is that there is a lack of clarity about what exactly a theory of consciousness needs to explain. What is needed is thus a model of the general features of consciousness — a model of the ‘architecture’ of consciousness — that will systematize the structural differences between conscious states, processes and creatures on the one hand and unconscious states, processes and creatures on the other. The aim of this project is to remove one of the central impediments to the progress of the science of consciousness by constructing such a model.
A great many of the data required for this task already exist, but these data concern different aspects of consciousness and are distributed across many disciplines. As a result, there have been few attempts to develop a truly comprehensive model of the architecture of consciousness. This project will overcome the limitations of previous work by drawing on research in philosophy, psychology, psychiatry, and cognitive neuroscience to develop a model of the architecture of consciousness that is structured around five of its core features: its subjectivity, its temporality, its unity, its selectivity, and its dimensionality (that is, the relationship between the levels of consciousness and the contents of consciousness). By providing a comprehensive characterization of what a theory of consciousness needs to explain, this project will provide a crucial piece of the puzzle of consciousness, enabling future generations of researchers to bridge the gap between raw data on the one hand and a full-blown theory of consciousness on the other
Max ERC Funding
1 477 483 €
Duration
Start date: 2013-03-01, End date: 2018-02-28
Project acronym ASTROLAB
Project Cold Collisions and the Pathways Toward Life in Interstellar Space
Researcher (PI) Holger Kreckel
Host Institution (HI) Klinik Max Planck Institut für Psychiatrie
Country Germany
Call Details Starting Grant (StG), PE9, ERC-2012-StG_20111012
Summary Modern telescopes like Herschel and ALMA open up a new window into molecular astrophysics to investigate a surprisingly rich chemistry that operates even at low densities and low temperatures. Observations with these instruments have the potential of unraveling key questions of astrobiology, like the accumulation of water and pre-biotic organic molecules on (exo)planets from asteroids and comets. Hand-in-hand with the heightened observational activities comes a strong demand for a thorough understanding of the molecular formation mechanisms. The vast majority of interstellar molecules are formed in ion-neutral reactions that remain efficient even at low temperatures. Unfortunately, the unusual nature of these processes under terrestrial conditions makes their laboratory study extremely difficult.
To address these issues, I propose to build a versatile merged beams setup for laboratory studies of ion-neutral collisions at the Cryogenic Storage Ring (CSR), the most ambitious of the next-generation storage devices under development worldwide. With this experimental setup, I will make use of a low-temperature and low-density environment that is ideal to simulate the conditions prevailing in interstellar space. The cryogenic surrounding, in combination with laser-generated ground state atom beams, will allow me to perform precise energy-resolved rate coefficient measurements for reactions between cold molecular ions (like, e.g., H2+, H3+, HCO+, CH2+, CH3+, etc.) and neutral atoms (H, D, C or O) in order to shed light on long-standing problems of astrochemistry and the formation of organic molecules in space.
With the large variability of the collision energy (corresponding to 40-40000 K), I will be able to provide data that are crucial for the interpretation of molecular observations in a variety of objects, ranging from cold molecular clouds to warm layers in protoplanetary disks.
Summary
Modern telescopes like Herschel and ALMA open up a new window into molecular astrophysics to investigate a surprisingly rich chemistry that operates even at low densities and low temperatures. Observations with these instruments have the potential of unraveling key questions of astrobiology, like the accumulation of water and pre-biotic organic molecules on (exo)planets from asteroids and comets. Hand-in-hand with the heightened observational activities comes a strong demand for a thorough understanding of the molecular formation mechanisms. The vast majority of interstellar molecules are formed in ion-neutral reactions that remain efficient even at low temperatures. Unfortunately, the unusual nature of these processes under terrestrial conditions makes their laboratory study extremely difficult.
To address these issues, I propose to build a versatile merged beams setup for laboratory studies of ion-neutral collisions at the Cryogenic Storage Ring (CSR), the most ambitious of the next-generation storage devices under development worldwide. With this experimental setup, I will make use of a low-temperature and low-density environment that is ideal to simulate the conditions prevailing in interstellar space. The cryogenic surrounding, in combination with laser-generated ground state atom beams, will allow me to perform precise energy-resolved rate coefficient measurements for reactions between cold molecular ions (like, e.g., H2+, H3+, HCO+, CH2+, CH3+, etc.) and neutral atoms (H, D, C or O) in order to shed light on long-standing problems of astrochemistry and the formation of organic molecules in space.
With the large variability of the collision energy (corresponding to 40-40000 K), I will be able to provide data that are crucial for the interpretation of molecular observations in a variety of objects, ranging from cold molecular clouds to warm layers in protoplanetary disks.
Max ERC Funding
1 486 800 €
Duration
Start date: 2012-09-01, End date: 2017-11-30