Project acronym ASTROGEOBIOSPHERE
Project An astronomical perspective on Earth's geological record and evolution of life
Researcher (PI) Birger Schmitz
Host Institution (HI) LUNDS UNIVERSITET
Call Details Advanced Grant (AdG), PE10, ERC-2011-ADG_20110209
Summary "This project will develop the use of relict, extraterrestrial minerals in Archean to Cenozoic slowly formed sediments as tracers of events in the solar system and cosmos, and to decipher the possible relation between such events and evolution of life and environmental change on Earth. There has been consensus that it would not be possible to reconstruct variations in the flux of different types of meteorites to Earth through the ages. Meteorite falls are rare and meteorites weather and decay rapidly on the Earth surface. However, the last years we have developed the first realistic approach to circumvent these problems. Almost all meteorite types contain a small fraction of spinel minerals that survives weathering and can be recovered from large samples of condensed sediments of any age. Inside the spinels we can locate by synchrotron-light X-ray tomography 1-30 micron sized inclusions of most of the other minerals that made up the original meteorite. With cutting-edge frontier microanalyses such as Ne-21 (solar wind, galactic rays), oxygen isotopes (meteorite group and type) and cosmic ray tracks (supernova densities) we will be able to unravel from the geological record fundamental new information about the solar system at specific times through the past 3.8 Gyr. Variations in flux and types of meteorites may reflect solar-system and galaxy gravity disturbances as well as the sequence of disruptions of the parent bodies for meteorite types known and not yet known. Cosmic-ray tracks in spinels may identify the galactic year (230 Myr) in the geological record. For the first time it will be possible to systematically relate major global biotic and tectonic events, changes in sea-level, climate and asteroid and comet impacts to what happened in the larger astronomical realm. In essence, the project is a robust approach to establish a pioneer ""astrostratigraphy"" for Earth's geological record, complementing existing bio-, chemo-, and magnetostratigraphies."
Summary
"This project will develop the use of relict, extraterrestrial minerals in Archean to Cenozoic slowly formed sediments as tracers of events in the solar system and cosmos, and to decipher the possible relation between such events and evolution of life and environmental change on Earth. There has been consensus that it would not be possible to reconstruct variations in the flux of different types of meteorites to Earth through the ages. Meteorite falls are rare and meteorites weather and decay rapidly on the Earth surface. However, the last years we have developed the first realistic approach to circumvent these problems. Almost all meteorite types contain a small fraction of spinel minerals that survives weathering and can be recovered from large samples of condensed sediments of any age. Inside the spinels we can locate by synchrotron-light X-ray tomography 1-30 micron sized inclusions of most of the other minerals that made up the original meteorite. With cutting-edge frontier microanalyses such as Ne-21 (solar wind, galactic rays), oxygen isotopes (meteorite group and type) and cosmic ray tracks (supernova densities) we will be able to unravel from the geological record fundamental new information about the solar system at specific times through the past 3.8 Gyr. Variations in flux and types of meteorites may reflect solar-system and galaxy gravity disturbances as well as the sequence of disruptions of the parent bodies for meteorite types known and not yet known. Cosmic-ray tracks in spinels may identify the galactic year (230 Myr) in the geological record. For the first time it will be possible to systematically relate major global biotic and tectonic events, changes in sea-level, climate and asteroid and comet impacts to what happened in the larger astronomical realm. In essence, the project is a robust approach to establish a pioneer ""astrostratigraphy"" for Earth's geological record, complementing existing bio-, chemo-, and magnetostratigraphies."
Max ERC Funding
1 950 000 €
Duration
Start date: 2012-04-01, End date: 2017-03-31
Project acronym ATMOGAIN
Project Atmospheric Gas-Aerosol Interface:
From Fundamental Theory to Global Effects
Researcher (PI) Ilona Anniina Riipinen
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Starting Grant (StG), PE10, ERC-2011-StG_20101014
Summary Atmospheric aerosol particles are a major player in the earth system: they impact the climate by scattering and absorbing solar radiation, as well as regulating the properties of clouds. On regional scales aerosol particles are among the main pollutants deteriorating air quality. Capturing the impact of aerosols is one of the main challenges in understanding the driving forces behind changing climate and air quality.
Atmospheric aerosol numbers are governed by the ultrafine (< 100 nm in diameter) particles. Most of these particles have been formed from atmospheric vapours, and their fate and impacts are governed by the mass transport processes between the gas and particulate phases. These transport processes are currently poorly understood. Correct representation of the aerosol growth/shrinkage by condensation/evaporation of atmospheric vapours is thus a prerequisite for capturing the evolution and impacts of aerosols.
I propose to start a research group that will address the major current unknowns in atmospheric ultrafine particle growth and evaporation. First, we will develop a unified theoretical framework to describe the mass accommodation processes at aerosol surfaces, aiming to resolve the current ambiguity with respect to the uptake of atmospheric vapours by aerosols. Second, we will study the condensational properties of selected organic compounds and their mixtures. Organic compounds are known to contribute significantly to atmospheric aerosol growth, but the properties that govern their condensation, such as saturation vapour pressures and activities, are largely unknown. Third, we aim to resolve the gas and particulate phase processes that govern the growth of realistic atmospheric aerosol. Fourth, we will parameterize ultrafine aerosol growth, implement the parameterizations to chemical transport models, and quantify the impact of these condensation and evaporation processes on global and regional aerosol budgets.
Summary
Atmospheric aerosol particles are a major player in the earth system: they impact the climate by scattering and absorbing solar radiation, as well as regulating the properties of clouds. On regional scales aerosol particles are among the main pollutants deteriorating air quality. Capturing the impact of aerosols is one of the main challenges in understanding the driving forces behind changing climate and air quality.
Atmospheric aerosol numbers are governed by the ultrafine (< 100 nm in diameter) particles. Most of these particles have been formed from atmospheric vapours, and their fate and impacts are governed by the mass transport processes between the gas and particulate phases. These transport processes are currently poorly understood. Correct representation of the aerosol growth/shrinkage by condensation/evaporation of atmospheric vapours is thus a prerequisite for capturing the evolution and impacts of aerosols.
I propose to start a research group that will address the major current unknowns in atmospheric ultrafine particle growth and evaporation. First, we will develop a unified theoretical framework to describe the mass accommodation processes at aerosol surfaces, aiming to resolve the current ambiguity with respect to the uptake of atmospheric vapours by aerosols. Second, we will study the condensational properties of selected organic compounds and their mixtures. Organic compounds are known to contribute significantly to atmospheric aerosol growth, but the properties that govern their condensation, such as saturation vapour pressures and activities, are largely unknown. Third, we aim to resolve the gas and particulate phase processes that govern the growth of realistic atmospheric aerosol. Fourth, we will parameterize ultrafine aerosol growth, implement the parameterizations to chemical transport models, and quantify the impact of these condensation and evaporation processes on global and regional aerosol budgets.
Max ERC Funding
1 498 099 €
Duration
Start date: 2011-09-01, End date: 2016-08-31
Project acronym AXION
Project Axions: From Heaven to Earth
Researcher (PI) Frank Wilczek
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Advanced Grant (AdG), PE2, ERC-2016-ADG
Summary Axions are hypothetical particles whose existence would solve two major problems: the strong P, T problem (a major blemish on the standard model); and the dark matter problem. It is a most important goal to either observe or rule out the existence of a cosmic axion background. It appears that decisive observations may be possible, but only after orchestrating insight from specialities ranging from quantum field theory and astrophysical modeling to ultra-low noise quantum measurement theory. Detailed predictions for the magnitude and structure of the cosmic axion background depend on cosmological and astrophysical modeling, which can be constrained by theoretical insight and numerical simulation. In parallel, we must optimize strategies for extracting accessible signals from that very weakly interacting source.
While the existence of axions as fundamental particles remains hypothetical, the equations governing how axions interact with electromagnetic fields also govern (with different parameters) how certain materials interact with electromagnetic fields. Thus those materials embody “emergent” axions. The equations have remarkable properties, which one can test in these materials, and possibly put to practical use.
Closely related to axions, mathematically, are anyons. Anyons are particle-like excitations that elude the familiar classification into bosons and fermions. Theoretical and numerical studies indicate that they are common emergent features of highly entangled states of matter in two dimensions. Recent work suggests the existence of states of matter, both natural and engineered, in which anyon dynamics is both important and experimentally accessible. Since the equations for anyons and axions are remarkably similar, and both have common, deep roots in symmetry and topology, it will be fruitful to consider them together.
Summary
Axions are hypothetical particles whose existence would solve two major problems: the strong P, T problem (a major blemish on the standard model); and the dark matter problem. It is a most important goal to either observe or rule out the existence of a cosmic axion background. It appears that decisive observations may be possible, but only after orchestrating insight from specialities ranging from quantum field theory and astrophysical modeling to ultra-low noise quantum measurement theory. Detailed predictions for the magnitude and structure of the cosmic axion background depend on cosmological and astrophysical modeling, which can be constrained by theoretical insight and numerical simulation. In parallel, we must optimize strategies for extracting accessible signals from that very weakly interacting source.
While the existence of axions as fundamental particles remains hypothetical, the equations governing how axions interact with electromagnetic fields also govern (with different parameters) how certain materials interact with electromagnetic fields. Thus those materials embody “emergent” axions. The equations have remarkable properties, which one can test in these materials, and possibly put to practical use.
Closely related to axions, mathematically, are anyons. Anyons are particle-like excitations that elude the familiar classification into bosons and fermions. Theoretical and numerical studies indicate that they are common emergent features of highly entangled states of matter in two dimensions. Recent work suggests the existence of states of matter, both natural and engineered, in which anyon dynamics is both important and experimentally accessible. Since the equations for anyons and axions are remarkably similar, and both have common, deep roots in symmetry and topology, it will be fruitful to consider them together.
Max ERC Funding
2 324 391 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym CAAXPROCESSINGHUMDIS
Project CAAX Protein Processing in Human DIsease: From Cancer to Progeria
Researcher (PI) Martin Olof Bergö
Host Institution (HI) GOETEBORGS UNIVERSITET
Call Details Starting Grant (StG), LS6, ERC-2007-StG
Summary My objective is to understand the physiologic and medical importance of the posttranslational processing of CAAX proteins (e.g., K-RAS and prelamin A) and to define the suitability of the CAAX protein processing enzymes as therapeutic targets for the treatment of cancer and progeria. CAAX proteins undergo three posttranslational processing steps at a carboxyl-terminal CAAX motif. These processing steps, which are mediated by four different enzymes (FTase, GGTase-I, RCE1, and ICMT), increase the hydrophobicity of the carboxyl terminus of the protein and thereby facilitate interactions with membrane surfaces. Somatic mutations in K-RAS deregulate cell growth and are etiologically involved in the pathogenesis of many forms of cancer. A mutation in prelamin A causes Hutchinson-Gilford progeria syndrome—a pediatric progeroid syndrome associated with misshaped cell nuclei and a host of aging-like disease phenotypes. One strategy to render the mutant K-RAS and prelamin A less harmful is to interfere with their ability to bind to membrane surfaces (e.g., the plasma membrane and the nuclear envelope). This could be accomplished by inhibiting the enzymes that modify the CAAX motif. My Specific Aims are: (1) To define the suitability of the CAAX processing enzymes as therapeutic targets in the treatment of K-RAS-induced lung cancer and leukemia; and (2) To test the hypothesis that inactivation of FTase or ICMT will ameliorate disease phenotypes of progeria. I have developed genetic strategies to produce lung cancer or leukemia in mice by activating an oncogenic K-RAS and simultaneously inactivating different CAAX processing enzymes. I will also inactivate several CAAX processing enzymes in mice with progeria—both before the emergence of phenotypes and after the development of advanced disease phenotypes. These experiments should reveal whether the absence of the different CAAX processing enzymes affects the onset, progression, or regression of cancer and progeria.
Summary
My objective is to understand the physiologic and medical importance of the posttranslational processing of CAAX proteins (e.g., K-RAS and prelamin A) and to define the suitability of the CAAX protein processing enzymes as therapeutic targets for the treatment of cancer and progeria. CAAX proteins undergo three posttranslational processing steps at a carboxyl-terminal CAAX motif. These processing steps, which are mediated by four different enzymes (FTase, GGTase-I, RCE1, and ICMT), increase the hydrophobicity of the carboxyl terminus of the protein and thereby facilitate interactions with membrane surfaces. Somatic mutations in K-RAS deregulate cell growth and are etiologically involved in the pathogenesis of many forms of cancer. A mutation in prelamin A causes Hutchinson-Gilford progeria syndrome—a pediatric progeroid syndrome associated with misshaped cell nuclei and a host of aging-like disease phenotypes. One strategy to render the mutant K-RAS and prelamin A less harmful is to interfere with their ability to bind to membrane surfaces (e.g., the plasma membrane and the nuclear envelope). This could be accomplished by inhibiting the enzymes that modify the CAAX motif. My Specific Aims are: (1) To define the suitability of the CAAX processing enzymes as therapeutic targets in the treatment of K-RAS-induced lung cancer and leukemia; and (2) To test the hypothesis that inactivation of FTase or ICMT will ameliorate disease phenotypes of progeria. I have developed genetic strategies to produce lung cancer or leukemia in mice by activating an oncogenic K-RAS and simultaneously inactivating different CAAX processing enzymes. I will also inactivate several CAAX processing enzymes in mice with progeria—both before the emergence of phenotypes and after the development of advanced disease phenotypes. These experiments should reveal whether the absence of the different CAAX processing enzymes affects the onset, progression, or regression of cancer and progeria.
Max ERC Funding
1 689 600 €
Duration
Start date: 2008-06-01, End date: 2013-05-31
Project acronym CC-MEM
Project Coordination and Composability: The Keys to Efficient Memory System Design
Researcher (PI) David BLACK-SCHAFFER
Host Institution (HI) UPPSALA UNIVERSITET
Call Details Starting Grant (StG), PE6, ERC-2016-STG
Summary Computer systems today are power limited. As a result, efficiency gains can be translated into performance. Over the past decade we have been so effective at making computation more efficient that we are now at the point where we spend as much energy moving data (from memory to cache to processor) as we do computing the results. And this trend is only becoming worse as we demand more bandwidth for more powerful processors. To improve performance we need to revisit the way we design memory systems from an energy-first perspective, both at the hardware level and by coordinating data movement between hardware and software.
CC-MEM will address memory system efficiency by redesigning low-level hardware and high-level hardware/software integration for energy efficiency. The key novelty is in developing a framework for creating efficient memory systems. This framework will enable researchers and designers to compose solutions to different memory system problems (through a shared exchange of metadata) and coordinate them towards high-level system efficiency goals (through a shared policy framework). Central to this framework is a bilateral exchange of metadata and policy between hardware and software components. This novel communication will open new challenges and opportunities for fine-grained optimizations, system-level efficiency metrics, and more effective divisions of responsibility between hardware and software components.
CC-MEM will change how researchers and designers approach memory system design from today’s ad hoc development of local solutions to one wherein disparate components can be integrated (composed) and driven (coordinated) by system-level metrics. As a result, we will be able to more intelligently manage data, leading to dramatically lower memory system energy and increased performance, and open new possibilities for hardware and software optimizations.
Summary
Computer systems today are power limited. As a result, efficiency gains can be translated into performance. Over the past decade we have been so effective at making computation more efficient that we are now at the point where we spend as much energy moving data (from memory to cache to processor) as we do computing the results. And this trend is only becoming worse as we demand more bandwidth for more powerful processors. To improve performance we need to revisit the way we design memory systems from an energy-first perspective, both at the hardware level and by coordinating data movement between hardware and software.
CC-MEM will address memory system efficiency by redesigning low-level hardware and high-level hardware/software integration for energy efficiency. The key novelty is in developing a framework for creating efficient memory systems. This framework will enable researchers and designers to compose solutions to different memory system problems (through a shared exchange of metadata) and coordinate them towards high-level system efficiency goals (through a shared policy framework). Central to this framework is a bilateral exchange of metadata and policy between hardware and software components. This novel communication will open new challenges and opportunities for fine-grained optimizations, system-level efficiency metrics, and more effective divisions of responsibility between hardware and software components.
CC-MEM will change how researchers and designers approach memory system design from today’s ad hoc development of local solutions to one wherein disparate components can be integrated (composed) and driven (coordinated) by system-level metrics. As a result, we will be able to more intelligently manage data, leading to dramatically lower memory system energy and increased performance, and open new possibilities for hardware and software optimizations.
Max ERC Funding
1 610 000 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym DisDyn
Project Distributed and Dynamic Graph Algorithms and Complexity
Researcher (PI) Danupon NA NONGKAI
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Starting Grant (StG), PE6, ERC-2016-STG
Summary This project aims to (i) resolve challenging graph problems in distributed and dynamic settings, with a focus on connectivity problems (such as computing edge connectivity and distances), and (ii) on the way develop a systematic approach to attack problems in these settings, by thoroughly exploring relevant algorithmic and complexity-theoretic landscapes. Tasks include
- building a hierarchy of intermediate computational models so that designing algorithms and proving lower bounds can be done in several intermediate steps,
- explaining the limits of algorithms by proving conditional lower bounds based on old and new reasonable conjectures, and
- connecting techniques in the two settings to generate new insights that are unlikely to emerge from the isolated viewpoint of a single field.
The project will take advantage from and contribute to the developments in many young fields in theoretical computer science, such as fine-grained complexity and sublinear algorithms. Resolving one of the connectivity problems will already be a groundbreaking result. However, given the approach, it is likely that one breakthrough will lead to many others.
Summary
This project aims to (i) resolve challenging graph problems in distributed and dynamic settings, with a focus on connectivity problems (such as computing edge connectivity and distances), and (ii) on the way develop a systematic approach to attack problems in these settings, by thoroughly exploring relevant algorithmic and complexity-theoretic landscapes. Tasks include
- building a hierarchy of intermediate computational models so that designing algorithms and proving lower bounds can be done in several intermediate steps,
- explaining the limits of algorithms by proving conditional lower bounds based on old and new reasonable conjectures, and
- connecting techniques in the two settings to generate new insights that are unlikely to emerge from the isolated viewpoint of a single field.
The project will take advantage from and contribute to the developments in many young fields in theoretical computer science, such as fine-grained complexity and sublinear algorithms. Resolving one of the connectivity problems will already be a groundbreaking result. However, given the approach, it is likely that one breakthrough will lead to many others.
Max ERC Funding
1 500 000 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym ERIKLINDAHLERC2007
Project Multiscale and Distributed Computing Algorithms for Biomolecular Simulation and Efficient Free Energy Calculations
Researcher (PI) Erik Lindahl
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Starting Grant (StG), PE4, ERC-2007-StG
Summary The long-term goal of our research is to advance the state-of-the-art in molecular simulation algorithms by 4-5 orders of magnitude, particularly in the context of the GROMACS software we are developing. This is an immense challenge, but with huge potential rewards: it will be an amazing virtual microscope for basic chemistry, polymer and material science research; it could help us understand the molecular basis of diseases such as Creutzfeldt-Jacob, and it would enable rational design rather than random screening for future drugs. To realize it, we will focus on four critical topics: • ALGORITHMS FOR SIMULATION ON GRAPHICS AND OTHER STREAMING PROCESSORS: Graphics cards and the test Intel 80-core chip are not only the most powerful processors available, but this type of streaming architectures will power many supercomputers in 3-5 years, and it is thus critical that we design new “streamable” MD algorithms. • MULTISCALE MODELING: We will develop virtual-site-based methods to bridge atomic and mesoscopic dynamics, QM/MM, and mixed explicit/implicit solvent models with water layers around macromolecules. • MULTI-LEVEL PARALLEL & DISTRIBUTED SIMULATION: Distributed computing provides virtually infinite computer power, but has been limited to small systems. We will address this by combining SMP parallelization and Markov State Models that partition phase space into transition/local dynamics to enable distributed simulation of arbitrary systems. • EFFICIENT FREE ENERGY CALCULATIONS: We will design algorithms for multi-conformational parallel sampling, implement Bennett Acceptance Ratios in Gromacs, correction terms for PME lattice sums, and combine standard force fields with polarization/multipoles, e.g. Amoeba. We have a very strong track record of converting methodological advances into applications, and the results will have impact on a wide range of fields from biomolecules and polymer science through material simulations and nanotechnology.
Summary
The long-term goal of our research is to advance the state-of-the-art in molecular simulation algorithms by 4-5 orders of magnitude, particularly in the context of the GROMACS software we are developing. This is an immense challenge, but with huge potential rewards: it will be an amazing virtual microscope for basic chemistry, polymer and material science research; it could help us understand the molecular basis of diseases such as Creutzfeldt-Jacob, and it would enable rational design rather than random screening for future drugs. To realize it, we will focus on four critical topics: • ALGORITHMS FOR SIMULATION ON GRAPHICS AND OTHER STREAMING PROCESSORS: Graphics cards and the test Intel 80-core chip are not only the most powerful processors available, but this type of streaming architectures will power many supercomputers in 3-5 years, and it is thus critical that we design new “streamable” MD algorithms. • MULTISCALE MODELING: We will develop virtual-site-based methods to bridge atomic and mesoscopic dynamics, QM/MM, and mixed explicit/implicit solvent models with water layers around macromolecules. • MULTI-LEVEL PARALLEL & DISTRIBUTED SIMULATION: Distributed computing provides virtually infinite computer power, but has been limited to small systems. We will address this by combining SMP parallelization and Markov State Models that partition phase space into transition/local dynamics to enable distributed simulation of arbitrary systems. • EFFICIENT FREE ENERGY CALCULATIONS: We will design algorithms for multi-conformational parallel sampling, implement Bennett Acceptance Ratios in Gromacs, correction terms for PME lattice sums, and combine standard force fields with polarization/multipoles, e.g. Amoeba. We have a very strong track record of converting methodological advances into applications, and the results will have impact on a wide range of fields from biomolecules and polymer science through material simulations and nanotechnology.
Max ERC Funding
992 413 €
Duration
Start date: 2008-09-01, End date: 2013-08-31
Project acronym FatemapB
Project High Resolution Mapping of Fetal and Adult B Cell Fates During Ontogeny
Researcher (PI) Joan YUAN
Host Institution (HI) LUNDS UNIVERSITET
Call Details Starting Grant (StG), LS6, ERC-2016-STG
Summary FateMapB aims to understand how the unique differentiation potential of fetal hematopoietic stem and progenitor cells
(HSPCs) contribute to functionally distinct cell types of the adult immune system. While most immune cells are replenished
by HSPCs through life, others emerge during a limited window in fetal life and sustain through self-renewal in situ. The
lineage identity of fetal HSPCs, and the extent of their contribution to the adult immune repertoire remain surprisingly
unclear. I previously identified the fetal specific RNA binding protein Lin28b as a post-transcriptional molecular switch
capable of inducing fetal-like hematopoiesis in adult bone marrow HSPCs (Yuan et al. Science, 2012). This discovery has
afforded me with unique perspectives on the formation of the mammalian immune system. The concept that the mature
immune system is a mosaic of fetal and adult derived cell types is addressed herein with an emphasis on the B cell lineage.
We will use two complementary lineage-tracing technologies to stratify the immune system as a function of developmental
time, generating fundamental insight into the division of labor between fetal and adult HSPCs that ultimately provides
effective host protection.
Aim 1. Determine the qualitative and quantitative contribution of fetal HSPCs to the mature immune repertoire in situ
through Cre recombination mediated lineage-tracing.
Aim 2. Resolve the disputed lineage relationship between fetal derived B1a cells and adult derived B2 cells by single cell
lineage-tracing using cellular barcoding in vivo.
Aim 3. Characterize the mechanism and effector functions of Lin28b induced B1a cell development for assessing the
clinical utility of inducible fetal-like lymphopoiesis.
The implications of FateMapB extend beyond normal development to immune regeneration and age-related features of
leukemogenesis. Finally, our combinatorial lineage-tracing approach enables dissection of cell fates with previously
unattainable resolution.
Summary
FateMapB aims to understand how the unique differentiation potential of fetal hematopoietic stem and progenitor cells
(HSPCs) contribute to functionally distinct cell types of the adult immune system. While most immune cells are replenished
by HSPCs through life, others emerge during a limited window in fetal life and sustain through self-renewal in situ. The
lineage identity of fetal HSPCs, and the extent of their contribution to the adult immune repertoire remain surprisingly
unclear. I previously identified the fetal specific RNA binding protein Lin28b as a post-transcriptional molecular switch
capable of inducing fetal-like hematopoiesis in adult bone marrow HSPCs (Yuan et al. Science, 2012). This discovery has
afforded me with unique perspectives on the formation of the mammalian immune system. The concept that the mature
immune system is a mosaic of fetal and adult derived cell types is addressed herein with an emphasis on the B cell lineage.
We will use two complementary lineage-tracing technologies to stratify the immune system as a function of developmental
time, generating fundamental insight into the division of labor between fetal and adult HSPCs that ultimately provides
effective host protection.
Aim 1. Determine the qualitative and quantitative contribution of fetal HSPCs to the mature immune repertoire in situ
through Cre recombination mediated lineage-tracing.
Aim 2. Resolve the disputed lineage relationship between fetal derived B1a cells and adult derived B2 cells by single cell
lineage-tracing using cellular barcoding in vivo.
Aim 3. Characterize the mechanism and effector functions of Lin28b induced B1a cell development for assessing the
clinical utility of inducible fetal-like lymphopoiesis.
The implications of FateMapB extend beyond normal development to immune regeneration and age-related features of
leukemogenesis. Finally, our combinatorial lineage-tracing approach enables dissection of cell fates with previously
unattainable resolution.
Max ERC Funding
1 499 905 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym FLEXBOT
Project Flexible object manipulation based on statistical learning and topological representations
Researcher (PI) Danica Kragic Jensfelt
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary A vision for the future are autonomous and semi-autonomous systems that perform complex tasks safely and robustly in interaction with humans and the environment. The action of such a system needs to be carefully planned and executed, taking into account the available sensory feedback and knowledge about the environment. Many of the existing approaches view motion planning as a geometrical problem, not taking the uncertainty into account. Our goal is to study how different type of representations and algorithms from the area of machine learning and classical mathematics can be used to solve some of the open problems in the area of action recognition and action generation.
FLEXBOT will explore how how topological representations can be used for an integrated approach toward i) vision based understanding of complex human hand motion, ii) mapping and control of robotics hands and iii) integrating the topological representations with models for high-level task encoding and planning.
Our research opens for new and important areas scientifically and technologically. Scientifically, we push for new way of thinking in an area that has traditionally been born from mechanical modeling of bodies. Technologically, we will provide methods plausible for evaluation of new designs of robotic and prosthetic hands. Further development of machine learning and computer vision methods will allow for scene understanding that goes beyond the assumption of worlds of rigid bodies, including complex objects such as hands.
Summary
A vision for the future are autonomous and semi-autonomous systems that perform complex tasks safely and robustly in interaction with humans and the environment. The action of such a system needs to be carefully planned and executed, taking into account the available sensory feedback and knowledge about the environment. Many of the existing approaches view motion planning as a geometrical problem, not taking the uncertainty into account. Our goal is to study how different type of representations and algorithms from the area of machine learning and classical mathematics can be used to solve some of the open problems in the area of action recognition and action generation.
FLEXBOT will explore how how topological representations can be used for an integrated approach toward i) vision based understanding of complex human hand motion, ii) mapping and control of robotics hands and iii) integrating the topological representations with models for high-level task encoding and planning.
Our research opens for new and important areas scientifically and technologically. Scientifically, we push for new way of thinking in an area that has traditionally been born from mechanical modeling of bodies. Technologically, we will provide methods plausible for evaluation of new designs of robotic and prosthetic hands. Further development of machine learning and computer vision methods will allow for scene understanding that goes beyond the assumption of worlds of rigid bodies, including complex objects such as hands.
Max ERC Funding
1 398 720 €
Duration
Start date: 2012-01-01, End date: 2017-12-31
Project acronym GLOBALVISION
Project Global Optimization Methods in Computer Vision, Pattern Recognition and Medical Imaging
Researcher (PI) Fredrik Kahl
Host Institution (HI) LUNDS UNIVERSITET
Call Details Starting Grant (StG), PE5, ERC-2007-StG
Summary Computer vision concerns itself with understanding the real world through the analysis of images. Typical problems are object recognition, medical image segmentation, geometric reconstruction problems and navigation of autonomous vehicles. Such problems often lead to complicated optimization problems with a mixture of discrete and continuous variables, or even infinite dimensional variables in terms of curves and surfaces. Today, state-of-the-art in solving these problems generally relies on heuristic methods that generate only local optima of various qualities. During the last few years, work by the applicant, co-workers, and others has opened new possibilities. This research project builds on this. We will in this project focus on developing new global optimization methods for computing high-quality solutions for a broad class of problems. A guiding principle will be to relax the original, complicated problem to an approximate, simpler one to which globally optimal solutions can more easily be computed. Technically, this relaxed problem often is convex. A crucial point in this approach is to estimate the quality of the exact solution of the approximate problem compared to the (unknown) global optimum of the original problem. Preliminary results have been well received by the research community and we now wish to extend this work to more difficult and more general problem settings, resulting in thorough re-examination of algorithms used widely in different and trans-disciplinary fields. This project is to be considered as a basic research project with relevance to industry. The expected outcome is new knowledge spread to a wide community through scientific papers published at international journals and conferences as well as publicly available software.
Summary
Computer vision concerns itself with understanding the real world through the analysis of images. Typical problems are object recognition, medical image segmentation, geometric reconstruction problems and navigation of autonomous vehicles. Such problems often lead to complicated optimization problems with a mixture of discrete and continuous variables, or even infinite dimensional variables in terms of curves and surfaces. Today, state-of-the-art in solving these problems generally relies on heuristic methods that generate only local optima of various qualities. During the last few years, work by the applicant, co-workers, and others has opened new possibilities. This research project builds on this. We will in this project focus on developing new global optimization methods for computing high-quality solutions for a broad class of problems. A guiding principle will be to relax the original, complicated problem to an approximate, simpler one to which globally optimal solutions can more easily be computed. Technically, this relaxed problem often is convex. A crucial point in this approach is to estimate the quality of the exact solution of the approximate problem compared to the (unknown) global optimum of the original problem. Preliminary results have been well received by the research community and we now wish to extend this work to more difficult and more general problem settings, resulting in thorough re-examination of algorithms used widely in different and trans-disciplinary fields. This project is to be considered as a basic research project with relevance to industry. The expected outcome is new knowledge spread to a wide community through scientific papers published at international journals and conferences as well as publicly available software.
Max ERC Funding
1 440 000 €
Duration
Start date: 2008-07-01, End date: 2013-06-30