Project acronym BRAIN-MATCH
Project Matching CNS Lineage Maps with Molecular Brain Tumor Portraits for Translational Exploitation
Researcher (PI) Stefan PFISTER
Host Institution (HI) DEUTSCHES KREBSFORSCHUNGSZENTRUM HEIDELBERG
Country Germany
Call Details Consolidator Grant (CoG), LS2, ERC-2018-COG
Summary Brain tumors represent an extremely heterogeneous group of more than 100 different molecularly distinct diseases, many of which are still almost uniformly lethal despite five decades of clinical trials. In contrast to hematologic malignancies and carcinomas, the cell-of-origin for the vast majority of these entities is unknown. This knowledge gap currently precludes a comprehensive understanding of tumor biology and also limits translational exploitation (e.g., utilizing lineage targets for novel therapies and circulating brain tumor cells for liquid biopsies).
The BRAIN-MATCH project represents an ambitious program to address this challenge and unmet medical need by taking an approach that (i) extensively utilizes existing molecular profiles of more than 30,000 brain tumor samples covering more than 100 different entities, publicly available single-cell sequencing data of normal brain regions, and bulk normal tissue data at different times of development across different species; (ii) generates unprecedented maps of normal human CNS development by using state-of-the art novel technologies; (iii) matches these molecular portraits of normal cell types with tumor datasets in order to identify specific cell-of-origin populations for individual tumor entities; and (iv) validates the most promising cell-of-origin populations and tumor-specific lineage and/or surface markers in vivo.
The expected outputs of BRAIN-MATCH are four-fold: (i) delivery of an unprecedented atlas of human normal CNS development, which will also be of great relevance for diverse fields other than cancer; (ii) functional validation of at least three lineage targets; (iii) isolation and molecular characterization of circulating brain tumor cells from patients´ blood for at least five tumor entities; and (iv) generation of at least three novel mouse models of brain tumor entities for which currently no faithful models exist.
Summary
Brain tumors represent an extremely heterogeneous group of more than 100 different molecularly distinct diseases, many of which are still almost uniformly lethal despite five decades of clinical trials. In contrast to hematologic malignancies and carcinomas, the cell-of-origin for the vast majority of these entities is unknown. This knowledge gap currently precludes a comprehensive understanding of tumor biology and also limits translational exploitation (e.g., utilizing lineage targets for novel therapies and circulating brain tumor cells for liquid biopsies).
The BRAIN-MATCH project represents an ambitious program to address this challenge and unmet medical need by taking an approach that (i) extensively utilizes existing molecular profiles of more than 30,000 brain tumor samples covering more than 100 different entities, publicly available single-cell sequencing data of normal brain regions, and bulk normal tissue data at different times of development across different species; (ii) generates unprecedented maps of normal human CNS development by using state-of-the art novel technologies; (iii) matches these molecular portraits of normal cell types with tumor datasets in order to identify specific cell-of-origin populations for individual tumor entities; and (iv) validates the most promising cell-of-origin populations and tumor-specific lineage and/or surface markers in vivo.
The expected outputs of BRAIN-MATCH are four-fold: (i) delivery of an unprecedented atlas of human normal CNS development, which will also be of great relevance for diverse fields other than cancer; (ii) functional validation of at least three lineage targets; (iii) isolation and molecular characterization of circulating brain tumor cells from patients´ blood for at least five tumor entities; and (iv) generation of at least three novel mouse models of brain tumor entities for which currently no faithful models exist.
Max ERC Funding
1 999 875 €
Duration
Start date: 2019-05-01, End date: 2024-04-30
Project acronym CaLA
Project The Capillary Lock Actuator: A novel bistable microfluidic actuator for cost-effective high-density actuator arrays suitable for large-scale graphical tactile displays
Researcher (PI) Bastian Rapp
Host Institution (HI) ALBERT-LUDWIGS-UNIVERSITAET FREIBURG
Country Germany
Call Details Consolidator Grant (CoG), PE7, ERC-2018-COG
Summary According to the World Health Organization more than 285 million people worldwide are visually impaired. In a world where graphics and online content (images, webpages) become increasingly important the inability to perceive information visually is the primary inhibitor for inclusion. In contrast to display technology for sighted people, tactile displays which translate text and graphics to touchable pixels (taxels) have seen little progress in recent decades. So-called Braille lines which display only a single line of text are still the norm. The reason why graphical tactile displays do not exist is the lack of a suitable actuator technology which allows generating massively parallelized individually addressable cost-effective taxel arrays.
This ERC Consolidator project aims at a revolution in microactuator array technology with a fundamentally new concept termed the Capillary Lock Actuator (CaLA). CaLA is a novel bistable massively parallelizable microfluidic microactuator which overcomes many of the limitations currently associated with microactuators. It can be operated with low-voltage control signals and requires virtually no power for actuation. CaLA harnesses three concepts inherent to microfluidics: positive capillary pressure, segmented flow and controllable locally confined changes in wetting. The project will use CaLA actuator arrays for setting up the very first portable tactile graphic display with 30.000 individually addressable taxels thereby significantly outperforming the state-of-the-art. It will be based on manufacturing techniques for highly complex microstructures in glass invented by my group.
CaLA will be a significant breakthrough in actuator technology and enabling for many applications in microsystem technology. Most importantly, it will be a significant step towards making the information technology inclusive for the visually impaired by providing the first robust cost-effective solution to large-scale tactile displays.
Summary
According to the World Health Organization more than 285 million people worldwide are visually impaired. In a world where graphics and online content (images, webpages) become increasingly important the inability to perceive information visually is the primary inhibitor for inclusion. In contrast to display technology for sighted people, tactile displays which translate text and graphics to touchable pixels (taxels) have seen little progress in recent decades. So-called Braille lines which display only a single line of text are still the norm. The reason why graphical tactile displays do not exist is the lack of a suitable actuator technology which allows generating massively parallelized individually addressable cost-effective taxel arrays.
This ERC Consolidator project aims at a revolution in microactuator array technology with a fundamentally new concept termed the Capillary Lock Actuator (CaLA). CaLA is a novel bistable massively parallelizable microfluidic microactuator which overcomes many of the limitations currently associated with microactuators. It can be operated with low-voltage control signals and requires virtually no power for actuation. CaLA harnesses three concepts inherent to microfluidics: positive capillary pressure, segmented flow and controllable locally confined changes in wetting. The project will use CaLA actuator arrays for setting up the very first portable tactile graphic display with 30.000 individually addressable taxels thereby significantly outperforming the state-of-the-art. It will be based on manufacturing techniques for highly complex microstructures in glass invented by my group.
CaLA will be a significant breakthrough in actuator technology and enabling for many applications in microsystem technology. Most importantly, it will be a significant step towards making the information technology inclusive for the visually impaired by providing the first robust cost-effective solution to large-scale tactile displays.
Max ERC Funding
1 999 750 €
Duration
Start date: 2019-05-01, End date: 2024-04-30
Project acronym CancerHetero
Project Dissection of tumor heterogeneity in vivo
Researcher (PI) Haikun Liu
Host Institution (HI) DEUTSCHES KREBSFORSCHUNGSZENTRUM HEIDELBERG
Country Germany
Call Details Consolidator Grant (CoG), LS2, ERC-2014-CoG
Summary It is now widely accepted that tumors are composed of heterogeneous population of cells, which contribute
to many aspects of treatment resistance observed in clinic. Despite the acknowledgment of the tumor cell
heterogeneity, little evidence was shown about complexity and dynamics of this heterogeneity in vivo,
mainly because of lacking flexible genetic tools which allow sophisticated analysis in primary tumors. We
recently developed a very efficient mouse somatic brain tumor model which have a full penetrance of high
grade glioma development. Combination of this model with several transgenic mouse lines allow us to
isolate and track different population of cells in primary tumors, most importantly, we also confirmed that
this can be done on single cell level. Here I propose to use this set of valuable genetic tools to dissect the
cellular heterogeneity in mouse gliomas. First we will perform several single cell lineage tracing experiment
to demonstrate the contribution of brain tumor stem cell, tumor progenitors as well as the relatively
differentiated cells, which will provide a complete data sets of clonal dynamics of different tumor cell types.
Second we will further perform this tracing experiment with the presence of conventional chemotherapy.
Third we will perform single cell RNA sequencing experiment to capture the molecular signature, which
determines the cellular heterogeneity, discovered by single cell tracing. This result will be further validated
by analysis of this molecular signatures in human primary tumors. We will also use our established in vivo
target validation approach to manipulate the candidate molecular regulators to establish the functional
correlation between molecular signature and phenotypic heterogeneity. This project will greatly improve our
understanding of tumor heterogeneity, and possibly provide novel approaches and strategies of targeting
human glioblastomas.
Summary
It is now widely accepted that tumors are composed of heterogeneous population of cells, which contribute
to many aspects of treatment resistance observed in clinic. Despite the acknowledgment of the tumor cell
heterogeneity, little evidence was shown about complexity and dynamics of this heterogeneity in vivo,
mainly because of lacking flexible genetic tools which allow sophisticated analysis in primary tumors. We
recently developed a very efficient mouse somatic brain tumor model which have a full penetrance of high
grade glioma development. Combination of this model with several transgenic mouse lines allow us to
isolate and track different population of cells in primary tumors, most importantly, we also confirmed that
this can be done on single cell level. Here I propose to use this set of valuable genetic tools to dissect the
cellular heterogeneity in mouse gliomas. First we will perform several single cell lineage tracing experiment
to demonstrate the contribution of brain tumor stem cell, tumor progenitors as well as the relatively
differentiated cells, which will provide a complete data sets of clonal dynamics of different tumor cell types.
Second we will further perform this tracing experiment with the presence of conventional chemotherapy.
Third we will perform single cell RNA sequencing experiment to capture the molecular signature, which
determines the cellular heterogeneity, discovered by single cell tracing. This result will be further validated
by analysis of this molecular signatures in human primary tumors. We will also use our established in vivo
target validation approach to manipulate the candidate molecular regulators to establish the functional
correlation between molecular signature and phenotypic heterogeneity. This project will greatly improve our
understanding of tumor heterogeneity, and possibly provide novel approaches and strategies of targeting
human glioblastomas.
Max ERC Funding
2 000 000 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym CHRiSHarMa
Project Commutators, Hilbert and Riesz transforms, Shifts, Harmonic extensions and Martingales
Researcher (PI) Stefanie Petermichl
Host Institution (HI) JULIUS-MAXIMILIANS-UNIVERSITAT WURZBURG
Country Germany
Call Details Consolidator Grant (CoG), PE1, ERC-2015-CoG
Summary This project aims to develop two arrays of questions at the heart of harmonic
analysis, probability and operator theory:
Multi-parameter harmonic analysis.
Through the use of wavelet methods in harmonic analysis, we plan to shed new
light on characterizations for boundedness of multi-parameter versions of
classical Hankel operators in a variety of settings. The classical Nehari's theorem on
the disk (1957) has found an important generalization to Hilbert space
valued functions, known as Page's theorem. A relevant extension of Nehari's
theorem to the bi-disk had been a long standing problem, finally solved in
2000, through novel harmonic analysis methods. It's operator analog remains
unknown and constitutes part of this proposal.
Sharp estimates for Calderon-Zygmund operators and martingale
inequalities.
We make use of the interplay between objects central to
Harmonic analysis, such as the Hilbert transform, and objects central to
probability theory, martingales. This connection has seen many faces, such as
in the UMD space classification by Bourgain and Burkholder or in the formula
of Gundy-Varapoulos, that uses orthogonal martingales to model the behavior of
the Hilbert transform. Martingale methods in combination with optimal control
have advanced an array of questions in harmonic analysis in recent years. In
this proposal we wish to continue this direction as well as exploit advances
in dyadic harmonic analysis for use in questions central to probability. There
is some focus on weighted estimates in a non-commutative and scalar setting, in the understanding of discretizations
of classical operators, such as the Hilbert transform and their role played
when acting on functions defined on discrete groups. From a martingale
standpoint, jump processes come into play. Another direction is the use of
numerical methods in combination with harmonic analysis achievements for martingale estimates.
Summary
This project aims to develop two arrays of questions at the heart of harmonic
analysis, probability and operator theory:
Multi-parameter harmonic analysis.
Through the use of wavelet methods in harmonic analysis, we plan to shed new
light on characterizations for boundedness of multi-parameter versions of
classical Hankel operators in a variety of settings. The classical Nehari's theorem on
the disk (1957) has found an important generalization to Hilbert space
valued functions, known as Page's theorem. A relevant extension of Nehari's
theorem to the bi-disk had been a long standing problem, finally solved in
2000, through novel harmonic analysis methods. It's operator analog remains
unknown and constitutes part of this proposal.
Sharp estimates for Calderon-Zygmund operators and martingale
inequalities.
We make use of the interplay between objects central to
Harmonic analysis, such as the Hilbert transform, and objects central to
probability theory, martingales. This connection has seen many faces, such as
in the UMD space classification by Bourgain and Burkholder or in the formula
of Gundy-Varapoulos, that uses orthogonal martingales to model the behavior of
the Hilbert transform. Martingale methods in combination with optimal control
have advanced an array of questions in harmonic analysis in recent years. In
this proposal we wish to continue this direction as well as exploit advances
in dyadic harmonic analysis for use in questions central to probability. There
is some focus on weighted estimates in a non-commutative and scalar setting, in the understanding of discretizations
of classical operators, such as the Hilbert transform and their role played
when acting on functions defined on discrete groups. From a martingale
standpoint, jump processes come into play. Another direction is the use of
numerical methods in combination with harmonic analysis achievements for martingale estimates.
Max ERC Funding
1 523 963 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym CO-MAN
Project Safe data-driven control for human-centric systems
Researcher (PI) Sandra Hirche
Host Institution (HI) TECHNISCHE UNIVERSITAET MUENCHEN
Country Germany
Call Details Consolidator Grant (CoG), PE7, ERC-2019-COG
Summary Many control systems of the future involve a tight interaction or even symbiosis with the human user. High-impact application domains of human-centric systems include healthcare, mobility, and infrastructure systems. In human-centric systems the human is both, an element of the control system, and a design criterion with individual requirements that need to be satisfied. Safety - despite the high uncertainty of human behavior - and maximization of individual user experience are the primary objectives for control design in human-centric systems. The visionary goal of CO-MAN is to contribute to the fundamental understanding and principled approach to the control of smart human-centric systems. We will develop a novel framework for user-adaptive data-driven control with performance guarantees in order to address the scientific challenges of high uncertainty and individual user requirements. The grand challenge is to unify the two previously separate paradigms of model-based control with its rigorous guarantees but limited modeling base and machine learning algorithms with its flexible modeling concepts but lack of guarantees. The breakthrough enabling idea is to merge probabilistic non-parametric modeling techniques from statistical learning theory with novel risk-aware control methodologies while including active user modeling. The game changer is the current push towards reliable machine learning with novel results on theoretical bounds for learning behavior. Because of favorable properties we will focus on Gaussian Processes to model user behavior and preferences and translate the naturally quantified model uncertainty into closed loop behavior guarantees through a confidence-driven human-interactive control approach. The PI is in a perfect position to achieve the envisioned goal of super-individualized data-driven control with performance guarantees given the highly visible preliminary results and leadership in the area of human-cyber-physical systems.
Summary
Many control systems of the future involve a tight interaction or even symbiosis with the human user. High-impact application domains of human-centric systems include healthcare, mobility, and infrastructure systems. In human-centric systems the human is both, an element of the control system, and a design criterion with individual requirements that need to be satisfied. Safety - despite the high uncertainty of human behavior - and maximization of individual user experience are the primary objectives for control design in human-centric systems. The visionary goal of CO-MAN is to contribute to the fundamental understanding and principled approach to the control of smart human-centric systems. We will develop a novel framework for user-adaptive data-driven control with performance guarantees in order to address the scientific challenges of high uncertainty and individual user requirements. The grand challenge is to unify the two previously separate paradigms of model-based control with its rigorous guarantees but limited modeling base and machine learning algorithms with its flexible modeling concepts but lack of guarantees. The breakthrough enabling idea is to merge probabilistic non-parametric modeling techniques from statistical learning theory with novel risk-aware control methodologies while including active user modeling. The game changer is the current push towards reliable machine learning with novel results on theoretical bounds for learning behavior. Because of favorable properties we will focus on Gaussian Processes to model user behavior and preferences and translate the naturally quantified model uncertainty into closed loop behavior guarantees through a confidence-driven human-interactive control approach. The PI is in a perfect position to achieve the envisioned goal of super-individualized data-driven control with performance guarantees given the highly visible preliminary results and leadership in the area of human-cyber-physical systems.
Max ERC Funding
1 999 975 €
Duration
Start date: 2020-09-01, End date: 2025-08-31
Project acronym COMBAT
Project Clearance Of Microbial Biofilms by Advancing diagnostics and Therapy
Researcher (PI) Susanne Christiane Haeussler
Host Institution (HI) HELMHOLTZ-ZENTRUM FUR INFEKTIONSFORSCHUNG GMBH
Country Germany
Call Details Consolidator Grant (CoG), LS6, ERC-2016-COG
Summary Every year chronic infections in patients due to biofilm formation of pathogenic bacteria are a multi-billion Euro burden to national healthcare systems. Despite improvements in technology and medical services, morbidity and mortality due to chronic infections have remained unchanged over the past decades. The emergence of a chronic infection disease burden calls for the development of modern diagnostics for biofilm resistance profiling and new therapeutic strategies to eradicate biofilm-associated infections. However, many unsuccessful attempts to address this need teach us that alternative perspectives are needed to meet the challenges.
The project is committed to develop innovative diagnostics and to strive for therapeutic solutions in patients suffering from biofilm-associated infections. The objective is to apply data-driven science to unlock the potential of microbial genomics. This new approach uses tools of advanced microbiological genomics and machine learning in genome-wide association studies on an existing unprecedentedly large dataset. This dataset has been generated in my group within the last five years and comprises sequence variation and gene expression information of a plethora of clinical Pseudomonas aeruginosa isolates. The wealth of patterns and characteristics of biofilm resistance are invisible at a smaller scale and will be interpreted within context and domain-specific knowledge.
The unique combination of basic molecular biology research, technology-driven approaches and data-driven science allows pioneer research dedicated to advance strategies to combat biofilm-associated infections. My approach does not only provide a prediction of biofilm resistance based on the bacteria´s genotype but also holds promise to transform treatment paradigms for the management of chronic infections and by interference with bacterial stress responses will promote the effectiveness of antimicrobials in clinical use to eradicate biofilm infections.
Summary
Every year chronic infections in patients due to biofilm formation of pathogenic bacteria are a multi-billion Euro burden to national healthcare systems. Despite improvements in technology and medical services, morbidity and mortality due to chronic infections have remained unchanged over the past decades. The emergence of a chronic infection disease burden calls for the development of modern diagnostics for biofilm resistance profiling and new therapeutic strategies to eradicate biofilm-associated infections. However, many unsuccessful attempts to address this need teach us that alternative perspectives are needed to meet the challenges.
The project is committed to develop innovative diagnostics and to strive for therapeutic solutions in patients suffering from biofilm-associated infections. The objective is to apply data-driven science to unlock the potential of microbial genomics. This new approach uses tools of advanced microbiological genomics and machine learning in genome-wide association studies on an existing unprecedentedly large dataset. This dataset has been generated in my group within the last five years and comprises sequence variation and gene expression information of a plethora of clinical Pseudomonas aeruginosa isolates. The wealth of patterns and characteristics of biofilm resistance are invisible at a smaller scale and will be interpreted within context and domain-specific knowledge.
The unique combination of basic molecular biology research, technology-driven approaches and data-driven science allows pioneer research dedicated to advance strategies to combat biofilm-associated infections. My approach does not only provide a prediction of biofilm resistance based on the bacteria´s genotype but also holds promise to transform treatment paradigms for the management of chronic infections and by interference with bacterial stress responses will promote the effectiveness of antimicrobials in clinical use to eradicate biofilm infections.
Max ERC Funding
1 998 750 €
Duration
Start date: 2017-05-01, End date: 2022-04-30
Project acronym ComplexAssembly
Project The birth of protein complexes
Researcher (PI) Martin BECK
Host Institution (HI) EUROPEAN MOLECULAR BIOLOGY LABORATORY
Country Germany
Call Details Consolidator Grant (CoG), LS2, ERC-2016-COG
Summary Protein complexes are central to many cellular functions but our knowledge of how cells assemble protein complexes remains very sparse. Biophysical and structural data of assembly intermediates are extremely rare. Particularly in higher eukaryotes, it has become clear that complex assembly by random collision of subunits cannot cope with the spatial and temporal complexity of the intricate architecture of many cellular machines. Here I propose to combine systems biology approaches with in situ structural biology methods to visualize protein complex assembly. I want to investigate experimentally in which order the interfaces of protein complexes are formed and to which extent structures of assembly intermediates resemble those observed in fully assembled complexes. I want develop methods to systematically screen for additional factors involved in assembly pathways. I furthermore want to test the hypothesis that mechanisms must exist in eukaryotes that coordinate local mRNA translation with the ordered formation of protein complex interfaces. I believe that in order to understand assembly pathways, these processes, that so far are often studied autonomously, need to be considered jointly and in a protein complex centric manner. The research proposed here will bridge across these different scientific disciplines. In the long term, a better mechanistic understanding of protein complex assembly and the structural characterization of critical intermediates will be of high relevance for scenarios under which a cell’s protein quality control system has to cope with stress, such as aging and neurodegenerative diseases. It might also facilitate the more efficient industrial production of therapeutically relevant proteins.
Summary
Protein complexes are central to many cellular functions but our knowledge of how cells assemble protein complexes remains very sparse. Biophysical and structural data of assembly intermediates are extremely rare. Particularly in higher eukaryotes, it has become clear that complex assembly by random collision of subunits cannot cope with the spatial and temporal complexity of the intricate architecture of many cellular machines. Here I propose to combine systems biology approaches with in situ structural biology methods to visualize protein complex assembly. I want to investigate experimentally in which order the interfaces of protein complexes are formed and to which extent structures of assembly intermediates resemble those observed in fully assembled complexes. I want develop methods to systematically screen for additional factors involved in assembly pathways. I furthermore want to test the hypothesis that mechanisms must exist in eukaryotes that coordinate local mRNA translation with the ordered formation of protein complex interfaces. I believe that in order to understand assembly pathways, these processes, that so far are often studied autonomously, need to be considered jointly and in a protein complex centric manner. The research proposed here will bridge across these different scientific disciplines. In the long term, a better mechanistic understanding of protein complex assembly and the structural characterization of critical intermediates will be of high relevance for scenarios under which a cell’s protein quality control system has to cope with stress, such as aging and neurodegenerative diseases. It might also facilitate the more efficient industrial production of therapeutically relevant proteins.
Max ERC Funding
1 957 717 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym CONSYN
Project Contextualizing biomolecular circuit models for synthetic biology
Researcher (PI) Heinz KOEPPL
Host Institution (HI) TECHNISCHE UNIVERSITAT DARMSTADT
Country Germany
Call Details Consolidator Grant (CoG), PE7, ERC-2017-COG
Summary Synthetic biology is the bottom-up engineering of new molecular functionality inside a biological cell. Although it aims at a quantitative and compositional approach, most of today’s implementations of synthetic circuits are based on inefficient trial-and-error runs. This approach to circuit design does not scale well with circuit complexity and is against the basic paradigm of synthetic biology. This unsatisfactory state of affairs is partly due to the lack of the right computational methodology that can support the quantitative characterization of circuits and their significant context dependency, i.e., their change in behavior upon interactions with the host machinery and with other circuit elements.
CONSYN will contribute computational methodology to overcome the trial-and-error approach and to ultimately turn synthetic circuit design into a rational bottom-up process that heavily relies on computational analysis before any actual biomolecular implementation is considered. In order to achieve this goal, we will work on the following agenda: (i) develop biophysical and statistical models of biomolecular contexts into which the synthetic circuit or synthetic part can be embedded in silico; (ii) devise new statistical inference methods that can deliver accurate characterization of circuits and their context dependency by making use of cutting-edge single-cell experimental data; (iii) derive new context-insensitive circuit designs through in silico sensitivity analysis and application of filtering theory; (iv) optimize protocols and measurement infrastructure using model-based experimental design yielding a better circuit and context characterization; (v) experimentally build synthetic circuits in vivo and in cell-free systems in order to validate and bring to life the above theoretical investigations. We are in the unique position to also address (v) in-house due to the experimental wetlab facilities in our group.
Summary
Synthetic biology is the bottom-up engineering of new molecular functionality inside a biological cell. Although it aims at a quantitative and compositional approach, most of today’s implementations of synthetic circuits are based on inefficient trial-and-error runs. This approach to circuit design does not scale well with circuit complexity and is against the basic paradigm of synthetic biology. This unsatisfactory state of affairs is partly due to the lack of the right computational methodology that can support the quantitative characterization of circuits and their significant context dependency, i.e., their change in behavior upon interactions with the host machinery and with other circuit elements.
CONSYN will contribute computational methodology to overcome the trial-and-error approach and to ultimately turn synthetic circuit design into a rational bottom-up process that heavily relies on computational analysis before any actual biomolecular implementation is considered. In order to achieve this goal, we will work on the following agenda: (i) develop biophysical and statistical models of biomolecular contexts into which the synthetic circuit or synthetic part can be embedded in silico; (ii) devise new statistical inference methods that can deliver accurate characterization of circuits and their context dependency by making use of cutting-edge single-cell experimental data; (iii) derive new context-insensitive circuit designs through in silico sensitivity analysis and application of filtering theory; (iv) optimize protocols and measurement infrastructure using model-based experimental design yielding a better circuit and context characterization; (v) experimentally build synthetic circuits in vivo and in cell-free systems in order to validate and bring to life the above theoretical investigations. We are in the unique position to also address (v) in-house due to the experimental wetlab facilities in our group.
Max ERC Funding
1 996 579 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym ConTExt
Project Connecting the Extreme
Researcher (PI) Sune Toft
Host Institution (HI) KOBENHAVNS UNIVERSITET
Country Denmark
Call Details Consolidator Grant (CoG), PE9, ERC-2014-CoG
Summary Advances in technology and methodology over the last decade, have enabled the study of galaxies to the highest redshifts. This has revolutionized our understanding of the origin and evolution of galaxies. I have played a central role in this revolution, by discovering that at z=2, when the universe was only 3 Gyr old, half of the most massive galaxies were extremely compact and had already completed their star formation. During the last five years I have led a successful group of postdocs and students dedicated to investigating the extreme properties of these galaxies and place them into cosmological context. Combining a series of high profile observational studies published by my group and others, I recently proposed an evolutionary sequence that ties together the most extreme galaxies in the universe, from the most intense dusty starburst at cosmic dawn, through quasars: the brightest sources in the universe, driven by feedback from supermassive black holes, and galaxy cores hosting the densest conglomerations of stellar mass known, to the sleeping giants of the local universe, the giant ellipticals. The proposed research program will explore if such an evolutionary sequence exists, with the ultimate goal of reaching, for the first time, a coherent physical understanding of how the most massive galaxies in the universe formed. While there is a chance the rigorous tests may ultimately reveal the proposed sequence to be too simplistic, a guarantied outcome of the program is a significantly improved understanding of the physical mechanisms that shape galaxies and drive their star formation and quenching
Summary
Advances in technology and methodology over the last decade, have enabled the study of galaxies to the highest redshifts. This has revolutionized our understanding of the origin and evolution of galaxies. I have played a central role in this revolution, by discovering that at z=2, when the universe was only 3 Gyr old, half of the most massive galaxies were extremely compact and had already completed their star formation. During the last five years I have led a successful group of postdocs and students dedicated to investigating the extreme properties of these galaxies and place them into cosmological context. Combining a series of high profile observational studies published by my group and others, I recently proposed an evolutionary sequence that ties together the most extreme galaxies in the universe, from the most intense dusty starburst at cosmic dawn, through quasars: the brightest sources in the universe, driven by feedback from supermassive black holes, and galaxy cores hosting the densest conglomerations of stellar mass known, to the sleeping giants of the local universe, the giant ellipticals. The proposed research program will explore if such an evolutionary sequence exists, with the ultimate goal of reaching, for the first time, a coherent physical understanding of how the most massive galaxies in the universe formed. While there is a chance the rigorous tests may ultimately reveal the proposed sequence to be too simplistic, a guarantied outcome of the program is a significantly improved understanding of the physical mechanisms that shape galaxies and drive their star formation and quenching
Max ERC Funding
1 999 526 €
Duration
Start date: 2015-09-01, End date: 2021-02-28
Project acronym COSMIC-LITMUS
Project Turning cosmic shear into a litmus test for the standard model of cosmology
Researcher (PI) Hendrik Jurgen HILDEBRANDT
Host Institution (HI) RUHR-UNIVERSITAET BOCHUM
Country Germany
Call Details Consolidator Grant (CoG), PE9, ERC-2017-COG
Summary The standard model of cosmology is impressively consistent with a large number of observations. Its parameters have been determined with great accuracy with the Planck CMB (cosmic microwave background) mission. However, recently local determinations of the Hubble constant as well as ob- servations of strong and weak gravitational lensing have found some tension with Planck. Are those observations first glimpses at a crack in the standard model and hints of an evolving dark energy com- ponent? With this ERC Consolidator Grant I will answer these questions by greatly increasing the robustness of one of those cosmological probes, the weak lensing effect of the large scale structure of the Universe also called cosmic shear.
In order to reach this goal I will concentrate on the largest outstanding source of systematic error: photometric redshifts (photo-z). I will exploit the unique combination of two European imaging surveys in the optical and infrared wavelength regime, an additional narrow-band imaging survey with extremely precise photo-z, and spectroscopic calibration data from a recently approved ESO large program on the VLT. Using angular cross-correlations and machine-learning I will calibrate the photo- z in a two-stage process making sure that this crucial systematic uncertainty will keep pace with the growing statistical power of imaging surveys. This will yield an uncertainty on the amplitude of the clustering of dark matter that is smaller than the best constraints from the CMB.
I will also apply these methods to ESA’s Euclid mission launching in 2020, which will fail if photo-z are not better understood by then. If the discrepancy between lensing and CMB measurements holds this would potentially result in a revolution of our understanding of the Universe. Regardless of this spectacular short-term possibility I will turn cosmic shear – one of the most powerful cosmological probes of dark energy – into a litmus test for our cosmological paradigm.
Summary
The standard model of cosmology is impressively consistent with a large number of observations. Its parameters have been determined with great accuracy with the Planck CMB (cosmic microwave background) mission. However, recently local determinations of the Hubble constant as well as ob- servations of strong and weak gravitational lensing have found some tension with Planck. Are those observations first glimpses at a crack in the standard model and hints of an evolving dark energy com- ponent? With this ERC Consolidator Grant I will answer these questions by greatly increasing the robustness of one of those cosmological probes, the weak lensing effect of the large scale structure of the Universe also called cosmic shear.
In order to reach this goal I will concentrate on the largest outstanding source of systematic error: photometric redshifts (photo-z). I will exploit the unique combination of two European imaging surveys in the optical and infrared wavelength regime, an additional narrow-band imaging survey with extremely precise photo-z, and spectroscopic calibration data from a recently approved ESO large program on the VLT. Using angular cross-correlations and machine-learning I will calibrate the photo- z in a two-stage process making sure that this crucial systematic uncertainty will keep pace with the growing statistical power of imaging surveys. This will yield an uncertainty on the amplitude of the clustering of dark matter that is smaller than the best constraints from the CMB.
I will also apply these methods to ESA’s Euclid mission launching in 2020, which will fail if photo-z are not better understood by then. If the discrepancy between lensing and CMB measurements holds this would potentially result in a revolution of our understanding of the Universe. Regardless of this spectacular short-term possibility I will turn cosmic shear – one of the most powerful cosmological probes of dark energy – into a litmus test for our cosmological paradigm.
Max ERC Funding
1 931 493 €
Duration
Start date: 2018-06-01, End date: 2023-05-31