Project acronym 3D-loop
Project Mechanism of homology search and the logic of homologous chromosome pairing in meiosis
Researcher (PI) Aurele PIAZZA
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Country France
Call Details Starting Grant (StG), LS2, ERC-2019-STG
Summary Homologous recombination (HR) is a conserved DNA double-strand breaks (DSB) repair pathway that uniquely uses an intact DNA molecule as a template. Genome-wide homology search is carried out by a nucleoprotein filament (NPF) assembled on the ssDNA flanking the DSB, and whose product is a “D-loop” joint molecule. Beyond accurate DSB repair, this capacity of HR to spatially associates homologous molecules is also harnessed for homolog pairing in meiosis. The goal of “3D-loop” is to tackle two long lasting conundrums: the fundamental homology search mechanism that achieves accurate and efficient identification of a single homologous donor in the vastness of the genome and nucleus, and how this mechanism is adapted for the purpose of homologs attachment in meiosis.
I overcame the main hurdle to study these core steps of HR by developing a suite of proximity ligation-based methodologies and experimental systems to physically detect joint molecules in yeast cells. It revealed elaborate regulation controlling D-loop dynamics and a novel class of joint molecules. This proposal builds upon these methodologies and findings to first address basic properties of the homology sampling process by the NPF and the role of D-loop dynamics, with the long-term goal to establish a quantitative framework of homology search in mitotic cells (WP1). Second, the meiosis-specific regulation of homology search leading to homolog pairing likely integrates chromosomal-scale information. Genome re-synthesis and engineering approaches will be deployed to (i) achieve a quantitative and dynamic cartography of the cytological and molecular events of meiosis over a large chromosomal region, (ii) probe cis-acting regulations at the chromosomal scale, and (iii) revisit the molecular paradigm for crossover formation (WP2). We expect this project to shed light on the fundamental process of homology search and its involvement in the chromosome pairing phenomenon lying at the basis of sexual reproduction.
Summary
Homologous recombination (HR) is a conserved DNA double-strand breaks (DSB) repair pathway that uniquely uses an intact DNA molecule as a template. Genome-wide homology search is carried out by a nucleoprotein filament (NPF) assembled on the ssDNA flanking the DSB, and whose product is a “D-loop” joint molecule. Beyond accurate DSB repair, this capacity of HR to spatially associates homologous molecules is also harnessed for homolog pairing in meiosis. The goal of “3D-loop” is to tackle two long lasting conundrums: the fundamental homology search mechanism that achieves accurate and efficient identification of a single homologous donor in the vastness of the genome and nucleus, and how this mechanism is adapted for the purpose of homologs attachment in meiosis.
I overcame the main hurdle to study these core steps of HR by developing a suite of proximity ligation-based methodologies and experimental systems to physically detect joint molecules in yeast cells. It revealed elaborate regulation controlling D-loop dynamics and a novel class of joint molecules. This proposal builds upon these methodologies and findings to first address basic properties of the homology sampling process by the NPF and the role of D-loop dynamics, with the long-term goal to establish a quantitative framework of homology search in mitotic cells (WP1). Second, the meiosis-specific regulation of homology search leading to homolog pairing likely integrates chromosomal-scale information. Genome re-synthesis and engineering approaches will be deployed to (i) achieve a quantitative and dynamic cartography of the cytological and molecular events of meiosis over a large chromosomal region, (ii) probe cis-acting regulations at the chromosomal scale, and (iii) revisit the molecular paradigm for crossover formation (WP2). We expect this project to shed light on the fundamental process of homology search and its involvement in the chromosome pairing phenomenon lying at the basis of sexual reproduction.
Max ERC Funding
1 499 779 €
Duration
Start date: 2020-01-01, End date: 2024-12-31
Project acronym 4D-GenEx
Project Spatio-temporal Organization and Expression of the Genome
Researcher (PI) Antoine COULON
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Country France
Call Details Starting Grant (StG), LS2, ERC-2017-STG
Summary This project investigates the two-way relationship between spatio-temporal genome organization and coordinated gene regulation, through an approach at the interface between physics, computer science and biology.
In the nucleus, preferred positions are observed from chromosomes to single genes, in relation to normal and pathological cellular states. Evidence indicates a complex spatio-temporal coupling between co-regulated genes: e.g. certain genes cluster spatially when responding to similar factors and transcriptional noise patterns suggest domain-wide mechanisms. Yet, no individual experiment allows probing transcriptional coordination in 4 dimensions (FISH, live locus tracking, Hi-C...). Interpreting such data also critically requires theory (stochastic processes, statistical physics…). A lack of appropriate experimental/analytical approaches is impairing our understanding of the 4D genome.
Our proposal combines cutting-edge single-molecule imaging, signal-theory data analysis and physical modeling to study how genes coordinate in space and time in a single nucleus. Our objectives are to understand (a) competition/recycling of shared resources between genes within subnuclear compartments, (b) how enhancers communicate with genes domain-wide, and (c) the role of local conformational dynamics and supercoiling in gene co-regulation. Our organizing hypothesis is that, by acting on their microenvironment, genes shape their co-expression with other genes.
Building upon my expertise, we will use dual-color MS2/PP7 RNA labeling to visualize for the first time transcription and motion of pairs of hormone-responsive genes in real time. With our innovative signal analysis tools, we will extract spatio-temporal signatures of underlying processes, which we will investigate with stochastic modeling and validate through experimental perturbations. We expect to uncover how the functional organization of the linear genome relates to its physical properties and dynamics in 4D.
Summary
This project investigates the two-way relationship between spatio-temporal genome organization and coordinated gene regulation, through an approach at the interface between physics, computer science and biology.
In the nucleus, preferred positions are observed from chromosomes to single genes, in relation to normal and pathological cellular states. Evidence indicates a complex spatio-temporal coupling between co-regulated genes: e.g. certain genes cluster spatially when responding to similar factors and transcriptional noise patterns suggest domain-wide mechanisms. Yet, no individual experiment allows probing transcriptional coordination in 4 dimensions (FISH, live locus tracking, Hi-C...). Interpreting such data also critically requires theory (stochastic processes, statistical physics…). A lack of appropriate experimental/analytical approaches is impairing our understanding of the 4D genome.
Our proposal combines cutting-edge single-molecule imaging, signal-theory data analysis and physical modeling to study how genes coordinate in space and time in a single nucleus. Our objectives are to understand (a) competition/recycling of shared resources between genes within subnuclear compartments, (b) how enhancers communicate with genes domain-wide, and (c) the role of local conformational dynamics and supercoiling in gene co-regulation. Our organizing hypothesis is that, by acting on their microenvironment, genes shape their co-expression with other genes.
Building upon my expertise, we will use dual-color MS2/PP7 RNA labeling to visualize for the first time transcription and motion of pairs of hormone-responsive genes in real time. With our innovative signal analysis tools, we will extract spatio-temporal signatures of underlying processes, which we will investigate with stochastic modeling and validate through experimental perturbations. We expect to uncover how the functional organization of the linear genome relates to its physical properties and dynamics in 4D.
Max ERC Funding
1 499 750 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym ACAP
Project Acency Costs and Asset Pricing
Researcher (PI) Thomas Mariotti
Host Institution (HI) FONDATION JEAN JACQUES LAFFONT,TOULOUSE SCIENCES ECONOMIQUES
Country France
Call Details Starting Grant (StG), SH1, ERC-2007-StG
Summary The main objective of this research project is to contribute at bridging the gap between the two main branches of financial theory, namely corporate finance and asset pricing. It is motivated by the conviction that these two aspects of financial activity should and can be analyzed within a unified framework. This research will borrow from these two approaches in order to construct theoretical models that allow one to analyze the design and issuance of financial securities, as well as the dynamics of their valuations. Unlike asset pricing, which takes as given the price of the fundamentals, the goal is to derive security price processes from a precise description of firm’s operations and internal frictions. Regarding the latter, and in line with traditional corporate finance theory, the analysis will emphasize the role of agency costs within the firm for the design of its securities. But the analysis will be pushed one step further by studying the impact of these agency costs on key financial variables such as stock and bond prices, leverage, book-to-market ratios, default risk, or the holding of liquidities by firms. One of the contributions of this research project is to show how these variables are interrelated when firms and investors agree upon optimal financial arrangements. The final objective is to derive a rich set of testable asset pricing implications that would eventually be brought to the data.
Summary
The main objective of this research project is to contribute at bridging the gap between the two main branches of financial theory, namely corporate finance and asset pricing. It is motivated by the conviction that these two aspects of financial activity should and can be analyzed within a unified framework. This research will borrow from these two approaches in order to construct theoretical models that allow one to analyze the design and issuance of financial securities, as well as the dynamics of their valuations. Unlike asset pricing, which takes as given the price of the fundamentals, the goal is to derive security price processes from a precise description of firm’s operations and internal frictions. Regarding the latter, and in line with traditional corporate finance theory, the analysis will emphasize the role of agency costs within the firm for the design of its securities. But the analysis will be pushed one step further by studying the impact of these agency costs on key financial variables such as stock and bond prices, leverage, book-to-market ratios, default risk, or the holding of liquidities by firms. One of the contributions of this research project is to show how these variables are interrelated when firms and investors agree upon optimal financial arrangements. The final objective is to derive a rich set of testable asset pricing implications that would eventually be brought to the data.
Max ERC Funding
1 000 000 €
Duration
Start date: 2008-11-01, End date: 2014-10-31
Project acronym AlgoQIP
Project Beyond Shannon: Algorithms for optimal information processing
Researcher (PI) Omar Fawzi
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE EN INFORMATIQUE ET AUTOMATIQUE
Country France
Call Details Starting Grant (StG), PE6, ERC-2019-STG
Summary In the road towards quantum technologies capable of exploiting the revolutionary potential of quantum theory for information technology, a major bottleneck is the large overhead needed to correct errors caused by unwanted noise. Despite important research activity and great progress in designing better error correcting codes and fault-tolerant schemes, the fundamental limits of communication/computation over a quantum noisy medium are far from being understood. In fact, no satisfactory quantum analogue of Shannon’s celebrated noisy coding theorem is known.
The objective of this project is to leverage tools from mathematical optimization in order to build an algorithmic theory of optimal information processing that would go beyond the statistical approach pioneered by Shannon. Our goal will be to establish efficient algorithms that determine optimal methods for achieving a given task, rather than only characterizing the best achievable rates in the asymptotic limit in terms of entropic expressions. This approach will address three limitations — that are particularly severe in the quantum context — faced by the statistical approach: the non-additivity of entropic expressions, the asymptotic nature of the theory and the independence assumption.
Our aim is to develop efficient algorithms that take as input a description of a noise model and output a near-optimal method for reliable communication under this model. For example, our algorithms will answer: how many logical qubits can be reliably stored using 100 physical qubits that undergo depolarizing noise with parameter 5%? We will also develop generic and efficient decoding algorithms for quantum error correcting codes. These algorithms will have direct applications to the development of quantum technologies. Moreover, we will establish methods to compute the relevant uncertainty of large structured systems and apply them to obtain tight and non-asymptotic security bounds for (quantum) cryptographic protocols.
Summary
In the road towards quantum technologies capable of exploiting the revolutionary potential of quantum theory for information technology, a major bottleneck is the large overhead needed to correct errors caused by unwanted noise. Despite important research activity and great progress in designing better error correcting codes and fault-tolerant schemes, the fundamental limits of communication/computation over a quantum noisy medium are far from being understood. In fact, no satisfactory quantum analogue of Shannon’s celebrated noisy coding theorem is known.
The objective of this project is to leverage tools from mathematical optimization in order to build an algorithmic theory of optimal information processing that would go beyond the statistical approach pioneered by Shannon. Our goal will be to establish efficient algorithms that determine optimal methods for achieving a given task, rather than only characterizing the best achievable rates in the asymptotic limit in terms of entropic expressions. This approach will address three limitations — that are particularly severe in the quantum context — faced by the statistical approach: the non-additivity of entropic expressions, the asymptotic nature of the theory and the independence assumption.
Our aim is to develop efficient algorithms that take as input a description of a noise model and output a near-optimal method for reliable communication under this model. For example, our algorithms will answer: how many logical qubits can be reliably stored using 100 physical qubits that undergo depolarizing noise with parameter 5%? We will also develop generic and efficient decoding algorithms for quantum error correcting codes. These algorithms will have direct applications to the development of quantum technologies. Moreover, we will establish methods to compute the relevant uncertainty of large structured systems and apply them to obtain tight and non-asymptotic security bounds for (quantum) cryptographic protocols.
Max ERC Funding
1 492 733 €
Duration
Start date: 2021-01-01, End date: 2025-12-31
Project acronym aLzINK
Project Alzheimer's disease and Zinc: the missing link ?
Researcher (PI) Christelle Sandrine Florence HUREAU-SABATER
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Country France
Call Details Starting Grant (StG), PE5, ERC-2014-STG
Summary Alzheimer's disease (AD) is one of the most serious diseases mankind is now facing as its social and economical impacts are increasing fastly. AD is very complex and the amyloid-β (Aβ) peptide as well as metallic ions (mainly copper and zinc) have been linked to its aetiology. While the deleterious impact of Cu is widely acknowledged, intervention of Zn is certain but still needs to be figured out.
The main objective of the present proposal, which is strongly anchored in the bio-inorganic chemistry field at interface with spectroscopy and biochemistry, is to design, synthesize and study new drug candidates (ligands L) capable of (i) targeting Cu(II) bound to Aβ within the synaptic cleft, where Zn is co-localized and ultimately to develop Zn-driven Cu(II) removal from Aβ and (ii) disrupting the aberrant Cu(II)-Aβ interactions involved in ROS production and Aβ aggregation, two deleterious events in AD. The drug candidates will thus have high Cu(II) over Zn selectively to preserve the crucial physiological role of Zn in the neurotransmission process. Zn is always underestimated (if not completely neglected) in current therapeutic approaches targeting Cu(II) despite the known interference of Zn with Cu(II) binding.
To reach this objective, it is absolutely necessary to first understand the metal ions trafficking issues in presence of Aβ alone at a molecular level (i.e. without the drug candidates).This includes: (i) determination of Zn binding site to Aβ, impact on Aβ aggregation and cell toxicity, (ii) determination of the mutual influence of Zn and Cu to their coordination to Aβ, impact on Aβ aggregation, ROS production and cell toxicity.
Methods used will span from organic synthesis to studies of neuronal model cells, with a major contribution of a wide panel of spectroscopic techniques including NMR, EPR, mass spectrometry, fluorescence, UV-Vis, circular-dichroism, X-ray absorption spectroscopy...
Summary
Alzheimer's disease (AD) is one of the most serious diseases mankind is now facing as its social and economical impacts are increasing fastly. AD is very complex and the amyloid-β (Aβ) peptide as well as metallic ions (mainly copper and zinc) have been linked to its aetiology. While the deleterious impact of Cu is widely acknowledged, intervention of Zn is certain but still needs to be figured out.
The main objective of the present proposal, which is strongly anchored in the bio-inorganic chemistry field at interface with spectroscopy and biochemistry, is to design, synthesize and study new drug candidates (ligands L) capable of (i) targeting Cu(II) bound to Aβ within the synaptic cleft, where Zn is co-localized and ultimately to develop Zn-driven Cu(II) removal from Aβ and (ii) disrupting the aberrant Cu(II)-Aβ interactions involved in ROS production and Aβ aggregation, two deleterious events in AD. The drug candidates will thus have high Cu(II) over Zn selectively to preserve the crucial physiological role of Zn in the neurotransmission process. Zn is always underestimated (if not completely neglected) in current therapeutic approaches targeting Cu(II) despite the known interference of Zn with Cu(II) binding.
To reach this objective, it is absolutely necessary to first understand the metal ions trafficking issues in presence of Aβ alone at a molecular level (i.e. without the drug candidates).This includes: (i) determination of Zn binding site to Aβ, impact on Aβ aggregation and cell toxicity, (ii) determination of the mutual influence of Zn and Cu to their coordination to Aβ, impact on Aβ aggregation, ROS production and cell toxicity.
Methods used will span from organic synthesis to studies of neuronal model cells, with a major contribution of a wide panel of spectroscopic techniques including NMR, EPR, mass spectrometry, fluorescence, UV-Vis, circular-dichroism, X-ray absorption spectroscopy...
Max ERC Funding
1 499 948 €
Duration
Start date: 2015-03-01, End date: 2021-08-31
Project acronym ATMOFLEX
Project Turbulent Transport in the Atmosphere: Fluctuations and Extreme Events
Researcher (PI) Jeremie Bec
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Country France
Call Details Starting Grant (StG), PE3, ERC-2009-StG
Summary A major part of the physical and chemical processes occurring in the atmosphere involves the turbulent transport of tiny particles. Current studies and models use a formulation in terms of mean fields, where the strong variations in the dynamical and statistical properties of the particles are neglected and where the underlying fluctuations of the fluid flow velocity are oversimplified. Devising an accurate understanding of the influence of air turbulence and of the extreme fluctuations that it generates in the dispersed phase remains a challenging issue. This project aims at coordinating and integrating theoretical, numerical, experimental, and observational efforts to develop a new statistical understanding of the role of fluctuations in atmospheric transport processes. The proposed work will cover individual as well as collective behaviors and will provide a systematic and unified description of targeted specific processes involving suspended drops or particles: the dispersion of pollutants from a source, the growth by condensation and coagulation of droplets and ice crystals in clouds, the scavenging, settling and re-suspension of aerosols, and the radiative and climatic effects of particles. The proposed approach is based on the use of tools borrowed from statistical physics and field theory, and from the theory of large deviations and of random dynamical systems in order to design new observables that will be simultaneously tractable analytically in simplified models and of relevance for the quantitative handling of such physical mechanisms. One of the outcomes will be to provide a new framework for improving and refining the methods used in meteorology and atmospheric sciences and to answer the long-standing question of the effects of suspended particles onto climate.
Summary
A major part of the physical and chemical processes occurring in the atmosphere involves the turbulent transport of tiny particles. Current studies and models use a formulation in terms of mean fields, where the strong variations in the dynamical and statistical properties of the particles are neglected and where the underlying fluctuations of the fluid flow velocity are oversimplified. Devising an accurate understanding of the influence of air turbulence and of the extreme fluctuations that it generates in the dispersed phase remains a challenging issue. This project aims at coordinating and integrating theoretical, numerical, experimental, and observational efforts to develop a new statistical understanding of the role of fluctuations in atmospheric transport processes. The proposed work will cover individual as well as collective behaviors and will provide a systematic and unified description of targeted specific processes involving suspended drops or particles: the dispersion of pollutants from a source, the growth by condensation and coagulation of droplets and ice crystals in clouds, the scavenging, settling and re-suspension of aerosols, and the radiative and climatic effects of particles. The proposed approach is based on the use of tools borrowed from statistical physics and field theory, and from the theory of large deviations and of random dynamical systems in order to design new observables that will be simultaneously tractable analytically in simplified models and of relevance for the quantitative handling of such physical mechanisms. One of the outcomes will be to provide a new framework for improving and refining the methods used in meteorology and atmospheric sciences and to answer the long-standing question of the effects of suspended particles onto climate.
Max ERC Funding
1 200 000 €
Duration
Start date: 2009-11-01, End date: 2014-10-31
Project acronym BIOMIM
Project Biomimetic films and membranes as advanced materials for studies on cellular processes
Researcher (PI) Catherine Cecile Picart
Host Institution (HI) INSTITUT POLYTECHNIQUE DE GRENOBLE
Country France
Call Details Starting Grant (StG), PE5, ERC-2010-StG_20091028
Summary The main objective nowadays in the field of biomaterials is to design highly performing bioinspired materials learning from natural processes. Importantly, biochemical and physical cues are key parameters that can affect cellular processes. Controlling processes that occur at the cell/material interface is also of prime importance to guide the cell response. The main aim of the current project is to develop novel functional bio-nanomaterials for in vitro biological studies. Our strategy is based on two related projects.
The first project deals with the rational design of smart films with foreseen applications in musculoskeletal tissue engineering. We will gain knowledge of key cellular processes by designing well defined self-assembled thin coatings. These multi-functional surfaces with bioactivity (incorporation of growth factors), mechanical (film stiffness) and topographical properties (spatial control of the film s properties) will serve as tools to mimic the complexity of the natural materials in vivo and to present bioactive molecules in the solid phase. We will get a better fundamental understanding of how cellular functions, including adhesion and differentiation of muscle cells are affected by the materials s surface properties.
In the second project, we will investigate at the molecular level a crucial aspect of cell adhesion and motility, which is the intracellular linkage between the plasma membrane and the cell cytoskeleton. We aim to elucidate the role of ERM proteins, especially ezrin and moesin, in the direct linkage between the plasma membrane and actin filaments. Here again, we will use a well defined microenvironment in vitro to simplify the complexity of the interactions that occur in cellulo. To this end, lipid membranes containing a key regulator lipid from the phosphoinositides familly, PIP2, will be employed in conjunction with purified proteins to investigate actin regulation by ERM proteins in the presence of PIP2-membranes.
Summary
The main objective nowadays in the field of biomaterials is to design highly performing bioinspired materials learning from natural processes. Importantly, biochemical and physical cues are key parameters that can affect cellular processes. Controlling processes that occur at the cell/material interface is also of prime importance to guide the cell response. The main aim of the current project is to develop novel functional bio-nanomaterials for in vitro biological studies. Our strategy is based on two related projects.
The first project deals with the rational design of smart films with foreseen applications in musculoskeletal tissue engineering. We will gain knowledge of key cellular processes by designing well defined self-assembled thin coatings. These multi-functional surfaces with bioactivity (incorporation of growth factors), mechanical (film stiffness) and topographical properties (spatial control of the film s properties) will serve as tools to mimic the complexity of the natural materials in vivo and to present bioactive molecules in the solid phase. We will get a better fundamental understanding of how cellular functions, including adhesion and differentiation of muscle cells are affected by the materials s surface properties.
In the second project, we will investigate at the molecular level a crucial aspect of cell adhesion and motility, which is the intracellular linkage between the plasma membrane and the cell cytoskeleton. We aim to elucidate the role of ERM proteins, especially ezrin and moesin, in the direct linkage between the plasma membrane and actin filaments. Here again, we will use a well defined microenvironment in vitro to simplify the complexity of the interactions that occur in cellulo. To this end, lipid membranes containing a key regulator lipid from the phosphoinositides familly, PIP2, will be employed in conjunction with purified proteins to investigate actin regulation by ERM proteins in the presence of PIP2-membranes.
Max ERC Funding
1 499 996 €
Duration
Start date: 2011-06-01, End date: 2016-05-31
Project acronym BioPoweredCL
Project Bright and biologically powered chemiluminescent labels for cell and tissue imaging
Researcher (PI) Alessandro ALIPRANDI
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Country France
Call Details Starting Grant (StG), PE5, ERC-2020-STG
Summary Imaging is one of the most powerful technique to visualize molecules, tissues, to understand and follow processes and it is the most used diagnostic tool in vitro and in vivo, Current biomedical imaging techniques can have high sensitivity, good spatial/temporal resolution and, in some cases, high tissue penetration but cannot combine all of these desired properties without using harmful radiations (or toxic labels) or very expensive equipment. Optical imaging techniques represent the best compromise among them; however, their ability to scale to human body is precluded. The main restriction of fluorescence imaging is that it requires light excitation which is limited by tissue absorption and scattering. Such limitations are not present in chemiluminescence imaging since light production occurs through a chemical reaction, resulting in higher penetration depth and best sensitivity. However both natural and artificial chemiluminescent systems require a continuous flow of exogenous reactants since all substrates are irreversibly consumed. BioPoweredCL aims to develop an unprecedented strategy to enable molecular imaging by realizing near infrared luminophores that harvest energy from the cellular respiration chain, in order to emit light without being consumed themselves. BioPoweredCL takes advantage of the most recent progress in artificial light production to develop a novel imaging technique where the absence of an excitation source overcomes the current limitations of fluorescence imaging while the regeneration of the luminophore overcomes the limitations of bioluminescence imaging. If successful it could replace current techniques based on harmful ionizing radiations such as X-rays or γ-rays. To reach such a grand-challenge the work plan is articulated into three different phases: 1) synthesis of new luminophores; 2) electrochemical characterization and energy cell harvesting; 3) in vitro experiments where the full potential of the approach will be validated.
Summary
Imaging is one of the most powerful technique to visualize molecules, tissues, to understand and follow processes and it is the most used diagnostic tool in vitro and in vivo, Current biomedical imaging techniques can have high sensitivity, good spatial/temporal resolution and, in some cases, high tissue penetration but cannot combine all of these desired properties without using harmful radiations (or toxic labels) or very expensive equipment. Optical imaging techniques represent the best compromise among them; however, their ability to scale to human body is precluded. The main restriction of fluorescence imaging is that it requires light excitation which is limited by tissue absorption and scattering. Such limitations are not present in chemiluminescence imaging since light production occurs through a chemical reaction, resulting in higher penetration depth and best sensitivity. However both natural and artificial chemiluminescent systems require a continuous flow of exogenous reactants since all substrates are irreversibly consumed. BioPoweredCL aims to develop an unprecedented strategy to enable molecular imaging by realizing near infrared luminophores that harvest energy from the cellular respiration chain, in order to emit light without being consumed themselves. BioPoweredCL takes advantage of the most recent progress in artificial light production to develop a novel imaging technique where the absence of an excitation source overcomes the current limitations of fluorescence imaging while the regeneration of the luminophore overcomes the limitations of bioluminescence imaging. If successful it could replace current techniques based on harmful ionizing radiations such as X-rays or γ-rays. To reach such a grand-challenge the work plan is articulated into three different phases: 1) synthesis of new luminophores; 2) electrochemical characterization and energy cell harvesting; 3) in vitro experiments where the full potential of the approach will be validated.
Max ERC Funding
1 449 750 €
Duration
Start date: 2021-10-01, End date: 2026-09-30
Project acronym BLACKJACK
Project Fast Monte Carlo integration with repulsive processes
Researcher (PI) Remi BARDENET
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Country France
Call Details Starting Grant (StG), PE6, ERC-2019-STG
Summary Expensive computer simulations have become routine in the experimental sciences. Astrophysicists design complex models of the evolution of galaxies, biologists develop intricate models of cells, ecologists model the dynamics of ecosystems at a world scale. A single evaluation of such complex models takes minutes or hours on today's hardware. On the other hand, fitting these models to data can require millions of serial evaluations. Monte Carlo methods, for example, are ubiquitous in statistical inference for scientific data, but they scale poorly with the number of model evaluations. Meanwhile, the use of parallel computing architectures for Monte Carlo is often limited to running independent copies of the same algorithm. Blackjack will provide Monte Carlo methods that unlock inference for expensive models in biology by directly addressing the slow rate of convergence and the parallelization of Monte Carlo methods.
The key to take down the Monte Carlo rate is to introduce repulsiveness between the quadrature nodes. For instance, we recently proved that determinantal point processes, a prototypal repulsive distribution introduced in physics, improve the Monte Carlo convergence rate, just like electrons lead to low-variance estimation of volumes by efficiently filling a box. Such results lead to open computational and statistical challenges. We propose to solve these challenges, and make repulsive processes a novel tool for applied statisticians, signal processers, and machine learners.
Still with repulsiveness as a hammer, we will design the first parallel Markov chain Monte Carlo algorithms that are qualitatively different from running independent copies of known algorithms, i.e., that explicitly improve the order of convergence of the single-machine algorithm. To this end, we will turn mathematical tools such as repulsive particle systems and non-colliding processes into computationally cheap, communication-efficient Monte Carlo schemes with fast convergence.
Summary
Expensive computer simulations have become routine in the experimental sciences. Astrophysicists design complex models of the evolution of galaxies, biologists develop intricate models of cells, ecologists model the dynamics of ecosystems at a world scale. A single evaluation of such complex models takes minutes or hours on today's hardware. On the other hand, fitting these models to data can require millions of serial evaluations. Monte Carlo methods, for example, are ubiquitous in statistical inference for scientific data, but they scale poorly with the number of model evaluations. Meanwhile, the use of parallel computing architectures for Monte Carlo is often limited to running independent copies of the same algorithm. Blackjack will provide Monte Carlo methods that unlock inference for expensive models in biology by directly addressing the slow rate of convergence and the parallelization of Monte Carlo methods.
The key to take down the Monte Carlo rate is to introduce repulsiveness between the quadrature nodes. For instance, we recently proved that determinantal point processes, a prototypal repulsive distribution introduced in physics, improve the Monte Carlo convergence rate, just like electrons lead to low-variance estimation of volumes by efficiently filling a box. Such results lead to open computational and statistical challenges. We propose to solve these challenges, and make repulsive processes a novel tool for applied statisticians, signal processers, and machine learners.
Still with repulsiveness as a hammer, we will design the first parallel Markov chain Monte Carlo algorithms that are qualitatively different from running independent copies of known algorithms, i.e., that explicitly improve the order of convergence of the single-machine algorithm. To this end, we will turn mathematical tools such as repulsive particle systems and non-colliding processes into computationally cheap, communication-efficient Monte Carlo schemes with fast convergence.
Max ERC Funding
1 489 000 €
Duration
Start date: 2020-02-01, End date: 2025-01-31
Project acronym CALI
Project The Cambodian Archaeological Lidar Initiative: Exploring Resilience in the Engineered Landscapes of Early SE Asia
Researcher (PI) Damian Evans
Host Institution (HI) ECOLE FRANCAISE D'EXTREME-ORIENT
Country France
Call Details Starting Grant (StG), SH6, ERC-2014-STG
Summary For over half a millennium, the great medieval capital of Angkor lay at the heart of a vast empire stretching across much of mainland SE Asia. Recent research has revealed that the famous monuments of Angkor were merely the epicentre of an immense settlement complex, with highly elaborate engineering works designed to manage water and mitigate the uncertainty of monsoon rains. Compelling evidence is now emerging that other temple complexes of the medieval Khmer Empire may also have formed the urban cores of dispersed, low-density settlements with similar systems of hydraulic engineering.
Using innovative airborne laser scanning (‘lidar’) technology, CALI will uncover, map and compare archaeological landscapes around all the major temple complexes of Cambodia, with a view to understanding what role these complex and vulnerable water management schemes played in the growth and decline of early civilisations in SE Asia. CALI will evaluate the hypothesis that the Khmer civilisation, in a bid to overcome the inherent constraints of a monsoon environment, became locked into rigid and inflexible traditions of urban development and large-scale hydraulic engineering that constrained their ability to adapt to rapidly-changing social, political and environmental circumstances.
By integrating data and techniques from fast-developing archaeological sciences like remote sensing, palaeoclimatology and geoinformatics, this work will provide important insights into the reasons for the collapse of inland agrarian empires in the middle of the second millennium AD, a transition that marks the emergence of modern mainland SE Asia. The lidar data will provide a comprehensive and internally-consistent archive of urban form at a regional scale, and offer a unique experimental space for evaluating socio-ecological resilience, persistence and transformation over two thousand years of human history, with clear implications for our understanding of contemporary urbanism and of urban futures.
Summary
For over half a millennium, the great medieval capital of Angkor lay at the heart of a vast empire stretching across much of mainland SE Asia. Recent research has revealed that the famous monuments of Angkor were merely the epicentre of an immense settlement complex, with highly elaborate engineering works designed to manage water and mitigate the uncertainty of monsoon rains. Compelling evidence is now emerging that other temple complexes of the medieval Khmer Empire may also have formed the urban cores of dispersed, low-density settlements with similar systems of hydraulic engineering.
Using innovative airborne laser scanning (‘lidar’) technology, CALI will uncover, map and compare archaeological landscapes around all the major temple complexes of Cambodia, with a view to understanding what role these complex and vulnerable water management schemes played in the growth and decline of early civilisations in SE Asia. CALI will evaluate the hypothesis that the Khmer civilisation, in a bid to overcome the inherent constraints of a monsoon environment, became locked into rigid and inflexible traditions of urban development and large-scale hydraulic engineering that constrained their ability to adapt to rapidly-changing social, political and environmental circumstances.
By integrating data and techniques from fast-developing archaeological sciences like remote sensing, palaeoclimatology and geoinformatics, this work will provide important insights into the reasons for the collapse of inland agrarian empires in the middle of the second millennium AD, a transition that marks the emergence of modern mainland SE Asia. The lidar data will provide a comprehensive and internally-consistent archive of urban form at a regional scale, and offer a unique experimental space for evaluating socio-ecological resilience, persistence and transformation over two thousand years of human history, with clear implications for our understanding of contemporary urbanism and of urban futures.
Max ERC Funding
1 482 844 €
Duration
Start date: 2015-03-01, End date: 2020-02-29