Project acronym 3D-nanoMorph
Project Label-free 3D morphological nanoscopy for studying sub-cellular dynamics in live cancer cells with high spatio-temporal resolution
Researcher (PI) Krishna AGARWAL
Host Institution (HI) UNIVERSITETET I TROMSOE - NORGES ARKTISKE UNIVERSITET
Call Details Starting Grant (StG), PE7, ERC-2018-STG
Summary Label-free optical nanoscopy, free from photobleaching and photochemical toxicity of fluorescence labels and yielding 3D morphological resolution of <50 nm, is the future of live cell imaging. 3D-nanoMorph breaks the diffraction barrier and shifts the paradigm in label-free nanoscopy, providing isotropic 3D resolution of <50 nm. To achieve this, 3D-nanoMorph performs non-linear inverse scattering for the first time in nanoscopy and decodes scattering between sub-cellular structures (organelles).
3D-nanoMorph innovatively devises complementary roles of light measurement system and computational nanoscopy algorithm. A novel illumination system and a novel light collection system together enable measurement of only the most relevant intensity component and create a fresh perspective about label-free measurements. A new computational nanoscopy approach employs non-linear inverse scattering. Harnessing non-linear inverse scattering for resolution enhancement in nanoscopy opens new possibilities in label-free 3D nanoscopy.
I will apply 3D-nanoMorph to study organelle degradation (autophagy) in live cancer cells over extended duration with high spatial and temporal resolution, presently limited by the lack of high-resolution label-free 3D morphological nanoscopy. Successful 3D mapping of nanoscale biological process of autophagy will open new avenues for cancer treatment and showcase 3D-nanoMorph for wider applications.
My cross-disciplinary expertise of 14 years spanning inverse problems, electromagnetism, optical microscopy, integrated optics and live cell nanoscopy paves path for successful implementation of 3D-nanoMorph.
Summary
Label-free optical nanoscopy, free from photobleaching and photochemical toxicity of fluorescence labels and yielding 3D morphological resolution of <50 nm, is the future of live cell imaging. 3D-nanoMorph breaks the diffraction barrier and shifts the paradigm in label-free nanoscopy, providing isotropic 3D resolution of <50 nm. To achieve this, 3D-nanoMorph performs non-linear inverse scattering for the first time in nanoscopy and decodes scattering between sub-cellular structures (organelles).
3D-nanoMorph innovatively devises complementary roles of light measurement system and computational nanoscopy algorithm. A novel illumination system and a novel light collection system together enable measurement of only the most relevant intensity component and create a fresh perspective about label-free measurements. A new computational nanoscopy approach employs non-linear inverse scattering. Harnessing non-linear inverse scattering for resolution enhancement in nanoscopy opens new possibilities in label-free 3D nanoscopy.
I will apply 3D-nanoMorph to study organelle degradation (autophagy) in live cancer cells over extended duration with high spatial and temporal resolution, presently limited by the lack of high-resolution label-free 3D morphological nanoscopy. Successful 3D mapping of nanoscale biological process of autophagy will open new avenues for cancer treatment and showcase 3D-nanoMorph for wider applications.
My cross-disciplinary expertise of 14 years spanning inverse problems, electromagnetism, optical microscopy, integrated optics and live cell nanoscopy paves path for successful implementation of 3D-nanoMorph.
Max ERC Funding
1 499 999 €
Duration
Start date: 2019-07-01, End date: 2024-06-30
Project acronym ActiveWindFarms
Project Active Wind Farms: Optimization and Control of Atmospheric Energy Extraction in Gigawatt Wind Farms
Researcher (PI) Johan Meyers
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE8, ERC-2012-StG_20111012
Summary With the recognition that wind energy will become an important contributor to the world’s energy portfolio, several wind farms with a capacity of over 1 gigawatt are in planning phase. In the past, engineering of wind farms focused on a bottom-up approach, in which atmospheric wind availability was considered to be fixed by climate and weather. However, farms of gigawatt size slow down the Atmospheric Boundary Layer (ABL) as a whole, reducing the availability of wind at turbine hub height. In Denmark’s large off-shore farms, this leads to underperformance of turbines which can reach levels of 40%–50% compared to the same turbine in a lone-standing case. For large wind farms, the vertical structure and turbulence physics of the flow in the ABL become crucial ingredients in their design and operation. This introduces a new set of scientific challenges related to the design and control of large wind farms. The major ambition of the present research proposal is to employ optimal control techniques to control the interaction between large wind farms and the ABL, and optimize overall farm-power extraction. Individual turbines are used as flow actuators by dynamically pitching their blades using time scales ranging between 10 to 500 seconds. The application of such control efforts on the atmospheric boundary layer has never been attempted before, and introduces flow control on a physical scale which is currently unprecedented. The PI possesses a unique combination of expertise and tools enabling these developments: efficient parallel large-eddy simulations of wind farms, multi-scale turbine modeling, and gradient-based optimization in large optimization-parameter spaces using adjoint formulations. To ensure a maximum impact on the wind-engineering field, the project aims at optimal control, experimental wind-tunnel validation, and at including multi-disciplinary aspects, related to structural mechanics, power quality, and controller design.
Summary
With the recognition that wind energy will become an important contributor to the world’s energy portfolio, several wind farms with a capacity of over 1 gigawatt are in planning phase. In the past, engineering of wind farms focused on a bottom-up approach, in which atmospheric wind availability was considered to be fixed by climate and weather. However, farms of gigawatt size slow down the Atmospheric Boundary Layer (ABL) as a whole, reducing the availability of wind at turbine hub height. In Denmark’s large off-shore farms, this leads to underperformance of turbines which can reach levels of 40%–50% compared to the same turbine in a lone-standing case. For large wind farms, the vertical structure and turbulence physics of the flow in the ABL become crucial ingredients in their design and operation. This introduces a new set of scientific challenges related to the design and control of large wind farms. The major ambition of the present research proposal is to employ optimal control techniques to control the interaction between large wind farms and the ABL, and optimize overall farm-power extraction. Individual turbines are used as flow actuators by dynamically pitching their blades using time scales ranging between 10 to 500 seconds. The application of such control efforts on the atmospheric boundary layer has never been attempted before, and introduces flow control on a physical scale which is currently unprecedented. The PI possesses a unique combination of expertise and tools enabling these developments: efficient parallel large-eddy simulations of wind farms, multi-scale turbine modeling, and gradient-based optimization in large optimization-parameter spaces using adjoint formulations. To ensure a maximum impact on the wind-engineering field, the project aims at optimal control, experimental wind-tunnel validation, and at including multi-disciplinary aspects, related to structural mechanics, power quality, and controller design.
Max ERC Funding
1 499 241 €
Duration
Start date: 2012-10-01, End date: 2017-09-30
Project acronym AEROSPACEPHYS
Project Multiphysics models and simulations for reacting and plasma flows applied to the space exploration program
Researcher (PI) Thierry Edouard Bertrand Magin
Host Institution (HI) INSTITUT VON KARMAN DE DYNAMIQUE DES FLUIDES
Call Details Starting Grant (StG), PE8, ERC-2010-StG_20091028
Summary Space exploration is one of boldest and most exciting endeavors that humanity has undertaken, and it holds enormous promise for the future. Our next challenges for the spatial conquest include bringing back samples to Earth by means of robotic missions and continuing the manned exploration program, which aims at sending human beings to Mars and bring them home safely. Inaccurate prediction of the heat-flux to the surface of the spacecraft heat shield can be fatal for the crew or the success of a robotic mission. This quantity is estimated during the design phase. An accurate prediction is a particularly complex task, regarding modelling of the following phenomena that are potential “mission killers:” 1) Radiation of the plasma in the shock layer, 2) Complex surface chemistry on the thermal protection material, 3) Flow transition from laminar to turbulent. Our poor understanding of the coupled mechanisms of radiation, ablation, and transition leads to the difficulties in flux prediction. To avoid failure and ensure safety of the astronauts and payload, engineers resort to “safety factors” to determine the thickness of the heat shield, at the expense of the mass of embarked payload. Thinking out of the box and basic research are thus necessary for advancements of the models that will better define the environment and requirements for the design and safe operation of tomorrow’s space vehicles and planetary probes for the manned space exploration. The three basic ingredients for predictive science are: 1) Physico-chemical models, 2) Computational methods, 3) Experimental data. We propose to follow a complementary approach for prediction. The proposed research aims at: “Integrating new advanced physico-chemical models and computational methods, based on a multidisciplinary approach developed together with physicists, chemists, and applied mathematicians, to create a top-notch multiphysics and multiscale numerical platform for simulations of planetary atmosphere entries, crucial to the new challenges of the manned space exploration program. Experimental data will also be used for validation, following state-of-the-art uncertainty quantification methods.”
Summary
Space exploration is one of boldest and most exciting endeavors that humanity has undertaken, and it holds enormous promise for the future. Our next challenges for the spatial conquest include bringing back samples to Earth by means of robotic missions and continuing the manned exploration program, which aims at sending human beings to Mars and bring them home safely. Inaccurate prediction of the heat-flux to the surface of the spacecraft heat shield can be fatal for the crew or the success of a robotic mission. This quantity is estimated during the design phase. An accurate prediction is a particularly complex task, regarding modelling of the following phenomena that are potential “mission killers:” 1) Radiation of the plasma in the shock layer, 2) Complex surface chemistry on the thermal protection material, 3) Flow transition from laminar to turbulent. Our poor understanding of the coupled mechanisms of radiation, ablation, and transition leads to the difficulties in flux prediction. To avoid failure and ensure safety of the astronauts and payload, engineers resort to “safety factors” to determine the thickness of the heat shield, at the expense of the mass of embarked payload. Thinking out of the box and basic research are thus necessary for advancements of the models that will better define the environment and requirements for the design and safe operation of tomorrow’s space vehicles and planetary probes for the manned space exploration. The three basic ingredients for predictive science are: 1) Physico-chemical models, 2) Computational methods, 3) Experimental data. We propose to follow a complementary approach for prediction. The proposed research aims at: “Integrating new advanced physico-chemical models and computational methods, based on a multidisciplinary approach developed together with physicists, chemists, and applied mathematicians, to create a top-notch multiphysics and multiscale numerical platform for simulations of planetary atmosphere entries, crucial to the new challenges of the manned space exploration program. Experimental data will also be used for validation, following state-of-the-art uncertainty quantification methods.”
Max ERC Funding
1 494 892 €
Duration
Start date: 2010-09-01, End date: 2015-08-31
Project acronym AFRIVAL
Project African river basins: catchment-scale carbon fluxes and transformations
Researcher (PI) Steven Bouillon
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE10, ERC-2009-StG
Summary This proposal wishes to fundamentally improve our understanding of the role of tropical freshwater ecosystems in carbon (C) cycling on the catchment scale. It uses an unprecedented combination of state-of-the-art proxies such as stable isotope, 14C and biomarker signatures to characterize organic matter, radiogenic isotope signatures to determine particle residence times, as well as field measurements of relevant biogeochemical processes. We focus on tropical systems since there is a striking lack of data on such systems, even though riverine C transport is thought to be disproportionately high in tropical areas. Furthermore, the presence of landscape-scale contrasts in vegetation (in particular, C3 vs. C4 plants) are an important asset in the use of stable isotopes as natural tracers of C cycling processes on this scale. Freshwater ecosystems are an important component in the global C cycle, and the primary link between terrestrial and marine ecosystems. Recent estimates indicate that ~2 Pg C y-1 (Pg=Petagram) enter freshwater systems, i.e., about twice the estimated global terrestrial C sink. More than half of this is thought to be remineralized before it reaches the coastal zone, and for the Amazon basin this has even been suggested to be ~90% of the lateral C inputs. The question how general these patterns are is a matter of debate, and assessing the mechanisms determining the degree of processing versus transport of organic carbon in lakes and river systems is critical to further constrain their role in the global C cycle. This proposal provides an interdisciplinary approach to describe and quantify catchment-scale C transport and cycling in tropical river basins. Besides conceptual and methodological advances, and a significant expansion of our dataset on C processes in such systems, new data gathered in this project are likely to provide exciting and novel hypotheses on the functioning of freshwater systems and their linkage to the terrestrial C budget.
Summary
This proposal wishes to fundamentally improve our understanding of the role of tropical freshwater ecosystems in carbon (C) cycling on the catchment scale. It uses an unprecedented combination of state-of-the-art proxies such as stable isotope, 14C and biomarker signatures to characterize organic matter, radiogenic isotope signatures to determine particle residence times, as well as field measurements of relevant biogeochemical processes. We focus on tropical systems since there is a striking lack of data on such systems, even though riverine C transport is thought to be disproportionately high in tropical areas. Furthermore, the presence of landscape-scale contrasts in vegetation (in particular, C3 vs. C4 plants) are an important asset in the use of stable isotopes as natural tracers of C cycling processes on this scale. Freshwater ecosystems are an important component in the global C cycle, and the primary link between terrestrial and marine ecosystems. Recent estimates indicate that ~2 Pg C y-1 (Pg=Petagram) enter freshwater systems, i.e., about twice the estimated global terrestrial C sink. More than half of this is thought to be remineralized before it reaches the coastal zone, and for the Amazon basin this has even been suggested to be ~90% of the lateral C inputs. The question how general these patterns are is a matter of debate, and assessing the mechanisms determining the degree of processing versus transport of organic carbon in lakes and river systems is critical to further constrain their role in the global C cycle. This proposal provides an interdisciplinary approach to describe and quantify catchment-scale C transport and cycling in tropical river basins. Besides conceptual and methodological advances, and a significant expansion of our dataset on C processes in such systems, new data gathered in this project are likely to provide exciting and novel hypotheses on the functioning of freshwater systems and their linkage to the terrestrial C budget.
Max ERC Funding
1 745 262 €
Duration
Start date: 2009-10-01, End date: 2014-09-30
Project acronym ALUFIX
Project Friction stir processing based local damage mitigation and healing in aluminium alloys
Researcher (PI) Aude SIMAR
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Starting Grant (StG), PE8, ERC-2016-STG
Summary ALUFIX proposes an original strategy for the development of aluminium-based materials involving damage mitigation and extrinsic self-healing concepts exploiting the new opportunities of the solid-state friction stir process. Friction stir processing locally extrudes and drags material from the front to the back and around the tool pin. It involves short duration at moderate temperatures (typically 80% of the melting temperature), fast cooling rates and large plastic deformations leading to far out-of-equilibrium microstructures. The idea is that commercial aluminium alloys can be locally improved and healed in regions of stress concentration where damage is likely to occur. Self-healing in metal-based materials is still in its infancy and existing strategies can hardly be extended to applications. Friction stir processing can enhance the damage and fatigue resistance of aluminium alloys by microstructure homogenisation and refinement. In parallel, friction stir processing can be used to integrate secondary phases in an aluminium matrix. In the ALUFIX project, healing phases will thus be integrated in aluminium in addition to refining and homogenising the microstructure. The “local stress management strategy” favours crack closure and crack deviation at the sub-millimetre scale thanks to a controlled residual stress field. The “transient liquid healing agent” strategy involves the in-situ generation of an out-of-equilibrium compositionally graded microstructure at the aluminium/healing agent interface capable of liquid-phase healing after a thermal treatment. Along the road, a variety of new scientific questions concerning the damage mechanisms will have to be addressed.
Summary
ALUFIX proposes an original strategy for the development of aluminium-based materials involving damage mitigation and extrinsic self-healing concepts exploiting the new opportunities of the solid-state friction stir process. Friction stir processing locally extrudes and drags material from the front to the back and around the tool pin. It involves short duration at moderate temperatures (typically 80% of the melting temperature), fast cooling rates and large plastic deformations leading to far out-of-equilibrium microstructures. The idea is that commercial aluminium alloys can be locally improved and healed in regions of stress concentration where damage is likely to occur. Self-healing in metal-based materials is still in its infancy and existing strategies can hardly be extended to applications. Friction stir processing can enhance the damage and fatigue resistance of aluminium alloys by microstructure homogenisation and refinement. In parallel, friction stir processing can be used to integrate secondary phases in an aluminium matrix. In the ALUFIX project, healing phases will thus be integrated in aluminium in addition to refining and homogenising the microstructure. The “local stress management strategy” favours crack closure and crack deviation at the sub-millimetre scale thanks to a controlled residual stress field. The “transient liquid healing agent” strategy involves the in-situ generation of an out-of-equilibrium compositionally graded microstructure at the aluminium/healing agent interface capable of liquid-phase healing after a thermal treatment. Along the road, a variety of new scientific questions concerning the damage mechanisms will have to be addressed.
Max ERC Funding
1 497 447 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym ANISOTROPIC UNIVERSE
Project The anisotropic universe -- a reality or fluke?
Researcher (PI) Hans Kristian Kamfjord Eriksen
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE9, ERC-2010-StG_20091028
Summary "During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Summary
"During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym BIVAQUM
Project Bivariational Approximations in Quantum Mechanics and Applications to Quantum Chemistry
Researcher (PI) Simen Kvaal
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE4, ERC-2014-STG
Summary The standard variational principles (VPs) are cornerstones of quantum mechanics, and one can hardly overestimate their usefulness as tools for generating approximations to the time-independent and
time-dependent Schröodinger equations. The aim of the proposal is to study and apply a generalization of these, the bivariational principles (BIVPs), which arise naturally when one does not assume a priori that the system Hamiltonian is Hermitian. This unconventional approach may have transformative impact on development of ab initio methodology, both for electronic structure and dynamics.
The first objective is to establish the mathematical foundation for the BIVPs. This opens up a whole new axis of method development for ab initio approaches. For instance, it is a largely ignored fact that the popular traditional coupled cluster (TCC) method can be neatly formulated with the BIVPs, and TCC is both polynomially scaling with the number of electrons and size-consistent. No “variational” method enjoys these properties simultaneously, indeed this seems to be incompatible with the standard VPs.
Armed with the BIVPs, the project aims to develop new and understand existing ab initio methods. The second objective is thus a systematic multireference coupled cluster theory (MRCC) based on the BIVPs. This
is in itself a novel approach that carries large potential benefits and impact. The third and last objective is an implementation of a new coupled-cluster type method where the orbitals are bivariational
parameters. This gives a size-consistent hierarchy of approximations to multiconfiguration
Hartree--Fock.
The PI's broad contact with and background in scientific disciplines such as applied mathematics and nuclear physics in addition to quantum chemistry increases the feasibility of the project.
Summary
The standard variational principles (VPs) are cornerstones of quantum mechanics, and one can hardly overestimate their usefulness as tools for generating approximations to the time-independent and
time-dependent Schröodinger equations. The aim of the proposal is to study and apply a generalization of these, the bivariational principles (BIVPs), which arise naturally when one does not assume a priori that the system Hamiltonian is Hermitian. This unconventional approach may have transformative impact on development of ab initio methodology, both for electronic structure and dynamics.
The first objective is to establish the mathematical foundation for the BIVPs. This opens up a whole new axis of method development for ab initio approaches. For instance, it is a largely ignored fact that the popular traditional coupled cluster (TCC) method can be neatly formulated with the BIVPs, and TCC is both polynomially scaling with the number of electrons and size-consistent. No “variational” method enjoys these properties simultaneously, indeed this seems to be incompatible with the standard VPs.
Armed with the BIVPs, the project aims to develop new and understand existing ab initio methods. The second objective is thus a systematic multireference coupled cluster theory (MRCC) based on the BIVPs. This
is in itself a novel approach that carries large potential benefits and impact. The third and last objective is an implementation of a new coupled-cluster type method where the orbitals are bivariational
parameters. This gives a size-consistent hierarchy of approximations to multiconfiguration
Hartree--Fock.
The PI's broad contact with and background in scientific disciplines such as applied mathematics and nuclear physics in addition to quantum chemistry increases the feasibility of the project.
Max ERC Funding
1 499 572 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym BRIDGE
Project Biomimetic process design for tissue regeneration:
from bench to bedside via in silico modelling
Researcher (PI) Liesbet Geris
Host Institution (HI) UNIVERSITE DE LIEGE
Call Details Starting Grant (StG), PE8, ERC-2011-StG_20101014
Summary "Tissue engineering (TE), the interdisciplinary field combining biomedical and engineering sciences in the search for functional man-made organ replacements, has key issues with the quantity and quality of the generated products. Protocols followed in the lab are mainly trial and error based, requiring a huge amount of manual interventions and lacking clear early time-point quality criteria to guide the process. As a result, these processes are very hard to scale up to industrial production levels. BRIDGE aims to fortify the engineering aspects of the TE field by adding a higher level of understanding and control to the manufacturing process (MP) through the use of in silico models. BRIDGE will focus on the bone TE field to provide proof of concept for its in silico approach.
The combination of the applicant's well-received published and ongoing work on a wide range of modelling tools in the bone field combined with the state-of-the-art experimental techniques present in the TE lab of the additional participant allows envisaging following innovation and impact:
1. proof-of-concept of the use of an in silico blue-print for the design and control of a robust modular TE MP;
2. model-derived optimised culture conditions for patient derived cell populations increasing modular robustness of in vitro chondrogenesis/endochondral ossification;
3. in silico identification of a limited set of in vitro biomarkers that is predictive of the in vivo outcome;
4. model-derived optimised culture conditions increasing quantity and quality of the in vivo outcome of the TE MP;
5. incorporation of congenital defects in the in silico MP design, constituting a further validation of BRIDGE’s in silico approach and a necessary step towards personalised medical care.
We believe that the systematic – and unprecedented – integration of (bone) TE and mathematical modelling, as proposed in BRIDGE, is required to come to a rationalized, engineering approach to design and control bone TE MPs."
Summary
"Tissue engineering (TE), the interdisciplinary field combining biomedical and engineering sciences in the search for functional man-made organ replacements, has key issues with the quantity and quality of the generated products. Protocols followed in the lab are mainly trial and error based, requiring a huge amount of manual interventions and lacking clear early time-point quality criteria to guide the process. As a result, these processes are very hard to scale up to industrial production levels. BRIDGE aims to fortify the engineering aspects of the TE field by adding a higher level of understanding and control to the manufacturing process (MP) through the use of in silico models. BRIDGE will focus on the bone TE field to provide proof of concept for its in silico approach.
The combination of the applicant's well-received published and ongoing work on a wide range of modelling tools in the bone field combined with the state-of-the-art experimental techniques present in the TE lab of the additional participant allows envisaging following innovation and impact:
1. proof-of-concept of the use of an in silico blue-print for the design and control of a robust modular TE MP;
2. model-derived optimised culture conditions for patient derived cell populations increasing modular robustness of in vitro chondrogenesis/endochondral ossification;
3. in silico identification of a limited set of in vitro biomarkers that is predictive of the in vivo outcome;
4. model-derived optimised culture conditions increasing quantity and quality of the in vivo outcome of the TE MP;
5. incorporation of congenital defects in the in silico MP design, constituting a further validation of BRIDGE’s in silico approach and a necessary step towards personalised medical care.
We believe that the systematic – and unprecedented – integration of (bone) TE and mathematical modelling, as proposed in BRIDGE, is required to come to a rationalized, engineering approach to design and control bone TE MPs."
Max ERC Funding
1 191 440 €
Duration
Start date: 2011-12-01, End date: 2016-11-30
Project acronym BRuSH
Project Oral bacteria as determinants for respiratory health
Researcher (PI) Randi BERTELSEN
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Starting Grant (StG), LS7, ERC-2018-STG
Summary The oral cavity is the gateway to the lower respiratory tract, and oral bacteria are likely to play a role in lung health. This may be the case for pathogens as well as commensal bacteria and the balance between species. The oral bacterial community of patients with periodontitis is dominated by gram-negative bacteria and a higher lipopolysaccharide (LPS) activity than in healthy microbiota. Furthermore, bacteria with especially potent pro-inflammatory LPS have been shown to be more common in the lungs of asthmatic than in healthy individuals. The working hypothesis of BRuSH is that microbiome communities dominated by LPS-producing bacteria which induce a particularly strong pro-inflammatory immune response in the host, will have a negative effect on respiratory health. I will test this hypothesis in two longitudinally designed population-based lung health studies. I aim to identify whether specific bacterial composition and types of LPS producing bacteria in oral and dust samples predict lung function and respiratory health over time; and if the different types of LPS-producing bacteria affect LPS in saliva saliva and dust. BRuSH will apply functional genome annotation that can assign biological significance to raw bacterial DNA sequences. With this bioinformatics tool I will cluster microbiome data into various LPS-producers: bacteria with LPS with strong inflammatory effects and others with weak- or antagonistic effects. The epidemiological studies will be supported by mice-models of asthma and cell assays of human bronchial epithelial cells, by exposing mice and bronchial cells to chemically synthesized Lipid A (the component that drive the LPS-induced immune responses) of various potency. The goal of BRuSH is to prove a causal relationship between oral microbiome and lung health, and gain knowledge that will enable us to make oral health a feasible target for intervention programs aimed at optimizing lung health and preventing respiratory disease.
Summary
The oral cavity is the gateway to the lower respiratory tract, and oral bacteria are likely to play a role in lung health. This may be the case for pathogens as well as commensal bacteria and the balance between species. The oral bacterial community of patients with periodontitis is dominated by gram-negative bacteria and a higher lipopolysaccharide (LPS) activity than in healthy microbiota. Furthermore, bacteria with especially potent pro-inflammatory LPS have been shown to be more common in the lungs of asthmatic than in healthy individuals. The working hypothesis of BRuSH is that microbiome communities dominated by LPS-producing bacteria which induce a particularly strong pro-inflammatory immune response in the host, will have a negative effect on respiratory health. I will test this hypothesis in two longitudinally designed population-based lung health studies. I aim to identify whether specific bacterial composition and types of LPS producing bacteria in oral and dust samples predict lung function and respiratory health over time; and if the different types of LPS-producing bacteria affect LPS in saliva saliva and dust. BRuSH will apply functional genome annotation that can assign biological significance to raw bacterial DNA sequences. With this bioinformatics tool I will cluster microbiome data into various LPS-producers: bacteria with LPS with strong inflammatory effects and others with weak- or antagonistic effects. The epidemiological studies will be supported by mice-models of asthma and cell assays of human bronchial epithelial cells, by exposing mice and bronchial cells to chemically synthesized Lipid A (the component that drive the LPS-induced immune responses) of various potency. The goal of BRuSH is to prove a causal relationship between oral microbiome and lung health, and gain knowledge that will enable us to make oral health a feasible target for intervention programs aimed at optimizing lung health and preventing respiratory disease.
Max ERC Funding
1 499 938 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym C4T
Project Climate change across Cenozoic cooling steps reconstructed with clumped isotope thermometry
Researcher (PI) Anna Nele Meckler
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Starting Grant (StG), PE10, ERC-2014-STG
Summary The Earth's climate system contains a highly complex interplay of numerous components, such as atmospheric greenhouse gases, ice sheets, and ocean circulation. Due to nonlinearities and feedbacks, changes to the system can result in rapid transitions to radically different climate states. In light of rising greenhouse gas levels there is an urgent need to better understand climate at such tipping points. Reconstructions of profound climate changes in the past provide crucial insight into our climate system and help to predict future changes. However, all proxies we use to reconstruct past climate depend on assumptions that are in addition increasingly uncertain back in time. A new kind of temperature proxy, the carbonate ‘clumped isotope’ thermometer, has great potential to overcome these obstacles. The proxy relies on thermodynamic principles, taking advantage of the temperature-dependence of the binding strength between different isotopes of carbon and oxygen, which makes it independent of other variables. Yet, widespread application of this technique in paleoceanography is currently prevented by the required large sample amounts, which are difficult to obtain from ocean sediments. If applied to the minute carbonate shells preserved in the sediments, this proxy would allow robust reconstructions of past temperatures in the surface and deep ocean, as well as global ice volume, far back in time. Here I propose to considerably decrease sample amount requirements of clumped isotope thermometry, building on recent successful modifications of the method and ideas for further analytical improvements. This will enable my group and me to thoroughly ground-truth the proxy for application in paleoceanography and for the first time apply it to aspects of past climate change across major climate transitions in the past, where clumped isotope thermometry can immediately contribute to solving long-standing first-order questions and allow for major progress in the field.
Summary
The Earth's climate system contains a highly complex interplay of numerous components, such as atmospheric greenhouse gases, ice sheets, and ocean circulation. Due to nonlinearities and feedbacks, changes to the system can result in rapid transitions to radically different climate states. In light of rising greenhouse gas levels there is an urgent need to better understand climate at such tipping points. Reconstructions of profound climate changes in the past provide crucial insight into our climate system and help to predict future changes. However, all proxies we use to reconstruct past climate depend on assumptions that are in addition increasingly uncertain back in time. A new kind of temperature proxy, the carbonate ‘clumped isotope’ thermometer, has great potential to overcome these obstacles. The proxy relies on thermodynamic principles, taking advantage of the temperature-dependence of the binding strength between different isotopes of carbon and oxygen, which makes it independent of other variables. Yet, widespread application of this technique in paleoceanography is currently prevented by the required large sample amounts, which are difficult to obtain from ocean sediments. If applied to the minute carbonate shells preserved in the sediments, this proxy would allow robust reconstructions of past temperatures in the surface and deep ocean, as well as global ice volume, far back in time. Here I propose to considerably decrease sample amount requirements of clumped isotope thermometry, building on recent successful modifications of the method and ideas for further analytical improvements. This will enable my group and me to thoroughly ground-truth the proxy for application in paleoceanography and for the first time apply it to aspects of past climate change across major climate transitions in the past, where clumped isotope thermometry can immediately contribute to solving long-standing first-order questions and allow for major progress in the field.
Max ERC Funding
1 877 209 €
Duration
Start date: 2015-08-01, End date: 2020-07-31