Project acronym ACCOPT
Project ACelerated COnvex OPTimization
Researcher (PI) Yurii NESTEROV
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Advanced Grant (AdG), PE1, ERC-2017-ADG
Summary The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Summary
The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Max ERC Funding
2 090 038 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym BRuSH
Project Oral bacteria as determinants for respiratory health
Researcher (PI) Randi BERTELSEN
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Starting Grant (StG), LS7, ERC-2018-STG
Summary The oral cavity is the gateway to the lower respiratory tract, and oral bacteria are likely to play a role in lung health. This may be the case for pathogens as well as commensal bacteria and the balance between species. The oral bacterial community of patients with periodontitis is dominated by gram-negative bacteria and a higher lipopolysaccharide (LPS) activity than in healthy microbiota. Furthermore, bacteria with especially potent pro-inflammatory LPS have been shown to be more common in the lungs of asthmatic than in healthy individuals. The working hypothesis of BRuSH is that microbiome communities dominated by LPS-producing bacteria which induce a particularly strong pro-inflammatory immune response in the host, will have a negative effect on respiratory health. I will test this hypothesis in two longitudinally designed population-based lung health studies. I aim to identify whether specific bacterial composition and types of LPS producing bacteria in oral and dust samples predict lung function and respiratory health over time; and if the different types of LPS-producing bacteria affect LPS in saliva saliva and dust. BRuSH will apply functional genome annotation that can assign biological significance to raw bacterial DNA sequences. With this bioinformatics tool I will cluster microbiome data into various LPS-producers: bacteria with LPS with strong inflammatory effects and others with weak- or antagonistic effects. The epidemiological studies will be supported by mice-models of asthma and cell assays of human bronchial epithelial cells, by exposing mice and bronchial cells to chemically synthesized Lipid A (the component that drive the LPS-induced immune responses) of various potency. The goal of BRuSH is to prove a causal relationship between oral microbiome and lung health, and gain knowledge that will enable us to make oral health a feasible target for intervention programs aimed at optimizing lung health and preventing respiratory disease.
Summary
The oral cavity is the gateway to the lower respiratory tract, and oral bacteria are likely to play a role in lung health. This may be the case for pathogens as well as commensal bacteria and the balance between species. The oral bacterial community of patients with periodontitis is dominated by gram-negative bacteria and a higher lipopolysaccharide (LPS) activity than in healthy microbiota. Furthermore, bacteria with especially potent pro-inflammatory LPS have been shown to be more common in the lungs of asthmatic than in healthy individuals. The working hypothesis of BRuSH is that microbiome communities dominated by LPS-producing bacteria which induce a particularly strong pro-inflammatory immune response in the host, will have a negative effect on respiratory health. I will test this hypothesis in two longitudinally designed population-based lung health studies. I aim to identify whether specific bacterial composition and types of LPS producing bacteria in oral and dust samples predict lung function and respiratory health over time; and if the different types of LPS-producing bacteria affect LPS in saliva saliva and dust. BRuSH will apply functional genome annotation that can assign biological significance to raw bacterial DNA sequences. With this bioinformatics tool I will cluster microbiome data into various LPS-producers: bacteria with LPS with strong inflammatory effects and others with weak- or antagonistic effects. The epidemiological studies will be supported by mice-models of asthma and cell assays of human bronchial epithelial cells, by exposing mice and bronchial cells to chemically synthesized Lipid A (the component that drive the LPS-induced immune responses) of various potency. The goal of BRuSH is to prove a causal relationship between oral microbiome and lung health, and gain knowledge that will enable us to make oral health a feasible target for intervention programs aimed at optimizing lung health and preventing respiratory disease.
Max ERC Funding
1 499 938 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym CALCULUS
Project Commonsense and Anticipation enriched Learning of Continuous representations sUpporting Language UnderStanding
Researcher (PI) Marie-Francine MOENS
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary Natural language understanding (NLU) by the machine is of large scientific, economic and social value. Humans perform the NLU task in an efficient way by relying on their capability to imagine or anticipate situations. They engage commonsense and world knowledge that is often acquired through perceptual experiences to make explicit what is left implicit in language. Inspired by these characteristics CALCULUS will design, implement and evaluate innovative paradigms supporting NLU, where it will combine old but powerful ideas for language understanding from the early days of artificial intelligence with new approaches from machine learning. The project focuses on the effective learning of anticipatory, continuous, non-symbolic representations of event frames and narrative structures of events that are trained on language and visual data. The grammatical structure of language is grounded in the geometric structure of visual data while embodying aspects of commonsense and world knowledge. The reusable representations are evaluated in a selection of NLU tasks requiring efficient real-time retrieval of the representations and parsing of the targeted written texts. Finally, we will evaluate the inference potential of the anticipatory representations in situations not seen in the training data and when inferring spatial and temporal information in metric real world spaces that is not mentioned in the processed language. The machine learning methods focus on learning latent variable models relying on Bayesian probabilistic models and neural networks and focus on settings with limited training data that are manually annotated. The best models will be integrated in a demonstrator that translates the language of stories to events happening in a 3-D virtual world. The PI has interdisciplinary expertise in natural language processing, joint processing of language and visual data, information retrieval and machine learning needed for the successful realization of the project.
Summary
Natural language understanding (NLU) by the machine is of large scientific, economic and social value. Humans perform the NLU task in an efficient way by relying on their capability to imagine or anticipate situations. They engage commonsense and world knowledge that is often acquired through perceptual experiences to make explicit what is left implicit in language. Inspired by these characteristics CALCULUS will design, implement and evaluate innovative paradigms supporting NLU, where it will combine old but powerful ideas for language understanding from the early days of artificial intelligence with new approaches from machine learning. The project focuses on the effective learning of anticipatory, continuous, non-symbolic representations of event frames and narrative structures of events that are trained on language and visual data. The grammatical structure of language is grounded in the geometric structure of visual data while embodying aspects of commonsense and world knowledge. The reusable representations are evaluated in a selection of NLU tasks requiring efficient real-time retrieval of the representations and parsing of the targeted written texts. Finally, we will evaluate the inference potential of the anticipatory representations in situations not seen in the training data and when inferring spatial and temporal information in metric real world spaces that is not mentioned in the processed language. The machine learning methods focus on learning latent variable models relying on Bayesian probabilistic models and neural networks and focus on settings with limited training data that are manually annotated. The best models will be integrated in a demonstrator that translates the language of stories to events happening in a 3-D virtual world. The PI has interdisciplinary expertise in natural language processing, joint processing of language and visual data, information retrieval and machine learning needed for the successful realization of the project.
Max ERC Funding
2 227 500 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym DISPATCH Neuro-Sense
Project Distributed Signal Processing Algorithms for Chronic Neuro-Sensor Networks
Researcher (PI) Alexander BERTRAND
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE6, ERC-2018-STG
Summary The possibility to chronically monitor the brain 24/7 in daily-life activities would revolutionize human-machine interactions and health care, e.g., in the context of neuroprostheses, neurological disorders, and brain-computer interfaces (BCI). Such chronic systems must satisfy challenging energy and miniaturization constraints, leading to modular designs in which multiple networked miniature neuro-sensor modules form a ‘neuro-sensor network’ (NSN).
However, current multi-channel neural signal processing (NSP) algorithms were designed for traditional neuro-sensor arrays with central access to all channels. These algorithms are not suited for NSNs, as they require unrealistic bandwidth budgets to centralize the data, yet a joint neural data analysis across NSN modules is crucial.
The central idea of this project is to remove this algorithm bottleneck by designing novel scalable, distributed NSP algorithms to let the modules of an NSN jointly process the recorded neural data through in-network data fusion and with a minimal exchange of data.
To guarantee impact, we mainly focus on establishing a new non-invasive NSN concept based on electroencephalography (EEG). By combining multiple ‘smart’ mini-EEG modules into an ‘EEG sensor network’ (EEG-Net), we compensate for the lack of spatial information captured by current stand-alone mini-EEG devices, without compromising in ‘wearability’. Equipping such EEG-Nets with distributed NSP algorithms will allow to process high-density EEG data at viable energy levels, which is a game changer towards high-performance chronic EEG for, e.g., epilepsy monitoring, neuroprostheses, and BCI.
We will validate these claims in an EEG-Net prototype in the above 3 use cases, benefiting from ongoing collaborations with the KUL university hospital. In addition, to demonstrate the general applicability of our novel NSP algorithms, we will validate them in other emerging NSN types as well, such as modular or untethered neural probes.
Summary
The possibility to chronically monitor the brain 24/7 in daily-life activities would revolutionize human-machine interactions and health care, e.g., in the context of neuroprostheses, neurological disorders, and brain-computer interfaces (BCI). Such chronic systems must satisfy challenging energy and miniaturization constraints, leading to modular designs in which multiple networked miniature neuro-sensor modules form a ‘neuro-sensor network’ (NSN).
However, current multi-channel neural signal processing (NSP) algorithms were designed for traditional neuro-sensor arrays with central access to all channels. These algorithms are not suited for NSNs, as they require unrealistic bandwidth budgets to centralize the data, yet a joint neural data analysis across NSN modules is crucial.
The central idea of this project is to remove this algorithm bottleneck by designing novel scalable, distributed NSP algorithms to let the modules of an NSN jointly process the recorded neural data through in-network data fusion and with a minimal exchange of data.
To guarantee impact, we mainly focus on establishing a new non-invasive NSN concept based on electroencephalography (EEG). By combining multiple ‘smart’ mini-EEG modules into an ‘EEG sensor network’ (EEG-Net), we compensate for the lack of spatial information captured by current stand-alone mini-EEG devices, without compromising in ‘wearability’. Equipping such EEG-Nets with distributed NSP algorithms will allow to process high-density EEG data at viable energy levels, which is a game changer towards high-performance chronic EEG for, e.g., epilepsy monitoring, neuroprostheses, and BCI.
We will validate these claims in an EEG-Net prototype in the above 3 use cases, benefiting from ongoing collaborations with the KUL university hospital. In addition, to demonstrate the general applicability of our novel NSP algorithms, we will validate them in other emerging NSN types as well, such as modular or untethered neural probes.
Max ERC Funding
1 489 656 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym FHiCuNCAG
Project Foundations for Higher and Curved Noncommutative Algebraic Geometry
Researcher (PI) Wendy Joy Lowen
Host Institution (HI) UNIVERSITEIT ANTWERPEN
Call Details Consolidator Grant (CoG), PE1, ERC-2018-COG
Summary With this research programme, inspired by open problems within noncommutative algebraic geometry (NCAG) as well as by actual developments in algebraic topology, it is our aim to lay out new foundations for NCAG. On the one hand, the categorical approach to geometry put forth in NCAG has seen a wide range of applications both in mathematics and in theoretical physics. On the other hand, algebraic topology has received a vast impetus from the development of higher topos theory by Lurie and others. The current project is aimed at cross-fertilisation between the two subjects, in particular through the development of “higher linear topos theory”. We will approach the higher structure on Hochschild type complexes from two angles. Firstly, focusing on intrinsic incarnations of spaces as large categories, we will use the tensor products developed jointly with Ramos González and Shoikhet to obtain a “large version” of the Deligne conjecture. Secondly, focusing on concrete representations, we will develop new operadic techniques in order to endow complexes like the Gerstenhaber-Schack complex for prestacks (due to Dinh Van-Lowen) and the deformation complexes for monoidal categories and pasting diagrams (due to Shrestha and Yetter) with new combinatorial structure. In another direction, we will move from Hochschild cohomology of abelian categories (in the sense of Lowen-Van den Bergh) to Mac Lane cohomology for exact categories (in the sense of Kaledin-Lowen), extending the scope of NCAG to “non-linear deformations”. One of the mysteries in algebraic deformation theory is the curvature problem: in the process of deformation we are brought to the boundaries of NCAG territory through the introduction of a curvature component which disables the standard approaches to cohomology. Eventually, it is our goal to set up a new framework for NCAG which incorporates curved objects, drawing inspiration from the realm of higher categories.
Summary
With this research programme, inspired by open problems within noncommutative algebraic geometry (NCAG) as well as by actual developments in algebraic topology, it is our aim to lay out new foundations for NCAG. On the one hand, the categorical approach to geometry put forth in NCAG has seen a wide range of applications both in mathematics and in theoretical physics. On the other hand, algebraic topology has received a vast impetus from the development of higher topos theory by Lurie and others. The current project is aimed at cross-fertilisation between the two subjects, in particular through the development of “higher linear topos theory”. We will approach the higher structure on Hochschild type complexes from two angles. Firstly, focusing on intrinsic incarnations of spaces as large categories, we will use the tensor products developed jointly with Ramos González and Shoikhet to obtain a “large version” of the Deligne conjecture. Secondly, focusing on concrete representations, we will develop new operadic techniques in order to endow complexes like the Gerstenhaber-Schack complex for prestacks (due to Dinh Van-Lowen) and the deformation complexes for monoidal categories and pasting diagrams (due to Shrestha and Yetter) with new combinatorial structure. In another direction, we will move from Hochschild cohomology of abelian categories (in the sense of Lowen-Van den Bergh) to Mac Lane cohomology for exact categories (in the sense of Kaledin-Lowen), extending the scope of NCAG to “non-linear deformations”. One of the mysteries in algebraic deformation theory is the curvature problem: in the process of deformation we are brought to the boundaries of NCAG territory through the introduction of a curvature component which disables the standard approaches to cohomology. Eventually, it is our goal to set up a new framework for NCAG which incorporates curved objects, drawing inspiration from the realm of higher categories.
Max ERC Funding
1 171 360 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym FitteR-CATABOLIC
Project Survival of the Fittest: On how to enhance recovery from critical illness through learning from evolutionary conserved catabolic pathways
Researcher (PI) Greta Herman VAN DEN BERGHE
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), LS7, ERC-2017-ADG
Summary Since a few decades, human patients who suffer from severe illnesses or multiple trauma, conditions that were previously lethal, are being treated in intensive care units (ICUs). Modern intensive care medicine bridges patients from life-threatening conditions to recovery with use of mechanical devices, vasoactive drugs and powerful anti-microbial agents. By postponing death, a new unnatural condition, intensive-care-dependent prolonged (>1 week) critical illness, has been created. About 25% of ICU patients today require prolonged intensive care, sometimes for weeks or months, and these patients are at high risk of death while consuming 75% of resources. Although the primary insult was adequately dealt with, many long-stay patients typically suffer from hypercatabolism, ICU-acquired brain dysfunction and polyneuropathy/myopathy leading to severe muscle weakness, further increasing the risk of late death. As hypercatabolism was considered the culprit, several anabolic interventions were tested, but these showed harm instead of benefit. We previously showed that fasting early during illness is superior to forceful feeding, pointing to certain benefits of catabolic responses. In healthy humans, fasting activates catabolism to provide substrates essential to protect and maintain brain and muscle function. This proposal aims to investigate whether evolutionary conserved catabolic fasting pathways, specifically lipolysis and ketogenesis, can be exploited in the search for prevention of brain dysfunction and muscle weakness in long-stay ICU patients, with the goal to identify a new metabolic intervention to enhance their recovery. The project builds further on our experience with bi-directional translational research - using human material whenever possible and a validated mouse model of sepsis-induced critical illness for objectives that cannot be addressed in patients - and aims to close the loop, from a novel concept to a large randomized controlled trial in patients.
Summary
Since a few decades, human patients who suffer from severe illnesses or multiple trauma, conditions that were previously lethal, are being treated in intensive care units (ICUs). Modern intensive care medicine bridges patients from life-threatening conditions to recovery with use of mechanical devices, vasoactive drugs and powerful anti-microbial agents. By postponing death, a new unnatural condition, intensive-care-dependent prolonged (>1 week) critical illness, has been created. About 25% of ICU patients today require prolonged intensive care, sometimes for weeks or months, and these patients are at high risk of death while consuming 75% of resources. Although the primary insult was adequately dealt with, many long-stay patients typically suffer from hypercatabolism, ICU-acquired brain dysfunction and polyneuropathy/myopathy leading to severe muscle weakness, further increasing the risk of late death. As hypercatabolism was considered the culprit, several anabolic interventions were tested, but these showed harm instead of benefit. We previously showed that fasting early during illness is superior to forceful feeding, pointing to certain benefits of catabolic responses. In healthy humans, fasting activates catabolism to provide substrates essential to protect and maintain brain and muscle function. This proposal aims to investigate whether evolutionary conserved catabolic fasting pathways, specifically lipolysis and ketogenesis, can be exploited in the search for prevention of brain dysfunction and muscle weakness in long-stay ICU patients, with the goal to identify a new metabolic intervention to enhance their recovery. The project builds further on our experience with bi-directional translational research - using human material whenever possible and a validated mouse model of sepsis-induced critical illness for objectives that cannot be addressed in patients - and aims to close the loop, from a novel concept to a large randomized controlled trial in patients.
Max ERC Funding
2 500 000 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym HealthierWomen
Project A woman's reproductive experience: Long-term implications for chronic disease and death
Researcher (PI) Rolv SKJAERVEN
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Advanced Grant (AdG), LS7, ERC-2018-ADG
Summary Pregnancy complications such as preeclampsia and preterm birth are known to affect infant health, but their influence on mothers’ long-term health is not well understood. Most previous studies are seriously limited by their reliance on information from the first pregnancy. Often they lack the data to study women’s complete reproductive histories. Without a complete reproductive history, the relationship between pregnancy complications and women’s long-term health cannot be reliably studied. The Medical Birth Registry of Norway, covering all births from 1967-, includes information on more than 3 million births and 1.5 million sibships. Linking this to population based death and cancer registries provides a worldwide unique source of population-based data which can be analysed to identify heterogeneities in risk by lifetime parity and the cumulative experience of pregnancy complications. Having worked in this field of research for many years, I see many erroneous conclusions in studies based on insufficient data. For instance, both after preeclampsia and after a stillbirth, the high risk of heart disease observed in one-child mothers is strongly attenuated in women with subsequent pregnancies. I will study different patterns of pregnancy complications that occur alone or in combination across pregnancies, and analyse their associations with cause specific maternal mortality. Using this unique methodology, I will challenge the idea that placental dysfunction is the origin of preeclampsia and test the hypothesis that pregnancy complications may cause direct long-term effects on maternal health. The findings of this research have the potential to advance our understanding of how pregnancy complications affect the long-term maternal health and help to develop more effective chronic disease prevention strategies.
Summary
Pregnancy complications such as preeclampsia and preterm birth are known to affect infant health, but their influence on mothers’ long-term health is not well understood. Most previous studies are seriously limited by their reliance on information from the first pregnancy. Often they lack the data to study women’s complete reproductive histories. Without a complete reproductive history, the relationship between pregnancy complications and women’s long-term health cannot be reliably studied. The Medical Birth Registry of Norway, covering all births from 1967-, includes information on more than 3 million births and 1.5 million sibships. Linking this to population based death and cancer registries provides a worldwide unique source of population-based data which can be analysed to identify heterogeneities in risk by lifetime parity and the cumulative experience of pregnancy complications. Having worked in this field of research for many years, I see many erroneous conclusions in studies based on insufficient data. For instance, both after preeclampsia and after a stillbirth, the high risk of heart disease observed in one-child mothers is strongly attenuated in women with subsequent pregnancies. I will study different patterns of pregnancy complications that occur alone or in combination across pregnancies, and analyse their associations with cause specific maternal mortality. Using this unique methodology, I will challenge the idea that placental dysfunction is the origin of preeclampsia and test the hypothesis that pregnancy complications may cause direct long-term effects on maternal health. The findings of this research have the potential to advance our understanding of how pregnancy complications affect the long-term maternal health and help to develop more effective chronic disease prevention strategies.
Max ERC Funding
2 500 000 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym ImmunoBioSynth
Project Synergistic engineering of anti-tumor immunity by synthetic biomaterials
Researcher (PI) Bruno DE GEEST
Host Institution (HI) UNIVERSITEIT GENT
Call Details Consolidator Grant (CoG), LS7, ERC-2018-COG
Summary Immunotherapy holds the potential to dramatically improve the curative prognosis of cancer patients. However, despite significant progress, a huge gap remains to be bridged to gain board success in the clinic. A first limiting factor in cancer immunotherapy is the low response rate in large fraction of the patients and an unmet need exists for more efficient - potentially synergistic - immunotherapies that improve upon or complement existing strategies. The second limiting factor is immune-related toxicity that can cause live-threatening situations as well as seriously impair the quality of life of patients. Therefore, there is an urgent need for safer immunotherapies that allow for a more target-specific engineering of the immune system. Strategies to engineer the immune system via a materials chemistry approach, i.e. immuno-engineering, have gathered major attention over the past decade and could complement or replace biologicals, and holds promise to contribute to resolving the current issues faced by the immunotherapy field. I hypothesize that synthetic biomaterials can play an important role in anti-cancer immunotherapy with regard to synergistic, safe, but potent, instruction of innate and adaptive anti-cancer immunity and to revert the tumor microenvironment from an immune-suppressive into an immune-susceptible state. Hereto, the overall scientific objective of this proposal is to fully embrace the potential of immuno-engineering and develop several highly synergistic biomaterials strategies to engineer the immune system to fight cancer. I will develop a series of biomaterials and address a number of fundamental questions with regard to optimal biomaterial design for immuno-engineering. Based on these findings, I will elucidate those therapeutic strategies that lead to synergistic engineering of innate and adaptive immunity in combination with remodeling the tumor microenvironment from an immune-suppressive into an immune-susceptible state.
Summary
Immunotherapy holds the potential to dramatically improve the curative prognosis of cancer patients. However, despite significant progress, a huge gap remains to be bridged to gain board success in the clinic. A first limiting factor in cancer immunotherapy is the low response rate in large fraction of the patients and an unmet need exists for more efficient - potentially synergistic - immunotherapies that improve upon or complement existing strategies. The second limiting factor is immune-related toxicity that can cause live-threatening situations as well as seriously impair the quality of life of patients. Therefore, there is an urgent need for safer immunotherapies that allow for a more target-specific engineering of the immune system. Strategies to engineer the immune system via a materials chemistry approach, i.e. immuno-engineering, have gathered major attention over the past decade and could complement or replace biologicals, and holds promise to contribute to resolving the current issues faced by the immunotherapy field. I hypothesize that synthetic biomaterials can play an important role in anti-cancer immunotherapy with regard to synergistic, safe, but potent, instruction of innate and adaptive anti-cancer immunity and to revert the tumor microenvironment from an immune-suppressive into an immune-susceptible state. Hereto, the overall scientific objective of this proposal is to fully embrace the potential of immuno-engineering and develop several highly synergistic biomaterials strategies to engineer the immune system to fight cancer. I will develop a series of biomaterials and address a number of fundamental questions with regard to optimal biomaterial design for immuno-engineering. Based on these findings, I will elucidate those therapeutic strategies that lead to synergistic engineering of innate and adaptive immunity in combination with remodeling the tumor microenvironment from an immune-suppressive into an immune-susceptible state.
Max ERC Funding
2 000 000 €
Duration
Start date: 2019-04-01, End date: 2024-03-31
Project acronym ImPRESS
Project Imaging Perfusion Restrictions from Extracellular Solid Stress
Researcher (PI) Kyrre Eeg Emblem
Host Institution (HI) OSLO UNIVERSITETSSYKEHUS HF
Call Details Starting Grant (StG), LS7, ERC-2017-STG
Summary Even the perfect cancer drug must reach its target to have an effect. The ImPRESS project main objective is to develop a novel imaging paradigm coined Restricted Perfusion Imaging (RPI) to reveal - for the first time in humans - vascular restrictions in solid cancers caused by mechanical solid stress, and use RPI to demonstrate that alleviating this force will repair the cancerous microenvironment and improve therapeutic response. Delivery of anti-cancer drugs to the tumor is critically dependent on a functional vascular bed. Developing biomarkers that can measure how mechanical forces in a solid tumor impair perfusion and promotes therapy resistance is essential for treatment of disease.
The ImPRESS project is based on the following observations; (I) pre-clinical work suggests that therapies targeting the tumor microenvironment and extracellular matrix may enhance drug delivery by decompressing tumor vessels; (II) results from animal models may not be transferable because compressive forces in human tumors in vivo can be many times higher; and (III) there are no available imaging technologies for medical diagnostics of solid stress in human cancers. Using RPI, ImPRESS will conduct a comprehensive series of innovative studies in brain cancer patients to answer three key questions: (Q1) Can we image vascular restrictions in human cancers and map how the vasculature changes with tumor growth or treatment? (Q2) Can we use medical engineering to image solid stress in vivo? (Q3) Can RPI show that matrix-depleting drugs improve patient response to conventional chemo- and radiation therapy as well as new targeted therapies?
The ImPRESS project holds a unique position to answer these questions by our unrivaled experience with advanced imaging of cancer patients. With successful delivery, ImPRESS will have a direct impact on patient treatment and establish an imaging paradigm that will pave the way for new scientific knowledge on how to revitalize cancer therapies.
Summary
Even the perfect cancer drug must reach its target to have an effect. The ImPRESS project main objective is to develop a novel imaging paradigm coined Restricted Perfusion Imaging (RPI) to reveal - for the first time in humans - vascular restrictions in solid cancers caused by mechanical solid stress, and use RPI to demonstrate that alleviating this force will repair the cancerous microenvironment and improve therapeutic response. Delivery of anti-cancer drugs to the tumor is critically dependent on a functional vascular bed. Developing biomarkers that can measure how mechanical forces in a solid tumor impair perfusion and promotes therapy resistance is essential for treatment of disease.
The ImPRESS project is based on the following observations; (I) pre-clinical work suggests that therapies targeting the tumor microenvironment and extracellular matrix may enhance drug delivery by decompressing tumor vessels; (II) results from animal models may not be transferable because compressive forces in human tumors in vivo can be many times higher; and (III) there are no available imaging technologies for medical diagnostics of solid stress in human cancers. Using RPI, ImPRESS will conduct a comprehensive series of innovative studies in brain cancer patients to answer three key questions: (Q1) Can we image vascular restrictions in human cancers and map how the vasculature changes with tumor growth or treatment? (Q2) Can we use medical engineering to image solid stress in vivo? (Q3) Can RPI show that matrix-depleting drugs improve patient response to conventional chemo- and radiation therapy as well as new targeted therapies?
The ImPRESS project holds a unique position to answer these questions by our unrivaled experience with advanced imaging of cancer patients. With successful delivery, ImPRESS will have a direct impact on patient treatment and establish an imaging paradigm that will pave the way for new scientific knowledge on how to revitalize cancer therapies.
Max ERC Funding
1 499 638 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym LOPRE
Project Lossy Preprocessing
Researcher (PI) Saket SAURABH
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Consolidator Grant (CoG), PE6, ERC-2018-COG
Summary A critical component of computational processing of data sets is the `preprocessing' or `compression' step which is the computation of a \emph{succinct, sufficiently accurate} representation
of the given data. Preprocessing is ubiquitous and a rigorous mathematical understanding of preprocessing algorithms is crucial in order to reason about and understand the limits of preprocessing.
Unfortunately, there is no mathematical framework to analyze and objectively compare two preprocessing routines while simultaneously taking into account `all three dimensions' --
-- the efficiency of computing the succinct representation,
-- the space required to store this representation, and
-- the accuracy with which the original data is captured in the succinct representation.
``The overarching goal of this proposal is the development of a mathematical framework for the rigorous analysis of preprocessing algorithms. ''
We will achieve the goal by designing new algorithmic techniques for preprocessing, developing a framework of analysis to make qualitative comparisons between various preprocessing routines based on the criteria above and by developing lower bound tools required
to understand the limitations of preprocessing for concrete problems.
This project will lift our understanding of algorithmic preprocessing to new heights and lead to a groundbreaking shift in the set of basic research questions attached to the study of preprocessing for specific problems. It will significantly advance the analysis of preprocessing and yield substantial technology transfer between adjacent subfields of computer science such as dynamic algorithms, streaming algorithms, property testing and graph theory.
Summary
A critical component of computational processing of data sets is the `preprocessing' or `compression' step which is the computation of a \emph{succinct, sufficiently accurate} representation
of the given data. Preprocessing is ubiquitous and a rigorous mathematical understanding of preprocessing algorithms is crucial in order to reason about and understand the limits of preprocessing.
Unfortunately, there is no mathematical framework to analyze and objectively compare two preprocessing routines while simultaneously taking into account `all three dimensions' --
-- the efficiency of computing the succinct representation,
-- the space required to store this representation, and
-- the accuracy with which the original data is captured in the succinct representation.
``The overarching goal of this proposal is the development of a mathematical framework for the rigorous analysis of preprocessing algorithms. ''
We will achieve the goal by designing new algorithmic techniques for preprocessing, developing a framework of analysis to make qualitative comparisons between various preprocessing routines based on the criteria above and by developing lower bound tools required
to understand the limitations of preprocessing for concrete problems.
This project will lift our understanding of algorithmic preprocessing to new heights and lead to a groundbreaking shift in the set of basic research questions attached to the study of preprocessing for specific problems. It will significantly advance the analysis of preprocessing and yield substantial technology transfer between adjacent subfields of computer science such as dynamic algorithms, streaming algorithms, property testing and graph theory.
Max ERC Funding
2 000 000 €
Duration
Start date: 2019-05-01, End date: 2024-04-30