Project acronym 3D-OA-HISTO
Project Development of 3D Histopathological Grading of Osteoarthritis
Researcher (PI) Simo Jaakko Saarakkala
Host Institution (HI) OULUN YLIOPISTO
Call Details Starting Grant (StG), LS7, ERC-2013-StG
Summary "Background: Osteoarthritis (OA) is a common musculoskeletal disease occurring worldwide. Despite extensive research, etiology of OA is still poorly understood. Histopathological grading (HPG) of 2D tissue sections is the gold standard reference method for determination of OA stage. However, traditional 2D-HPG is destructive and based only on subjective visual evaluation. These limitations induce bias to clinical in vitro OA diagnostics and basic research that both rely strongly on HPG.
Objectives: 1) To establish and validate the very first 3D-HPG of OA based on cutting-edge nano/micro-CT (Computed Tomography) technologies in vitro; 2) To use the established method to clarify the beginning phases of OA; and 3) To validate 3D-HPG of OA for in vivo use.
Methods: Several hundreds of human osteochondral samples from patients undergoing total knee arthroplasty will be collected. The samples will be imaged in vitro with nano/micro-CT and clinical high-end extremity CT devices using specific contrast-agents to quantify tissue constituents and structure in 3D in large volume. From this information, a novel 3D-HPG is developed with statistical classification algorithms. Finally, the developed novel 3D-HPG of OA will be applied clinically in vivo.
Significance: This is the very first study to establish 3D-HPG of OA pathology in vitro and in vivo. Furthermore, the developed technique hugely improves the understanding of the beginning phases of OA. Ultimately, the study will contribute for improving OA patients’ quality of life by slowing the disease progression, and for providing powerful tools to develop new OA therapies."
Summary
"Background: Osteoarthritis (OA) is a common musculoskeletal disease occurring worldwide. Despite extensive research, etiology of OA is still poorly understood. Histopathological grading (HPG) of 2D tissue sections is the gold standard reference method for determination of OA stage. However, traditional 2D-HPG is destructive and based only on subjective visual evaluation. These limitations induce bias to clinical in vitro OA diagnostics and basic research that both rely strongly on HPG.
Objectives: 1) To establish and validate the very first 3D-HPG of OA based on cutting-edge nano/micro-CT (Computed Tomography) technologies in vitro; 2) To use the established method to clarify the beginning phases of OA; and 3) To validate 3D-HPG of OA for in vivo use.
Methods: Several hundreds of human osteochondral samples from patients undergoing total knee arthroplasty will be collected. The samples will be imaged in vitro with nano/micro-CT and clinical high-end extremity CT devices using specific contrast-agents to quantify tissue constituents and structure in 3D in large volume. From this information, a novel 3D-HPG is developed with statistical classification algorithms. Finally, the developed novel 3D-HPG of OA will be applied clinically in vivo.
Significance: This is the very first study to establish 3D-HPG of OA pathology in vitro and in vivo. Furthermore, the developed technique hugely improves the understanding of the beginning phases of OA. Ultimately, the study will contribute for improving OA patients’ quality of life by slowing the disease progression, and for providing powerful tools to develop new OA therapies."
Max ERC Funding
1 500 000 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym A-DATADRIVE-B
Project Advanced Data-Driven Black-box modelling
Researcher (PI) Johan Adelia K Suykens
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE7, ERC-2011-ADG_20110209
Summary Making accurate predictions is a crucial factor in many systems (such as in modelling energy consumption, power load forecasting, traffic networks, process industry, environmental modelling, biomedicine, brain-machine interfaces) for cost savings, efficiency, health, safety and organizational purposes. In this proposal we aim at realizing a new generation of more advanced black-box modelling techniques for estimating predictive models from measured data. We will study different optimization modelling frameworks in order to obtain improved black-box modelling approaches. This will be done by specifying models through constrained optimization problems by studying different candidate core models (parametric models, support vector machines and kernel methods) together with additional sets of constraints and regularization mechanisms. Different candidate mathematical frameworks will be considered with models that possess primal and (Lagrange) dual model representations, functional analysis in reproducing kernel Hilbert spaces, operator splitting and optimization in Banach spaces. Several aspects that are relevant to black-box models will be studied including incorporation of prior knowledge, structured dynamical systems, tensorial data representations, interpretability and sparsity, and general purpose optimization algorithms. The methods should be suitable for handling larger data sets and high dimensional input spaces. The final goal is also to realize a next generation software tool (including symbolic generation of models and handling different supervised and unsupervised learning tasks, static and dynamic systems) that can be generically applied to data from different application areas. The proposal A-DATADRIVE-B aims at getting end-users connected to the more advanced methods through a user-friendly data-driven black-box modelling tool. The methods and tool will be tested in connection to several real-life applications.
Summary
Making accurate predictions is a crucial factor in many systems (such as in modelling energy consumption, power load forecasting, traffic networks, process industry, environmental modelling, biomedicine, brain-machine interfaces) for cost savings, efficiency, health, safety and organizational purposes. In this proposal we aim at realizing a new generation of more advanced black-box modelling techniques for estimating predictive models from measured data. We will study different optimization modelling frameworks in order to obtain improved black-box modelling approaches. This will be done by specifying models through constrained optimization problems by studying different candidate core models (parametric models, support vector machines and kernel methods) together with additional sets of constraints and regularization mechanisms. Different candidate mathematical frameworks will be considered with models that possess primal and (Lagrange) dual model representations, functional analysis in reproducing kernel Hilbert spaces, operator splitting and optimization in Banach spaces. Several aspects that are relevant to black-box models will be studied including incorporation of prior knowledge, structured dynamical systems, tensorial data representations, interpretability and sparsity, and general purpose optimization algorithms. The methods should be suitable for handling larger data sets and high dimensional input spaces. The final goal is also to realize a next generation software tool (including symbolic generation of models and handling different supervised and unsupervised learning tasks, static and dynamic systems) that can be generically applied to data from different application areas. The proposal A-DATADRIVE-B aims at getting end-users connected to the more advanced methods through a user-friendly data-driven black-box modelling tool. The methods and tool will be tested in connection to several real-life applications.
Max ERC Funding
2 485 800 €
Duration
Start date: 2012-04-01, End date: 2017-03-31
Project acronym ABACUS
Project Advancing Behavioral and Cognitive Understanding of Speech
Researcher (PI) Bart De Boer
Host Institution (HI) VRIJE UNIVERSITEIT BRUSSEL
Call Details Starting Grant (StG), SH4, ERC-2011-StG_20101124
Summary I intend to investigate what cognitive mechanisms give us combinatorial speech. Combinatorial speech is the ability to make new words using pre-existing speech sounds. Humans are the only apes that can do this, yet we do not know how our brains do it, nor how exactly we differ from other apes. Using new experimental techniques to study human behavior and new computational techniques to model human cognition, I will find out how we deal with combinatorial speech.
The experimental part will study individual and cultural learning. Experimental cultural learning is a new technique that simulates cultural evolution in the laboratory. Two types of cultural learning will be used: iterated learning, which simulates language transfer across generations, and social coordination, which simulates emergence of norms in a language community. Using the two types of cultural learning together with individual learning experiments will help to zero in, from three angles, on how humans deal with combinatorial speech. In addition it will make a methodological contribution by comparing the strengths and weaknesses of the three methods.
The computer modeling part will formalize hypotheses about how our brains deal with combinatorial speech. Two models will be built: a high-level model that will establish the basic algorithms with which combinatorial speech is learned and reproduced, and a neural model that will establish in more detail how the algorithms are implemented in the brain. In addition, the models, through increasing understanding of how humans deal with speech, will help bridge the performance gap between human and computer speech recognition.
The project will advance science in four ways: it will provide insight into how our unique ability for using combinatorial speech works, it will tell us how this is implemented in the brain, it will extend the novel methodology of experimental cultural learning and it will create new computer models for dealing with human speech.
Summary
I intend to investigate what cognitive mechanisms give us combinatorial speech. Combinatorial speech is the ability to make new words using pre-existing speech sounds. Humans are the only apes that can do this, yet we do not know how our brains do it, nor how exactly we differ from other apes. Using new experimental techniques to study human behavior and new computational techniques to model human cognition, I will find out how we deal with combinatorial speech.
The experimental part will study individual and cultural learning. Experimental cultural learning is a new technique that simulates cultural evolution in the laboratory. Two types of cultural learning will be used: iterated learning, which simulates language transfer across generations, and social coordination, which simulates emergence of norms in a language community. Using the two types of cultural learning together with individual learning experiments will help to zero in, from three angles, on how humans deal with combinatorial speech. In addition it will make a methodological contribution by comparing the strengths and weaknesses of the three methods.
The computer modeling part will formalize hypotheses about how our brains deal with combinatorial speech. Two models will be built: a high-level model that will establish the basic algorithms with which combinatorial speech is learned and reproduced, and a neural model that will establish in more detail how the algorithms are implemented in the brain. In addition, the models, through increasing understanding of how humans deal with speech, will help bridge the performance gap between human and computer speech recognition.
The project will advance science in four ways: it will provide insight into how our unique ability for using combinatorial speech works, it will tell us how this is implemented in the brain, it will extend the novel methodology of experimental cultural learning and it will create new computer models for dealing with human speech.
Max ERC Funding
1 276 620 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym ACCOPT
Project ACelerated COnvex OPTimization
Researcher (PI) Yurii NESTEROV
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Advanced Grant (AdG), PE1, ERC-2017-ADG
Summary The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Summary
The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Max ERC Funding
2 090 038 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym AcTafactors
Project AcTafactors: Tumor Necrosis Factor-based immuno-cytokines with superior therapeutic indexes
Researcher (PI) Jan Honoré L Tavernier
Host Institution (HI) VIB
Call Details Proof of Concept (PoC), ERC-2015-PoC, ERC-2015-PoC
Summary Tumor Necrosis Factor (TNF) is a homotrimeric pro-inflammatory cytokine that was originally discovered based on its extraordinary antitumor activity. However, its shock-inducing properties, causing hypotension, leukopenia and multiple organ failure, prevented its systemic use in cancer treatment. With this proof-of-concept study we want to evaluate a novel class of cell-targeted TNFs with strongly reduced systemic toxicities (AcTafactors). In these engineered immuno-cytokines, single-chain TNFs that harbor mutations to reduce the affinity for its receptor(s) are fused to a cell- specific targeting domain. Whilst almost no biological activity is observed on non-targeted cells, thus preventing systemic toxicity, avidity effects at the targeted cell membrane lead to recovery of over 90% of the TNF signaling activity. In this project we propose a lead optimization program to further improve the lead AcTafactors identified in the context of the ERC Advanced Grant project and to evaluate the resulting molecules for their ability to target the tumor (neo)vasculature in clinically relevant murine tumor models. The pre-clinical proof-of-concept we aim for represents a first step towards clinical development and ultimately potential market approval of an effective AcTafactor anti-cancer therapy.
Summary
Tumor Necrosis Factor (TNF) is a homotrimeric pro-inflammatory cytokine that was originally discovered based on its extraordinary antitumor activity. However, its shock-inducing properties, causing hypotension, leukopenia and multiple organ failure, prevented its systemic use in cancer treatment. With this proof-of-concept study we want to evaluate a novel class of cell-targeted TNFs with strongly reduced systemic toxicities (AcTafactors). In these engineered immuno-cytokines, single-chain TNFs that harbor mutations to reduce the affinity for its receptor(s) are fused to a cell- specific targeting domain. Whilst almost no biological activity is observed on non-targeted cells, thus preventing systemic toxicity, avidity effects at the targeted cell membrane lead to recovery of over 90% of the TNF signaling activity. In this project we propose a lead optimization program to further improve the lead AcTafactors identified in the context of the ERC Advanced Grant project and to evaluate the resulting molecules for their ability to target the tumor (neo)vasculature in clinically relevant murine tumor models. The pre-clinical proof-of-concept we aim for represents a first step towards clinical development and ultimately potential market approval of an effective AcTafactor anti-cancer therapy.
Max ERC Funding
149 320 €
Duration
Start date: 2015-11-01, End date: 2017-04-30
Project acronym ActiveWindFarms
Project Active Wind Farms: Optimization and Control of Atmospheric Energy Extraction in Gigawatt Wind Farms
Researcher (PI) Johan Meyers
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE8, ERC-2012-StG_20111012
Summary With the recognition that wind energy will become an important contributor to the world’s energy portfolio, several wind farms with a capacity of over 1 gigawatt are in planning phase. In the past, engineering of wind farms focused on a bottom-up approach, in which atmospheric wind availability was considered to be fixed by climate and weather. However, farms of gigawatt size slow down the Atmospheric Boundary Layer (ABL) as a whole, reducing the availability of wind at turbine hub height. In Denmark’s large off-shore farms, this leads to underperformance of turbines which can reach levels of 40%–50% compared to the same turbine in a lone-standing case. For large wind farms, the vertical structure and turbulence physics of the flow in the ABL become crucial ingredients in their design and operation. This introduces a new set of scientific challenges related to the design and control of large wind farms. The major ambition of the present research proposal is to employ optimal control techniques to control the interaction between large wind farms and the ABL, and optimize overall farm-power extraction. Individual turbines are used as flow actuators by dynamically pitching their blades using time scales ranging between 10 to 500 seconds. The application of such control efforts on the atmospheric boundary layer has never been attempted before, and introduces flow control on a physical scale which is currently unprecedented. The PI possesses a unique combination of expertise and tools enabling these developments: efficient parallel large-eddy simulations of wind farms, multi-scale turbine modeling, and gradient-based optimization in large optimization-parameter spaces using adjoint formulations. To ensure a maximum impact on the wind-engineering field, the project aims at optimal control, experimental wind-tunnel validation, and at including multi-disciplinary aspects, related to structural mechanics, power quality, and controller design.
Summary
With the recognition that wind energy will become an important contributor to the world’s energy portfolio, several wind farms with a capacity of over 1 gigawatt are in planning phase. In the past, engineering of wind farms focused on a bottom-up approach, in which atmospheric wind availability was considered to be fixed by climate and weather. However, farms of gigawatt size slow down the Atmospheric Boundary Layer (ABL) as a whole, reducing the availability of wind at turbine hub height. In Denmark’s large off-shore farms, this leads to underperformance of turbines which can reach levels of 40%–50% compared to the same turbine in a lone-standing case. For large wind farms, the vertical structure and turbulence physics of the flow in the ABL become crucial ingredients in their design and operation. This introduces a new set of scientific challenges related to the design and control of large wind farms. The major ambition of the present research proposal is to employ optimal control techniques to control the interaction between large wind farms and the ABL, and optimize overall farm-power extraction. Individual turbines are used as flow actuators by dynamically pitching their blades using time scales ranging between 10 to 500 seconds. The application of such control efforts on the atmospheric boundary layer has never been attempted before, and introduces flow control on a physical scale which is currently unprecedented. The PI possesses a unique combination of expertise and tools enabling these developments: efficient parallel large-eddy simulations of wind farms, multi-scale turbine modeling, and gradient-based optimization in large optimization-parameter spaces using adjoint formulations. To ensure a maximum impact on the wind-engineering field, the project aims at optimal control, experimental wind-tunnel validation, and at including multi-disciplinary aspects, related to structural mechanics, power quality, and controller design.
Max ERC Funding
1 499 241 €
Duration
Start date: 2012-10-01, End date: 2017-09-30
Project acronym AD-VIP
Project Alzheimer’s disease and AAV9: Use of a virus-based delivery system for vectored immunoprophylaxis in dementia.
Researcher (PI) MATTHEW GUY HOLT
Host Institution (HI) VIB
Call Details Proof of Concept (PoC), PC1, ERC-2015-PoC
Summary Alzheimer’s disease (AD) is the most common form of dementia in the Western World, representing an economic and social cost of billions of euros a year. Given the changing demographics of society, these costs will only increase over the coming decades.
Amyloid plaques, composed of amyloid beta peptide (Abeta), are a defining characteristic of AD. Evidence now suggests that Abeta is central to disease pathogenesis due to its toxicity, which leads to cell loss and eventual cognitive decline. Abeta is generated by proteolytic cleavage of amyloid precursor protein, a process that involves the protein BACE1.
Knock-down of BACE1 is sufficient to prevent amyloid pathology and cognitive deficits in transgenic mouse models of AD, so BACE1 is an attractive target for therapeutic intervention. Although many small molecule inhibitors of BACE1 have been developed, many have problems with imperfect selectivity, posing a substantial risk for off-target toxicity in vivo. In contrast, antibody-based therapeutics provide an attractive alternative given their excellent molecular selectivity. However, the success of antibody therapies in AD is limited by the blood brain barrier, which limits antibody entry into the brain from the systemic circulation.
Recent studies have shown that adeno-associated virus serotype 9 (AAV9) effectively crosses the blood brain barrier. Here, we propose evaluating the use of AAV9 as a delivery system for a highly specific and potent inhibitory nanobody targeted against BACE1 as a treatment for AD.
Summary
Alzheimer’s disease (AD) is the most common form of dementia in the Western World, representing an economic and social cost of billions of euros a year. Given the changing demographics of society, these costs will only increase over the coming decades.
Amyloid plaques, composed of amyloid beta peptide (Abeta), are a defining characteristic of AD. Evidence now suggests that Abeta is central to disease pathogenesis due to its toxicity, which leads to cell loss and eventual cognitive decline. Abeta is generated by proteolytic cleavage of amyloid precursor protein, a process that involves the protein BACE1.
Knock-down of BACE1 is sufficient to prevent amyloid pathology and cognitive deficits in transgenic mouse models of AD, so BACE1 is an attractive target for therapeutic intervention. Although many small molecule inhibitors of BACE1 have been developed, many have problems with imperfect selectivity, posing a substantial risk for off-target toxicity in vivo. In contrast, antibody-based therapeutics provide an attractive alternative given their excellent molecular selectivity. However, the success of antibody therapies in AD is limited by the blood brain barrier, which limits antibody entry into the brain from the systemic circulation.
Recent studies have shown that adeno-associated virus serotype 9 (AAV9) effectively crosses the blood brain barrier. Here, we propose evaluating the use of AAV9 as a delivery system for a highly specific and potent inhibitory nanobody targeted against BACE1 as a treatment for AD.
Max ERC Funding
150 000 €
Duration
Start date: 2016-12-01, End date: 2018-05-31
Project acronym ADAPTEM
Project Adaptive transmission electron microscopy: development of a programmable phase plate
Researcher (PI) Johan VERBEECK
Host Institution (HI) UNIVERSITEIT ANTWERPEN
Call Details Proof of Concept (PoC), ERC-2017-PoC
Summary Adaptive optics, the technology to dynamically program the phase of optical waves has sparked an avalanche of scientific discoveries and innovations in light optics applications. Nowadays, the phase of optical waves can be dynamically programmed providing research on exotic optical beams and unprecedented control over the performance of optical instruments. Although electron waves carry many similarities in comparison to their optical counterparts, a generic programmable phase plate for electrons is still missing. This project aims at developing a prototype of a programmable electrostatic phase plate that allows the user to freely change the phase of electron waves and demonstrate it to potential licensees for further upscaling and introduction to the market. The target of this POC project is the realization of a tunable easy-to-use 5x5-pixel prototype that will demonstrate the potential of adaptive optics in electron microscopy. Its realization will be based on lithographic technology to allow for future upscaling. It is expected that such a phase plate can dramatically increase the information obtained at a given electron dose, limiting the detrimental effects of beam damage that currently hinders the use of electron microscopy in e.g. life sciences. As such, it has the potential to disrupt the electron microscopy market with novel applications while at the same time reducing cost and complexity and increasing the potential for fully automated instruments.
Summary
Adaptive optics, the technology to dynamically program the phase of optical waves has sparked an avalanche of scientific discoveries and innovations in light optics applications. Nowadays, the phase of optical waves can be dynamically programmed providing research on exotic optical beams and unprecedented control over the performance of optical instruments. Although electron waves carry many similarities in comparison to their optical counterparts, a generic programmable phase plate for electrons is still missing. This project aims at developing a prototype of a programmable electrostatic phase plate that allows the user to freely change the phase of electron waves and demonstrate it to potential licensees for further upscaling and introduction to the market. The target of this POC project is the realization of a tunable easy-to-use 5x5-pixel prototype that will demonstrate the potential of adaptive optics in electron microscopy. Its realization will be based on lithographic technology to allow for future upscaling. It is expected that such a phase plate can dramatically increase the information obtained at a given electron dose, limiting the detrimental effects of beam damage that currently hinders the use of electron microscopy in e.g. life sciences. As such, it has the potential to disrupt the electron microscopy market with novel applications while at the same time reducing cost and complexity and increasing the potential for fully automated instruments.
Max ERC Funding
148 500 €
Duration
Start date: 2018-03-01, End date: 2019-08-31
Project acronym ADHESWITCHES
Project Adhesion switches in cancer and development: from in vivo to synthetic biology
Researcher (PI) Mari Johanna Ivaska
Host Institution (HI) TURUN YLIOPISTO
Call Details Consolidator Grant (CoG), LS3, ERC-2013-CoG
Summary Integrins are transmembrane cell adhesion receptors controlling cell proliferation and migration. Our objective is to gain fundamentally novel mechanistic insight into the emerging new roles of integrins in cancer and to generate a road map of integrin dependent pathways critical in mammary gland development and integrin signalling thus opening new targets for therapeutic interventions. We will combine an in vivo based translational approach with cell and molecular biological studies aiming to identify entirely novel concepts in integrin function using cutting edge techniques and synthetic-biology tools.
The specific objectives are:
1) Integrin inactivation in branching morphogenesis and cancer invasion. Integrins regulate mammary gland development and cancer invasion but the role of integrin inactivating proteins in these processes is currently completely unknown. We will investigate this using genetically modified mice, ex-vivo organoid models and human tissues with the aim to identify beneficial combinational treatments against cancer invasion.
2) Endosomal adhesomes – cross-talk between integrin activity and integrin “inside-in signaling”. We hypothesize that endocytosed active integrins engage in specialized endosomal signaling that governs cell survival especially in cancer. RNAi cell arrays, super-resolution STED imaging and endosomal proteomics will be used to investigate integrin signaling in endosomes.
3) Spatio-temporal co-ordination of adhesion and endocytosis. Several cytosolic proteins compete for integrin binding to regulate activation, endocytosis and recycling. Photoactivatable protein-traps and predefined matrix micropatterns will be employed to mechanistically dissect the spatio-temporal dynamics and hierarchy of their recruitment.
We will employ innovative and unconventional techniques to address three major unanswered questions in the field and significantly advance our understanding of integrin function in development and cancer.
Summary
Integrins are transmembrane cell adhesion receptors controlling cell proliferation and migration. Our objective is to gain fundamentally novel mechanistic insight into the emerging new roles of integrins in cancer and to generate a road map of integrin dependent pathways critical in mammary gland development and integrin signalling thus opening new targets for therapeutic interventions. We will combine an in vivo based translational approach with cell and molecular biological studies aiming to identify entirely novel concepts in integrin function using cutting edge techniques and synthetic-biology tools.
The specific objectives are:
1) Integrin inactivation in branching morphogenesis and cancer invasion. Integrins regulate mammary gland development and cancer invasion but the role of integrin inactivating proteins in these processes is currently completely unknown. We will investigate this using genetically modified mice, ex-vivo organoid models and human tissues with the aim to identify beneficial combinational treatments against cancer invasion.
2) Endosomal adhesomes – cross-talk between integrin activity and integrin “inside-in signaling”. We hypothesize that endocytosed active integrins engage in specialized endosomal signaling that governs cell survival especially in cancer. RNAi cell arrays, super-resolution STED imaging and endosomal proteomics will be used to investigate integrin signaling in endosomes.
3) Spatio-temporal co-ordination of adhesion and endocytosis. Several cytosolic proteins compete for integrin binding to regulate activation, endocytosis and recycling. Photoactivatable protein-traps and predefined matrix micropatterns will be employed to mechanistically dissect the spatio-temporal dynamics and hierarchy of their recruitment.
We will employ innovative and unconventional techniques to address three major unanswered questions in the field and significantly advance our understanding of integrin function in development and cancer.
Max ERC Funding
1 887 910 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym AEROSOL
Project Astrochemistry of old stars:direct probing of unique chemical laboratories
Researcher (PI) Leen Katrien Els Decin
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Consolidator Grant (CoG), PE9, ERC-2014-CoG
Summary The gas and dust in the interstellar medium (ISM) drive the chemical evolution of galaxies, the formation of stars and planets, and the synthesis of complex prebiotic molecules. The prime birth places for this interstellar material are the winds of evolved (super)giant stars. These winds are unique chemical laboratories, in which a large variety of gas and dust species radially expand away from the star.
Recent progress on the observations of these winds has been impressive thanks to Herschel and ALMA. The next challenge is to unravel the wealth of chemical information contained in these data. This is an ambitious task since (1) a plethora of physical and chemical processes interact in a complex way, (2) laboratory data to interpret these interactions are lacking, and (3) theoretical tools to analyse the data do not meet current needs.
To boost the knowledge of the physics and chemistry characterizing these winds, I propose a world-leading multi-disciplinary project combining (1) high-quality data, (2) novel theoretical wind models, and (3) targeted laboratory experiments. The aim is to pinpoint the dominant chemical pathways, unravel the transition from gas-phase to dust species, elucidate the role of clumps on the overall wind structure, and study the reciprocal effect between various dynamical and chemical phenomena.
Now is the right time for this ambitious project thanks to the availability of (1) high-quality multi-wavelength data, including ALMA and Herschel data of the PI, (2) supercomputers enabling a homogeneous analysis of the data using sophisticated theoretical wind models, and (3) novel laboratory equipment to measure the gas-phase reaction rates of key species.
This project will have far-reaching impact on (1) the field of evolved stars, (2) the understanding of the chemical lifecycle of the ISM, (3) chemical studies of dynamically more complex systems, such as exoplanets, protostars, supernovae etc., and (4) it will guide new instrument development.
Summary
The gas and dust in the interstellar medium (ISM) drive the chemical evolution of galaxies, the formation of stars and planets, and the synthesis of complex prebiotic molecules. The prime birth places for this interstellar material are the winds of evolved (super)giant stars. These winds are unique chemical laboratories, in which a large variety of gas and dust species radially expand away from the star.
Recent progress on the observations of these winds has been impressive thanks to Herschel and ALMA. The next challenge is to unravel the wealth of chemical information contained in these data. This is an ambitious task since (1) a plethora of physical and chemical processes interact in a complex way, (2) laboratory data to interpret these interactions are lacking, and (3) theoretical tools to analyse the data do not meet current needs.
To boost the knowledge of the physics and chemistry characterizing these winds, I propose a world-leading multi-disciplinary project combining (1) high-quality data, (2) novel theoretical wind models, and (3) targeted laboratory experiments. The aim is to pinpoint the dominant chemical pathways, unravel the transition from gas-phase to dust species, elucidate the role of clumps on the overall wind structure, and study the reciprocal effect between various dynamical and chemical phenomena.
Now is the right time for this ambitious project thanks to the availability of (1) high-quality multi-wavelength data, including ALMA and Herschel data of the PI, (2) supercomputers enabling a homogeneous analysis of the data using sophisticated theoretical wind models, and (3) novel laboratory equipment to measure the gas-phase reaction rates of key species.
This project will have far-reaching impact on (1) the field of evolved stars, (2) the understanding of the chemical lifecycle of the ISM, (3) chemical studies of dynamically more complex systems, such as exoplanets, protostars, supernovae etc., and (4) it will guide new instrument development.
Max ERC Funding
2 605 897 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym AEROSPACEPHYS
Project Multiphysics models and simulations for reacting and plasma flows applied to the space exploration program
Researcher (PI) Thierry Edouard Bertrand Magin
Host Institution (HI) INSTITUT VON KARMAN DE DYNAMIQUE DES FLUIDES
Call Details Starting Grant (StG), PE8, ERC-2010-StG_20091028
Summary Space exploration is one of boldest and most exciting endeavors that humanity has undertaken, and it holds enormous promise for the future. Our next challenges for the spatial conquest include bringing back samples to Earth by means of robotic missions and continuing the manned exploration program, which aims at sending human beings to Mars and bring them home safely. Inaccurate prediction of the heat-flux to the surface of the spacecraft heat shield can be fatal for the crew or the success of a robotic mission. This quantity is estimated during the design phase. An accurate prediction is a particularly complex task, regarding modelling of the following phenomena that are potential “mission killers:” 1) Radiation of the plasma in the shock layer, 2) Complex surface chemistry on the thermal protection material, 3) Flow transition from laminar to turbulent. Our poor understanding of the coupled mechanisms of radiation, ablation, and transition leads to the difficulties in flux prediction. To avoid failure and ensure safety of the astronauts and payload, engineers resort to “safety factors” to determine the thickness of the heat shield, at the expense of the mass of embarked payload. Thinking out of the box and basic research are thus necessary for advancements of the models that will better define the environment and requirements for the design and safe operation of tomorrow’s space vehicles and planetary probes for the manned space exploration. The three basic ingredients for predictive science are: 1) Physico-chemical models, 2) Computational methods, 3) Experimental data. We propose to follow a complementary approach for prediction. The proposed research aims at: “Integrating new advanced physico-chemical models and computational methods, based on a multidisciplinary approach developed together with physicists, chemists, and applied mathematicians, to create a top-notch multiphysics and multiscale numerical platform for simulations of planetary atmosphere entries, crucial to the new challenges of the manned space exploration program. Experimental data will also be used for validation, following state-of-the-art uncertainty quantification methods.”
Summary
Space exploration is one of boldest and most exciting endeavors that humanity has undertaken, and it holds enormous promise for the future. Our next challenges for the spatial conquest include bringing back samples to Earth by means of robotic missions and continuing the manned exploration program, which aims at sending human beings to Mars and bring them home safely. Inaccurate prediction of the heat-flux to the surface of the spacecraft heat shield can be fatal for the crew or the success of a robotic mission. This quantity is estimated during the design phase. An accurate prediction is a particularly complex task, regarding modelling of the following phenomena that are potential “mission killers:” 1) Radiation of the plasma in the shock layer, 2) Complex surface chemistry on the thermal protection material, 3) Flow transition from laminar to turbulent. Our poor understanding of the coupled mechanisms of radiation, ablation, and transition leads to the difficulties in flux prediction. To avoid failure and ensure safety of the astronauts and payload, engineers resort to “safety factors” to determine the thickness of the heat shield, at the expense of the mass of embarked payload. Thinking out of the box and basic research are thus necessary for advancements of the models that will better define the environment and requirements for the design and safe operation of tomorrow’s space vehicles and planetary probes for the manned space exploration. The three basic ingredients for predictive science are: 1) Physico-chemical models, 2) Computational methods, 3) Experimental data. We propose to follow a complementary approach for prediction. The proposed research aims at: “Integrating new advanced physico-chemical models and computational methods, based on a multidisciplinary approach developed together with physicists, chemists, and applied mathematicians, to create a top-notch multiphysics and multiscale numerical platform for simulations of planetary atmosphere entries, crucial to the new challenges of the manned space exploration program. Experimental data will also be used for validation, following state-of-the-art uncertainty quantification methods.”
Max ERC Funding
1 494 892 €
Duration
Start date: 2010-09-01, End date: 2015-08-31
Project acronym AfricanWomen
Project Women in Africa
Researcher (PI) catherine GUIRKINGER
Host Institution (HI) UNIVERSITE DE NAMUR ASBL
Call Details Starting Grant (StG), SH1, ERC-2017-STG
Summary Rates of domestic violence and the relative risk of premature death for women are higher in sub-Saharan Africa than in any other region. Yet we know remarkably little about the economic forces, incentives and constraints that drive discrimination against women in this region, making it hard to identify policy levers to address the problem. This project will help fill this gap.
I will investigate gender discrimination from two complementary perspectives. First, through the lens of economic history, I will investigate the forces driving trends in women’s relative well-being since slavery. To quantify the evolution of well-being of sub-Saharan women relative to men, I will use three types of historical data: anthropometric indicators (relative height), vital statistics (to compute numbers of missing women), and outcomes of formal and informal family law disputes. I will then investigate how major economic developments and changes in family laws differentially affected women’s welfare across ethnic groups with different norms on women’s roles and rights.
Second, using intra-household economic models, I will provide new insights into domestic violence and gender bias in access to crucial resources in present-day Africa. I will develop a new household model that incorporates gender identity and endogenous outside options to explore the relationship between women’s empowerment and the use of violence. Using the notion of strategic delegation, I will propose a new rationale for the separation of budgets often observed in African households and generate predictions of how improvements in women’s outside options affect welfare. Finally, with first hand data, I will investigate intra-household differences in nutrition and work effort in times of food shortage from the points of view of efficiency and equity. I will use activity trackers as an innovative means of collecting high quality data on work effort and thus overcome data limitations restricting the existing literature
Summary
Rates of domestic violence and the relative risk of premature death for women are higher in sub-Saharan Africa than in any other region. Yet we know remarkably little about the economic forces, incentives and constraints that drive discrimination against women in this region, making it hard to identify policy levers to address the problem. This project will help fill this gap.
I will investigate gender discrimination from two complementary perspectives. First, through the lens of economic history, I will investigate the forces driving trends in women’s relative well-being since slavery. To quantify the evolution of well-being of sub-Saharan women relative to men, I will use three types of historical data: anthropometric indicators (relative height), vital statistics (to compute numbers of missing women), and outcomes of formal and informal family law disputes. I will then investigate how major economic developments and changes in family laws differentially affected women’s welfare across ethnic groups with different norms on women’s roles and rights.
Second, using intra-household economic models, I will provide new insights into domestic violence and gender bias in access to crucial resources in present-day Africa. I will develop a new household model that incorporates gender identity and endogenous outside options to explore the relationship between women’s empowerment and the use of violence. Using the notion of strategic delegation, I will propose a new rationale for the separation of budgets often observed in African households and generate predictions of how improvements in women’s outside options affect welfare. Finally, with first hand data, I will investigate intra-household differences in nutrition and work effort in times of food shortage from the points of view of efficiency and equity. I will use activity trackers as an innovative means of collecting high quality data on work effort and thus overcome data limitations restricting the existing literature
Max ERC Funding
1 499 313 €
Duration
Start date: 2018-08-01, End date: 2023-07-31
Project acronym AFRIVAL
Project African river basins: catchment-scale carbon fluxes and transformations
Researcher (PI) Steven Bouillon
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE10, ERC-2009-StG
Summary This proposal wishes to fundamentally improve our understanding of the role of tropical freshwater ecosystems in carbon (C) cycling on the catchment scale. It uses an unprecedented combination of state-of-the-art proxies such as stable isotope, 14C and biomarker signatures to characterize organic matter, radiogenic isotope signatures to determine particle residence times, as well as field measurements of relevant biogeochemical processes. We focus on tropical systems since there is a striking lack of data on such systems, even though riverine C transport is thought to be disproportionately high in tropical areas. Furthermore, the presence of landscape-scale contrasts in vegetation (in particular, C3 vs. C4 plants) are an important asset in the use of stable isotopes as natural tracers of C cycling processes on this scale. Freshwater ecosystems are an important component in the global C cycle, and the primary link between terrestrial and marine ecosystems. Recent estimates indicate that ~2 Pg C y-1 (Pg=Petagram) enter freshwater systems, i.e., about twice the estimated global terrestrial C sink. More than half of this is thought to be remineralized before it reaches the coastal zone, and for the Amazon basin this has even been suggested to be ~90% of the lateral C inputs. The question how general these patterns are is a matter of debate, and assessing the mechanisms determining the degree of processing versus transport of organic carbon in lakes and river systems is critical to further constrain their role in the global C cycle. This proposal provides an interdisciplinary approach to describe and quantify catchment-scale C transport and cycling in tropical river basins. Besides conceptual and methodological advances, and a significant expansion of our dataset on C processes in such systems, new data gathered in this project are likely to provide exciting and novel hypotheses on the functioning of freshwater systems and their linkage to the terrestrial C budget.
Summary
This proposal wishes to fundamentally improve our understanding of the role of tropical freshwater ecosystems in carbon (C) cycling on the catchment scale. It uses an unprecedented combination of state-of-the-art proxies such as stable isotope, 14C and biomarker signatures to characterize organic matter, radiogenic isotope signatures to determine particle residence times, as well as field measurements of relevant biogeochemical processes. We focus on tropical systems since there is a striking lack of data on such systems, even though riverine C transport is thought to be disproportionately high in tropical areas. Furthermore, the presence of landscape-scale contrasts in vegetation (in particular, C3 vs. C4 plants) are an important asset in the use of stable isotopes as natural tracers of C cycling processes on this scale. Freshwater ecosystems are an important component in the global C cycle, and the primary link between terrestrial and marine ecosystems. Recent estimates indicate that ~2 Pg C y-1 (Pg=Petagram) enter freshwater systems, i.e., about twice the estimated global terrestrial C sink. More than half of this is thought to be remineralized before it reaches the coastal zone, and for the Amazon basin this has even been suggested to be ~90% of the lateral C inputs. The question how general these patterns are is a matter of debate, and assessing the mechanisms determining the degree of processing versus transport of organic carbon in lakes and river systems is critical to further constrain their role in the global C cycle. This proposal provides an interdisciplinary approach to describe and quantify catchment-scale C transport and cycling in tropical river basins. Besides conceptual and methodological advances, and a significant expansion of our dataset on C processes in such systems, new data gathered in this project are likely to provide exciting and novel hypotheses on the functioning of freshwater systems and their linkage to the terrestrial C budget.
Max ERC Funding
1 745 262 €
Duration
Start date: 2009-10-01, End date: 2014-09-30
Project acronym Age Asymmetry
Project Age-Selective Segregation of Organelles
Researcher (PI) Pekka Aleksi Katajisto
Host Institution (HI) HELSINGIN YLIOPISTO
Call Details Starting Grant (StG), LS3, ERC-2015-STG
Summary Our tissues are constantly renewed by stem cells. Over time, stem cells accumulate cellular damage that will compromise renewal and results in aging. As stem cells can divide asymmetrically, segregation of harmful factors to the differentiating daughter cell could be one possible mechanism for slowing damage accumulation in the stem cell. However, current evidence for such mechanisms comes mainly from analogous findings in yeast, and studies have concentrated only on few types of cellular damage.
I hypothesize that the chronological age of a subcellular component is a proxy for all the damage it has sustained. In order to secure regeneration, mammalian stem cells may therefore specifically sort old cellular material asymmetrically. To study this, I have developed a novel strategy and tools to address the age-selective segregation of any protein in stem cell division. Using this approach, I have already discovered that stem-like cells of the human mammary epithelium indeed apportion chronologically old mitochondria asymmetrically in cell division, and enrich old mitochondria to the differentiating daughter cell. We will investigate the mechanisms underlying this novel phenomenon, and its relevance for mammalian aging.
We will first identify how old and young mitochondria differ, and how stem cells recognize them to facilitate the asymmetric segregation. Next, we will analyze the extent of asymmetric age-selective segregation by targeting several other subcellular compartments in a stem cell division. Finally, we will determine whether the discovered age-selective segregation is a general property of stem cell in vivo, and it's functional relevance for maintenance of stem cells and tissue regeneration. Our discoveries may open new possibilities to target aging associated functional decline by induction of asymmetric age-selective organelle segregation.
Summary
Our tissues are constantly renewed by stem cells. Over time, stem cells accumulate cellular damage that will compromise renewal and results in aging. As stem cells can divide asymmetrically, segregation of harmful factors to the differentiating daughter cell could be one possible mechanism for slowing damage accumulation in the stem cell. However, current evidence for such mechanisms comes mainly from analogous findings in yeast, and studies have concentrated only on few types of cellular damage.
I hypothesize that the chronological age of a subcellular component is a proxy for all the damage it has sustained. In order to secure regeneration, mammalian stem cells may therefore specifically sort old cellular material asymmetrically. To study this, I have developed a novel strategy and tools to address the age-selective segregation of any protein in stem cell division. Using this approach, I have already discovered that stem-like cells of the human mammary epithelium indeed apportion chronologically old mitochondria asymmetrically in cell division, and enrich old mitochondria to the differentiating daughter cell. We will investigate the mechanisms underlying this novel phenomenon, and its relevance for mammalian aging.
We will first identify how old and young mitochondria differ, and how stem cells recognize them to facilitate the asymmetric segregation. Next, we will analyze the extent of asymmetric age-selective segregation by targeting several other subcellular compartments in a stem cell division. Finally, we will determine whether the discovered age-selective segregation is a general property of stem cell in vivo, and it's functional relevance for maintenance of stem cells and tissue regeneration. Our discoveries may open new possibilities to target aging associated functional decline by induction of asymmetric age-selective organelle segregation.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym AGNES
Project ACTIVE AGEING – RESILIENCE AND EXTERNAL SUPPORT AS MODIFIERS OF THE DISABLEMENT OUTCOME
Researcher (PI) Taina Tuulikki RANTANEN
Host Institution (HI) JYVASKYLAN YLIOPISTO
Call Details Advanced Grant (AdG), SH3, ERC-2015-AdG
Summary The goals are 1. To develop a scale assessing the diversity of active ageing with four dimensions that are ability (what people can do), activity (what people do do), ambition (what are the valued activities that people want to do), and autonomy (how satisfied people are with the opportunity to do valued activities); 2. To examine health and physical and psychological functioning as the determinants and social and build environment, resilience and personal skills as modifiers of active ageing; 3. To develop a multicomponent sustainable intervention aiming to promote active ageing (methods: counselling, information technology, help from volunteers); 4. To test the feasibility and effectiveness on the intervention; and 5. To study cohort effects on the phenotypes on the pathway to active ageing.
“If You Can Measure It, You Can Change It.” Active ageing assessment needs conceptual progress, which I propose to do. A quantifiable scale will be developed that captures the diversity of active ageing stemming from the WHO definition of active ageing as the process of optimizing opportunities for health and participation in the society for all people in line with their needs, goals and capacities as they age. I will collect cross-sectional data (N=1000, ages 75, 80 and 85 years) and model the pathway to active ageing with state-of-the art statistical methods. By doing this I will create novel knowledge on preconditions for active ageing. The collected cohort data will be compared to a pre-existing cohort data that was collected 25 years ago to obtain knowledge about changes over time in functioning of older people. A randomized controlled trial (N=200) will be conducted to assess the effectiveness of the envisioned intervention promoting active ageing through participation. The project will regenerate ageing research by launching a novel scale, by training young scientists, by creating new concepts and theory development and by producing evidence for active ageing promotion
Summary
The goals are 1. To develop a scale assessing the diversity of active ageing with four dimensions that are ability (what people can do), activity (what people do do), ambition (what are the valued activities that people want to do), and autonomy (how satisfied people are with the opportunity to do valued activities); 2. To examine health and physical and psychological functioning as the determinants and social and build environment, resilience and personal skills as modifiers of active ageing; 3. To develop a multicomponent sustainable intervention aiming to promote active ageing (methods: counselling, information technology, help from volunteers); 4. To test the feasibility and effectiveness on the intervention; and 5. To study cohort effects on the phenotypes on the pathway to active ageing.
“If You Can Measure It, You Can Change It.” Active ageing assessment needs conceptual progress, which I propose to do. A quantifiable scale will be developed that captures the diversity of active ageing stemming from the WHO definition of active ageing as the process of optimizing opportunities for health and participation in the society for all people in line with their needs, goals and capacities as they age. I will collect cross-sectional data (N=1000, ages 75, 80 and 85 years) and model the pathway to active ageing with state-of-the art statistical methods. By doing this I will create novel knowledge on preconditions for active ageing. The collected cohort data will be compared to a pre-existing cohort data that was collected 25 years ago to obtain knowledge about changes over time in functioning of older people. A randomized controlled trial (N=200) will be conducted to assess the effectiveness of the envisioned intervention promoting active ageing through participation. The project will regenerate ageing research by launching a novel scale, by training young scientists, by creating new concepts and theory development and by producing evidence for active ageing promotion
Max ERC Funding
2 044 364 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym AI-CU
Project Automated Improvement of Continuous User interfaces
Researcher (PI) BART GERBEN DE BOER
Host Institution (HI) VRIJE UNIVERSITEIT BRUSSEL
Call Details Proof of Concept (PoC), ERC-2017-PoC
Summary We propose to develop two tools for creating, in a systematic way, better user interfaces based on continuous, non-symbolic actions, such as swipes on a touch screen, 3-D motions with a hand-held device, or breath patterns in a user interface for otherwise paralyzed patients. The tools are based on two experimental/computational techniques developed in the ABACUS project: iterated learning and social coordination.
In iterated learning, sets of signals produced by one user are learned and reproduced by another user. The reproductions are then in turn learned by the next user. In the ABACUS project, it has been shown that this results in more learnable sets of signals. We propose to show how this can be applied to creating learnable and usable signals in a systematic way when design a user interface for a device that allows continuous actions.
In social coordination, it has been shown that signals become simplified and more abstract when people communicate over an extended period of time. The ABACUS project has developed techniques to detect and quantify this. We propose to show how these can be used for a user interface that adapts to its user. This will allow novice users to use more extended and therefore more learnable versions of actions, while the system adapts when users become more adept at using the interface and reduce their actions. Because the system is adaptive, the user is not constrained in how they do this.
Concretely, we propose to implement these two tools, investigate how they can be used optimally and advertise them to
interested companies, starting with ones with which we have contact, but extending our network at the start of the project through a business case development. In order to disseminate the results we propose to involve a user committee and organize one or more workshops.
Summary
We propose to develop two tools for creating, in a systematic way, better user interfaces based on continuous, non-symbolic actions, such as swipes on a touch screen, 3-D motions with a hand-held device, or breath patterns in a user interface for otherwise paralyzed patients. The tools are based on two experimental/computational techniques developed in the ABACUS project: iterated learning and social coordination.
In iterated learning, sets of signals produced by one user are learned and reproduced by another user. The reproductions are then in turn learned by the next user. In the ABACUS project, it has been shown that this results in more learnable sets of signals. We propose to show how this can be applied to creating learnable and usable signals in a systematic way when design a user interface for a device that allows continuous actions.
In social coordination, it has been shown that signals become simplified and more abstract when people communicate over an extended period of time. The ABACUS project has developed techniques to detect and quantify this. We propose to show how these can be used for a user interface that adapts to its user. This will allow novice users to use more extended and therefore more learnable versions of actions, while the system adapts when users become more adept at using the interface and reduce their actions. Because the system is adaptive, the user is not constrained in how they do this.
Concretely, we propose to implement these two tools, investigate how they can be used optimally and advertise them to
interested companies, starting with ones with which we have contact, but extending our network at the start of the project through a business case development. In order to disseminate the results we propose to involve a user committee and organize one or more workshops.
Max ERC Funding
150 000 €
Duration
Start date: 2018-06-01, End date: 2019-11-30
Project acronym AIDA
Project Architectural design In Dialogue with dis-Ability Theoretical and methodological exploration of a multi-sensorial design approach in architecture
Researcher (PI) Ann Heylighen
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), SH2, ERC-2007-StG
Summary This research project is based on the notion that, because of their specific interaction with space, people with particular dis-abilities are able to appreciate spatial qualities or detect misfits in the environment that most architects—or other designers—are not even aware of. This notion holds for sensory dis-abilities such as blindness or visual impairment, but also for mental dis-abilities like autism or Alzheimer’s dementia. The experiences and subsequent insights of these dis-abled people, so it is argued, represent a considerable knowledge resource that would complement and enrich the professional expertise of architects and designers in general. This argument forms the basis for a methodological and theoretical exploration of a multi-sensorial design approach in architecture. On the one hand, a series of retrospective case studies will be conducted to identify and describe the motives and elements that trigger or stimulate architects’ attention for the multi-sensorial spatial experiences of people with dis-abilities when designing spaces. On the other hand, the research project will investigate experimentally in real time to what extent design processes and products in architecture can be enriched by establishing a dialogue between the multi-sensorial ‘knowing-in-action’ of people with dis-abilities and the expertise of professional architects/designers. In this way, the research project aims to develop a more profound understanding of how the concept of Design for All can be realised in architectural practice. At least as important, however, is its contribution to innovation in architecture tout court. The research results are expected to give a powerful impulse to quality improvement of the built environment by stimulating and supporting the development of innovative design concepts.
Summary
This research project is based on the notion that, because of their specific interaction with space, people with particular dis-abilities are able to appreciate spatial qualities or detect misfits in the environment that most architects—or other designers—are not even aware of. This notion holds for sensory dis-abilities such as blindness or visual impairment, but also for mental dis-abilities like autism or Alzheimer’s dementia. The experiences and subsequent insights of these dis-abled people, so it is argued, represent a considerable knowledge resource that would complement and enrich the professional expertise of architects and designers in general. This argument forms the basis for a methodological and theoretical exploration of a multi-sensorial design approach in architecture. On the one hand, a series of retrospective case studies will be conducted to identify and describe the motives and elements that trigger or stimulate architects’ attention for the multi-sensorial spatial experiences of people with dis-abilities when designing spaces. On the other hand, the research project will investigate experimentally in real time to what extent design processes and products in architecture can be enriched by establishing a dialogue between the multi-sensorial ‘knowing-in-action’ of people with dis-abilities and the expertise of professional architects/designers. In this way, the research project aims to develop a more profound understanding of how the concept of Design for All can be realised in architectural practice. At least as important, however, is its contribution to innovation in architecture tout court. The research results are expected to give a powerful impulse to quality improvement of the built environment by stimulating and supporting the development of innovative design concepts.
Max ERC Funding
1 195 385 €
Duration
Start date: 2008-05-01, End date: 2013-10-31
Project acronym ALEM
Project ADDITIONAL LOSSES IN ELECTRICAL MACHINES
Researcher (PI) Matti Antero Arkkio
Host Institution (HI) AALTO KORKEAKOULUSAATIO SR
Call Details Advanced Grant (AdG), PE8, ERC-2013-ADG
Summary "Electrical motors consume about 40 % of the electrical energy produced in the European Union. About 90 % of this energy is converted to mechanical work. However, 0.5-2.5 % of it goes to so called additional load losses whose exact origins are unknown. Our ambitious aim is to reveal the origins of these losses, build up numerical tools for modeling them and optimize electrical motors to minimize the losses.
As the hypothesis of the research, we assume that the additional losses mainly result from the deterioration of the core materials during the manufacturing process of the machine. By calorimetric measurements, we have found that the core losses of electrical machines may be twice as large as comprehensive loss models predict. The electrical steel sheets are punched, welded together and shrink fit to the frame. This causes residual strains in the core sheets deteriorating their magnetic characteristics. The cutting burrs make galvanic contacts between the sheets and form paths for inter-lamination currents. Another potential source of additional losses are the circulating currents between the parallel strands of random-wound armature windings. The stochastic nature of these potential sources of additional losses puts more challenge on the research.
We shall develop a physical loss model that couples the mechanical strains and electromagnetic losses in electrical steel sheets and apply the new model for comprehensive loss analysis of electrical machines. The stochastic variables related to the core losses and circulating-current losses will be discretized together with the temporal and spatial discretization of the electromechanical field variables. The numerical stochastic loss model will be used to search for such machine constructions that are insensitive to the manufacturing defects. We shall validate the new numerical loss models by electromechanical and calorimetric measurements."
Summary
"Electrical motors consume about 40 % of the electrical energy produced in the European Union. About 90 % of this energy is converted to mechanical work. However, 0.5-2.5 % of it goes to so called additional load losses whose exact origins are unknown. Our ambitious aim is to reveal the origins of these losses, build up numerical tools for modeling them and optimize electrical motors to minimize the losses.
As the hypothesis of the research, we assume that the additional losses mainly result from the deterioration of the core materials during the manufacturing process of the machine. By calorimetric measurements, we have found that the core losses of electrical machines may be twice as large as comprehensive loss models predict. The electrical steel sheets are punched, welded together and shrink fit to the frame. This causes residual strains in the core sheets deteriorating their magnetic characteristics. The cutting burrs make galvanic contacts between the sheets and form paths for inter-lamination currents. Another potential source of additional losses are the circulating currents between the parallel strands of random-wound armature windings. The stochastic nature of these potential sources of additional losses puts more challenge on the research.
We shall develop a physical loss model that couples the mechanical strains and electromagnetic losses in electrical steel sheets and apply the new model for comprehensive loss analysis of electrical machines. The stochastic variables related to the core losses and circulating-current losses will be discretized together with the temporal and spatial discretization of the electromechanical field variables. The numerical stochastic loss model will be used to search for such machine constructions that are insensitive to the manufacturing defects. We shall validate the new numerical loss models by electromechanical and calorimetric measurements."
Max ERC Funding
2 489 949 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym ALGOA
Project Novel algorithm for treatment planning of patients with osteoarthritis
Researcher (PI) Rami Kristian KORHONEN
Host Institution (HI) ITA-SUOMEN YLIOPISTO
Call Details Proof of Concept (PoC), PC1, ERC-2016-PoC
Summary Osteoarthritis (OA) is a common joint disease affecting over 40 million Europeans. Most common consequences of OA are pain, disability and social isolation. What is alarming, the number of patients will increase 50% in developed countries during the next 20 years. Moreover, the economic costs of OA are considerable since 1) direct healthcare (hospital admissions, medical examinations, drug therapy, etc.) and 2) productivity costs due to reduced performance while at work and absence from work have been estimated to be between 1% and 2.5% of the gross domestic product (GDP) in Western countries.
We have developed an algorithm that is able to predict the progression of OA for overweight subjects while healthy subjects do not develop OA. When employed in clinical use, preventive and personalised treatments can be started before clinically significant symptoms are observed. This marks a major breakthrough in improving the life quality of OA patients and patients prone to OA. Our discovery will directly lead to longer working careers and lesser absence from work, and will result subsequently increased productivity. Moreover, the patients are expected to live longer due to reduced disability and social isolation.
Moreover, the discovery provides economic long-term relief for the health care system, which is burdened by increasing geriatric population and stringent economic environment. With our tool, as an example, by eliminating 25% of medical examinations annually due to overweight or obesity in Finland (150.000 patients), we estimate to decrease annual direct costs by 140M€ and indirect costs by 185M€.
In the PoC project we will carry out technical proof-of-concept and perform pre-commercialisation actions to shorten the time to market. The ultimate goal after the project is to develop our innovation towards a software product, aiding the OA diagnostics in hospitals and having commercialisation potential amongst medical device companies.
Summary
Osteoarthritis (OA) is a common joint disease affecting over 40 million Europeans. Most common consequences of OA are pain, disability and social isolation. What is alarming, the number of patients will increase 50% in developed countries during the next 20 years. Moreover, the economic costs of OA are considerable since 1) direct healthcare (hospital admissions, medical examinations, drug therapy, etc.) and 2) productivity costs due to reduced performance while at work and absence from work have been estimated to be between 1% and 2.5% of the gross domestic product (GDP) in Western countries.
We have developed an algorithm that is able to predict the progression of OA for overweight subjects while healthy subjects do not develop OA. When employed in clinical use, preventive and personalised treatments can be started before clinically significant symptoms are observed. This marks a major breakthrough in improving the life quality of OA patients and patients prone to OA. Our discovery will directly lead to longer working careers and lesser absence from work, and will result subsequently increased productivity. Moreover, the patients are expected to live longer due to reduced disability and social isolation.
Moreover, the discovery provides economic long-term relief for the health care system, which is burdened by increasing geriatric population and stringent economic environment. With our tool, as an example, by eliminating 25% of medical examinations annually due to overweight or obesity in Finland (150.000 patients), we estimate to decrease annual direct costs by 140M€ and indirect costs by 185M€.
In the PoC project we will carry out technical proof-of-concept and perform pre-commercialisation actions to shorten the time to market. The ultimate goal after the project is to develop our innovation towards a software product, aiding the OA diagnostics in hospitals and having commercialisation potential amongst medical device companies.
Max ERC Funding
150 000 €
Duration
Start date: 2018-01-01, End date: 2019-06-30
Project acronym ALGOCom
Project Novel Algorithmic Techniques through the Lens of Combinatorics
Researcher (PI) Parinya Chalermsook
Host Institution (HI) AALTO KORKEAKOULUSAATIO SR
Call Details Starting Grant (StG), PE6, ERC-2017-STG
Summary Real-world optimization problems pose major challenges to algorithmic research. For instance, (i) many important problems are believed to be intractable (i.e. NP-hard) and (ii) with the growth of data size, modern applications often require a decision making under {\em incomplete and dynamically changing input data}. After several decades of research, central problems in these domains have remained poorly understood (e.g. Is there an asymptotically most efficient binary search trees?) Existing algorithmic techniques either reach their limitation or are inherently tailored to special cases.
This project attempts to untangle this gap in the state of the art and seeks new interplay across multiple areas of algorithms, such as approximation algorithms, online algorithms, fixed-parameter tractable (FPT) algorithms, exponential time algorithms, and data structures. We propose new directions from the {\em structural perspectives} that connect the aforementioned algorithmic problems to basic questions in combinatorics.
Our approaches fall into one of the three broad schemes: (i) new structural theory, (ii) intermediate problems, and (iii) transfer of techniques. These directions partially build on the PI's successes in resolving more than ten classical problems in this context.
Resolving the proposed problems will likely revolutionize our understanding about algorithms and data structures and potentially unify techniques in multiple algorithmic regimes. Any progress is, in fact, already a significant contribution to the algorithms community. We suggest concrete intermediate goals that are of independent interest and have lower risks, so they are suitable for Ph.D students.
Summary
Real-world optimization problems pose major challenges to algorithmic research. For instance, (i) many important problems are believed to be intractable (i.e. NP-hard) and (ii) with the growth of data size, modern applications often require a decision making under {\em incomplete and dynamically changing input data}. After several decades of research, central problems in these domains have remained poorly understood (e.g. Is there an asymptotically most efficient binary search trees?) Existing algorithmic techniques either reach their limitation or are inherently tailored to special cases.
This project attempts to untangle this gap in the state of the art and seeks new interplay across multiple areas of algorithms, such as approximation algorithms, online algorithms, fixed-parameter tractable (FPT) algorithms, exponential time algorithms, and data structures. We propose new directions from the {\em structural perspectives} that connect the aforementioned algorithmic problems to basic questions in combinatorics.
Our approaches fall into one of the three broad schemes: (i) new structural theory, (ii) intermediate problems, and (iii) transfer of techniques. These directions partially build on the PI's successes in resolving more than ten classical problems in this context.
Resolving the proposed problems will likely revolutionize our understanding about algorithms and data structures and potentially unify techniques in multiple algorithmic regimes. Any progress is, in fact, already a significant contribution to the algorithms community. We suggest concrete intermediate goals that are of independent interest and have lower risks, so they are suitable for Ph.D students.
Max ERC Funding
1 411 258 €
Duration
Start date: 2018-02-01, End date: 2023-01-31