Project acronym 4D-EEG
Project 4D-EEG: A new tool to investigate the spatial and temporal activity patterns in the brain
Researcher (PI) Franciscus C.T. Van Der Helm
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Advanced Grant (AdG), PE7, ERC-2011-ADG_20110209
Summary Our first goal is to develop a new tool to determine brain activity with a high temporal (< 1 msec) and spatial (about 2 mm) resolution with the focus on motor control. High density EEG (up to 256 electrodes) will be used for EEG source localization. Advanced force-controlled robot manipulators will be used to impose continuous force perturbations to the joints. Advanced closed-loop system identification algorithms will identify the dynamic EEG response of multiple brain areas to the perturbation, leading to a functional interpretation of EEG. The propagation of the signal in time and 3D space through the cortex can be monitored: 4D-EEG. Preliminary experiments with EEG localization have shown that the continuous force perturbations resulted in a better signal-to-noise ratio and coherence than the current method using transient perturbations..
4D-EEG will be a direct measure of the neural activity in the brain with an excellent temporal response and easy to use in combination with motor control tasks. The new 4D-EEG method is expected to provide a breakthrough in comparison to functional MRI (fMRI) when elucidating the meaning of cortical map plasticity in motor learning.
Our second goal is to generate and validate new hypotheses about the longitudinal relationship between motor learning and cortical map plasticity by clinically using 4D-EEG in an intensive, repeated measurement design in patients suffering from a stroke. The application of 4D-EEG combined with haptic robots will allow us to discover how dynamics in cortical map plasticity are related with upper limb recovery after stroke in terms of neural repair and using behavioral compensation strategies while performing a meaningful motor tasks.. The non-invasive 4D-EEG technique combined with haptic robots will open the window about what and how patients (re)learn when showing motor recovery after stroke in order to allow us to develop more effective patient-tailored therapies in neuro-rehabilitation.
Summary
Our first goal is to develop a new tool to determine brain activity with a high temporal (< 1 msec) and spatial (about 2 mm) resolution with the focus on motor control. High density EEG (up to 256 electrodes) will be used for EEG source localization. Advanced force-controlled robot manipulators will be used to impose continuous force perturbations to the joints. Advanced closed-loop system identification algorithms will identify the dynamic EEG response of multiple brain areas to the perturbation, leading to a functional interpretation of EEG. The propagation of the signal in time and 3D space through the cortex can be monitored: 4D-EEG. Preliminary experiments with EEG localization have shown that the continuous force perturbations resulted in a better signal-to-noise ratio and coherence than the current method using transient perturbations..
4D-EEG will be a direct measure of the neural activity in the brain with an excellent temporal response and easy to use in combination with motor control tasks. The new 4D-EEG method is expected to provide a breakthrough in comparison to functional MRI (fMRI) when elucidating the meaning of cortical map plasticity in motor learning.
Our second goal is to generate and validate new hypotheses about the longitudinal relationship between motor learning and cortical map plasticity by clinically using 4D-EEG in an intensive, repeated measurement design in patients suffering from a stroke. The application of 4D-EEG combined with haptic robots will allow us to discover how dynamics in cortical map plasticity are related with upper limb recovery after stroke in terms of neural repair and using behavioral compensation strategies while performing a meaningful motor tasks.. The non-invasive 4D-EEG technique combined with haptic robots will open the window about what and how patients (re)learn when showing motor recovery after stroke in order to allow us to develop more effective patient-tailored therapies in neuro-rehabilitation.
Max ERC Funding
3 477 202 €
Duration
Start date: 2012-06-01, End date: 2017-05-31
Project acronym AAATSI
Project Advanced Antenna Architecture for THZ Sensing Instruments
Researcher (PI) Andrea Neto
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Starting Grant (StG), PE7, ERC-2011-StG_20101014
Summary The Tera-Hertz portion of the spectrum presents unique potentials for advanced applications. Currently the THz spectrum is revealing the mechanisms at the origin of our universe and provides the means to monitor the health of our planet via satellite based sensing of critical gases. Potentially time domain sensing of the THz spectrum will be the ideal tool for a vast variety of medical and security applications.
Presently, systems in the THz regime are extremely expensive and consequently the THz spectrum is still the domain of only niche (expensive) scientific applications. The main problems are the lack of power and sensitivity. The wide unused THz spectral bandwidth is, herself, the only widely available resource that in the future can compensate for these problems. But, so far, when scientists try to really use the bandwidth, they run into an insurmountable physical limit: antenna dispersion. Antenna dispersion modifies the signal’s spectrum in a wavelength dependent manner in all types of radiation, but is particularly deleterious to THz signals because the spectrum is too wide and with foreseeable technology it cannot be digitized.
The goal of this proposal is to introduce break-through antenna technology that will eliminate the dispersion bottle neck and revolutionize Time Domain sensing and Spectroscopic Space Science. Achieving these goals the project will pole vault THz imaging technology into the 21-th century and develop critically important enabling technologies which will satisfy the electrical engineering needs of the next 30 years and in the long run will enable multi Tera-bit wireless communications.
In order to achieve these goals, I will first build upon two major breakthrough radiation mechanisms that I pioneered: Leaky Lenses and Connected Arrays. Eventually, ultra wide band imaging arrays constituted by thousands of components will be designed on the bases of the new theoretical findings and demonstrated.
Summary
The Tera-Hertz portion of the spectrum presents unique potentials for advanced applications. Currently the THz spectrum is revealing the mechanisms at the origin of our universe and provides the means to monitor the health of our planet via satellite based sensing of critical gases. Potentially time domain sensing of the THz spectrum will be the ideal tool for a vast variety of medical and security applications.
Presently, systems in the THz regime are extremely expensive and consequently the THz spectrum is still the domain of only niche (expensive) scientific applications. The main problems are the lack of power and sensitivity. The wide unused THz spectral bandwidth is, herself, the only widely available resource that in the future can compensate for these problems. But, so far, when scientists try to really use the bandwidth, they run into an insurmountable physical limit: antenna dispersion. Antenna dispersion modifies the signal’s spectrum in a wavelength dependent manner in all types of radiation, but is particularly deleterious to THz signals because the spectrum is too wide and with foreseeable technology it cannot be digitized.
The goal of this proposal is to introduce break-through antenna technology that will eliminate the dispersion bottle neck and revolutionize Time Domain sensing and Spectroscopic Space Science. Achieving these goals the project will pole vault THz imaging technology into the 21-th century and develop critically important enabling technologies which will satisfy the electrical engineering needs of the next 30 years and in the long run will enable multi Tera-bit wireless communications.
In order to achieve these goals, I will first build upon two major breakthrough radiation mechanisms that I pioneered: Leaky Lenses and Connected Arrays. Eventually, ultra wide band imaging arrays constituted by thousands of components will be designed on the bases of the new theoretical findings and demonstrated.
Max ERC Funding
1 499 487 €
Duration
Start date: 2011-11-01, End date: 2017-10-31
Project acronym ActiveBioFluids
Project Origins of Collective Motion in Active Biofluids
Researcher (PI) Daniel TAM
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Starting Grant (StG), PE3, ERC-2016-STG
Summary The emergence of coherent behaviour is ubiquitous in the natural world and has long captivated biologists and physicists alike. One area of growing interest is the collective motion and synchronization arising within and between simple motile organisms. My goal is to develop and use a novel experimental approach to unravel the origins of spontaneous coherent motion in three model systems of biofluids: (1) the synchronization of the two flagella of green algae Chlamydomonas Rheinhardtii, (2) the metachronal wave in the cilia of protist Paramecium and (3) the collective motion of swimming microorganisms in active suspensions. Understanding the mechanisms leading to collective motion is of tremendous importance because it is crucial to many biological processes such as mechanical signal transduction, embryonic development and biofilm formation.
Up till now, most of the work has been theoretical and has led to the dominant view that hydrodynamic interactions are the main driving force for synchronization and collective motion. Recent experiments have challenged this view and highlighted the importance of direct mechanical contact. New experimental studies are now crucially needed. The state-of-the-art of experimental approaches consists of observations of unperturbed cells. The key innovation in our approach is to dynamically interact with microorganisms in real-time, at the relevant time and length scales. I will investigate the origins of coherent motion by reproducing synthetically the mechanical signatures of physiological flows and direct mechanical interactions and track precisely the response of the organism to the perturbations. Our new approach will incorporate optical tweezers to interact with motile cells, and a unique μ-Tomographic PIV setup to track their 3D micron-scale motion.
This proposal tackles a timely question in biophysics and will yield new insight into the fundamental principles underlying collective motion in active biological matter.
Summary
The emergence of coherent behaviour is ubiquitous in the natural world and has long captivated biologists and physicists alike. One area of growing interest is the collective motion and synchronization arising within and between simple motile organisms. My goal is to develop and use a novel experimental approach to unravel the origins of spontaneous coherent motion in three model systems of biofluids: (1) the synchronization of the two flagella of green algae Chlamydomonas Rheinhardtii, (2) the metachronal wave in the cilia of protist Paramecium and (3) the collective motion of swimming microorganisms in active suspensions. Understanding the mechanisms leading to collective motion is of tremendous importance because it is crucial to many biological processes such as mechanical signal transduction, embryonic development and biofilm formation.
Up till now, most of the work has been theoretical and has led to the dominant view that hydrodynamic interactions are the main driving force for synchronization and collective motion. Recent experiments have challenged this view and highlighted the importance of direct mechanical contact. New experimental studies are now crucially needed. The state-of-the-art of experimental approaches consists of observations of unperturbed cells. The key innovation in our approach is to dynamically interact with microorganisms in real-time, at the relevant time and length scales. I will investigate the origins of coherent motion by reproducing synthetically the mechanical signatures of physiological flows and direct mechanical interactions and track precisely the response of the organism to the perturbations. Our new approach will incorporate optical tweezers to interact with motile cells, and a unique μ-Tomographic PIV setup to track their 3D micron-scale motion.
This proposal tackles a timely question in biophysics and will yield new insight into the fundamental principles underlying collective motion in active biological matter.
Max ERC Funding
1 500 000 €
Duration
Start date: 2017-04-01, End date: 2022-03-31
Project acronym ADULT
Project Analysis of the Dark Universe through Lensing Tomography
Researcher (PI) Hendrik Hoekstra
Host Institution (HI) UNIVERSITEIT LEIDEN
Call Details Starting Grant (StG), PE9, ERC-2011-StG_20101014
Summary The discoveries that the expansion of the universe is accelerating due to an unknown “dark energy”
and that most of the matter is invisible, highlight our lack of understanding of the major constituents
of the universe. These surprising findings set the stage for research in cosmology at the start of the
21st century. The objective of this proposal is to advance observational constraints to a level where we can distinguish between physical mechanisms that aim to explain the properties of dark energy and the observed distribution of dark matter throughout the universe. We use a relatively new technique called weak gravitational lensing: the accurate measurement of correlations in the orientations of distant galaxies enables us to map the dark matter distribution directly and to extract the cosmological information that is encoded by the large-scale structure.
To study the dark universe we will analyse data from a new state-of-the-art imaging survey: the Kilo-
Degree Survey (KiDS) will cover 1500 square degrees in 9 filters. The combination of its large survey
area and the availability of exquisite photometric redshifts for the sources makes KiDS the first
project that can place interesting constraints on the dark energy equation-of-state using lensing data
alone. Combined with complementary results from Planck, our measurements will provide one of the
best views of the dark side of the universe before much larger space-based projects commence.
To reach the desired accuracy we need to carefully measure the shapes of distant background galaxies. We also need to account for any intrinsic alignments that arise due to tidal interactions, rather than through lensing. Reducing these observational and physical biases to negligible levels is a necessarystep to ensure the success of KiDS and an important part of our preparation for more challenging projects such as the European-led space mission Euclid.
Summary
The discoveries that the expansion of the universe is accelerating due to an unknown “dark energy”
and that most of the matter is invisible, highlight our lack of understanding of the major constituents
of the universe. These surprising findings set the stage for research in cosmology at the start of the
21st century. The objective of this proposal is to advance observational constraints to a level where we can distinguish between physical mechanisms that aim to explain the properties of dark energy and the observed distribution of dark matter throughout the universe. We use a relatively new technique called weak gravitational lensing: the accurate measurement of correlations in the orientations of distant galaxies enables us to map the dark matter distribution directly and to extract the cosmological information that is encoded by the large-scale structure.
To study the dark universe we will analyse data from a new state-of-the-art imaging survey: the Kilo-
Degree Survey (KiDS) will cover 1500 square degrees in 9 filters. The combination of its large survey
area and the availability of exquisite photometric redshifts for the sources makes KiDS the first
project that can place interesting constraints on the dark energy equation-of-state using lensing data
alone. Combined with complementary results from Planck, our measurements will provide one of the
best views of the dark side of the universe before much larger space-based projects commence.
To reach the desired accuracy we need to carefully measure the shapes of distant background galaxies. We also need to account for any intrinsic alignments that arise due to tidal interactions, rather than through lensing. Reducing these observational and physical biases to negligible levels is a necessarystep to ensure the success of KiDS and an important part of our preparation for more challenging projects such as the European-led space mission Euclid.
Max ERC Funding
1 316 880 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym AGGLONANOCOAT
Project The interplay between agglomeration and coating of nanoparticles in the gas phase
Researcher (PI) Jan Rudolf Van Ommen
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Starting Grant (StG), PE8, ERC-2011-StG_20101014
Summary This proposal aims to develop a generic synthesis approach for core-shell nanoparticles by unravelling the relevant mechanisms. Core-shell nanoparticles have high potential in heterogeneous catalysis, energy storage, and medical applications. However, on a fundamental level there is currently a poor understanding of how to produce such nanostructured particles in a controllable and scalable manner.
The main barriers to achieving this goal are understanding how nanoparticles agglomerate to loose dynamic clusters and controlling the agglomeration process in gas flows during coating, such that uniform coatings can be made. This is very challenging because of the two-way coupling between agglomeration and coating. During the coating we change the particle surfaces and thus the way the particles stick together. Correspondingly, the stickiness of particles determines how easy reactants can reach the surface.
Innovatively the project will be the first systematic study into this multi-scale phenomenon with investigations at all relevant length scales. Current synthesis approaches – mostly carried out in the liquid phase – are typically developed case by case. I will coat nanoparticles in the gas phase with atomic layer deposition (ALD): a technique from the semi-conductor industry that can deposit a wide range of materials. ALD applied to flat substrates offers excellent control over layer thickness. I will investigate the modification of single particle surfaces, particle-particle interaction, the structure of agglomerates, and the flow behaviour of large number of agglomerates. To this end, I will apply a multidisciplinary approach, combining disciplines as physical chemistry, fluid dynamics, and reaction engineering.
Summary
This proposal aims to develop a generic synthesis approach for core-shell nanoparticles by unravelling the relevant mechanisms. Core-shell nanoparticles have high potential in heterogeneous catalysis, energy storage, and medical applications. However, on a fundamental level there is currently a poor understanding of how to produce such nanostructured particles in a controllable and scalable manner.
The main barriers to achieving this goal are understanding how nanoparticles agglomerate to loose dynamic clusters and controlling the agglomeration process in gas flows during coating, such that uniform coatings can be made. This is very challenging because of the two-way coupling between agglomeration and coating. During the coating we change the particle surfaces and thus the way the particles stick together. Correspondingly, the stickiness of particles determines how easy reactants can reach the surface.
Innovatively the project will be the first systematic study into this multi-scale phenomenon with investigations at all relevant length scales. Current synthesis approaches – mostly carried out in the liquid phase – are typically developed case by case. I will coat nanoparticles in the gas phase with atomic layer deposition (ALD): a technique from the semi-conductor industry that can deposit a wide range of materials. ALD applied to flat substrates offers excellent control over layer thickness. I will investigate the modification of single particle surfaces, particle-particle interaction, the structure of agglomerates, and the flow behaviour of large number of agglomerates. To this end, I will apply a multidisciplinary approach, combining disciplines as physical chemistry, fluid dynamics, and reaction engineering.
Max ERC Funding
1 409 952 €
Duration
Start date: 2011-12-01, End date: 2016-11-30
Project acronym ALGSTRONGCRYPTO
Project Algebraic Methods for Stronger Crypto
Researcher (PI) Ronald John Fitzgerald CRAMER
Host Institution (HI) STICHTING NEDERLANDSE WETENSCHAPPELIJK ONDERZOEK INSTITUTEN
Call Details Advanced Grant (AdG), PE6, ERC-2016-ADG
Summary Our field is cryptology. Our overarching objective is to advance significantly the frontiers in
design and analysis of high-security cryptography for the future generation.
Particularly, we wish to enhance the efficiency, functionality, and, last-but-not-least, fundamental understanding of cryptographic security against very powerful adversaries.
Our approach here is to develop completely novel methods by
deepening, strengthening and broadening the
algebraic foundations of the field.
Concretely, our lens builds on
the arithmetic codex. This is a general, abstract cryptographic primitive whose basic theory we recently developed and whose asymptotic part, which relies on algebraic geometry, enjoys crucial applications in surprising foundational results on constant communication-rate two-party cryptography. A codex is a linear (error correcting) code that, when endowing its ambient vector space just with coordinate-wise multiplication, can be viewed as simulating, up to some degree, richer arithmetical structures such as finite fields (or products thereof), or generally, finite-dimensional algebras over finite fields. Besides this degree, coordinate-localities for which simulation holds and for which it does not at all are also captured.
Our method is based on novel perspectives on codices which significantly
widen their scope and strengthen their utility. Particularly, we bring
symmetries, computational- and complexity theoretic aspects, and connections with algebraic number theory, -geometry, and -combinatorics into play in novel ways. Our applications range from public-key cryptography to secure multi-party computation.
Our proposal is subdivided into 3 interconnected modules:
(1) Algebraic- and Number Theoretical Cryptanalysis
(2) Construction of Algebraic Crypto Primitives
(3) Advanced Theory of Arithmetic Codices
Summary
Our field is cryptology. Our overarching objective is to advance significantly the frontiers in
design and analysis of high-security cryptography for the future generation.
Particularly, we wish to enhance the efficiency, functionality, and, last-but-not-least, fundamental understanding of cryptographic security against very powerful adversaries.
Our approach here is to develop completely novel methods by
deepening, strengthening and broadening the
algebraic foundations of the field.
Concretely, our lens builds on
the arithmetic codex. This is a general, abstract cryptographic primitive whose basic theory we recently developed and whose asymptotic part, which relies on algebraic geometry, enjoys crucial applications in surprising foundational results on constant communication-rate two-party cryptography. A codex is a linear (error correcting) code that, when endowing its ambient vector space just with coordinate-wise multiplication, can be viewed as simulating, up to some degree, richer arithmetical structures such as finite fields (or products thereof), or generally, finite-dimensional algebras over finite fields. Besides this degree, coordinate-localities for which simulation holds and for which it does not at all are also captured.
Our method is based on novel perspectives on codices which significantly
widen their scope and strengthen their utility. Particularly, we bring
symmetries, computational- and complexity theoretic aspects, and connections with algebraic number theory, -geometry, and -combinatorics into play in novel ways. Our applications range from public-key cryptography to secure multi-party computation.
Our proposal is subdivided into 3 interconnected modules:
(1) Algebraic- and Number Theoretical Cryptanalysis
(2) Construction of Algebraic Crypto Primitives
(3) Advanced Theory of Arithmetic Codices
Max ERC Funding
2 447 439 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym ALPROS
Project Artificial Life-like Processive Systems
Researcher (PI) Roeland Johannes Maria Nolte
Host Institution (HI) STICHTING KATHOLIEKE UNIVERSITEIT
Call Details Advanced Grant (AdG), PE5, ERC-2011-ADG_20110209
Summary Toroidal processive enzymes (e.g. enzymes/proteins that are able to thread onto biopolymers and to perform stepwise reactions along the polymer chain) are among the most fascinating tools involved in the clockwork machinery of life. Processive catalysis is ubiquitous in Nature, viz. DNA polymerases, endo- and exo-nucleases and; it plays a crucial role in numerous events of the cell’s life, including most of the replication, transmission, and expression and repair processes of the genetic information. In the case of DNA polymerases the protein catalyst encircles the DNA and whilst moving along it, make copies of high fidelity. Although numerous works have been reported in relation with the synthesis of natural enzymes' analogues, very few efforts have been paid in comparison to mimic these processive properties. It is the goal of this proposal to rectify this oversight and unravel the essential components of Nature’s polymer catalysts. The individual projects are designed to specifically target the essential aspects of processive catalysis, i.e. rate of motion, rate of catalysis, and transfer of information. One project is aimed at extending the research into a processive catalytic system that is more suitable for industrial application. Two projects involve more farsighted studies and are designed to push the research way beyond the current boundaries into the area of Turing machines and bio-rotaxane catalysts which can modify DNA in a non-natural process. The vision of this proposal is to open up the field of ‘processive catalysis’ and invigorate the next generation of chemists to develop information transfer and toroidal processive catalysts. The construction of synthetic analogues of processive enzymes could open a gate toward a large range of applications, ranging from intelligent tailoring of polymers to information storage and processing.
Summary
Toroidal processive enzymes (e.g. enzymes/proteins that are able to thread onto biopolymers and to perform stepwise reactions along the polymer chain) are among the most fascinating tools involved in the clockwork machinery of life. Processive catalysis is ubiquitous in Nature, viz. DNA polymerases, endo- and exo-nucleases and; it plays a crucial role in numerous events of the cell’s life, including most of the replication, transmission, and expression and repair processes of the genetic information. In the case of DNA polymerases the protein catalyst encircles the DNA and whilst moving along it, make copies of high fidelity. Although numerous works have been reported in relation with the synthesis of natural enzymes' analogues, very few efforts have been paid in comparison to mimic these processive properties. It is the goal of this proposal to rectify this oversight and unravel the essential components of Nature’s polymer catalysts. The individual projects are designed to specifically target the essential aspects of processive catalysis, i.e. rate of motion, rate of catalysis, and transfer of information. One project is aimed at extending the research into a processive catalytic system that is more suitable for industrial application. Two projects involve more farsighted studies and are designed to push the research way beyond the current boundaries into the area of Turing machines and bio-rotaxane catalysts which can modify DNA in a non-natural process. The vision of this proposal is to open up the field of ‘processive catalysis’ and invigorate the next generation of chemists to develop information transfer and toroidal processive catalysts. The construction of synthetic analogues of processive enzymes could open a gate toward a large range of applications, ranging from intelligent tailoring of polymers to information storage and processing.
Max ERC Funding
1 603 699 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym APROCS
Project Automated Linear Parameter-Varying Modeling and Control Synthesis for Nonlinear Complex Systems
Researcher (PI) Roland TOTH
Host Institution (HI) TECHNISCHE UNIVERSITEIT EINDHOVEN
Call Details Starting Grant (StG), PE7, ERC-2016-STG
Summary Linear Parameter-Varying (LPV) systems are flexible mathematical models capable of representing Nonlinear (NL)/Time-Varying (TV) dynamical behaviors of complex physical systems (e.g., wafer scanners, car engines, chemical reactors), often encountered in engineering, via a linear structure. The LPV framework provides computationally efficient and robust approaches to synthesize digital controllers that can ensure desired operation of such systems - making it attractive to (i) high-tech mechatronic, (ii) automotive and (iii) chemical-process applications. Such a framework is important to meet with the increasing operational demands of systems in these industrial sectors and to realize future technological targets. However, recent studies have shown that, to fully exploit the potential of the LPV framework, a number of limiting factors of the underlying theory ask a for serious innovation, as currently it is not understood how to (1) automate exact and low-complexity LPV modeling of real-world applications and how to refine uncertain aspects of these models efficiently by the help of measured data, (2) incorporate control objectives directly into modeling and to develop model reduction approaches for control, and (3) how to see modeling & control synthesis as a unified, closed-loop system synthesis approach directly oriented for the underlying NL/TV system. Furthermore, due to the increasingly cyber-physical nature of applications, (4) control synthesis is needed in a plug & play fashion, where if sub-systems are modified or exchanged, then the control design and the model of the whole system are only incrementally updated. This project aims to surmount Challenges (1)-(4) by establishing an innovative revolution of the LPV framework supported by a software suite and extensive empirical studies on real-world industrial applications; with a potential to ensure a leading role of technological innovation of the EU in the high-impact industrial sectors (i)-(iii).
Summary
Linear Parameter-Varying (LPV) systems are flexible mathematical models capable of representing Nonlinear (NL)/Time-Varying (TV) dynamical behaviors of complex physical systems (e.g., wafer scanners, car engines, chemical reactors), often encountered in engineering, via a linear structure. The LPV framework provides computationally efficient and robust approaches to synthesize digital controllers that can ensure desired operation of such systems - making it attractive to (i) high-tech mechatronic, (ii) automotive and (iii) chemical-process applications. Such a framework is important to meet with the increasing operational demands of systems in these industrial sectors and to realize future technological targets. However, recent studies have shown that, to fully exploit the potential of the LPV framework, a number of limiting factors of the underlying theory ask a for serious innovation, as currently it is not understood how to (1) automate exact and low-complexity LPV modeling of real-world applications and how to refine uncertain aspects of these models efficiently by the help of measured data, (2) incorporate control objectives directly into modeling and to develop model reduction approaches for control, and (3) how to see modeling & control synthesis as a unified, closed-loop system synthesis approach directly oriented for the underlying NL/TV system. Furthermore, due to the increasingly cyber-physical nature of applications, (4) control synthesis is needed in a plug & play fashion, where if sub-systems are modified or exchanged, then the control design and the model of the whole system are only incrementally updated. This project aims to surmount Challenges (1)-(4) by establishing an innovative revolution of the LPV framework supported by a software suite and extensive empirical studies on real-world industrial applications; with a potential to ensure a leading role of technological innovation of the EU in the high-impact industrial sectors (i)-(iii).
Max ERC Funding
1 493 561 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym BinCosmos
Project The Impact of Massive Binaries Through Cosmic Time
Researcher (PI) Selma DE MINK
Host Institution (HI) UNIVERSITEIT VAN AMSTERDAM
Call Details Starting Grant (StG), PE9, ERC-2016-STG
Summary Massive stars play many key roles in Astrophysics. As COSMIC ENGINES they transformed the pristine Universe left after the Big Bang into our modern Universe. We use massive stars, their explosions and products as COSMIC PROBES to study the conditions in the distant Universe and the extreme physics inaccessible at earth. Models of massive stars are thus widely applied. A central common assumption is that massive stars are non-rotating single objects, in stark contrast with new data. Recent studies show that majority (70% according to our data) will experience severe interaction with a companion (Sana, de Mink et al. Science 2012).
I propose to conduct the most ambitious and extensive exploration to date of the effects of binarity and rotation on the lives and fates of massive stars to (I) transform our understanding of the complex physical processes and how they operate in the vast parameter space and (II) explore the cosmological implications after calibrating and verifying the models. To achieve this ambitious objective I will use an innovative computational approach that combines the strength of two highly complementary codes and seek direct confrontation with observations to overcome the computational challenges that inhibited previous work.
This timely project will provide the urgent theory framework needed for interpretation and guiding of observing programs with the new facilities (JWST, LSST, aLIGO/VIRGO). Public release of the model grids and code will ensure wide impact of this project. I am in the unique position to successfully lead this project because of my (i) extensive experience modeling the complex physical processes, (ii) leading role in introducing large statistical simulations in the massive star community and (iii) direct involvement in surveys that will be used in this project.
Summary
Massive stars play many key roles in Astrophysics. As COSMIC ENGINES they transformed the pristine Universe left after the Big Bang into our modern Universe. We use massive stars, their explosions and products as COSMIC PROBES to study the conditions in the distant Universe and the extreme physics inaccessible at earth. Models of massive stars are thus widely applied. A central common assumption is that massive stars are non-rotating single objects, in stark contrast with new data. Recent studies show that majority (70% according to our data) will experience severe interaction with a companion (Sana, de Mink et al. Science 2012).
I propose to conduct the most ambitious and extensive exploration to date of the effects of binarity and rotation on the lives and fates of massive stars to (I) transform our understanding of the complex physical processes and how they operate in the vast parameter space and (II) explore the cosmological implications after calibrating and verifying the models. To achieve this ambitious objective I will use an innovative computational approach that combines the strength of two highly complementary codes and seek direct confrontation with observations to overcome the computational challenges that inhibited previous work.
This timely project will provide the urgent theory framework needed for interpretation and guiding of observing programs with the new facilities (JWST, LSST, aLIGO/VIRGO). Public release of the model grids and code will ensure wide impact of this project. I am in the unique position to successfully lead this project because of my (i) extensive experience modeling the complex physical processes, (ii) leading role in introducing large statistical simulations in the massive star community and (iii) direct involvement in surveys that will be used in this project.
Max ERC Funding
1 926 634 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym BRiCPT
Project Basic Research in Cryptographic Protocol Theory
Researcher (PI) Jesper Buus Nielsen
Host Institution (HI) AARHUS UNIVERSITET
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary In cryptographic protocol theory, we consider a situation where a number of entities want to solve some problem over a computer network. Each entity has some secret data it does not want the other entities to learn, yet, they all want to learn something about the common set of data. In an electronic election, they want to know the number of yes-votes without revealing who voted what. For instance, in an electronic auction, they want to find the winner without leaking the bids of the losers.
A main focus of the project is to develop new techniques for solving such protocol problems. We are in particular interested in techniques which can automatically construct a protocol solving a problem given only a description of what the problem is. My focus will be theoretical basic research, but I believe that advancing the theory of secure protocol compilers will have an immense impact on the practice of developing secure protocols for practice.
When one develops complex protocols, it is important to be able to verify their correctness before they are deployed, in particular so, when the purpose of the protocols is to protect information. If and when an error is found and corrected, the sensitive data will possibly already be compromised. Therefore, cryptographic protocol theory develops models of what it means for a protocol to be secure, and techniques for analyzing whether a given protocol is secure or not.
A main focuses of the project is to develop better security models, as existing security models either suffer from the problem that it is possible to prove some protocols secure which are not secure in practice, or they suffer from the problem that it is impossible to prove security of some protocol which are believed to be secure in practice. My focus will again be on theoretical basic research, but I believe that better security models are important for advancing a practice where protocols are verified as secure before deployed.
Summary
In cryptographic protocol theory, we consider a situation where a number of entities want to solve some problem over a computer network. Each entity has some secret data it does not want the other entities to learn, yet, they all want to learn something about the common set of data. In an electronic election, they want to know the number of yes-votes without revealing who voted what. For instance, in an electronic auction, they want to find the winner without leaking the bids of the losers.
A main focus of the project is to develop new techniques for solving such protocol problems. We are in particular interested in techniques which can automatically construct a protocol solving a problem given only a description of what the problem is. My focus will be theoretical basic research, but I believe that advancing the theory of secure protocol compilers will have an immense impact on the practice of developing secure protocols for practice.
When one develops complex protocols, it is important to be able to verify their correctness before they are deployed, in particular so, when the purpose of the protocols is to protect information. If and when an error is found and corrected, the sensitive data will possibly already be compromised. Therefore, cryptographic protocol theory develops models of what it means for a protocol to be secure, and techniques for analyzing whether a given protocol is secure or not.
A main focuses of the project is to develop better security models, as existing security models either suffer from the problem that it is possible to prove some protocols secure which are not secure in practice, or they suffer from the problem that it is impossible to prove security of some protocol which are believed to be secure in practice. My focus will again be on theoretical basic research, but I believe that better security models are important for advancing a practice where protocols are verified as secure before deployed.
Max ERC Funding
1 171 019 €
Duration
Start date: 2011-12-01, End date: 2016-11-30