Project acronym ACCOPT
Project ACelerated COnvex OPTimization
Researcher (PI) Yurii NESTEROV
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Advanced Grant (AdG), PE1, ERC-2017-ADG
Summary The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Summary
The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Max ERC Funding
2 090 038 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym BIOTENSORS
Project Biomedical Data Fusion using Tensor based Blind Source Separation
Researcher (PI) Sabine Jeanne A Van Huffel
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary "Summary: the quest for a general functional tensor framework for blind source separation
Our overall objective is the development of a general functional framework for solving tensor based blind source separation (BSS) problems in biomedical data fusion, using tensor decompositions (TDs) as basic core. We claim that TDs will allow the extraction of fairly complicated sources of biomedical activity from fairly complicated sets of uni- and multimodal data. The power of the new techniques will be demonstrated for three well-chosen representative biomedical applications for which extensive expertise and fully validated datasets are available in the PI’s team, namely:
• Metabolite quantification and brain tumour tissue typing using Magnetic Resonance Spectroscopic Imaging,
• Functional monitoring including seizure detection and polysomnography,
• Cognitive brain functioning and seizure zone localization using simultaneous Electroencephalography-functional MR Imaging integration.
Solving these challenging problems requires that algorithmic progress is made in several directions:
• Algorithms need to be based on multilinear extensions of numerical linear algebra.
• New grounds for separation, such as representability in a given function class, need to be explored.
• Prior knowledge needs to be exploited via appropriate health relevant constraints.
• Biomedical data fusion requires the combination of TDs, coupled via relevant constraints.
• Algorithms for TD updating are important for continuous long-term patient monitoring.
The algorithms are eventually integrated in an easy-to-use open source software platform that is general enough for use in other BSS applications.
Having been involved in biomedical signal processing over a period of 20 years, the PI has a good overview of the field and the opportunities. By working directly at the forefront in close collaboration with the clinical scientists who actually use our software, we can have a huge impact."
Summary
"Summary: the quest for a general functional tensor framework for blind source separation
Our overall objective is the development of a general functional framework for solving tensor based blind source separation (BSS) problems in biomedical data fusion, using tensor decompositions (TDs) as basic core. We claim that TDs will allow the extraction of fairly complicated sources of biomedical activity from fairly complicated sets of uni- and multimodal data. The power of the new techniques will be demonstrated for three well-chosen representative biomedical applications for which extensive expertise and fully validated datasets are available in the PI’s team, namely:
• Metabolite quantification and brain tumour tissue typing using Magnetic Resonance Spectroscopic Imaging,
• Functional monitoring including seizure detection and polysomnography,
• Cognitive brain functioning and seizure zone localization using simultaneous Electroencephalography-functional MR Imaging integration.
Solving these challenging problems requires that algorithmic progress is made in several directions:
• Algorithms need to be based on multilinear extensions of numerical linear algebra.
• New grounds for separation, such as representability in a given function class, need to be explored.
• Prior knowledge needs to be exploited via appropriate health relevant constraints.
• Biomedical data fusion requires the combination of TDs, coupled via relevant constraints.
• Algorithms for TD updating are important for continuous long-term patient monitoring.
The algorithms are eventually integrated in an easy-to-use open source software platform that is general enough for use in other BSS applications.
Having been involved in biomedical signal processing over a period of 20 years, the PI has a good overview of the field and the opportunities. By working directly at the forefront in close collaboration with the clinical scientists who actually use our software, we can have a huge impact."
Max ERC Funding
2 500 000 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym CALCULUS
Project Commonsense and Anticipation enriched Learning of Continuous representations sUpporting Language UnderStanding
Researcher (PI) Marie-Francine MOENS
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary Natural language understanding (NLU) by the machine is of large scientific, economic and social value. Humans perform the NLU task in an efficient way by relying on their capability to imagine or anticipate situations. They engage commonsense and world knowledge that is often acquired through perceptual experiences to make explicit what is left implicit in language. Inspired by these characteristics CALCULUS will design, implement and evaluate innovative paradigms supporting NLU, where it will combine old but powerful ideas for language understanding from the early days of artificial intelligence with new approaches from machine learning. The project focuses on the effective learning of anticipatory, continuous, non-symbolic representations of event frames and narrative structures of events that are trained on language and visual data. The grammatical structure of language is grounded in the geometric structure of visual data while embodying aspects of commonsense and world knowledge. The reusable representations are evaluated in a selection of NLU tasks requiring efficient real-time retrieval of the representations and parsing of the targeted written texts. Finally, we will evaluate the inference potential of the anticipatory representations in situations not seen in the training data and when inferring spatial and temporal information in metric real world spaces that is not mentioned in the processed language. The machine learning methods focus on learning latent variable models relying on Bayesian probabilistic models and neural networks and focus on settings with limited training data that are manually annotated. The best models will be integrated in a demonstrator that translates the language of stories to events happening in a 3-D virtual world. The PI has interdisciplinary expertise in natural language processing, joint processing of language and visual data, information retrieval and machine learning needed for the successful realization of the project.
Summary
Natural language understanding (NLU) by the machine is of large scientific, economic and social value. Humans perform the NLU task in an efficient way by relying on their capability to imagine or anticipate situations. They engage commonsense and world knowledge that is often acquired through perceptual experiences to make explicit what is left implicit in language. Inspired by these characteristics CALCULUS will design, implement and evaluate innovative paradigms supporting NLU, where it will combine old but powerful ideas for language understanding from the early days of artificial intelligence with new approaches from machine learning. The project focuses on the effective learning of anticipatory, continuous, non-symbolic representations of event frames and narrative structures of events that are trained on language and visual data. The grammatical structure of language is grounded in the geometric structure of visual data while embodying aspects of commonsense and world knowledge. The reusable representations are evaluated in a selection of NLU tasks requiring efficient real-time retrieval of the representations and parsing of the targeted written texts. Finally, we will evaluate the inference potential of the anticipatory representations in situations not seen in the training data and when inferring spatial and temporal information in metric real world spaces that is not mentioned in the processed language. The machine learning methods focus on learning latent variable models relying on Bayesian probabilistic models and neural networks and focus on settings with limited training data that are manually annotated. The best models will be integrated in a demonstrator that translates the language of stories to events happening in a 3-D virtual world. The PI has interdisciplinary expertise in natural language processing, joint processing of language and visual data, information retrieval and machine learning needed for the successful realization of the project.
Max ERC Funding
2 227 500 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym CerQuS
Project Certified Quantum Security
Researcher (PI) Dominique Peer Ghislain UNRUH
Host Institution (HI) TARTU ULIKOOL
Call Details Consolidator Grant (CoG), PE6, ERC-2018-COG
Summary "Digital communication permeates all areas of today's daily life. Cryptographic protocols are used to secure that
communication. Quantum communication and the advent of quantum computers both threaten existing cryptographic
solutions, and create new opportunities for secure protocols. The security of cryptographic systems is normally ensured by
mathematical proofs. Due to human error, however, these proofs often contain errors, limiting the usefulness of said proofs.
This is especially true in the case of quantum protocols since human intuition is well-adapted to the classical world, but not
to quantum mechanics. To resolve this problem, methods for verifying cryptographic security proofs using computers (i.e.,
for ""certifying"" the security) have been developed. Yet, all existing verification approaches handle classical cryptography
only - for quantum protocols, no approaches exist.
This project will lay the foundations for the verification of quantum cryptography. We will design logics and software tools
for developing and verifying security proofs on the computer, both for classical protocols secure against quantum computer
(post-quantum security) and for protocols that use quantum communication.
Our main approach is the design of a logic (quantum relational Hoare logic, qRHL) for reasoning about the relationship
between pairs of quantum programs, together with an ecosystem of manual and automated reasoning tools, culminating in
fully certified security proofs for real-world quantum protocols.
As a final result, the project will improve the security of protocols in the quantum age, by removing one possible source of
human error. In addition, the project directly impacts the research community, by providing new foundations in program
verification, and by providing cryptographers with new tools for the verification of their protocols.
"
Summary
"Digital communication permeates all areas of today's daily life. Cryptographic protocols are used to secure that
communication. Quantum communication and the advent of quantum computers both threaten existing cryptographic
solutions, and create new opportunities for secure protocols. The security of cryptographic systems is normally ensured by
mathematical proofs. Due to human error, however, these proofs often contain errors, limiting the usefulness of said proofs.
This is especially true in the case of quantum protocols since human intuition is well-adapted to the classical world, but not
to quantum mechanics. To resolve this problem, methods for verifying cryptographic security proofs using computers (i.e.,
for ""certifying"" the security) have been developed. Yet, all existing verification approaches handle classical cryptography
only - for quantum protocols, no approaches exist.
This project will lay the foundations for the verification of quantum cryptography. We will design logics and software tools
for developing and verifying security proofs on the computer, both for classical protocols secure against quantum computer
(post-quantum security) and for protocols that use quantum communication.
Our main approach is the design of a logic (quantum relational Hoare logic, qRHL) for reasoning about the relationship
between pairs of quantum programs, together with an ecosystem of manual and automated reasoning tools, culminating in
fully certified security proofs for real-world quantum protocols.
As a final result, the project will improve the security of protocols in the quantum age, by removing one possible source of
human error. In addition, the project directly impacts the research community, by providing new foundations in program
verification, and by providing cryptographers with new tools for the verification of their protocols.
"
Max ERC Funding
1 716 475 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym COGNIMUND
Project Cognitive Image Understanding: Image representations and Multimodal learning
Researcher (PI) Tinne Tuytelaars
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE6, ERC-2009-StG
Summary One of the primary and most appealing goals of computer vision is to automatically understand the content of images on a cognitive level. Ultimately we want to have computers interpret images as we humans do, recognizing all the objects, scenes, and people as well as their relations as they appear in natural images or video. With this project, I want to advance the state of the art in this field in two directions, which I believe to be crucial to build the next generation of image understanding tools. First, novel more robust yet descriptive image representations will be designed, that incorporate the intrinsic structure of images. These should already go a long way towards removing irrelevant sources of variability while capturing the essence of the image content. I believe the importance of further research into image representations is currently underestimated within the research community, yet I claim this is a crucial step with lots of opportunities good learning cannot easily make up for bad features. Second, weakly supervised methods to learn from multimodal input (especially the combination of images and text) will be investigated, making it possible to leverage the large amount of weak annotations available via the internet. This is essential if we want to scale the methods to a larger number of object categories (several hundreds instead of a few tens). As more data can be used for training, such weakly supervised methods might in the end even come on par with or outperform supervised schemes. Here we will call upon the latest results in semi-supervised learning, datamining, and computational linguistics.
Summary
One of the primary and most appealing goals of computer vision is to automatically understand the content of images on a cognitive level. Ultimately we want to have computers interpret images as we humans do, recognizing all the objects, scenes, and people as well as their relations as they appear in natural images or video. With this project, I want to advance the state of the art in this field in two directions, which I believe to be crucial to build the next generation of image understanding tools. First, novel more robust yet descriptive image representations will be designed, that incorporate the intrinsic structure of images. These should already go a long way towards removing irrelevant sources of variability while capturing the essence of the image content. I believe the importance of further research into image representations is currently underestimated within the research community, yet I claim this is a crucial step with lots of opportunities good learning cannot easily make up for bad features. Second, weakly supervised methods to learn from multimodal input (especially the combination of images and text) will be investigated, making it possible to leverage the large amount of weak annotations available via the internet. This is essential if we want to scale the methods to a larger number of object categories (several hundreds instead of a few tens). As more data can be used for training, such weakly supervised methods might in the end even come on par with or outperform supervised schemes. Here we will call upon the latest results in semi-supervised learning, datamining, and computational linguistics.
Max ERC Funding
1 538 380 €
Duration
Start date: 2010-02-01, End date: 2015-01-31
Project acronym COLORAMAP
Project Constrained Low-Rank Matrix Approximations: Theoretical and Algorithmic Developments for Practitioners
Researcher (PI) Nicolas Benoit P Gillis
Host Institution (HI) UNIVERSITE DE MONS
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary Low-rank matrix approximation (LRA) techniques such as principal component analysis (PCA) are powerful tools for the representation and analysis of high dimensional data, and are used in a wide variety of areas such as machine learning, signal and image processing, data mining, and optimization. Without any constraints and using the least squares error, LRA can be solved via the singular value decomposition. However, in practice, this model is often not suitable mainly because (i) the data might be contaminated with outliers, missing data and non-Gaussian noise, and (ii) the low-rank factors of the decomposition might have to satisfy some specific constraints. Hence, in recent years, many variants of LRA have been introduced, using different constraints on the factors and using different objective functions to assess the quality of the approximation; e.g., sparse PCA, PCA with missing data, independent component analysis and nonnegative matrix factorization. Although these new constrained LRA models have become very popular and standard in some fields, there is still a significant gap between theory and practice. In this project, our goal is to reduce this gap by attacking the problem in an integrated way making connections between LRA variants, and by using four very different but complementary perspectives: (1) computational complexity issues, (2) provably correct algorithms, (3) heuristics for difficult instances, and (4) application-oriented aspects. This unified and multi-disciplinary approach will enable us to understand these problems better, to develop and analyze new and existing algorithms and to then use them for applications. Our ultimate goal is to provide practitioners with new tools and to allow them to decide which method to use in which situation and to know what to expect from it.
Summary
Low-rank matrix approximation (LRA) techniques such as principal component analysis (PCA) are powerful tools for the representation and analysis of high dimensional data, and are used in a wide variety of areas such as machine learning, signal and image processing, data mining, and optimization. Without any constraints and using the least squares error, LRA can be solved via the singular value decomposition. However, in practice, this model is often not suitable mainly because (i) the data might be contaminated with outliers, missing data and non-Gaussian noise, and (ii) the low-rank factors of the decomposition might have to satisfy some specific constraints. Hence, in recent years, many variants of LRA have been introduced, using different constraints on the factors and using different objective functions to assess the quality of the approximation; e.g., sparse PCA, PCA with missing data, independent component analysis and nonnegative matrix factorization. Although these new constrained LRA models have become very popular and standard in some fields, there is still a significant gap between theory and practice. In this project, our goal is to reduce this gap by attacking the problem in an integrated way making connections between LRA variants, and by using four very different but complementary perspectives: (1) computational complexity issues, (2) provably correct algorithms, (3) heuristics for difficult instances, and (4) application-oriented aspects. This unified and multi-disciplinary approach will enable us to understand these problems better, to develop and analyze new and existing algorithms and to then use them for applications. Our ultimate goal is to provide practitioners with new tools and to allow them to decide which method to use in which situation and to know what to expect from it.
Max ERC Funding
1 291 750 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym COSMOS
Project Semiparametric Inference for Complex and Structural Models in Survival Analysis
Researcher (PI) Ingrid VAN KEILEGOM
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE1, ERC-2015-AdG
Summary In survival analysis investigators are interested in modeling and analysing the time until an event happens. It often happens that the available data are right censored, which means that only a lower bound of the time of interest is observed. This feature complicates substantially the statistical analysis of this kind of data. The aim of this project is to solve a number of open problems related to time-to-event data, that would represent a major step forward in the area of survival analysis.
The project has three objectives:
[1] Cure models take into account that a certain fraction of the subjects under study will never experience the event of interest. Because of the complex nature of these models, many problems are still open and rigorous theory is rather scarce in this area. Our goal is to fill this gap, which will be a challenging but important task.
[2] Copulas are nowadays widespread in many areas in statistics. However, they can contribute more substantially to resolving a number of the outstanding issues in survival analysis, such as in quantile regression and dependent censoring. Finding answers to these open questions, would open up new horizons for a wide variety of problems.
[3] We wish to develop new methods for doing correct inference in some of the common models in survival analysis in the presence of endogeneity or measurement errors. The present methodology has serious shortcomings, and we would like to propose, develop and validate new methods, that would be a major breakthrough if successful.
The above objectives will be achieved by using mostly semiparametric models. The development of mathematical properties under these models is often a challenging task, as complex tools from the theory on empirical processes and semiparametric efficiency are required. The project will therefore require an innovative combination of highly complex mathematical skills and cutting edge results from modern theory for semiparametric models.
Summary
In survival analysis investigators are interested in modeling and analysing the time until an event happens. It often happens that the available data are right censored, which means that only a lower bound of the time of interest is observed. This feature complicates substantially the statistical analysis of this kind of data. The aim of this project is to solve a number of open problems related to time-to-event data, that would represent a major step forward in the area of survival analysis.
The project has three objectives:
[1] Cure models take into account that a certain fraction of the subjects under study will never experience the event of interest. Because of the complex nature of these models, many problems are still open and rigorous theory is rather scarce in this area. Our goal is to fill this gap, which will be a challenging but important task.
[2] Copulas are nowadays widespread in many areas in statistics. However, they can contribute more substantially to resolving a number of the outstanding issues in survival analysis, such as in quantile regression and dependent censoring. Finding answers to these open questions, would open up new horizons for a wide variety of problems.
[3] We wish to develop new methods for doing correct inference in some of the common models in survival analysis in the presence of endogeneity or measurement errors. The present methodology has serious shortcomings, and we would like to propose, develop and validate new methods, that would be a major breakthrough if successful.
The above objectives will be achieved by using mostly semiparametric models. The development of mathematical properties under these models is often a challenging task, as complex tools from the theory on empirical processes and semiparametric efficiency are required. The project will therefore require an innovative combination of highly complex mathematical skills and cutting edge results from modern theory for semiparametric models.
Max ERC Funding
2 318 750 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym CRAMIS
Project Critical phenomena in random matrix theory and integrable systems
Researcher (PI) Tom Claeys
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Starting Grant (StG), PE1, ERC-2012-StG_20111012
Summary The main goal of the project is to create a research group on critical phenomena in random matrix theory and integrable systems at the Université Catholique de Louvain, where the PI was recently appointed.
Random matrix ensembles, integrable partial differential equations and Toeplitz determinants will be the main research topics in the project. Those three models show intimate connections and they all share certain properties that are, to a large extent, universal. In the recent past it has been showed that Painlevé equations play an important and universal role in the description of critical behaviour in each of these areas. In random matrix theory, they describe the local correlations between eigenvalues in appropriate double scaling limits; for integrable partial differential equations such as the Korteweg-de Vries equation and the nonlinear Schrödinger equation, they arise near points of gradient catastrophe in the small dispersion limit; for Toeplitz determinants they describe phase transitions for underlying models in statistical physics.
The aim of the project is to study new types of critical behaviour and to obtain a better understanding of the remarkable similarities between random matrices on one hand and integrable partial differential equations on the other hand. The focus will be on asymptotic questions, and one of the tools we plan to use is the Deift/Zhou steepest descent method to obtain asymptotics for Riemann-Hilbert problems. Although many of the problems in this project have their origin or motivation in mathematical physics, the proposed techniques are mostly based on complex and classical analysis.
Summary
The main goal of the project is to create a research group on critical phenomena in random matrix theory and integrable systems at the Université Catholique de Louvain, where the PI was recently appointed.
Random matrix ensembles, integrable partial differential equations and Toeplitz determinants will be the main research topics in the project. Those three models show intimate connections and they all share certain properties that are, to a large extent, universal. In the recent past it has been showed that Painlevé equations play an important and universal role in the description of critical behaviour in each of these areas. In random matrix theory, they describe the local correlations between eigenvalues in appropriate double scaling limits; for integrable partial differential equations such as the Korteweg-de Vries equation and the nonlinear Schrödinger equation, they arise near points of gradient catastrophe in the small dispersion limit; for Toeplitz determinants they describe phase transitions for underlying models in statistical physics.
The aim of the project is to study new types of critical behaviour and to obtain a better understanding of the remarkable similarities between random matrices on one hand and integrable partial differential equations on the other hand. The focus will be on asymptotic questions, and one of the tools we plan to use is the Deift/Zhou steepest descent method to obtain asymptotics for Riemann-Hilbert problems. Although many of the problems in this project have their origin or motivation in mathematical physics, the proposed techniques are mostly based on complex and classical analysis.
Max ERC Funding
1 130 400 €
Duration
Start date: 2012-08-01, End date: 2017-07-31
Project acronym CRASH
Project CRyptographic Algorithms and Secure Hardware
Researcher (PI) François-Xavier Standaert
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary Side-channel attacks are an important threat against cryptographic implementations in which an adversary takes advantage of physical leakages, such as the power consumption of a smart card, in order to recover secret information. By circumventing the models in which standard security proofs are obtained, they can lead to powerful attacks against a large class of devices. As a consequence, formalizing implementation security and efficiently preventing side-channel attacks is one of the most challenging open problems in modern cryptography. Physical attacks imply new optimization criteria, with potential impact on the way we conceive algorithms and the way we design circuits. By putting together mathematical and electrical engineering problems, just as they are raised in reality, the CRASH project is expected to develop concrete basements for the next generation of cryptographic algorithms and their implementation. For this purpose, three main directions will be considered. First, we will investigate sound evaluation tools for side-channel attacks and validate them on different prototype chips. Second, we will consider the impact of physical attacks on the mathematical aspects of cryptography, both destructively (i.e. by developing new attacks and advanced cryptanalysis tools) and constructively (i.e. by investigating new cipher designs and security proof techniques). Third, we will evaluate the possibility to integrate physical security analysis into the design tools of integrated circuits (e.g. in order to obtain “physical security aware” compilers). Summarizing, this project aims to break the barrier between the abstractions of mathematical cryptography and the concrete peculiarities of physical security in present microelectronic devices. By considering the system and algorithmic issues in a unified way, it is expected to get rid of the incompatibilities between the separate formalisms that are usually considered in order to explain these concurrent realities.
Summary
Side-channel attacks are an important threat against cryptographic implementations in which an adversary takes advantage of physical leakages, such as the power consumption of a smart card, in order to recover secret information. By circumventing the models in which standard security proofs are obtained, they can lead to powerful attacks against a large class of devices. As a consequence, formalizing implementation security and efficiently preventing side-channel attacks is one of the most challenging open problems in modern cryptography. Physical attacks imply new optimization criteria, with potential impact on the way we conceive algorithms and the way we design circuits. By putting together mathematical and electrical engineering problems, just as they are raised in reality, the CRASH project is expected to develop concrete basements for the next generation of cryptographic algorithms and their implementation. For this purpose, three main directions will be considered. First, we will investigate sound evaluation tools for side-channel attacks and validate them on different prototype chips. Second, we will consider the impact of physical attacks on the mathematical aspects of cryptography, both destructively (i.e. by developing new attacks and advanced cryptanalysis tools) and constructively (i.e. by investigating new cipher designs and security proof techniques). Third, we will evaluate the possibility to integrate physical security analysis into the design tools of integrated circuits (e.g. in order to obtain “physical security aware” compilers). Summarizing, this project aims to break the barrier between the abstractions of mathematical cryptography and the concrete peculiarities of physical security in present microelectronic devices. By considering the system and algorithmic issues in a unified way, it is expected to get rid of the incompatibilities between the separate formalisms that are usually considered in order to explain these concurrent realities.
Max ERC Funding
1 498 874 €
Duration
Start date: 2011-10-01, End date: 2016-09-30
Project acronym DEMIURGE
Project Automatic Design of Robot Swarms
Researcher (PI) Mauro Birattari
Host Institution (HI) UNIVERSITE LIBRE DE BRUXELLES
Call Details Consolidator Grant (CoG), PE6, ERC-2015-CoG
Summary The scope of this project is the automatic design of robot swarms. Swarm robotics is an appealing approach to the coordination of large groups of robots. Up to now, robot swarms have been designed via some labor-intensive process.
My goal is to advance the state of the art in swarm robotics by developing the DEMIURGE: an intelligent system that is able to design and realize robot swarms in a totally integrated and automatic way
The DEMIURGE is a novel concept. Starting from requirements expressed in a specification language that I will define, the DEMIURGE will design all aspects of a robot swarm - hardware and control software.
The DEMIURGE will cast a design problem into an optimization problem and will tackle it in a computation-intensive way. In this project, I will study different control software structures, optimization algorithms, ways to specify requirements, validation protocols, on-line adaptation mechanisms and techniques for re-design at run time.
Summary
The scope of this project is the automatic design of robot swarms. Swarm robotics is an appealing approach to the coordination of large groups of robots. Up to now, robot swarms have been designed via some labor-intensive process.
My goal is to advance the state of the art in swarm robotics by developing the DEMIURGE: an intelligent system that is able to design and realize robot swarms in a totally integrated and automatic way
The DEMIURGE is a novel concept. Starting from requirements expressed in a specification language that I will define, the DEMIURGE will design all aspects of a robot swarm - hardware and control software.
The DEMIURGE will cast a design problem into an optimization problem and will tackle it in a computation-intensive way. In this project, I will study different control software structures, optimization algorithms, ways to specify requirements, validation protocols, on-line adaptation mechanisms and techniques for re-design at run time.
Max ERC Funding
2 000 000 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym DISPATCH Neuro-Sense
Project Distributed Signal Processing Algorithms for Chronic Neuro-Sensor Networks
Researcher (PI) Alexander BERTRAND
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE6, ERC-2018-STG
Summary The possibility to chronically monitor the brain 24/7 in daily-life activities would revolutionize human-machine interactions and health care, e.g., in the context of neuroprostheses, neurological disorders, and brain-computer interfaces (BCI). Such chronic systems must satisfy challenging energy and miniaturization constraints, leading to modular designs in which multiple networked miniature neuro-sensor modules form a ‘neuro-sensor network’ (NSN).
However, current multi-channel neural signal processing (NSP) algorithms were designed for traditional neuro-sensor arrays with central access to all channels. These algorithms are not suited for NSNs, as they require unrealistic bandwidth budgets to centralize the data, yet a joint neural data analysis across NSN modules is crucial.
The central idea of this project is to remove this algorithm bottleneck by designing novel scalable, distributed NSP algorithms to let the modules of an NSN jointly process the recorded neural data through in-network data fusion and with a minimal exchange of data.
To guarantee impact, we mainly focus on establishing a new non-invasive NSN concept based on electroencephalography (EEG). By combining multiple ‘smart’ mini-EEG modules into an ‘EEG sensor network’ (EEG-Net), we compensate for the lack of spatial information captured by current stand-alone mini-EEG devices, without compromising in ‘wearability’. Equipping such EEG-Nets with distributed NSP algorithms will allow to process high-density EEG data at viable energy levels, which is a game changer towards high-performance chronic EEG for, e.g., epilepsy monitoring, neuroprostheses, and BCI.
We will validate these claims in an EEG-Net prototype in the above 3 use cases, benefiting from ongoing collaborations with the KUL university hospital. In addition, to demonstrate the general applicability of our novel NSP algorithms, we will validate them in other emerging NSN types as well, such as modular or untethered neural probes.
Summary
The possibility to chronically monitor the brain 24/7 in daily-life activities would revolutionize human-machine interactions and health care, e.g., in the context of neuroprostheses, neurological disorders, and brain-computer interfaces (BCI). Such chronic systems must satisfy challenging energy and miniaturization constraints, leading to modular designs in which multiple networked miniature neuro-sensor modules form a ‘neuro-sensor network’ (NSN).
However, current multi-channel neural signal processing (NSP) algorithms were designed for traditional neuro-sensor arrays with central access to all channels. These algorithms are not suited for NSNs, as they require unrealistic bandwidth budgets to centralize the data, yet a joint neural data analysis across NSN modules is crucial.
The central idea of this project is to remove this algorithm bottleneck by designing novel scalable, distributed NSP algorithms to let the modules of an NSN jointly process the recorded neural data through in-network data fusion and with a minimal exchange of data.
To guarantee impact, we mainly focus on establishing a new non-invasive NSN concept based on electroencephalography (EEG). By combining multiple ‘smart’ mini-EEG modules into an ‘EEG sensor network’ (EEG-Net), we compensate for the lack of spatial information captured by current stand-alone mini-EEG devices, without compromising in ‘wearability’. Equipping such EEG-Nets with distributed NSP algorithms will allow to process high-density EEG data at viable energy levels, which is a game changer towards high-performance chronic EEG for, e.g., epilepsy monitoring, neuroprostheses, and BCI.
We will validate these claims in an EEG-Net prototype in the above 3 use cases, benefiting from ongoing collaborations with the KUL university hospital. In addition, to demonstrate the general applicability of our novel NSP algorithms, we will validate them in other emerging NSN types as well, such as modular or untethered neural probes.
Max ERC Funding
1 489 656 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym DPMP
Project Dependable Performance on Many-Thread Processors
Researcher (PI) Lieven Eeckhout
Host Institution (HI) UNIVERSITEIT GENT
Call Details Starting Grant (StG), PE6, ERC-2010-StG_20091028
Summary Contemporary microprocessors seek at improving performance through thread-level parallelism by co-executing multiple threads on a single microprocessor chip. Projections suggest that future processors will feature multiple tens to hundreds of threads, hence called many-thread processors. Many-thread processors, however, lead to non-dependable performance: co-executing threads affect each other s performance in unpredictable ways because of resource sharing across threads. Failure to deliver dependable performance leads to missed deadlines, priority inversion, unbalanced parallel execution, etc., which will severely impact the usage model and the performance growth path for many important future and emerging application domains (e.g., media, medical, datacenter).
DPMP envisions that performance introspection using a cycle accounting architecture that tracks per-thread performance, will be the breakthrough to delivering dependable performance in future many-thread processors. To this end, DPMP will develop a hardware cycle accounting architecture that estimates single-thread progress during many-thread execution. The ability to track per-thread progress enables system software to deliver dependable performance by assigning hardware resources to threads depending on their relative progress. Through this cooperative hardware-software approach, this project addresses a fundamental problem in multi-threaded ad multi/many-core processing.
Summary
Contemporary microprocessors seek at improving performance through thread-level parallelism by co-executing multiple threads on a single microprocessor chip. Projections suggest that future processors will feature multiple tens to hundreds of threads, hence called many-thread processors. Many-thread processors, however, lead to non-dependable performance: co-executing threads affect each other s performance in unpredictable ways because of resource sharing across threads. Failure to deliver dependable performance leads to missed deadlines, priority inversion, unbalanced parallel execution, etc., which will severely impact the usage model and the performance growth path for many important future and emerging application domains (e.g., media, medical, datacenter).
DPMP envisions that performance introspection using a cycle accounting architecture that tracks per-thread performance, will be the breakthrough to delivering dependable performance in future many-thread processors. To this end, DPMP will develop a hardware cycle accounting architecture that estimates single-thread progress during many-thread execution. The ability to track per-thread progress enables system software to deliver dependable performance by assigning hardware resources to threads depending on their relative progress. Through this cooperative hardware-software approach, this project addresses a fundamental problem in multi-threaded ad multi/many-core processing.
Max ERC Funding
1 389 000 €
Duration
Start date: 2010-10-01, End date: 2016-09-30
Project acronym E-SWARM
Project Engineering Swarm Intelligence Systems
Researcher (PI) Marco Dorigo
Host Institution (HI) UNIVERSITE LIBRE DE BRUXELLES
Call Details Advanced Grant (AdG), PE6, ERC-2009-AdG
Summary Swarm intelligence is the discipline that deals with natural and artificial systems composed of many individuals that coordinate using decentralized control and self-organization. In this project, we focus on the design and implementation of artificial swarm intelligence systems for the solution of complex problems. Our current understanding of how to use swarms of artificial agents largely relies on rules of thumb and intuition based on the experience of individual researchers. This is not sufficient for us to design swarm intelligence systems at the level of complexity required by many real-world applications, or to accurately predict the behavior of the systems we design. The goal of the E-SWARM is to develop a rigorous engineering methodology for the design and implementation of artificial swarm intelligence systems. We believe that in the future, swarm intelligence will be an important tool for researchers and engineers interested in solving certain classes of complex problems. To build the foundations of this discipline and to develop an appropriate methodology, we will proceed in parallel both at an abstract level and by tackling a number of challenging problems in selected research domains. The research domains we have chosen are optimization, robotics, networks, and data mining.
Summary
Swarm intelligence is the discipline that deals with natural and artificial systems composed of many individuals that coordinate using decentralized control and self-organization. In this project, we focus on the design and implementation of artificial swarm intelligence systems for the solution of complex problems. Our current understanding of how to use swarms of artificial agents largely relies on rules of thumb and intuition based on the experience of individual researchers. This is not sufficient for us to design swarm intelligence systems at the level of complexity required by many real-world applications, or to accurately predict the behavior of the systems we design. The goal of the E-SWARM is to develop a rigorous engineering methodology for the design and implementation of artificial swarm intelligence systems. We believe that in the future, swarm intelligence will be an important tool for researchers and engineers interested in solving certain classes of complex problems. To build the foundations of this discipline and to develop an appropriate methodology, we will proceed in parallel both at an abstract level and by tackling a number of challenging problems in selected research domains. The research domains we have chosen are optimization, robotics, networks, and data mining.
Max ERC Funding
2 016 000 €
Duration
Start date: 2010-06-01, End date: 2015-05-31
Project acronym FHiCuNCAG
Project Foundations for Higher and Curved Noncommutative Algebraic Geometry
Researcher (PI) Wendy Joy Lowen
Host Institution (HI) UNIVERSITEIT ANTWERPEN
Call Details Consolidator Grant (CoG), PE1, ERC-2018-COG
Summary With this research programme, inspired by open problems within noncommutative algebraic geometry (NCAG) as well as by actual developments in algebraic topology, it is our aim to lay out new foundations for NCAG. On the one hand, the categorical approach to geometry put forth in NCAG has seen a wide range of applications both in mathematics and in theoretical physics. On the other hand, algebraic topology has received a vast impetus from the development of higher topos theory by Lurie and others. The current project is aimed at cross-fertilisation between the two subjects, in particular through the development of “higher linear topos theory”. We will approach the higher structure on Hochschild type complexes from two angles. Firstly, focusing on intrinsic incarnations of spaces as large categories, we will use the tensor products developed jointly with Ramos González and Shoikhet to obtain a “large version” of the Deligne conjecture. Secondly, focusing on concrete representations, we will develop new operadic techniques in order to endow complexes like the Gerstenhaber-Schack complex for prestacks (due to Dinh Van-Lowen) and the deformation complexes for monoidal categories and pasting diagrams (due to Shrestha and Yetter) with new combinatorial structure. In another direction, we will move from Hochschild cohomology of abelian categories (in the sense of Lowen-Van den Bergh) to Mac Lane cohomology for exact categories (in the sense of Kaledin-Lowen), extending the scope of NCAG to “non-linear deformations”. One of the mysteries in algebraic deformation theory is the curvature problem: in the process of deformation we are brought to the boundaries of NCAG territory through the introduction of a curvature component which disables the standard approaches to cohomology. Eventually, it is our goal to set up a new framework for NCAG which incorporates curved objects, drawing inspiration from the realm of higher categories.
Summary
With this research programme, inspired by open problems within noncommutative algebraic geometry (NCAG) as well as by actual developments in algebraic topology, it is our aim to lay out new foundations for NCAG. On the one hand, the categorical approach to geometry put forth in NCAG has seen a wide range of applications both in mathematics and in theoretical physics. On the other hand, algebraic topology has received a vast impetus from the development of higher topos theory by Lurie and others. The current project is aimed at cross-fertilisation between the two subjects, in particular through the development of “higher linear topos theory”. We will approach the higher structure on Hochschild type complexes from two angles. Firstly, focusing on intrinsic incarnations of spaces as large categories, we will use the tensor products developed jointly with Ramos González and Shoikhet to obtain a “large version” of the Deligne conjecture. Secondly, focusing on concrete representations, we will develop new operadic techniques in order to endow complexes like the Gerstenhaber-Schack complex for prestacks (due to Dinh Van-Lowen) and the deformation complexes for monoidal categories and pasting diagrams (due to Shrestha and Yetter) with new combinatorial structure. In another direction, we will move from Hochschild cohomology of abelian categories (in the sense of Lowen-Van den Bergh) to Mac Lane cohomology for exact categories (in the sense of Kaledin-Lowen), extending the scope of NCAG to “non-linear deformations”. One of the mysteries in algebraic deformation theory is the curvature problem: in the process of deformation we are brought to the boundaries of NCAG territory through the introduction of a curvature component which disables the standard approaches to cohomology. Eventually, it is our goal to set up a new framework for NCAG which incorporates curved objects, drawing inspiration from the realm of higher categories.
Max ERC Funding
1 171 360 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym FOREFRONT
Project Frontiers of Extended Formulations
Researcher (PI) Samuel Fiorini
Host Institution (HI) UNIVERSITE LIBRE DE BRUXELLES
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary "Linear programming has proved to be an invaluable tool both in theory and practice. Semidefinite programming surpasses linear programming in terms of expressivity while remaining tractable. This project proposal investigates the modeling power of linear and semidefinite programming, in the context of combinatorial optimization. Within the emerging framework of extended formulations (EFs), I seek a decisive answer to the following question: Which problems can be modeled by a linear or semidefinite program, when the number of constraints and variables are limited? EFs are based on the idea that one should choose the ""right"" variables to model a problem. By extending the set of variables of a problem by a few carefully chosen variables, the number of constraints can in some cases dramatically decrease, making the problem easier to solve. Despite previous high-quality research, the theory of EFs is still on square one. This project proposal aims at (i) transforming our current zero-dimensional state of knowledge to a truly three-dimensional state of knowledge by pushing the boundaries of EFs in three directions (models, types and problems); (ii) using EFs as a lens on complexity by proving strong consequences of important conjectures such as P != NP, and leveraging strong connections to geometry to make progress on the log-rank conjecture. The proposed methodology is: (i) experiment-aided; (ii) interdisciplinary; (iii) constructive."
Summary
"Linear programming has proved to be an invaluable tool both in theory and practice. Semidefinite programming surpasses linear programming in terms of expressivity while remaining tractable. This project proposal investigates the modeling power of linear and semidefinite programming, in the context of combinatorial optimization. Within the emerging framework of extended formulations (EFs), I seek a decisive answer to the following question: Which problems can be modeled by a linear or semidefinite program, when the number of constraints and variables are limited? EFs are based on the idea that one should choose the ""right"" variables to model a problem. By extending the set of variables of a problem by a few carefully chosen variables, the number of constraints can in some cases dramatically decrease, making the problem easier to solve. Despite previous high-quality research, the theory of EFs is still on square one. This project proposal aims at (i) transforming our current zero-dimensional state of knowledge to a truly three-dimensional state of knowledge by pushing the boundaries of EFs in three directions (models, types and problems); (ii) using EFs as a lens on complexity by proving strong consequences of important conjectures such as P != NP, and leveraging strong connections to geometry to make progress on the log-rank conjecture. The proposed methodology is: (i) experiment-aided; (ii) interdisciplinary; (iii) constructive."
Max ERC Funding
1 455 479 €
Duration
Start date: 2014-09-01, End date: 2019-08-31
Project acronym FORSIED
Project Formalizing Subjective Interestingness in Exploratory Data Mining
Researcher (PI) Tijl De Bie
Host Institution (HI) UNIVERSITEIT GENT
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary "The rate at which research labs, enterprises and governments accumulate data is high and fast increasing. Often, these data are collected for no specific purpose, or they turn out to be useful for unanticipated purposes: Companies constantly look for new ways to monetize their customer databases; Governments mine various databases to detect tax fraud; Security agencies mine and cross-associate numerous heterogeneous information streams from publicly accessible and classified databases to understand and detect security threats. The objective in such Exploratory Data Mining (EDM) tasks is typically ill-defined, i.e. it is unclear how to formalize how interesting a pattern extracted from the data is. As a result, EDM is often a slow process of trial and error.
During this fellowship we aim to develop the mathematical principles of what makes a pattern interesting in a very subjective sense. Crucial in this endeavour will be research into automatic mechanisms to model and duly consider the prior beliefs and expectations of the user for whom the EDM patterns are intended, thus relieving the users of the complex task to attempt to formalize themselves what makes a pattern interesting to them.
This project will represent a radical change in how EDM research is done. Currently, researchers typically imagine a specific purpose for the patterns, try to formalize interestingness of such patterns given that purpose, and design an algorithm to mine them. However, given the variety of users, this strategy has led to a multitude of algorithms. As a result, users need to be data mining experts to understand which algorithm applies to their situation. To resolve this, we will develop a theoretically solid framework for the design of EDM systems that model the user's beliefs and expectations as much as the data itself, so as to maximize the amount of useful information transmitted to the user. This will ultimately bring the power of EDM within reach of the non-expert."
Summary
"The rate at which research labs, enterprises and governments accumulate data is high and fast increasing. Often, these data are collected for no specific purpose, or they turn out to be useful for unanticipated purposes: Companies constantly look for new ways to monetize their customer databases; Governments mine various databases to detect tax fraud; Security agencies mine and cross-associate numerous heterogeneous information streams from publicly accessible and classified databases to understand and detect security threats. The objective in such Exploratory Data Mining (EDM) tasks is typically ill-defined, i.e. it is unclear how to formalize how interesting a pattern extracted from the data is. As a result, EDM is often a slow process of trial and error.
During this fellowship we aim to develop the mathematical principles of what makes a pattern interesting in a very subjective sense. Crucial in this endeavour will be research into automatic mechanisms to model and duly consider the prior beliefs and expectations of the user for whom the EDM patterns are intended, thus relieving the users of the complex task to attempt to formalize themselves what makes a pattern interesting to them.
This project will represent a radical change in how EDM research is done. Currently, researchers typically imagine a specific purpose for the patterns, try to formalize interestingness of such patterns given that purpose, and design an algorithm to mine them. However, given the variety of users, this strategy has led to a multitude of algorithms. As a result, users need to be data mining experts to understand which algorithm applies to their situation. To resolve this, we will develop a theoretically solid framework for the design of EDM systems that model the user's beliefs and expectations as much as the data itself, so as to maximize the amount of useful information transmitted to the user. This will ultimately bring the power of EDM within reach of the non-expert."
Max ERC Funding
1 549 315 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym HHNCDMIR
Project Hochschild cohomology, non-commutative deformations and mirror symmetry
Researcher (PI) Wendy Lowen
Host Institution (HI) UNIVERSITEIT ANTWERPEN
Call Details Starting Grant (StG), PE1, ERC-2010-StG_20091028
Summary "Our research programme addresses several interesting current issues in non-commutative algebraic geometry, and important links with symplectic geometry and algebraic topology. Non-commutative algebraic geometry is concerned with the study of algebraic objects in geometric ways. One of the basic philosophies is that, in analogy with (derived) categories of (quasi-)coherent sheaves over schemes and (derived) module categories, non-commutative spaces can be represented by suitable abelian or triangulated categories. This point of view has proven extremely useful in non-commutative algebra, algebraic geometry and more recently in string theory thanks to the Homological Mirror Symmetry conjecture. One of our main aims is to set up a deformation framework for non-commutative spaces represented by ""enhanced"" triangulated categories, encompassing both the non-commutative schemes represented by derived abelian categories and the derived-affine spaces, represented by dg algebras. This framework should clarify and resolve some of the important problems known to exist in the deformation theory of derived-affine spaces. It should moreover be applicable to Fukaya-type categories, and yield a new way of proving and interpreting instances of ""deformed mirror symmetry"". This theory will be developed in interaction with concrete applications of the abelian deformation theory developed in our earlier work, and with the development of new decomposition and comparison techniques for Hochschild cohomology. By understanding the links between the different theories and fields of application, we aim to achieve an interdisciplinary understanding of non-commutative spaces using abelian and triangulated structures."
Summary
"Our research programme addresses several interesting current issues in non-commutative algebraic geometry, and important links with symplectic geometry and algebraic topology. Non-commutative algebraic geometry is concerned with the study of algebraic objects in geometric ways. One of the basic philosophies is that, in analogy with (derived) categories of (quasi-)coherent sheaves over schemes and (derived) module categories, non-commutative spaces can be represented by suitable abelian or triangulated categories. This point of view has proven extremely useful in non-commutative algebra, algebraic geometry and more recently in string theory thanks to the Homological Mirror Symmetry conjecture. One of our main aims is to set up a deformation framework for non-commutative spaces represented by ""enhanced"" triangulated categories, encompassing both the non-commutative schemes represented by derived abelian categories and the derived-affine spaces, represented by dg algebras. This framework should clarify and resolve some of the important problems known to exist in the deformation theory of derived-affine spaces. It should moreover be applicable to Fukaya-type categories, and yield a new way of proving and interpreting instances of ""deformed mirror symmetry"". This theory will be developed in interaction with concrete applications of the abelian deformation theory developed in our earlier work, and with the development of new decomposition and comparison techniques for Hochschild cohomology. By understanding the links between the different theories and fields of application, we aim to achieve an interdisciplinary understanding of non-commutative spaces using abelian and triangulated structures."
Max ERC Funding
703 080 €
Duration
Start date: 2010-10-01, End date: 2016-09-30
Project acronym IMPaCT
Project Implementing Multi-Party Computation Technology
Researcher (PI) Nigel Paul Smart
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE6, ERC-2015-AdG
Summary The goal of IMPaCT is to turn Multi-Party Computation (MPC) from a stage in which we are beginning to obtain practical feasibility results, to a stage in which we have fully practical systems. It has long been acknowledged that MPC has the potential to provide a transformative change in the way security solutions are enabled. As it presently stands this is currently only possible in limited applications; deployments in restricted scenarios are beginning to emerge. However, in turning MPC into a fully practical technology a number of key scientific challenges need to be solved; many of which have not yet even been considered in the theoretical literature. The IMPaCT project aims to address this scientific gap, bridge it, and so provide the tools for a future road-map in which MPC can be deployed as a widespread tool, as ubiquitous as encryption and digital signatures are today.
Our scientific approach will be to investigate new MPC protocols and techniques which take into account practical constraints and issues which would arise in future application scenarios. Our work, despite being scientifically rigorous and driven from deep theoretical insight, will be grounded in practical considerations. All systems and protocols proposed will be prototyped so as to ensure that practical real world issues are taken into account. In addition we will use our extensive industrial linkages to ensure a two way dialogue between potential users and the developers of MPC technology; thus helping to embed future impact of the work in IMPaCT.
Our workplan is focused around key scientific challenges which we have identified on the road to fully practical MPC applications. These include the design of methodologies to cope with the asynchronicity of networks, how to realistically measure and model MPC protocols performance, how to utilize low round complexity protocols in practice, how to deal with problems with large input sizes (e.g. streaming data), and many more.
Summary
The goal of IMPaCT is to turn Multi-Party Computation (MPC) from a stage in which we are beginning to obtain practical feasibility results, to a stage in which we have fully practical systems. It has long been acknowledged that MPC has the potential to provide a transformative change in the way security solutions are enabled. As it presently stands this is currently only possible in limited applications; deployments in restricted scenarios are beginning to emerge. However, in turning MPC into a fully practical technology a number of key scientific challenges need to be solved; many of which have not yet even been considered in the theoretical literature. The IMPaCT project aims to address this scientific gap, bridge it, and so provide the tools for a future road-map in which MPC can be deployed as a widespread tool, as ubiquitous as encryption and digital signatures are today.
Our scientific approach will be to investigate new MPC protocols and techniques which take into account practical constraints and issues which would arise in future application scenarios. Our work, despite being scientifically rigorous and driven from deep theoretical insight, will be grounded in practical considerations. All systems and protocols proposed will be prototyped so as to ensure that practical real world issues are taken into account. In addition we will use our extensive industrial linkages to ensure a two way dialogue between potential users and the developers of MPC technology; thus helping to embed future impact of the work in IMPaCT.
Our workplan is focused around key scientific challenges which we have identified on the road to fully practical MPC applications. These include the design of methodologies to cope with the asynchronicity of networks, how to realistically measure and model MPC protocols performance, how to utilize low round complexity protocols in practice, how to deal with problems with large input sizes (e.g. streaming data), and many more.
Max ERC Funding
2 499 938 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym INVEST
Project inVEST: Foundations for a Shift from Verification to Synthesis
Researcher (PI) Jean-Francois Raskin
Host Institution (HI) UNIVERSITE LIBRE DE BRUXELLES
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary Reactive systems are computer systems that maintain a continuous interaction with the environment in which they execute. Examples of reactive systems are controllers embedded in cars or planes, system level software, device drivers, communication protocols, etc. On the one hand, those systems are notoriously difficult to develop correctly (because of characteristics like concurrency, real-time constraints, parallelism, etc). And on the other hand, their correctness is often critical as they are used in contexts where safety is an issue, or because of economical reasons related to mass production.
To ensure reliability of reactive systems, advanced verification techniques have been developed. One particularly successful approach is model-checking. Nevertheless, model-checking is used to find bugs in designs but it does not support the design itself.
In this project, we want to develop new algorithms and tools to support the automatic synthesis of modern reactive systems (instead of their verification a posteriori). We aim at a shift from verification to synthesis. To allow this shift, we need new foundations: we propose to generalize transition systems and automata – models of computation in the classical approach to verification – by the more flexible, and mathematically deeper, game-theoretic framework. Our work will be of fundamental nature but will also aim at the development of algorithms and tools. Those new foundations will allow for the development of a new generation of computer-aided design tools that will support the automatic synthesis of modern reactive systems and ensure correctness by construction.
Summary
Reactive systems are computer systems that maintain a continuous interaction with the environment in which they execute. Examples of reactive systems are controllers embedded in cars or planes, system level software, device drivers, communication protocols, etc. On the one hand, those systems are notoriously difficult to develop correctly (because of characteristics like concurrency, real-time constraints, parallelism, etc). And on the other hand, their correctness is often critical as they are used in contexts where safety is an issue, or because of economical reasons related to mass production.
To ensure reliability of reactive systems, advanced verification techniques have been developed. One particularly successful approach is model-checking. Nevertheless, model-checking is used to find bugs in designs but it does not support the design itself.
In this project, we want to develop new algorithms and tools to support the automatic synthesis of modern reactive systems (instead of their verification a posteriori). We aim at a shift from verification to synthesis. To allow this shift, we need new foundations: we propose to generalize transition systems and automata – models of computation in the classical approach to verification – by the more flexible, and mathematically deeper, game-theoretic framework. Our work will be of fundamental nature but will also aim at the development of algorithms and tools. Those new foundations will allow for the development of a new generation of computer-aided design tools that will support the automatic synthesis of modern reactive systems and ensure correctness by construction.
Max ERC Funding
1 415 255 €
Duration
Start date: 2012-01-01, End date: 2017-09-30
Project acronym INVPROB
Project Inverse Problems
Researcher (PI) Lassi Juhani Päivärinta
Host Institution (HI) TALLINNA TEHNIKAULIKOOL
Call Details Advanced Grant (AdG), PE1, ERC-2010-AdG_20100224
Summary Inverse problems constitute an interdisciplinary field of science concentrating on the mathematical theory and practical interpretation of indirect measurements. Their applications include medical imaging, atmospheric remote sensing, industrial process monitoring, and astronomical imaging. The common feature is extreme sensitivity to measurement noise. Computerized tomography, MRI, and exploration of the interior of earth by using earthquake data are typical inverse problems where mathematics has played an important role. By using the methods of inverse problems it is possible to bring modern mathematics to a vast number of applied fields. Genuine scientific innovations that are found in mathematical research, say in geometry, stochastics, or analysis, can be brought to real life applications through modelling. The solutions are often found by combining recent theoretical and computational advances. The study of inverse problems is one of the most active and fastest growing areas of modern applied mathematics, and the most interdisciplinary field of mathematics or even science in general.
The exciting but high risk problems in the research plan of the PI include mathematics of invisibility cloaking, invisible patterns, practical algorithms for imaging, and random quantum systems. Progress in these problems could have a considerable impact in applications such as construction of metamaterials for invisible optic fibre cables, scopes for MRI devices, and early screening for breast cancer. The progress here necessitates international collaboration. This will be realized in upcoming programs on inverse problems. The PI is involved in organizing semester programs in inverse problems at MSRI in 2010, Isaac Newton Institute in 2011, and Mittag-Leffler -institute in 2012.
Summary
Inverse problems constitute an interdisciplinary field of science concentrating on the mathematical theory and practical interpretation of indirect measurements. Their applications include medical imaging, atmospheric remote sensing, industrial process monitoring, and astronomical imaging. The common feature is extreme sensitivity to measurement noise. Computerized tomography, MRI, and exploration of the interior of earth by using earthquake data are typical inverse problems where mathematics has played an important role. By using the methods of inverse problems it is possible to bring modern mathematics to a vast number of applied fields. Genuine scientific innovations that are found in mathematical research, say in geometry, stochastics, or analysis, can be brought to real life applications through modelling. The solutions are often found by combining recent theoretical and computational advances. The study of inverse problems is one of the most active and fastest growing areas of modern applied mathematics, and the most interdisciplinary field of mathematics or even science in general.
The exciting but high risk problems in the research plan of the PI include mathematics of invisibility cloaking, invisible patterns, practical algorithms for imaging, and random quantum systems. Progress in these problems could have a considerable impact in applications such as construction of metamaterials for invisible optic fibre cables, scopes for MRI devices, and early screening for breast cancer. The progress here necessitates international collaboration. This will be realized in upcoming programs on inverse problems. The PI is involved in organizing semester programs in inverse problems at MSRI in 2010, Isaac Newton Institute in 2011, and Mittag-Leffler -institute in 2012.
Max ERC Funding
1 800 000 €
Duration
Start date: 2011-03-01, End date: 2016-02-29