Project acronym ALGILE
Project Foundations of Algebraic and Dynamic Data Management Systems
Researcher (PI) Christoph Koch
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary "Contemporary database query languages are ultimately founded on logic and feature an additive operation – usually a form of (multi)set union or disjunction – that is asymmetric in that additions or updates do not always have an inverse. This asymmetry puts a greater part of the machinery of abstract algebra for equation solving outside the reach of databases. However, such equation solving would be a key functionality that problems such as query equivalence testing and data integration could be reduced to: In the current scenario of the presence of an asymmetric additive operation they are undecidable. Moreover, query languages with a symmetric additive operation (i.e., which has an inverse and is thus based on ring theory) would open up databases for a large range of new scientific and mathematical applications.
The goal of the proposed project is to reinvent database management systems with a foundation in abstract algebra and specifically in ring theory. The presence of an additive inverse allows to cleanly define differences between queries. This gives rise to a database analog of differential calculus that leads to radically new incremental and adaptive query evaluation algorithms that substantially outperform the state of the art techniques. These algorithms enable a new class of systems which I call Dynamic Data Management Systems. Such systems can maintain continuously fresh query views at extremely high update rates and have important applications in interactive Large-scale Data Analysis. There is a natural connection between differences and updates, motivating the group theoretic study of updates that will lead to better ways of creating out-of-core data processing algorithms for new storage devices. Basing queries on ring theory leads to a new class of systems, Algebraic Data Management Systems, which herald a convergence of database systems and computer algebra systems."
Summary
"Contemporary database query languages are ultimately founded on logic and feature an additive operation – usually a form of (multi)set union or disjunction – that is asymmetric in that additions or updates do not always have an inverse. This asymmetry puts a greater part of the machinery of abstract algebra for equation solving outside the reach of databases. However, such equation solving would be a key functionality that problems such as query equivalence testing and data integration could be reduced to: In the current scenario of the presence of an asymmetric additive operation they are undecidable. Moreover, query languages with a symmetric additive operation (i.e., which has an inverse and is thus based on ring theory) would open up databases for a large range of new scientific and mathematical applications.
The goal of the proposed project is to reinvent database management systems with a foundation in abstract algebra and specifically in ring theory. The presence of an additive inverse allows to cleanly define differences between queries. This gives rise to a database analog of differential calculus that leads to radically new incremental and adaptive query evaluation algorithms that substantially outperform the state of the art techniques. These algorithms enable a new class of systems which I call Dynamic Data Management Systems. Such systems can maintain continuously fresh query views at extremely high update rates and have important applications in interactive Large-scale Data Analysis. There is a natural connection between differences and updates, motivating the group theoretic study of updates that will lead to better ways of creating out-of-core data processing algorithms for new storage devices. Basing queries on ring theory leads to a new class of systems, Algebraic Data Management Systems, which herald a convergence of database systems and computer algebra systems."
Max ERC Funding
1 480 548 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym BATMAN
Project Development of Quantitative Metrologies to Guide Lithium Ion Battery Manufacturing
Researcher (PI) Vanessa Wood
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), PE8, ERC-2015-STG
Summary Lithium ion batteries offer tremendous potential as an enabling technology for sustainable transportation and development. However, their widespread usage as the energy storage solution for electric mobility and grid-level integration of renewables is impeded by the fact that current state-of-the-art lithium ion batteries have energy densities that are too small, charge- and discharge rates that are too low, and costs that are too high. Highly publicized instances of catastrophic failure of lithium ion batteries raise questions of safety. Understanding the limitations to battery performance and origins of the degradation and failure is highly complex due to the difficulties in studying interrelated processes that take place at different length and time scales in a corrosive environment. In the project, we will (1) develop and implement quantitative methods to study the complex interrelations between structure and electrochemistry occurring at the nano-, micron-, and milli-scales in lithium ion battery active materials and electrodes, (2) conduct systematic experimental studies with our new techniques to understand the origins of performance limitations and to develop design guidelines for achieving high performance and safe batteries, and (3) investigate economically viable engineering solutions based on these guidelines to achieve high performance and safe lithium ion batteries.
Summary
Lithium ion batteries offer tremendous potential as an enabling technology for sustainable transportation and development. However, their widespread usage as the energy storage solution for electric mobility and grid-level integration of renewables is impeded by the fact that current state-of-the-art lithium ion batteries have energy densities that are too small, charge- and discharge rates that are too low, and costs that are too high. Highly publicized instances of catastrophic failure of lithium ion batteries raise questions of safety. Understanding the limitations to battery performance and origins of the degradation and failure is highly complex due to the difficulties in studying interrelated processes that take place at different length and time scales in a corrosive environment. In the project, we will (1) develop and implement quantitative methods to study the complex interrelations between structure and electrochemistry occurring at the nano-, micron-, and milli-scales in lithium ion battery active materials and electrodes, (2) conduct systematic experimental studies with our new techniques to understand the origins of performance limitations and to develop design guidelines for achieving high performance and safe batteries, and (3) investigate economically viable engineering solutions based on these guidelines to achieve high performance and safe lithium ion batteries.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym BIGCODE
Project Learning from Big Code: Probabilistic Models, Analysis and Synthesis
Researcher (PI) Martin Vechev
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary The goal of this proposal is to fundamentally change the way we build and reason about software. We aim to develop new kinds of statistical programming systems that provide probabilistically likely solutions to tasks that are difficult or impossible to solve with traditional approaches.
These statistical programming systems will be based on probabilistic models of massive codebases (also known as ``Big Code'') built via a combination of advanced programming languages and powerful machine learning and natural language processing techniques. To solve a particular challenge, a statistical programming system will query a probabilistic model, compute the most likely predictions, and present those to the developer.
Based on probabilistic models of ``Big Code'', we propose to investigate new statistical techniques in the context of three fundamental research directions: i) statistical program synthesis where we develop techniques that automatically synthesize and predict new programs, ii) statistical prediction of program properties where we develop new techniques that can predict important facts (e.g., types) about programs, and iii) statistical translation of programs where we investigate new techniques for statistical translation of programs (e.g., from one programming language to another, or to a natural language).
We believe the research direction outlined in this interdisciplinary proposal opens a new and exciting area of computer science. This area will combine sophisticated statistical learning and advanced programming language techniques for building the next-generation statistical programming systems.
We expect the results of this proposal to have an immediate impact upon millions of developers worldwide, triggering a paradigm shift in the way tomorrow's software is built, as well as a long-lasting impact on scientific fields such as machine learning, natural language processing, programming languages and software engineering.
Summary
The goal of this proposal is to fundamentally change the way we build and reason about software. We aim to develop new kinds of statistical programming systems that provide probabilistically likely solutions to tasks that are difficult or impossible to solve with traditional approaches.
These statistical programming systems will be based on probabilistic models of massive codebases (also known as ``Big Code'') built via a combination of advanced programming languages and powerful machine learning and natural language processing techniques. To solve a particular challenge, a statistical programming system will query a probabilistic model, compute the most likely predictions, and present those to the developer.
Based on probabilistic models of ``Big Code'', we propose to investigate new statistical techniques in the context of three fundamental research directions: i) statistical program synthesis where we develop techniques that automatically synthesize and predict new programs, ii) statistical prediction of program properties where we develop new techniques that can predict important facts (e.g., types) about programs, and iii) statistical translation of programs where we investigate new techniques for statistical translation of programs (e.g., from one programming language to another, or to a natural language).
We believe the research direction outlined in this interdisciplinary proposal opens a new and exciting area of computer science. This area will combine sophisticated statistical learning and advanced programming language techniques for building the next-generation statistical programming systems.
We expect the results of this proposal to have an immediate impact upon millions of developers worldwide, triggering a paradigm shift in the way tomorrow's software is built, as well as a long-lasting impact on scientific fields such as machine learning, natural language processing, programming languages and software engineering.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-04-01, End date: 2021-03-31
Project acronym CATACOAT
Project Nanostructured catalyst overcoats for renewable chemical production from biomass
Researcher (PI) Jeremy Scott LUTERBACHER
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE8, ERC-2017-STG
Summary In the CATACOAT project, we will develop layer-by-layer solution-processed catalyst overcoating methods, which will result in catalysts that have both targeted and broad impacts. We will produce highly active, stable and selective catalysts for the upgrading of lignin – the largest natural source of aromatic chemicals – into commodity chemicals, which will have an important targeted impact. The broader impact of our work will lie in the production of catalytic materials with unprecedented control over the active site architecture.
There is an urgent need to provide these cheap, stable, selective, and highly active catalysts for renewable molecule production. Thanks to its availability and relatively low cost, lignocellulosic biomass is an attractive source of renewable carbon. However, unlike petroleum, biomass-derived molecules are highly oxygenated, and often produced in dilute-aqueous streams. Heterogeneous catalysts – the workhorses of the petrochemical industry – are sensitive to water and contain many metals that easily sinter and leach in liquid-phase conditions. The production of renewable chemicals from biomass, especially valuable aromatics, often requires expensive platinum group metals and suffers from low selectivity.
Catalyst overcoating presents a potential solution to this problem. Recent breakthroughs using catalyst overcoating with atomic layer deposition (ALD) showed that base metal catalysts can be stabilized against sintering and leaching in liquid phase conditions. However, ALD creates dramatic drops in activity due to excessive coverage, and forms an overcoat that cannot be tuned.
Our materials will feature the controlled placement of metal sites (including single atoms), several oxide sites, and even molecular imprints with sub-nanometer precision within highly accessible nanocavities. We anticipate that such materials will create unprecedented opportunities for reducing cost and increasing sustainability in the chemical industry and beyond.
Summary
In the CATACOAT project, we will develop layer-by-layer solution-processed catalyst overcoating methods, which will result in catalysts that have both targeted and broad impacts. We will produce highly active, stable and selective catalysts for the upgrading of lignin – the largest natural source of aromatic chemicals – into commodity chemicals, which will have an important targeted impact. The broader impact of our work will lie in the production of catalytic materials with unprecedented control over the active site architecture.
There is an urgent need to provide these cheap, stable, selective, and highly active catalysts for renewable molecule production. Thanks to its availability and relatively low cost, lignocellulosic biomass is an attractive source of renewable carbon. However, unlike petroleum, biomass-derived molecules are highly oxygenated, and often produced in dilute-aqueous streams. Heterogeneous catalysts – the workhorses of the petrochemical industry – are sensitive to water and contain many metals that easily sinter and leach in liquid-phase conditions. The production of renewable chemicals from biomass, especially valuable aromatics, often requires expensive platinum group metals and suffers from low selectivity.
Catalyst overcoating presents a potential solution to this problem. Recent breakthroughs using catalyst overcoating with atomic layer deposition (ALD) showed that base metal catalysts can be stabilized against sintering and leaching in liquid phase conditions. However, ALD creates dramatic drops in activity due to excessive coverage, and forms an overcoat that cannot be tuned.
Our materials will feature the controlled placement of metal sites (including single atoms), several oxide sites, and even molecular imprints with sub-nanometer precision within highly accessible nanocavities. We anticipate that such materials will create unprecedented opportunities for reducing cost and increasing sustainability in the chemical industry and beyond.
Max ERC Funding
1 785 195 €
Duration
Start date: 2017-12-01, End date: 2022-11-30
Project acronym CEMOS
Project Crystal Engineering for Molecular Organic Semiconductors
Researcher (PI) Kevin Sivula
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE8, ERC-2013-StG
Summary "The urgent need to develop inexpensive and ubiquitous solar energy conversion cannot be overstated. Solution processed organic semiconductors can enable this goal as they support drastically less expensive fabrication techniques compared to traditional semiconductors. Molecular organic semiconductors (MOSs) offer many advantages to their more-common pi-conjugated polymer counterparts, however a clear and fundamental challenge to enable the goal of high performance solution-processable molecular organic semiconductor devices is to develop the ability to control the crystal packing, crystalline domain size, and mixing ability (for multicomponent blends) in the thin-film device geometry. The CEMOS project will accomplish this by pioneering innovative methods of “bottom-up” crystal engineering for organic semiconductors. We will employ specifically tailored molecules designed to leverage both thermodynamic and kinetic aspects of molecular organic semiconductor systems to direct and control crystalline packing, promote crystallite nucleation, compatibilize disparate phases, and plasticize inelastic materials. We will demonstrate that our new classes of materials can enable the tuning of the charge carrier transport and morphology in MOS thin films, and we will evaluate their performance in actual thin-film transistor (TFT) and organic photovoltaic (OPV) devices. Our highly interdisciplinary approach, combining material synthesis and device fabrication/evaluation, will not only lead to improvements in the performance and stability of OPVs and TFTs but will also give deep insights into how the crystalline packing—independent from the molecular structure—affects the optoelectronic properties. The success of CEMOS will rapidly advance the performance of MOS devices by enabling reproducible and tuneable performance comparable to traditional semiconductors—but at radically lower processing costs."
Summary
"The urgent need to develop inexpensive and ubiquitous solar energy conversion cannot be overstated. Solution processed organic semiconductors can enable this goal as they support drastically less expensive fabrication techniques compared to traditional semiconductors. Molecular organic semiconductors (MOSs) offer many advantages to their more-common pi-conjugated polymer counterparts, however a clear and fundamental challenge to enable the goal of high performance solution-processable molecular organic semiconductor devices is to develop the ability to control the crystal packing, crystalline domain size, and mixing ability (for multicomponent blends) in the thin-film device geometry. The CEMOS project will accomplish this by pioneering innovative methods of “bottom-up” crystal engineering for organic semiconductors. We will employ specifically tailored molecules designed to leverage both thermodynamic and kinetic aspects of molecular organic semiconductor systems to direct and control crystalline packing, promote crystallite nucleation, compatibilize disparate phases, and plasticize inelastic materials. We will demonstrate that our new classes of materials can enable the tuning of the charge carrier transport and morphology in MOS thin films, and we will evaluate their performance in actual thin-film transistor (TFT) and organic photovoltaic (OPV) devices. Our highly interdisciplinary approach, combining material synthesis and device fabrication/evaluation, will not only lead to improvements in the performance and stability of OPVs and TFTs but will also give deep insights into how the crystalline packing—independent from the molecular structure—affects the optoelectronic properties. The success of CEMOS will rapidly advance the performance of MOS devices by enabling reproducible and tuneable performance comparable to traditional semiconductors—but at radically lower processing costs."
Max ERC Funding
1 477 472 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym COMET
Project foundations of COmputational similarity geoMETtry
Researcher (PI) Michael Bronstein
Host Institution (HI) UNIVERSITA DELLA SVIZZERA ITALIANA
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary "Similarity is one of the most fundamental notions encountered in problems practically in every branch of science, and is especially crucial in image sciences such as computer vision and pattern recognition. The need to quantify similarity or dissimilarity of some data is central to broad categories of problems involving comparison, search, matching, alignment, or reconstruction. The most common way to model a similarity is using metrics (distances). Such constructions are well-studied in the field of metric geometry, and there exist numerous computational algorithms allowing, for example, to represent one metric using another by means of isometric embeddings.
However, in many applications such a model appears to be too restrictive: many types of similarity are non-metric; it is not always possible to model the similarity precisely or completely e.g. due to missing data; some objects might be mutually incomparable e.g. if they are coming from different modalities. Such deficiencies of the metric similarity model are especially pronounced in large-scale computer vision, pattern recognition, and medical imaging applications.
The ambitious goal of this project is to introduce a paradigm shift in the way we model and compute similarity. We will develop a unifying framework of computational similarity geometry that extends the theoretical metric model, and will allow developing efficient numerical and computational tools for the representation and computation of generic similarity models. The methods will be developed all the way from mathematical concepts to efficiently implemented code and will be applied to today’s most important and challenging problems in Internet-scale computer vision and pattern recognition, shape analysis, and medical imaging."
Summary
"Similarity is one of the most fundamental notions encountered in problems practically in every branch of science, and is especially crucial in image sciences such as computer vision and pattern recognition. The need to quantify similarity or dissimilarity of some data is central to broad categories of problems involving comparison, search, matching, alignment, or reconstruction. The most common way to model a similarity is using metrics (distances). Such constructions are well-studied in the field of metric geometry, and there exist numerous computational algorithms allowing, for example, to represent one metric using another by means of isometric embeddings.
However, in many applications such a model appears to be too restrictive: many types of similarity are non-metric; it is not always possible to model the similarity precisely or completely e.g. due to missing data; some objects might be mutually incomparable e.g. if they are coming from different modalities. Such deficiencies of the metric similarity model are especially pronounced in large-scale computer vision, pattern recognition, and medical imaging applications.
The ambitious goal of this project is to introduce a paradigm shift in the way we model and compute similarity. We will develop a unifying framework of computational similarity geometry that extends the theoretical metric model, and will allow developing efficient numerical and computational tools for the representation and computation of generic similarity models. The methods will be developed all the way from mathematical concepts to efficiently implemented code and will be applied to today’s most important and challenging problems in Internet-scale computer vision and pattern recognition, shape analysis, and medical imaging."
Max ERC Funding
1 495 020 €
Duration
Start date: 2012-10-01, End date: 2017-09-30
Project acronym COMPLEXDATA
Project Statistics for Complex Data: Understanding Randomness, Geometry and Complexity with a view Towards Biophysics
Researcher (PI) Victor Michael Panaretos
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE1, ERC-2010-StG_20091028
Summary The ComplexData project aims at advancing our understanding of the statistical treatment of varied types of complex data by generating new theory and methods, and to obtain progress in concrete current biophysical problems through the implementation of the new tools developed. Complex Data constitute data where the basic object of observation cannot be described in the standard Euclidean context of statistics, but rather needs to be thought of as an element of an abstract mathematical space with special properties. Scientific progress has, in recent years, begun to generate an increasing number of new and complex types of data that require statistical understanding and analysis. Four such types of data that are arising in the context of current scientific research and that the project will be focusing on are: random integral transforms, random unlabelled shapes, random flows of functions, and random tensor fields. In these unconventional contexts for statistics, the strategy of the project will be to carefully exploit the special aspects involved due to geometry, dimension and randomness in order to be able to either adapt and synthesize existing statistical methods, or to generate new statistical ideas altogether. However, the project will not restrict itself to merely studying the theoretical aspects of complex data, but will be truly interdisciplinary. The connecting thread among all the above data types is that their study is motivated by, and will be applied to concrete practical problems arising in the study of biological structure, dynamics, and function: biophysics. For this reason, the programme will be in interaction with local and international contacts from this field. In particular, the theoretical/methodological output of the four programme research foci will be applied to gain insights in the following corresponding four application areas: electron microscopy, protein homology, DNA molecular dynamics, brain imaging.
Summary
The ComplexData project aims at advancing our understanding of the statistical treatment of varied types of complex data by generating new theory and methods, and to obtain progress in concrete current biophysical problems through the implementation of the new tools developed. Complex Data constitute data where the basic object of observation cannot be described in the standard Euclidean context of statistics, but rather needs to be thought of as an element of an abstract mathematical space with special properties. Scientific progress has, in recent years, begun to generate an increasing number of new and complex types of data that require statistical understanding and analysis. Four such types of data that are arising in the context of current scientific research and that the project will be focusing on are: random integral transforms, random unlabelled shapes, random flows of functions, and random tensor fields. In these unconventional contexts for statistics, the strategy of the project will be to carefully exploit the special aspects involved due to geometry, dimension and randomness in order to be able to either adapt and synthesize existing statistical methods, or to generate new statistical ideas altogether. However, the project will not restrict itself to merely studying the theoretical aspects of complex data, but will be truly interdisciplinary. The connecting thread among all the above data types is that their study is motivated by, and will be applied to concrete practical problems arising in the study of biological structure, dynamics, and function: biophysics. For this reason, the programme will be in interaction with local and international contacts from this field. In particular, the theoretical/methodological output of the four programme research foci will be applied to gain insights in the following corresponding four application areas: electron microscopy, protein homology, DNA molecular dynamics, brain imaging.
Max ERC Funding
681 146 €
Duration
Start date: 2011-06-01, End date: 2016-05-31
Project acronym CONQUEST
Project Controlled quantum effects and spin technology
- from non-equilibrium physics to functional magnetics
Researcher (PI) Henrik Ronnow
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE3, ERC-2010-StG_20091028
Summary The technology of the 20th century was dominated by a single material class: The semiconductors, whose properties can be tuned between those of metals and insulators all of which describable by single-electron effects. In contrast, quantum magnets and strongly correlated electron systems offer a full palette of quantum mechanical many-electron states. CONQUEST aim to discover, understand and demonstrate control over such quantum states. A new experimental approach, building on established powerful laboratory and neutron scattering techniques combined with dynamical control-perturbations, will be developed to study correlated quantum effects in magnetic materials. The immediate goal is to open a new field of non-equilibrium and time dependent studies in solid state physics. The long-term vision is that the approach might nurture the materials of the 21st century.
Summary
The technology of the 20th century was dominated by a single material class: The semiconductors, whose properties can be tuned between those of metals and insulators all of which describable by single-electron effects. In contrast, quantum magnets and strongly correlated electron systems offer a full palette of quantum mechanical many-electron states. CONQUEST aim to discover, understand and demonstrate control over such quantum states. A new experimental approach, building on established powerful laboratory and neutron scattering techniques combined with dynamical control-perturbations, will be developed to study correlated quantum effects in magnetic materials. The immediate goal is to open a new field of non-equilibrium and time dependent studies in solid state physics. The long-term vision is that the approach might nurture the materials of the 21st century.
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-04-01, End date: 2016-03-31
Project acronym CONSTAMIS
Project Connecting Statistical Mechanics and Conformal Field Theory: an Ising Model Perspective
Researcher (PI) CLEMENT HONGLER
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE1, ERC-2016-STG
Summary The developments of Statistical Mechanics and Quantum Field Theory are among the major achievements of the 20th century's science. During the second half of the century, these two subjects started to converge. In two dimensions, this resulted in a most remarkable chapter of mathematical physics: Conformal Field Theory (CFT) reveals deep structures allowing for extremely precise investigations, making such theories powerful building blocks of many subjects of mathematics and physics. Unfortunately, this convergence has remained non-rigorous, leaving most of the spectacular field-theoretic applications to Statistical Mechanics conjectural.
About 15 years ago, several mathematical breakthroughs shed new light on this picture. The development of SLE curves and discrete complex analysis has enabled one to connect various statistical mechanics models with conformally symmetric processes. Recently, major progress was made on a key statistical mechanics model, the Ising model: the connection with SLE was established, and many formulae predicted by CFT were proven.
Important advances towards connecting Statistical Mechanics and CFT now appear possible. This is the goal of this proposal, which is organized in three objectives:
(I) Build a deep correspondence between the Ising model and CFT: reveal clear links between the objects and structures arising in the Ising and CFT frameworks.
(II) Gather the insights of (I) to study new connections to CFT, particularly for minimal models, current algebras and parafermions.
(III) Combine (I) and (II) to go beyond conformal symmetry: link the Ising model with massive integrable field theories.
The aim is to build one of the first rigorous bridges between Statistical Mechanics and CFT. It will help to close the gap between physical derivations and mathematical theorems. By linking the deep structures of CFT to concrete models that are applicable in many subjects, it will be potentially useful to theoretical and applied scientists.
Summary
The developments of Statistical Mechanics and Quantum Field Theory are among the major achievements of the 20th century's science. During the second half of the century, these two subjects started to converge. In two dimensions, this resulted in a most remarkable chapter of mathematical physics: Conformal Field Theory (CFT) reveals deep structures allowing for extremely precise investigations, making such theories powerful building blocks of many subjects of mathematics and physics. Unfortunately, this convergence has remained non-rigorous, leaving most of the spectacular field-theoretic applications to Statistical Mechanics conjectural.
About 15 years ago, several mathematical breakthroughs shed new light on this picture. The development of SLE curves and discrete complex analysis has enabled one to connect various statistical mechanics models with conformally symmetric processes. Recently, major progress was made on a key statistical mechanics model, the Ising model: the connection with SLE was established, and many formulae predicted by CFT were proven.
Important advances towards connecting Statistical Mechanics and CFT now appear possible. This is the goal of this proposal, which is organized in three objectives:
(I) Build a deep correspondence between the Ising model and CFT: reveal clear links between the objects and structures arising in the Ising and CFT frameworks.
(II) Gather the insights of (I) to study new connections to CFT, particularly for minimal models, current algebras and parafermions.
(III) Combine (I) and (II) to go beyond conformal symmetry: link the Ising model with massive integrable field theories.
The aim is to build one of the first rigorous bridges between Statistical Mechanics and CFT. It will help to close the gap between physical derivations and mathematical theorems. By linking the deep structures of CFT to concrete models that are applicable in many subjects, it will be potentially useful to theoretical and applied scientists.
Max ERC Funding
998 005 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym COSPSENA
Project Coherence of Spins in Semiconductor Nanostructures
Researcher (PI) Dominik Max Zumbühl
Host Institution (HI) UNIVERSITAT BASEL
Call Details Starting Grant (StG), PE3, ERC-2007-StG
Summary Macroscopic control of quantum states is a major theme in much of modern physics because quantum coherence enables study of fundamental physics and has promising applications for quantum information processing. The potential significance of quantum computing is recognized well beyond the physics community. For electron spins in GaAs quantum dots, it has become clear that decoherence caused by interactions with the nuclear spins is a major challenge. We propose to investigate and reduce hyperfine induced decoherence with two complementary approaches: nuclear spin state narrowing and nuclear spin polarization. We propose a new projective state narrowing technique: a large, Coulomb blockaded dot measures the qubit nuclear ensemble, resulting in enhanced spin coherence times. Further, mediated by an interacting 2D electron gas via hyperfine interaction, a low temperature nuclear ferromagnetic spin state was predicted, which we propose to investigate using a quantum point contact as a nuclear polarization detector. Estimates indicate that the nuclear ferromagnetic transition occurs in the sub-Millikelvin range, well below already hard to reach temperatures around 10 mK. However, the exciting combination of interacting electron and nuclear spin physics as well as applications in spin qubits give ample incentive to strive for sub-Millikelvin temperatures in nanostructures. We propose to build a novel type of nuclear demagnetization refrigerator aiming to reach electron temperatures of 0.1 mK in semiconductor nanostructures. This interdisciplinary project combines Microkelvin and nanophysics, going well beyond the status quo. It is a challenging project that could be the beginning of a new era of coherent spin physics with unprecedented quantum control. This project requires a several year commitment and a team of two graduate students plus one postdoctoral fellow.
Summary
Macroscopic control of quantum states is a major theme in much of modern physics because quantum coherence enables study of fundamental physics and has promising applications for quantum information processing. The potential significance of quantum computing is recognized well beyond the physics community. For electron spins in GaAs quantum dots, it has become clear that decoherence caused by interactions with the nuclear spins is a major challenge. We propose to investigate and reduce hyperfine induced decoherence with two complementary approaches: nuclear spin state narrowing and nuclear spin polarization. We propose a new projective state narrowing technique: a large, Coulomb blockaded dot measures the qubit nuclear ensemble, resulting in enhanced spin coherence times. Further, mediated by an interacting 2D electron gas via hyperfine interaction, a low temperature nuclear ferromagnetic spin state was predicted, which we propose to investigate using a quantum point contact as a nuclear polarization detector. Estimates indicate that the nuclear ferromagnetic transition occurs in the sub-Millikelvin range, well below already hard to reach temperatures around 10 mK. However, the exciting combination of interacting electron and nuclear spin physics as well as applications in spin qubits give ample incentive to strive for sub-Millikelvin temperatures in nanostructures. We propose to build a novel type of nuclear demagnetization refrigerator aiming to reach electron temperatures of 0.1 mK in semiconductor nanostructures. This interdisciplinary project combines Microkelvin and nanophysics, going well beyond the status quo. It is a challenging project that could be the beginning of a new era of coherent spin physics with unprecedented quantum control. This project requires a several year commitment and a team of two graduate students plus one postdoctoral fellow.
Max ERC Funding
1 377 000 €
Duration
Start date: 2008-06-01, End date: 2013-05-31
Project acronym COSYM
Project Computational Symmetry for Geometric Data Analysis and Design
Researcher (PI) Mark Pauly
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE6, ERC-2010-StG_20091028
Summary The analysis and synthesis of complex 3D geometric data sets is of crucial importance in many scientific disciplines (e.g. bio-medicine, material science, mechanical engineering, physics) and industrial applications (e.g. drug design, entertainment, architecture). We are currently witnessing a tremendous increase in the size and complexity of geometric data, largely fueled by significant advances in 3D acquisition and digital production technology. However, existing computational tools are often not suited to handle this complexity.
The goal of this project is to explore a fundamentally different way of processing 3D geometry. We will investigate a new generalized model of geometric symmetry as a unifying concept for studying spatial organization in geometric data. This model allows exposing the inherent redundancies in digital 3D data and will enable truly scalable algorithms for analysis, processing, and design of large-scale geometric data sets. The proposed research will address a number of fundamental questions: What is the information content of 3D geometric models? How can we represent, store, and transmit geometric data most efficiently? Can we we use symmetry to repair deficiencies and reduce noise in acquired data? What is the role of symmetry in the design process and how can it be used to reduce complexity?
I will investigate these questions with an integrated approach that combines thorough theoretical studies with practical solutions for real-world applications.
The proposed research has a strong interdisciplinary component and will consider the same fundamental questions from different perspectives, closely interacting with scientists of various disciplines, as well artists, architects, and designers.
Summary
The analysis and synthesis of complex 3D geometric data sets is of crucial importance in many scientific disciplines (e.g. bio-medicine, material science, mechanical engineering, physics) and industrial applications (e.g. drug design, entertainment, architecture). We are currently witnessing a tremendous increase in the size and complexity of geometric data, largely fueled by significant advances in 3D acquisition and digital production technology. However, existing computational tools are often not suited to handle this complexity.
The goal of this project is to explore a fundamentally different way of processing 3D geometry. We will investigate a new generalized model of geometric symmetry as a unifying concept for studying spatial organization in geometric data. This model allows exposing the inherent redundancies in digital 3D data and will enable truly scalable algorithms for analysis, processing, and design of large-scale geometric data sets. The proposed research will address a number of fundamental questions: What is the information content of 3D geometric models? How can we represent, store, and transmit geometric data most efficiently? Can we we use symmetry to repair deficiencies and reduce noise in acquired data? What is the role of symmetry in the design process and how can it be used to reduce complexity?
I will investigate these questions with an integrated approach that combines thorough theoretical studies with practical solutions for real-world applications.
The proposed research has a strong interdisciplinary component and will consider the same fundamental questions from different perspectives, closely interacting with scientists of various disciplines, as well artists, architects, and designers.
Max ERC Funding
1 160 302 €
Duration
Start date: 2011-02-01, End date: 2016-01-31
Project acronym DAPP
Project Data-centric Parallel Programming
Researcher (PI) Torsten Hoefler
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary We address a fundamental and increasingly important challenge in computer science: how to program large-scale heterogeneous parallel computers. Society relies on these computers to satisfy the growing demands of important applications such as drug design, weather prediction, and big data analytics. Architectural trends make heterogeneous parallel processors the fundamental building blocks of computing platforms ranging from quad-core laptops to million-core supercomputers; failing to exploit these architectures efficiently will severely limit the technological advance of our society. Computationally demanding problems are often inherently parallel and can readily be compiled for various target architectures. Yet, efficiently mapping data to the target memory system is notoriously hard, and the cost of fetching two operands from remote memory is already orders of magnitude more expensive than any arithmetic operation. Data access cost is growing with the amount of parallelism which makes data layout optimizations crucial. Prevalent parallel programming abstractions largely ignore data access and guide programmers to design threads of execution that are scheduled to the machine. We depart from this control-centric model to a data-centric program formulation where we express programs as collections of values, called memlets, that are mapped as first-class objects by the compiler and runtime system. Our holistic compiler and runtime system aims to substantially advance the state of the art in parallel computing by combining static and dynamic scheduling of memlets to complex heterogeneous target architectures. We will demonstrate our methods on three challenging real-world applications in scientific computing, data analytics, and graph processing. We strongly believe that, without holistic data-centric programming, the growing complexity and inefficiency of parallel programming will create a scaling wall that will limit our future computational capabilities.
Summary
We address a fundamental and increasingly important challenge in computer science: how to program large-scale heterogeneous parallel computers. Society relies on these computers to satisfy the growing demands of important applications such as drug design, weather prediction, and big data analytics. Architectural trends make heterogeneous parallel processors the fundamental building blocks of computing platforms ranging from quad-core laptops to million-core supercomputers; failing to exploit these architectures efficiently will severely limit the technological advance of our society. Computationally demanding problems are often inherently parallel and can readily be compiled for various target architectures. Yet, efficiently mapping data to the target memory system is notoriously hard, and the cost of fetching two operands from remote memory is already orders of magnitude more expensive than any arithmetic operation. Data access cost is growing with the amount of parallelism which makes data layout optimizations crucial. Prevalent parallel programming abstractions largely ignore data access and guide programmers to design threads of execution that are scheduled to the machine. We depart from this control-centric model to a data-centric program formulation where we express programs as collections of values, called memlets, that are mapped as first-class objects by the compiler and runtime system. Our holistic compiler and runtime system aims to substantially advance the state of the art in parallel computing by combining static and dynamic scheduling of memlets to complex heterogeneous target architectures. We will demonstrate our methods on three challenging real-world applications in scientific computing, data analytics, and graph processing. We strongly believe that, without holistic data-centric programming, the growing complexity and inefficiency of parallel programming will create a scaling wall that will limit our future computational capabilities.
Max ERC Funding
1 499 672 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym DYNCORSYS
Project Real-time dynamics of correlated many-body systems
Researcher (PI) Philipp Werner
Host Institution (HI) UNIVERSITE DE FRIBOURG
Call Details Starting Grant (StG), PE3, ERC-2011-StG_20101014
Summary "Strongly correlated materials exhibit some of the most remarkable phenonomena found in condensed matter systems. They typically involve many active degrees of freedom (spin, charge, orbital), which leads to numerous competing states and complicated phase diagrams. A new perspective on correlated many-body systems is provided by the nonequilibrium dynamics, which is being explored in transport studies on nanostructures, pump-probe experiments on correlated solids, and in quench experiments on ultra-cold atomic gases.
An advanced theoretical framework for the study of correlated lattice models, which can be adapted to nonequilibrium situations, is dynamical mean field theory (DMFT). One aim of this proposal is to develop ""nonequilibrium DMFT"" into a powerful tool for the simulation of excitation and relaxation processes in interacting many-body systems. The big challenge in these simulations is the calculation of the real-time evolution of a quantum impurity model. Recently developed real-time impurity solvers have, however, opened the door to a wide range of applications. We will improve the efficiency and flexibility of these methods and develop complementary approaches, which will extend the accessible parameter regimes. This machinery will be used to study correlated lattice models under nonequilibrium conditions. The ultimate goal is to explore and qualitatively understand the nonequilibrium properties of ""real"" materials with active spin, charge, orbital and lattice degrees of freedom.
The ability to simulate the real-time dynamics of correlated many-body systems will be crucial for the interpretation of experiments and the discovery of correlation effects which manifest themselves only in the form of transient states. A proper understanding of the most basic nonequilibrium phenomena in correlated solids will help guide future experiments and hopefully lead to new technological applications such as ultra-fast switches or storage devices."
Summary
"Strongly correlated materials exhibit some of the most remarkable phenonomena found in condensed matter systems. They typically involve many active degrees of freedom (spin, charge, orbital), which leads to numerous competing states and complicated phase diagrams. A new perspective on correlated many-body systems is provided by the nonequilibrium dynamics, which is being explored in transport studies on nanostructures, pump-probe experiments on correlated solids, and in quench experiments on ultra-cold atomic gases.
An advanced theoretical framework for the study of correlated lattice models, which can be adapted to nonequilibrium situations, is dynamical mean field theory (DMFT). One aim of this proposal is to develop ""nonequilibrium DMFT"" into a powerful tool for the simulation of excitation and relaxation processes in interacting many-body systems. The big challenge in these simulations is the calculation of the real-time evolution of a quantum impurity model. Recently developed real-time impurity solvers have, however, opened the door to a wide range of applications. We will improve the efficiency and flexibility of these methods and develop complementary approaches, which will extend the accessible parameter regimes. This machinery will be used to study correlated lattice models under nonequilibrium conditions. The ultimate goal is to explore and qualitatively understand the nonequilibrium properties of ""real"" materials with active spin, charge, orbital and lattice degrees of freedom.
The ability to simulate the real-time dynamics of correlated many-body systems will be crucial for the interpretation of experiments and the discovery of correlation effects which manifest themselves only in the form of transient states. A proper understanding of the most basic nonequilibrium phenomena in correlated solids will help guide future experiments and hopefully lead to new technological applications such as ultra-fast switches or storage devices."
Max ERC Funding
1 493 178 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym ELECTROCHEMBOTS
Project MAGNETOELECTRIC CHEMONANOROBOTICS FOR CHEMICAL AND BIOMEDICAL APPLICATIONS
Researcher (PI) Salvador Pané Vidal
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), PE8, ERC-2013-StG
Summary "The ability to generate electric fields at small scales is becoming increasingly important in many fields of research including plasmonics-based sensing, micro- and nanofabrication, microfluidics and spintronics. The localized generation of electrical fields at extremely small scales has the potential to revolutionize conventional methods of electrically stimulating cells. The objective of this proposal is the development of miniaturized untethered devices capable of delivering electric currents to cells for the stimulation of their vital functions. To this end, we propose the construction of micro- and nanoscale magnetoelectric structures that can be triggered using external magnetic fields. These small devices will consist of composite hybrid structures containing piezoelectric and magnetostrictive layers. By applying an oscillating magnetic field in the presence of a DC bias field, the magnetostrictive element will deform, thereby generating stress in a piezoelectric shell, which in turn will become electrically polarized. Small devices capable of wirelessly generating electric fields offer an innovative way of studying the electrical and electrochemical stimulation of cells. For example, by concentrating electric fields at specific locations in a cell, the behavior of protein membrane components such as cell adhesion molecules or transport proteins can be altered to modulate the stiction of proliferating cells or ion channel gating kinetics."
Summary
"The ability to generate electric fields at small scales is becoming increasingly important in many fields of research including plasmonics-based sensing, micro- and nanofabrication, microfluidics and spintronics. The localized generation of electrical fields at extremely small scales has the potential to revolutionize conventional methods of electrically stimulating cells. The objective of this proposal is the development of miniaturized untethered devices capable of delivering electric currents to cells for the stimulation of their vital functions. To this end, we propose the construction of micro- and nanoscale magnetoelectric structures that can be triggered using external magnetic fields. These small devices will consist of composite hybrid structures containing piezoelectric and magnetostrictive layers. By applying an oscillating magnetic field in the presence of a DC bias field, the magnetostrictive element will deform, thereby generating stress in a piezoelectric shell, which in turn will become electrically polarized. Small devices capable of wirelessly generating electric fields offer an innovative way of studying the electrical and electrochemical stimulation of cells. For example, by concentrating electric fields at specific locations in a cell, the behavior of protein membrane components such as cell adhesion molecules or transport proteins can be altered to modulate the stiction of proliferating cells or ion channel gating kinetics."
Max ERC Funding
1 491 701 €
Duration
Start date: 2013-09-01, End date: 2018-08-31
Project acronym EllipticPDE
Project Regularity and singularities in elliptic PDE's: beyond monotonicity formulas
Researcher (PI) Xavier ROS-OTON
Host Institution (HI) UNIVERSITAT ZURICH
Call Details Starting Grant (StG), PE1, ERC-2018-STG
Summary One of the oldest and most important questions in PDE theory is that of regularity. A classical example is Hilbert's XIXth problem (1900), solved by De Giorgi and Nash in 1956. During the second half of the XXth century, the regularity theory for elliptic and parabolic PDE's experienced a huge development, and many fundamental questions were answered by Caffarelli, Nirenberg, Krylov, Evans, Nadirashvili, Friedman, and many others. Still, there are problems of crucial importance that remain open.
The aim of this project is to go significantly beyond the state of the art in some of the most important open questions in this context. In particular, three key objectives of the project are the following. First, to introduce new techniques to obtain fine description of singularities in nonlinear elliptic PDE's. Aside from its intrinsic interest, a good regularity theory for singular points is likely to provide insightful applications in other contexts. A second aim of the project is to establish generic regularity results for free boundaries and other PDE problems. The development of methods which would allow one to prove generic regularity results may be viewed as one of the greatest challenges not only for free boundary problems, but for PDE problems in general. Finally, the third main objective is to achieve a complete regularity theory for nonlinear elliptic PDE's that does not rely on monotonicity formulas. These three objectives, while seemingly different, are in fact deeply interrelated.
Summary
One of the oldest and most important questions in PDE theory is that of regularity. A classical example is Hilbert's XIXth problem (1900), solved by De Giorgi and Nash in 1956. During the second half of the XXth century, the regularity theory for elliptic and parabolic PDE's experienced a huge development, and many fundamental questions were answered by Caffarelli, Nirenberg, Krylov, Evans, Nadirashvili, Friedman, and many others. Still, there are problems of crucial importance that remain open.
The aim of this project is to go significantly beyond the state of the art in some of the most important open questions in this context. In particular, three key objectives of the project are the following. First, to introduce new techniques to obtain fine description of singularities in nonlinear elliptic PDE's. Aside from its intrinsic interest, a good regularity theory for singular points is likely to provide insightful applications in other contexts. A second aim of the project is to establish generic regularity results for free boundaries and other PDE problems. The development of methods which would allow one to prove generic regularity results may be viewed as one of the greatest challenges not only for free boundary problems, but for PDE problems in general. Finally, the third main objective is to achieve a complete regularity theory for nonlinear elliptic PDE's that does not rely on monotonicity formulas. These three objectives, while seemingly different, are in fact deeply interrelated.
Max ERC Funding
1 335 250 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym ETOPEX
Project Engineering Topological Phases and Excitations in Nanostructures with Interactions
Researcher (PI) Jelena KLINOVAJA
Host Institution (HI) UNIVERSITAT BASEL
Call Details Starting Grant (StG), PE3, ERC-2017-STG
Summary The main goal of this theory project is to propose engineered topological phases emerging only in strongly interacting systems and to identify the most feasible systems for experimental implementation. First, we will focus on setups hosting topological states localized at domain walls in one-dimensional channels such as parafermions, which are a new class of non-Abelian anyons and most promising candidates for topological quantum computing schemes. Second, in the framework of weakly coupled wires and planes, we will develop schemes for novel fractional topological phases in two- and three-dimensional interacting systems. To achieve these two goals, my team will identify necessary ingredients such as strong electron-electron interactions, helical magnetic order, or crossed Andreev proximity-induced superconductivity and address each of them separately. Later, we combine them to lead us to the desired topological phases and states. On our way to the main goal, as test cases, we will also study non-interacting analogies of the proposed effects such as Majorana fermions and integer topological insulators and pay close attention to the rapid experimental progress to come up with the most feasible proposals. We will study transport properties, scanning tunneling and atomic force microscopy. Especially for systems driven out of equilibrium, we will develop a Floquet-Luttinger liquid technique. We will explore the stability of engineered topological phases, error rates of topological qubits based on them, and computation schemes allowing for a set of universal qubit gates. We will strive to find a reasonable balance between topological stability and experimental
feasibility of setups. Our main theoretical tools are Luttinger liquid techniques (bosonization and renormalization group), Green functions, Floquet formalism, and numerical simulations in non-interacting test models.
Summary
The main goal of this theory project is to propose engineered topological phases emerging only in strongly interacting systems and to identify the most feasible systems for experimental implementation. First, we will focus on setups hosting topological states localized at domain walls in one-dimensional channels such as parafermions, which are a new class of non-Abelian anyons and most promising candidates for topological quantum computing schemes. Second, in the framework of weakly coupled wires and planes, we will develop schemes for novel fractional topological phases in two- and three-dimensional interacting systems. To achieve these two goals, my team will identify necessary ingredients such as strong electron-electron interactions, helical magnetic order, or crossed Andreev proximity-induced superconductivity and address each of them separately. Later, we combine them to lead us to the desired topological phases and states. On our way to the main goal, as test cases, we will also study non-interacting analogies of the proposed effects such as Majorana fermions and integer topological insulators and pay close attention to the rapid experimental progress to come up with the most feasible proposals. We will study transport properties, scanning tunneling and atomic force microscopy. Especially for systems driven out of equilibrium, we will develop a Floquet-Luttinger liquid technique. We will explore the stability of engineered topological phases, error rates of topological qubits based on them, and computation schemes allowing for a set of universal qubit gates. We will strive to find a reasonable balance between topological stability and experimental
feasibility of setups. Our main theoretical tools are Luttinger liquid techniques (bosonization and renormalization group), Green functions, Floquet formalism, and numerical simulations in non-interacting test models.
Max ERC Funding
1 158 403 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym FLATRONICS
Project Electronic devices based on nanolayers
Researcher (PI) Andras Kis
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE3, ERC-2009-StG
Summary The main objective of this research proposal is to explore the electrical properties of nanoscale devices and circuits based on nanolayers. Nanolayers cover a wide span of possible electronic properties, ranging from semiconducting to superconducting. The possibility to form electrical circuits by varying their geometry offers rich research and practical opportunities. Together with graphene, nanolayers could form the material library for future nanoelectronics where different materials could be mixed and matched to different functionalities.
Summary
The main objective of this research proposal is to explore the electrical properties of nanoscale devices and circuits based on nanolayers. Nanolayers cover a wide span of possible electronic properties, ranging from semiconducting to superconducting. The possibility to form electrical circuits by varying their geometry offers rich research and practical opportunities. Together with graphene, nanolayers could form the material library for future nanoelectronics where different materials could be mixed and matched to different functionalities.
Max ERC Funding
1 799 996 €
Duration
Start date: 2009-09-01, End date: 2014-08-31
Project acronym FLIRT
Project Fluid Flows and Irregular Transport
Researcher (PI) Gianluca Crippa
Host Institution (HI) UNIVERSITAT BASEL
Call Details Starting Grant (StG), PE1, ERC-2015-STG
Summary "Several important partial differential equations (PDEs) arising in the mathematical description of physical phenomena exhibit transport features: physical quantities are advected by velocity fields that drive the dynamics of the system. This is the case for instance for the Euler equation of fluid dynamics, for conservation laws, and for kinetic equations.
An ubiquitous feature of these phenomena is their intrinsic lack of regularity. From the mathematical point of view this stems from the nonlinearity and/or nonlocality of the PDEs. Moreover, the lack of regularity also encodes actual properties of the underlying physical systems: conservation laws develop shocks (discontinuities that propagate in time), solutions to the Euler equation exhibit rough and ""disordered"" behaviors. This irregularity is the major difficulty in the mathematical analysis of such problems, since it prevents the use of many standard methods, foremost the classical (and powerful) theory of characteristics.
For these reasons, the study in a non smooth setting of transport and continuity equations, and of flows of ordinary differential equations, is a fundamental tool to approach challenging important questions concerning these PDEs.
This project aims at establishing:
(1) deep insight into the structure of solutions of nonlinear PDEs, in particular the Euler equation and multidimensional systems of conservation laws,
(2) rigorous bounds for mixing phenomena in fluid flows, phenomena for which giving a precise mathematical formulation is extremely challenging.
The unifying factor of this proposal is that the analysis will rely on major advances in the theory of flows of ordinary differential equations in a non smooth setting, thus providing a robust formulation via characteristics for the PDEs under consideration. The guiding thread is the crucial role of geometric measure theory techniques, which are extremely efficient to describe and investigate irregular phenomena."
Summary
"Several important partial differential equations (PDEs) arising in the mathematical description of physical phenomena exhibit transport features: physical quantities are advected by velocity fields that drive the dynamics of the system. This is the case for instance for the Euler equation of fluid dynamics, for conservation laws, and for kinetic equations.
An ubiquitous feature of these phenomena is their intrinsic lack of regularity. From the mathematical point of view this stems from the nonlinearity and/or nonlocality of the PDEs. Moreover, the lack of regularity also encodes actual properties of the underlying physical systems: conservation laws develop shocks (discontinuities that propagate in time), solutions to the Euler equation exhibit rough and ""disordered"" behaviors. This irregularity is the major difficulty in the mathematical analysis of such problems, since it prevents the use of many standard methods, foremost the classical (and powerful) theory of characteristics.
For these reasons, the study in a non smooth setting of transport and continuity equations, and of flows of ordinary differential equations, is a fundamental tool to approach challenging important questions concerning these PDEs.
This project aims at establishing:
(1) deep insight into the structure of solutions of nonlinear PDEs, in particular the Euler equation and multidimensional systems of conservation laws,
(2) rigorous bounds for mixing phenomena in fluid flows, phenomena for which giving a precise mathematical formulation is extremely challenging.
The unifying factor of this proposal is that the analysis will rely on major advances in the theory of flows of ordinary differential equations in a non smooth setting, thus providing a robust formulation via characteristics for the PDEs under consideration. The guiding thread is the crucial role of geometric measure theory techniques, which are extremely efficient to describe and investigate irregular phenomena."
Max ERC Funding
1 009 351 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym GALATEA
Project Tailoring Material Properties Using Femtosecond Lasers: A New Paradigm for Highly Integrated Micro-/Nano- Scale Systems
Researcher (PI) Yves, Jérôme Bellouard
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE8, ERC-2012-StG_20111012
Summary Using recent progress in laser technology and in particular in the field of ultra-fast lasers, we are getting close to accomplish the alchemist dream of transforming materials. Compact lasers can generate pulses with ultra-high peak powers in the Tera-Watt or even Peta-Watt ranges. These high-power pulses lead to a radically different laser-matter interaction than the one obtained with conventional lasers. Non-linear multi-photons processes are observed; they open new and exciting opportunities to tailor the matter in its intimate structure with sub-wavelength spatial resolutions and in the three dimensions.
This project is aiming at exploring the use of these ultrafast lasers to locally tailor the physical properties of glass materials. More specifically, our objective is to create polymorphs embedded in bulk structures and to demonstrate their use as means to introduce new functionalities in the material.
The long-term objective is to develop the scientific understanding and technological know-how to create three-dimensional objects with nanoscale features where optics, fluidics and micromechanical elements as well as active functions are integrated in a single monolithic piece of glass and to do so using a single process.
This is a multidisciplinary research that pushes the frontier of our current knowledge of femtosecond laser interaction with glass to demonstrate a novel design platform for future micro-/nano- systems.
Summary
Using recent progress in laser technology and in particular in the field of ultra-fast lasers, we are getting close to accomplish the alchemist dream of transforming materials. Compact lasers can generate pulses with ultra-high peak powers in the Tera-Watt or even Peta-Watt ranges. These high-power pulses lead to a radically different laser-matter interaction than the one obtained with conventional lasers. Non-linear multi-photons processes are observed; they open new and exciting opportunities to tailor the matter in its intimate structure with sub-wavelength spatial resolutions and in the three dimensions.
This project is aiming at exploring the use of these ultrafast lasers to locally tailor the physical properties of glass materials. More specifically, our objective is to create polymorphs embedded in bulk structures and to demonstrate their use as means to introduce new functionalities in the material.
The long-term objective is to develop the scientific understanding and technological know-how to create three-dimensional objects with nanoscale features where optics, fluidics and micromechanical elements as well as active functions are integrated in a single monolithic piece of glass and to do so using a single process.
This is a multidisciplinary research that pushes the frontier of our current knowledge of femtosecond laser interaction with glass to demonstrate a novel design platform for future micro-/nano- systems.
Max ERC Funding
1 757 396 €
Duration
Start date: 2012-12-01, End date: 2017-11-30
Project acronym GEQIT
Project Generalized (quantum) information theory
Researcher (PI) Renato Renner
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), PE6, ERC-2010-StG_20091028
Summary Information theory is a branch of science that studies, from a mathematical perspective, the processing, transmission, and storage of information. The classical theory has been established in 1948 by Claude Shannon and has later been extended to incorporate processes where information is represented by the state of quantum systems.
A major limitation of the present theory of information is that various of its concepts and methods require, as an assumption, that the processes to be studied are iterated many times. For example, Shannon's well-known result that the Shannon entropy equals the data compression rate assumes a source that repeatedly emits data according to the same given distribution. In addition, such results are often only valid asymptotically as the number of iterations tends to infinity.
While this limitation is normally acceptable when studying classical information-processing tasks such as channel coding (since communication channels are typically used repeatedly), it turns out to be a severe obstacle when analyzing new types of applications such as quantum cryptography. For instance, there is generally no sensible way to describe the attack strategy of an adversary against a quantum key distribution scheme as a recurrent process.
The goal of this project is to overcome this limitation and develop a theory of (classical and quantum) information which is completely general. Among the potential applications are new types of cryptographic schemes providing device-independent security. That is, their security guarantees hold independently of the details (and imperfections) of the actual implementations.
Summary
Information theory is a branch of science that studies, from a mathematical perspective, the processing, transmission, and storage of information. The classical theory has been established in 1948 by Claude Shannon and has later been extended to incorporate processes where information is represented by the state of quantum systems.
A major limitation of the present theory of information is that various of its concepts and methods require, as an assumption, that the processes to be studied are iterated many times. For example, Shannon's well-known result that the Shannon entropy equals the data compression rate assumes a source that repeatedly emits data according to the same given distribution. In addition, such results are often only valid asymptotically as the number of iterations tends to infinity.
While this limitation is normally acceptable when studying classical information-processing tasks such as channel coding (since communication channels are typically used repeatedly), it turns out to be a severe obstacle when analyzing new types of applications such as quantum cryptography. For instance, there is generally no sensible way to describe the attack strategy of an adversary against a quantum key distribution scheme as a recurrent process.
The goal of this project is to overcome this limitation and develop a theory of (classical and quantum) information which is completely general. Among the potential applications are new types of cryptographic schemes providing device-independent security. That is, their security guarantees hold independently of the details (and imperfections) of the actual implementations.
Max ERC Funding
1 288 792 €
Duration
Start date: 2010-12-01, End date: 2015-11-30
Project acronym GRAPHCPX
Project A graph complex valued field theory
Researcher (PI) Thomas Hans Willwacher
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), PE1, ERC-2015-STG
Summary The goal of the proposed project is to create a universal (AKSZ type) topological field theory with values in graph complexes, capturing the rational homotopy types of manifolds, configuration and embedding spaces.
If successful, such a theory will unite certain areas of mathematical physics, topology, homological algebra and algebraic geometry. More concretely, from the physical viewpoint it would give a precise topological interpretation of a class of well studied topological field theories, as opposed to the current state of the art, in which these theories are defined by giving formulae without guarantees on the non-triviality of the produced invariants.
From the topological viewpoint such a theory will provide new tools to study much sought after objects like configuration and embedding spaces, and tentatively also diffeomorphism groups, through small combinatorial models given by Feynman diagrams. In particular, this will unite and extend existing graphical models of configuration and embedding spaces due to Kontsevich, Lambrechts, Volic, Arone, Turchin and others.
From the homological algebra viewpoint a field theory as above provides a wealth of additional algebraic structures on the graph complexes, which are some of the most central and most mysterious objects in the field.
Such algebraic structures are expected to yield constraints on the graph cohomology, as well as ways to construct series of previously unknown classes.
Summary
The goal of the proposed project is to create a universal (AKSZ type) topological field theory with values in graph complexes, capturing the rational homotopy types of manifolds, configuration and embedding spaces.
If successful, such a theory will unite certain areas of mathematical physics, topology, homological algebra and algebraic geometry. More concretely, from the physical viewpoint it would give a precise topological interpretation of a class of well studied topological field theories, as opposed to the current state of the art, in which these theories are defined by giving formulae without guarantees on the non-triviality of the produced invariants.
From the topological viewpoint such a theory will provide new tools to study much sought after objects like configuration and embedding spaces, and tentatively also diffeomorphism groups, through small combinatorial models given by Feynman diagrams. In particular, this will unite and extend existing graphical models of configuration and embedding spaces due to Kontsevich, Lambrechts, Volic, Arone, Turchin and others.
From the homological algebra viewpoint a field theory as above provides a wealth of additional algebraic structures on the graph complexes, which are some of the most central and most mysterious objects in the field.
Such algebraic structures are expected to yield constraints on the graph cohomology, as well as ways to construct series of previously unknown classes.
Max ERC Funding
1 162 500 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym HYBRIDQED
Project Hybrid Cavity Quantum Electrodynamics with Atoms and Circuits
Researcher (PI) Andreas Joachim Wallraff
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), PE3, ERC-2009-StG
Summary We plan to investigate the strong coherent interaction of light and matter on the level of individual photons and atoms or atom-like systems. In particular, we will explore large dipole moment superconducting artificial atoms and natural Rydberg atoms interacting with radiation fields contained in quasi-one-dimensional on-chip microwave frequency resonators. In these resonators photons generate field strengths that exceed those in conventional mirror based resonators by orders of magnitude and they can also be stored for long times. This allows us to reach the strong coupling limit of cavity quantum electrodynamics (QED) using superconducting circuits, an approach known as circuit QED. In this project we will explore novel approaches to perform quantum optics experiments in circuits. We will develop techniques to generate and detect non-classical radiation fields using nonlinear resonators and chip-based interferometers. We will also further advance the circuit QED approach to quantum information processing. Our main goal is to develop an interface between circuit and atom based realizations of cavity QED. In particular, we will couple Rydberg atoms to on-chip resonators. To achieve this goal we will first investigate the interaction of ensembles of atoms in a beam with the coherent fields in a transmission line or a resonator. We will perform spectroscopy and we will investigate on-chip dispersive detection schemes for Rydberg atoms. We will also explore the interaction of Rydberg atoms with chip surfaces in dependence on materials, temperature and geometry. Experiments will be performed from 300 K down to millikelvin temperatures. We will realize and characterize on-chip traps for Rydberg atoms. Using trapped atoms we will explore their coherent dynamics. Finally, we aim at investigating the single atom and single photon limit. When realized, this system will be used to explore the first quantum coherent interface between atomic and solid state qubits.
Summary
We plan to investigate the strong coherent interaction of light and matter on the level of individual photons and atoms or atom-like systems. In particular, we will explore large dipole moment superconducting artificial atoms and natural Rydberg atoms interacting with radiation fields contained in quasi-one-dimensional on-chip microwave frequency resonators. In these resonators photons generate field strengths that exceed those in conventional mirror based resonators by orders of magnitude and they can also be stored for long times. This allows us to reach the strong coupling limit of cavity quantum electrodynamics (QED) using superconducting circuits, an approach known as circuit QED. In this project we will explore novel approaches to perform quantum optics experiments in circuits. We will develop techniques to generate and detect non-classical radiation fields using nonlinear resonators and chip-based interferometers. We will also further advance the circuit QED approach to quantum information processing. Our main goal is to develop an interface between circuit and atom based realizations of cavity QED. In particular, we will couple Rydberg atoms to on-chip resonators. To achieve this goal we will first investigate the interaction of ensembles of atoms in a beam with the coherent fields in a transmission line or a resonator. We will perform spectroscopy and we will investigate on-chip dispersive detection schemes for Rydberg atoms. We will also explore the interaction of Rydberg atoms with chip surfaces in dependence on materials, temperature and geometry. Experiments will be performed from 300 K down to millikelvin temperatures. We will realize and characterize on-chip traps for Rydberg atoms. Using trapped atoms we will explore their coherent dynamics. Finally, we aim at investigating the single atom and single photon limit. When realized, this system will be used to explore the first quantum coherent interface between atomic and solid state qubits.
Max ERC Funding
1 954 464 €
Duration
Start date: 2009-09-01, End date: 2014-08-31
Project acronym iModel
Project Intelligent Shape Modeling
Researcher (PI) Olga Sorkine
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary Digital 3D content creation and modeling has become an indispensable part of our technology-driven society. Any modern design and manufacturing process involves manipulation of digital 3D shapes. Many industries have been long expecting ubiquitous 3D as the next revolution in multimedia. Yet, contrary to “traditional” media such as digital music and video, 3D content creation and editing is not accessible to the general public, and 3D geometric data is not nearly as wide-spread as it has been anticipated. Despite extensive geometric modeling research in the past two decades, 3D modeling is still a restricted domain and demands tedious, time consuming and expensive work effort even from trained professionals, namely engineers, designers, and digital artists. Geometric modeling is reported to constitute one of the lowest-productivity components of product life cycle.
The major reason for 3D shape modeling remaining inaccessible and tedious is that our current geometry representation and modeling algorithms focus on low-level mathematical properties of the shapes, entirely missing structural, contextual or semantic information. As a consequence, current modeling systems are unintuitive, inefficient and difficult for humans to work with. We believe that instead of continuing on the current incremental research path, a concentrated effort is required to fundamentally rethink the shape modeling process and re-align research agendas, putting high-level shape structure and function at the core. We propose a research plan that will lead to intelligent digital 3D modeling tools that integrate semantic knowledge about the objects being modeled and provide the user an intuitive and logical response, fostering creativity and eliminating unnecessary low-level manual modeling tasks. Achieving these goals will represent a fundamental change to our current notion of 3D modeling, and will finally enable us to leverage the true potential of digital 3D content for society.
Summary
Digital 3D content creation and modeling has become an indispensable part of our technology-driven society. Any modern design and manufacturing process involves manipulation of digital 3D shapes. Many industries have been long expecting ubiquitous 3D as the next revolution in multimedia. Yet, contrary to “traditional” media such as digital music and video, 3D content creation and editing is not accessible to the general public, and 3D geometric data is not nearly as wide-spread as it has been anticipated. Despite extensive geometric modeling research in the past two decades, 3D modeling is still a restricted domain and demands tedious, time consuming and expensive work effort even from trained professionals, namely engineers, designers, and digital artists. Geometric modeling is reported to constitute one of the lowest-productivity components of product life cycle.
The major reason for 3D shape modeling remaining inaccessible and tedious is that our current geometry representation and modeling algorithms focus on low-level mathematical properties of the shapes, entirely missing structural, contextual or semantic information. As a consequence, current modeling systems are unintuitive, inefficient and difficult for humans to work with. We believe that instead of continuing on the current incremental research path, a concentrated effort is required to fundamentally rethink the shape modeling process and re-align research agendas, putting high-level shape structure and function at the core. We propose a research plan that will lead to intelligent digital 3D modeling tools that integrate semantic knowledge about the objects being modeled and provide the user an intuitive and logical response, fostering creativity and eliminating unnecessary low-level manual modeling tasks. Achieving these goals will represent a fundamental change to our current notion of 3D modeling, and will finally enable us to leverage the true potential of digital 3D content for society.
Max ERC Funding
1 497 442 €
Duration
Start date: 2012-09-01, End date: 2017-08-31
Project acronym IMPRO
Project Implicit Programming
Researcher (PI) Viktor Kuncak
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary "I propose implicit programming, a paradigm for developing reliable software using new programming language specification constructs and tools, supported through the new notion of software synthesis procedures. The paradigm will enable developers to use specifications as executable programming language constructs and will automate some of the program construction tasks to the point where they become feasible for the end users. Implicit programming will increase developer productivity by enabling developers to focus on the desired software functionality instead of worrying about low-level implementation details. Implicit programming will also improve software reliability, because the presence of specifications will make programs easier to analyze.
From the algorithmic perspective, I propose a new agenda for research in algorithms for decidable logical theories. An input to such an algorithm is a logical formula (or a boolean-valued programming language expressions). Whereas a decision procedure for satisfiability merely checks whether there exists a satisfying assignment for the formula, we propose to develop synthesis procedures. A synthesis procedure views the input as a relation between inputs and outputs, and produces a function from input variables to output variables. In other words, it transforms a specification into a computable function. We will design synthesis procedures for important classes of formulas motivated by useful programming language fragments. We will use synthesis procedures as a compilation mechanism for declarative programming language constructs, ensuring correctness by construction. To develop practical synthesis procedures we will combine insights from decision procedure research (including the results on SMT solvers), with the research on compiler construction, program analysis, and program transformation. The experience from the rich model toolkit initiative (http://RichModels.org) will help us address these goals."
Summary
"I propose implicit programming, a paradigm for developing reliable software using new programming language specification constructs and tools, supported through the new notion of software synthesis procedures. The paradigm will enable developers to use specifications as executable programming language constructs and will automate some of the program construction tasks to the point where they become feasible for the end users. Implicit programming will increase developer productivity by enabling developers to focus on the desired software functionality instead of worrying about low-level implementation details. Implicit programming will also improve software reliability, because the presence of specifications will make programs easier to analyze.
From the algorithmic perspective, I propose a new agenda for research in algorithms for decidable logical theories. An input to such an algorithm is a logical formula (or a boolean-valued programming language expressions). Whereas a decision procedure for satisfiability merely checks whether there exists a satisfying assignment for the formula, we propose to develop synthesis procedures. A synthesis procedure views the input as a relation between inputs and outputs, and produces a function from input variables to output variables. In other words, it transforms a specification into a computable function. We will design synthesis procedures for important classes of formulas motivated by useful programming language fragments. We will use synthesis procedures as a compilation mechanism for declarative programming language constructs, ensuring correctness by construction. To develop practical synthesis procedures we will combine insights from decision procedure research (including the results on SMT solvers), with the research on compiler construction, program analysis, and program transformation. The experience from the rich model toolkit initiative (http://RichModels.org) will help us address these goals."
Max ERC Funding
1 439 240 €
Duration
Start date: 2012-12-01, End date: 2017-11-30
Project acronym LASER-ARPES
Project Laser based photoemission: revolutionizing the spectroscopy of correlated electrons
Researcher (PI) Felix Baumberger
Host Institution (HI) UNIVERSITE DE GENEVE
Call Details Starting Grant (StG), PE3, ERC-2007-StG
Summary It is proposed to develop a novel instrument for angular resolved photoelectron spectroscopy (ARPES) by combining a laser based ultraviolet light source with a state-of-the-art electron spectrometer. This combination will be unique in Europe and will push this important technique to an entirely new level of resolution, comparable to the thermal broadening at 1 K and nearly an order of magnitude lower than the resolution achievable in practical ARPES experiments with the latest synchrotron light sources. The low photon energy of this new source will also markedly enhance the bulk sensitivity of ARPES and thus enable the investigation of interesting materials that were not accessible so far. These new capabilities will be used to study the subtle quantum many-body states of correlated electrons in transition metal oxides, a frontier topic in condensed-matter physics. Specifically, we will focus on electronic instabilities in perovskites and elucidate how different degrees of freedom play together to determine the often vastly different properties of chemically closely related materials. Moreover, we will apply modern electron spectroscopy to correlated molecular solids with complex phase diagrams that challenge existing theory for satisfactory explanations. This field is largely unexplored but is fundamental for advances in molecular electronics.
Summary
It is proposed to develop a novel instrument for angular resolved photoelectron spectroscopy (ARPES) by combining a laser based ultraviolet light source with a state-of-the-art electron spectrometer. This combination will be unique in Europe and will push this important technique to an entirely new level of resolution, comparable to the thermal broadening at 1 K and nearly an order of magnitude lower than the resolution achievable in practical ARPES experiments with the latest synchrotron light sources. The low photon energy of this new source will also markedly enhance the bulk sensitivity of ARPES and thus enable the investigation of interesting materials that were not accessible so far. These new capabilities will be used to study the subtle quantum many-body states of correlated electrons in transition metal oxides, a frontier topic in condensed-matter physics. Specifically, we will focus on electronic instabilities in perovskites and elucidate how different degrees of freedom play together to determine the often vastly different properties of chemically closely related materials. Moreover, we will apply modern electron spectroscopy to correlated molecular solids with complex phase diagrams that challenge existing theory for satisfactory explanations. This field is largely unexplored but is fundamental for advances in molecular electronics.
Max ERC Funding
1 450 825 €
Duration
Start date: 2008-08-01, End date: 2013-07-31
Project acronym MAQD
Project Mathematical Aspects of Quantum Dynamics
Researcher (PI) Benjamin Schlein
Host Institution (HI) UNIVERSITAT ZURICH
Call Details Starting Grant (StG), PE1, ERC-2009-StG
Summary The main goal of this proposal is to reach
a better mathematical understanding of
the dynamics of quantum mechanical
systems. In particular I plan to work
on the following three projects along
this direction. A. Effective Evolution
Equations for Macroscopic Systems.
The derivation of effective evolution
equations from first principle microscopic
theories is a fundamental task of statistical
mechanics. I have been involved in
several projects related to the derivation
of the Hartree and the Gross-Piteavskii
equation from many body quantum
dynamics. I plan to continue to work on
these problems and to use these results
to obtain new information on the many
body dynamics. B. Spectral Properties
of Random Matrices. The correlations
among eigenvalues of large random
matrices are expected to be independent
of the distribution of the entries. This
conjecture, known as universality, is
of great importance for random matrix
theory. In collaboration with L. Erdos and
H.-T. Yau, we established the validity of
Wigner's semicircle law on
microscopic scales, and we proved the
emergence of eigenvalue repulsion. In
the future, we plan to continue to study
Wigner matrices to prove, on the longer
term, universality. C. Locality Estimates in
Quantum Dynamics. Anharmonic lattice
systems are very important models in
non-equilibrium statistical mechanics.
With B. Nachtergaele, H. Raz, and R.
Sims, we proved Lieb-Robinson type
inequalities (giving an upper bound on
the speed of propagation of signals), for
a certain class of anharmonicity. Next, we
plan to extend these results to a larger
class of anharmonic potentials, and to
apply these bounds to establish other
fundamental properties of the dynamics
of anharmonic systems, such as the
existence of its thermodynamical limit.
Summary
The main goal of this proposal is to reach
a better mathematical understanding of
the dynamics of quantum mechanical
systems. In particular I plan to work
on the following three projects along
this direction. A. Effective Evolution
Equations for Macroscopic Systems.
The derivation of effective evolution
equations from first principle microscopic
theories is a fundamental task of statistical
mechanics. I have been involved in
several projects related to the derivation
of the Hartree and the Gross-Piteavskii
equation from many body quantum
dynamics. I plan to continue to work on
these problems and to use these results
to obtain new information on the many
body dynamics. B. Spectral Properties
of Random Matrices. The correlations
among eigenvalues of large random
matrices are expected to be independent
of the distribution of the entries. This
conjecture, known as universality, is
of great importance for random matrix
theory. In collaboration with L. Erdos and
H.-T. Yau, we established the validity of
Wigner's semicircle law on
microscopic scales, and we proved the
emergence of eigenvalue repulsion. In
the future, we plan to continue to study
Wigner matrices to prove, on the longer
term, universality. C. Locality Estimates in
Quantum Dynamics. Anharmonic lattice
systems are very important models in
non-equilibrium statistical mechanics.
With B. Nachtergaele, H. Raz, and R.
Sims, we proved Lieb-Robinson type
inequalities (giving an upper bound on
the speed of propagation of signals), for
a certain class of anharmonicity. Next, we
plan to extend these results to a larger
class of anharmonic potentials, and to
apply these bounds to establish other
fundamental properties of the dynamics
of anharmonic systems, such as the
existence of its thermodynamical limit.
Max ERC Funding
750 000 €
Duration
Start date: 2009-12-01, End date: 2014-11-30
Project acronym MEGA-XUV
Project Efficient megahertz coherent XUV light source
Researcher (PI) Thomas Südmeyer
Host Institution (HI) UNIVERSITE DE NEUCHATEL
Call Details Starting Grant (StG), PE3, ERC-2011-StG_20101014
Summary "Coherent extreme ultraviolet (XUV) light sources open up new opportunities for science and technology. Promising examples are attosecond metrology, spectroscopic and structural analysis of matter on a nanometer scale, high resolution XUV-microscopy and lithography. The most promising technique for table-top sources is femtosecond laser-driven high-harmonic generation (HHG) in gases. Unfortunately, their XUV photon flux is not sufficient for most applications. This is caused by the low average power of the kHz repetition rate driving lasers (<10 W) and the poor conversion efficiency (<10-6). Following the traditional path of increasing the power, numerous research teams are engineering larger and more complex femtosecond high-power amplifier systems, which are supposed to provide several kilowatts of average power in the next decade. However, it is questionable if such systems can easily serve as tool for further scientific studies with XUV light.
The goal of this proposal is the realization of a simpler and more efficient source of high-flux XUV radiation. Instead of amplifying a laser beam to several kW of power and dumping it after the HHG interaction, the generation of high harmonics is placed directly inside the intra-cavity multi-kilowatt beam of a femtosecond laser. Thus, the unconverted light is “recycled”, and the laser medium only needs to compensate for the low losses of the resonator. Achieving passive femtosecond pulse formation at these record-high power levels will require eliminating any destabilizing effects inside the resonator. This appears to be only feasible with ultrafast thin disk lasers, because all key components are used in reflection.
Exploiting the scientific opportunities of the resulting table-top multi-MHz coherent XUV light source in various interdisciplinary applications is the second major part of this project. The developed XUV source will be transportable, which will enable the fast implementation of joint measurements."
Summary
"Coherent extreme ultraviolet (XUV) light sources open up new opportunities for science and technology. Promising examples are attosecond metrology, spectroscopic and structural analysis of matter on a nanometer scale, high resolution XUV-microscopy and lithography. The most promising technique for table-top sources is femtosecond laser-driven high-harmonic generation (HHG) in gases. Unfortunately, their XUV photon flux is not sufficient for most applications. This is caused by the low average power of the kHz repetition rate driving lasers (<10 W) and the poor conversion efficiency (<10-6). Following the traditional path of increasing the power, numerous research teams are engineering larger and more complex femtosecond high-power amplifier systems, which are supposed to provide several kilowatts of average power in the next decade. However, it is questionable if such systems can easily serve as tool for further scientific studies with XUV light.
The goal of this proposal is the realization of a simpler and more efficient source of high-flux XUV radiation. Instead of amplifying a laser beam to several kW of power and dumping it after the HHG interaction, the generation of high harmonics is placed directly inside the intra-cavity multi-kilowatt beam of a femtosecond laser. Thus, the unconverted light is “recycled”, and the laser medium only needs to compensate for the low losses of the resonator. Achieving passive femtosecond pulse formation at these record-high power levels will require eliminating any destabilizing effects inside the resonator. This appears to be only feasible with ultrafast thin disk lasers, because all key components are used in reflection.
Exploiting the scientific opportunities of the resulting table-top multi-MHz coherent XUV light source in various interdisciplinary applications is the second major part of this project. The developed XUV source will be transportable, which will enable the fast implementation of joint measurements."
Max ERC Funding
1 500 000 €
Duration
Start date: 2012-03-01, End date: 2017-02-28
Project acronym MIAMI
Project Machine Learning-based Market Design
Researcher (PI) Sven SEUKEN
Host Institution (HI) UNIVERSITAT ZURICH
Call Details Starting Grant (StG), PE6, ERC-2018-STG
Summary "Market designers study how to set the ""rules of a marketplace"" such that the market works well. However, markets are getting increasingly complex such that designing good market mechanisms ""by hand"" is often infeasible, in particular when certain design desiderata (such as efficiency, strategyproofness, or fairness) are in conflict with each other. Moreover, human agents are boundedly-rational: already in small domains, they are best modeled as having incomplete preferences, because they may only know a ranking or the values of their top choices. In combinatorial domains, the number of choices grows exponentially, such that it quickly becomes impossible for an agent to report its full valuation, even if it had complete preferences. In this ERC grant proposal, we propose to combine techniques from ""machine learning"" with ""market design"" to address these challenges.
First, we propose to develop a new, automated approach to design mechanisms with the help of machine learning (ML). In contrast to prior ML-based automated mechanism design work, we explicitly aim to train the ML algorithm to exploit regularities in the mechanism design space. Second, we propose to study the ""design of machine learning-based mechanisms."" These are mechanisms that use machine learning internally to achieve good efficiency and incentives even when agents have incomplete knowledge about their own preferences.
In addition to pushing the scientific boundaries of market design research, this ERC project will also have an immediate impact on practical market design. We will apply our techniques in two different settings: (1) for the design of combinatorial spectrum auctions, a multi-billion dollar domain; and (2) for the design of school choice matching markets, which are used to match millions of students to high school every year.
"
Summary
"Market designers study how to set the ""rules of a marketplace"" such that the market works well. However, markets are getting increasingly complex such that designing good market mechanisms ""by hand"" is often infeasible, in particular when certain design desiderata (such as efficiency, strategyproofness, or fairness) are in conflict with each other. Moreover, human agents are boundedly-rational: already in small domains, they are best modeled as having incomplete preferences, because they may only know a ranking or the values of their top choices. In combinatorial domains, the number of choices grows exponentially, such that it quickly becomes impossible for an agent to report its full valuation, even if it had complete preferences. In this ERC grant proposal, we propose to combine techniques from ""machine learning"" with ""market design"" to address these challenges.
First, we propose to develop a new, automated approach to design mechanisms with the help of machine learning (ML). In contrast to prior ML-based automated mechanism design work, we explicitly aim to train the ML algorithm to exploit regularities in the mechanism design space. Second, we propose to study the ""design of machine learning-based mechanisms."" These are mechanisms that use machine learning internally to achieve good efficiency and incentives even when agents have incomplete knowledge about their own preferences.
In addition to pushing the scientific boundaries of market design research, this ERC project will also have an immediate impact on practical market design. We will apply our techniques in two different settings: (1) for the design of combinatorial spectrum auctions, a multi-billion dollar domain; and (2) for the design of school choice matching markets, which are used to match millions of students to high school every year.
"
Max ERC Funding
1 375 000 €
Duration
Start date: 2018-12-01, End date: 2023-11-30
Project acronym microCrysFact
Project Microfluidic Crystal Factories (μ-CrysFact): a breakthrough approach for crystal engineering
Researcher (PI) Jose Puigmartí Luis
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), PE8, ERC-2015-STG
Summary To study and understand the aggregation, nucleation, and/or self-assembly processes of crystalline matter is of crucial importance for research and applications in many disciplines. For example, understanding the formation of crystalline amyloid fibres could lead to advances in the treatment and prevention of both Alzheimer’s and Parkinson’s diseases, whereas controlling the process of crystal formation can play a significant role in obtaining chemicals and materials that are important for industry as well as society as a whole (e.g., drugs, superconductors, polarizers and/or frequency modulators).
Despite the impressive progress made in molecular engineering during the last few decades, the quest for a general tool-box technology to study, control and monitor crystallisation processes as well as to isolate metastable states (dynamic capture) is still incomplete. That is because crystalline assemblies are frequently investigated in their equilibrium form, driving the system to its minimum energy state. This methodology limits the emergence of new chemicals and crystals with advanced functionalities, and thus hampers advances in the field of materials engineering.
µ-CrysFact will develop tool-box technologies where diffusion-limited and kinetically controlled environments will be achieved during crystallisation and where the isolation of non-equilibrium species will be facilitated by pushing crystallisation processes out of equilibrium. In addition, µ-CrysFact’s technologies will be used to localise, integrate and chemically treat crystals with the aim of honing their functionality. This unprecedented approach has the potential to lead to the discovery of new materials with advanced functions and unique properties, thus opening new horizons in materials engineering research.
Summary
To study and understand the aggregation, nucleation, and/or self-assembly processes of crystalline matter is of crucial importance for research and applications in many disciplines. For example, understanding the formation of crystalline amyloid fibres could lead to advances in the treatment and prevention of both Alzheimer’s and Parkinson’s diseases, whereas controlling the process of crystal formation can play a significant role in obtaining chemicals and materials that are important for industry as well as society as a whole (e.g., drugs, superconductors, polarizers and/or frequency modulators).
Despite the impressive progress made in molecular engineering during the last few decades, the quest for a general tool-box technology to study, control and monitor crystallisation processes as well as to isolate metastable states (dynamic capture) is still incomplete. That is because crystalline assemblies are frequently investigated in their equilibrium form, driving the system to its minimum energy state. This methodology limits the emergence of new chemicals and crystals with advanced functionalities, and thus hampers advances in the field of materials engineering.
µ-CrysFact will develop tool-box technologies where diffusion-limited and kinetically controlled environments will be achieved during crystallisation and where the isolation of non-equilibrium species will be facilitated by pushing crystallisation processes out of equilibrium. In addition, µ-CrysFact’s technologies will be used to localise, integrate and chemically treat crystals with the aim of honing their functionality. This unprecedented approach has the potential to lead to the discovery of new materials with advanced functions and unique properties, thus opening new horizons in materials engineering research.
Max ERC Funding
1 814 128 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym MiniMasonryTesting
Project Seismic Testing of 3D Printed Miniature Masonry in a Geotechnical Centrifuge
Researcher (PI) Michalis VASSILIOU
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), PE8, ERC-2018-STG
Summary Earthquakes are responsible for more than half of the human losses due to natural disasters. Masonry structures have been proven the most vulnerable both in the developing and in the developed world. Even though Masonry is one of the oldest building materials, our understanding of its behavior at the level of the structure (system level) is limited. Therefore, there is a need for extended shake table testing. But shake table tests are expensive and full-scale system-level testing of large buildings is only possible in a handful of shake tables in the globe – and at a huge cost.
We propose to take advantage of research developments in 3D printing and develop a method to perform system-level testing at a small scale using 3D printers and a geotechnical centrifuge (to preserve similitude). The key is to print materials with behavior controllable and similar to masonry. MiniMasonry testing proposes to control the properties of masonry via controlling the geometry of a 3D printed “meta”-mortar. The method will be developed via typical static masonry tests performed on the 3D printed parts. It will be further validated via comparing shaking table tests (in a centrifuge) of miniature structures to existing results of full-scale tests. The cost of the dynamic tests is expected to be so low, that multiple tests can be performed, so that existing numerical methods can be validated in the statistical sense. As a case study, the method will be applied to explore the behavior of a low-cost seismic isolation method that has been proposed for masonry structures in developing countries.
With the rapid evolution of 3D printing, it will be possible to scale-up the methods developed in MiniMasonryTesting, so that other Civil Engineering materials can be tested faster and cheaper than now. This is a game changer in structural testing, as it will enable researchers to test structures that up to now it was impossible or very expensive to test at a system level.
Summary
Earthquakes are responsible for more than half of the human losses due to natural disasters. Masonry structures have been proven the most vulnerable both in the developing and in the developed world. Even though Masonry is one of the oldest building materials, our understanding of its behavior at the level of the structure (system level) is limited. Therefore, there is a need for extended shake table testing. But shake table tests are expensive and full-scale system-level testing of large buildings is only possible in a handful of shake tables in the globe – and at a huge cost.
We propose to take advantage of research developments in 3D printing and develop a method to perform system-level testing at a small scale using 3D printers and a geotechnical centrifuge (to preserve similitude). The key is to print materials with behavior controllable and similar to masonry. MiniMasonry testing proposes to control the properties of masonry via controlling the geometry of a 3D printed “meta”-mortar. The method will be developed via typical static masonry tests performed on the 3D printed parts. It will be further validated via comparing shaking table tests (in a centrifuge) of miniature structures to existing results of full-scale tests. The cost of the dynamic tests is expected to be so low, that multiple tests can be performed, so that existing numerical methods can be validated in the statistical sense. As a case study, the method will be applied to explore the behavior of a low-cost seismic isolation method that has been proposed for masonry structures in developing countries.
With the rapid evolution of 3D printing, it will be possible to scale-up the methods developed in MiniMasonryTesting, so that other Civil Engineering materials can be tested faster and cheaper than now. This is a game changer in structural testing, as it will enable researchers to test structures that up to now it was impossible or very expensive to test at a system level.
Max ERC Funding
1 999 477 €
Duration
Start date: 2019-04-01, End date: 2024-03-31
Project acronym MININEXACT
Project Exact Mining from In-Exact Data
Researcher (PI) Michail Vlachos
Host Institution (HI) IBM RESEARCH GMBH
Call Details Starting Grant (StG), PE6, ERC-2010-StG_20091028
Summary Data exchange and data publishing is an inherent component of our interconnected world. Industrial companies outsource datasets to marketing and mining firms in order to support business intelligence; medical institutions exchange collected clinical experiments; academic institutions create repositories and share datasets for promoting research collaboration. A common denominator in any data exchange is the 'transformation' of the original data, which usually results in 'distortion' of data. While accurate and useful information can be potentially distilled from the original data, operations such as anonymization, rights protection and compression result in modified datasets that very seldom retain the mining capacity of its original source. This proposal seeks to address questions such as the following:
- How can we lossy compress datasets and still guarantee that mining operations are not distorted?
- Is it possible to right protect datasets and provide assurances that this task shall not impair our ability to distill useful knowledge?
- To what extent can we resolve data anonymization issues and yet retain the mining capacity of the original dataset?
We will examine a fundamental and hard problem in the area of knowledge discovery, which is the delicate balance between data transformation and data utility under mining operations. The problem lies at the confluence of many areas, such as machine and statistical learning, information theory, data representation and optimization. We will focus on studying data transformation methods (compression, anonymization, right protection) that guarantee the preservation of the salient dataset characteristics, such that data mining operations on original and transformed dataset are retained as well as possible. We will investigate how graph-centric approaches, clustering, classification and visualization algorithms can be ported to work under the proposed mining-preservation paradigm. Additional research challenges i
Summary
Data exchange and data publishing is an inherent component of our interconnected world. Industrial companies outsource datasets to marketing and mining firms in order to support business intelligence; medical institutions exchange collected clinical experiments; academic institutions create repositories and share datasets for promoting research collaboration. A common denominator in any data exchange is the 'transformation' of the original data, which usually results in 'distortion' of data. While accurate and useful information can be potentially distilled from the original data, operations such as anonymization, rights protection and compression result in modified datasets that very seldom retain the mining capacity of its original source. This proposal seeks to address questions such as the following:
- How can we lossy compress datasets and still guarantee that mining operations are not distorted?
- Is it possible to right protect datasets and provide assurances that this task shall not impair our ability to distill useful knowledge?
- To what extent can we resolve data anonymization issues and yet retain the mining capacity of the original dataset?
We will examine a fundamental and hard problem in the area of knowledge discovery, which is the delicate balance between data transformation and data utility under mining operations. The problem lies at the confluence of many areas, such as machine and statistical learning, information theory, data representation and optimization. We will focus on studying data transformation methods (compression, anonymization, right protection) that guarantee the preservation of the salient dataset characteristics, such that data mining operations on original and transformed dataset are retained as well as possible. We will investigate how graph-centric approaches, clustering, classification and visualization algorithms can be ported to work under the proposed mining-preservation paradigm. Additional research challenges i
Max ERC Funding
1 499 999 €
Duration
Start date: 2011-04-01, End date: 2016-03-31
Project acronym MiTopMat
Project Microstructured Topological Materials: A novel route towards topological electronics
Researcher (PI) Philip MOLL
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE3, ERC-2016-STG
Summary Topological semi-metals such as Cd3As2 or TaAs are characterized by two bands crossing at isolated points in momentum space and a linear electronic dispersion around these crossing points. This linear dispersion can be mapped onto the Dirac- or Weyl-Hamiltonian, describing relativistic massless fermions, and thus relativistic phenomena from high-energy physics may appear in these materials. For example, the chirality, χ=±1, is a conserved quantity for massless fermions, separating the electrons into two distinct chiral species. A new class of topological electronics has been proposed based on chirality imbalance and chiral currents taking the role of charge imbalance and charge currents in electronics. Such devices promise technological advances in speed, energy efficiency, and quantum coherent processes at elevated temperatures.
We will research the basic physical phenomena on which topological electronics is based: 1) The ability to interact electrically with the chiral states in a topological semi-metal is an essential prerequisite for their application. We will investigate whether currents in the Fermi arc surface states can be induced by charge currents and selectively detected by voltage measurements. 2) Weyl materials are more robust against defects and therefore of interest for industrial fabrication. We will experimentally test this topological protection in high-field transport experiments in a wide range of Weyl materials. 3) Recently, topological processes leading to fast, tuneable and efficient voltage inversion were predicted. We will investigate the phenomenon, fabricate and characterize such inverters, and assess their performance. MiTopMat thus aims to build the first prototype of a topological voltage inverter.
These goals are challenging but achievable: MiTopMat’s research plan is based on Focused Ion Beam microfabrication, which we have successfully shown to be a promising route to fabricate chiral devices.
Summary
Topological semi-metals such as Cd3As2 or TaAs are characterized by two bands crossing at isolated points in momentum space and a linear electronic dispersion around these crossing points. This linear dispersion can be mapped onto the Dirac- or Weyl-Hamiltonian, describing relativistic massless fermions, and thus relativistic phenomena from high-energy physics may appear in these materials. For example, the chirality, χ=±1, is a conserved quantity for massless fermions, separating the electrons into two distinct chiral species. A new class of topological electronics has been proposed based on chirality imbalance and chiral currents taking the role of charge imbalance and charge currents in electronics. Such devices promise technological advances in speed, energy efficiency, and quantum coherent processes at elevated temperatures.
We will research the basic physical phenomena on which topological electronics is based: 1) The ability to interact electrically with the chiral states in a topological semi-metal is an essential prerequisite for their application. We will investigate whether currents in the Fermi arc surface states can be induced by charge currents and selectively detected by voltage measurements. 2) Weyl materials are more robust against defects and therefore of interest for industrial fabrication. We will experimentally test this topological protection in high-field transport experiments in a wide range of Weyl materials. 3) Recently, topological processes leading to fast, tuneable and efficient voltage inversion were predicted. We will investigate the phenomenon, fabricate and characterize such inverters, and assess their performance. MiTopMat thus aims to build the first prototype of a topological voltage inverter.
These goals are challenging but achievable: MiTopMat’s research plan is based on Focused Ion Beam microfabrication, which we have successfully shown to be a promising route to fabricate chiral devices.
Max ERC Funding
1 836 070 €
Duration
Start date: 2017-12-01, End date: 2022-11-30
Project acronym ModGravTrial
Project Modified Gravity on Trial
Researcher (PI) Lavinia HEISENBERG
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), PE9, ERC-2018-STG
Summary The main goal of this project is to study the fundamental properties of field theories of space-time, their cosmological consequences and observational signatures. My approach aims to use the next generation of cosmological and astrophysical observations to test the validity of General Relativity on scales where it has not been fully tested yet and its resilience against alternative theories with modifications on cosmological scales.
I will first study the implications of the quantum aspects of modified gravity theories beyond the tree level analyses for their viability, consistency and predictability. This will reduce the allowed alternative theories significantly. I will then investigate the physical consequences of these theoretically promising theories for cosmological and astrophysical scenarios. As part of my approach, I will use large galaxy surveys to constrain effects on the dynamics of cosmic structure formation from these modifications of gravity. In addition, I will confront modified theoretical predictions with Planck measurements of the Cosmic Microwave Background, type-Ia supernova data, measurements of the Baryon Acoustic Oscillations and gravitational lensing. Along the way, my group and I will crucially contribute to the development of the necessary analytical and numerical tools to exhaustively analyse such data in the search for modifications of gravity.
In addition, the recent detection of gravitational waves by the LIGO team has paved an exciting new avenue for testing gravitational theories. These new observations will put even more stringent constraints on alternative theories. As part of the proposed research, I will also extensively exploit this new observational channel to test the validity of General Relativity and put new effects of modified gravity on trial. In particular, the propagation speed of tensor perturbations in modified gravity theories will be severely restricted by observations of gravitational waves.
Summary
The main goal of this project is to study the fundamental properties of field theories of space-time, their cosmological consequences and observational signatures. My approach aims to use the next generation of cosmological and astrophysical observations to test the validity of General Relativity on scales where it has not been fully tested yet and its resilience against alternative theories with modifications on cosmological scales.
I will first study the implications of the quantum aspects of modified gravity theories beyond the tree level analyses for their viability, consistency and predictability. This will reduce the allowed alternative theories significantly. I will then investigate the physical consequences of these theoretically promising theories for cosmological and astrophysical scenarios. As part of my approach, I will use large galaxy surveys to constrain effects on the dynamics of cosmic structure formation from these modifications of gravity. In addition, I will confront modified theoretical predictions with Planck measurements of the Cosmic Microwave Background, type-Ia supernova data, measurements of the Baryon Acoustic Oscillations and gravitational lensing. Along the way, my group and I will crucially contribute to the development of the necessary analytical and numerical tools to exhaustively analyse such data in the search for modifications of gravity.
In addition, the recent detection of gravitational waves by the LIGO team has paved an exciting new avenue for testing gravitational theories. These new observations will put even more stringent constraints on alternative theories. As part of the proposed research, I will also extensively exploit this new observational channel to test the validity of General Relativity and put new effects of modified gravity on trial. In particular, the propagation speed of tensor perturbations in modified gravity theories will be severely restricted by observations of gravitational waves.
Max ERC Funding
1 500 000 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym NaMic
Project Nanowire Atomic Force Microscopy for Real Time Imaging of Nanoscale Biological Processes
Researcher (PI) Georg Ernest Fantner
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE3, ERC-2012-StG_20111012
Summary Short summary:
The ability to measure structures with nanoscale resolution continues to transform physics, materials science and life science alike. Nevertheless, while there are excellent tools to obtain detailed molecular-level static structure (for example in biology), there are very few tools to develop an understanding of how these structures change dynamically as they fulfill their biological function. New biologically-compatible, high-speed nanoscale characterization technologies are required to perform these measurements. In this project, we will develop a nanowire-based, high-speed atomic force microscope (NW-HS-AFM) capable of imaging the dynamics of molecular processes on living cells. We will use this instrument to study the dynamic pore-formation mechanisms of novel peptide antibiotics. This increase in performance over current AFMs will be achieved through the use of electron-beam-deposited nanogranular tunneling resistors on prefabricated nanowire AFM cantilevers. By combining these cantilevers with our state of the art high-speed AFM technology, we expect to obtain nanoscale-resolution images of protein pores on living cells at rates of tens of milliseconds per image. This capability will open a whole new arena for seeing nanoscale life in action.
Summary
Short summary:
The ability to measure structures with nanoscale resolution continues to transform physics, materials science and life science alike. Nevertheless, while there are excellent tools to obtain detailed molecular-level static structure (for example in biology), there are very few tools to develop an understanding of how these structures change dynamically as they fulfill their biological function. New biologically-compatible, high-speed nanoscale characterization technologies are required to perform these measurements. In this project, we will develop a nanowire-based, high-speed atomic force microscope (NW-HS-AFM) capable of imaging the dynamics of molecular processes on living cells. We will use this instrument to study the dynamic pore-formation mechanisms of novel peptide antibiotics. This increase in performance over current AFMs will be achieved through the use of electron-beam-deposited nanogranular tunneling resistors on prefabricated nanowire AFM cantilevers. By combining these cantilevers with our state of the art high-speed AFM technology, we expect to obtain nanoscale-resolution images of protein pores on living cells at rates of tens of milliseconds per image. This capability will open a whole new arena for seeing nanoscale life in action.
Max ERC Funding
1 264 640 €
Duration
Start date: 2012-12-01, End date: 2017-11-30
Project acronym NEWNET
Project New Approaches to Network Design
Researcher (PI) Fabrizio Grandoni
Host Institution (HI) SCUOLA UNIVERSITARIA PROFESSIONALE DELLA SVIZZERA ITALIANA
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary "Networks pervade every aspect of nowadays life. This is one of the reasons why their design, management, and analysis is one of the most active areas of theoretical and empirical research in Computer Science and Operations Research. The main goal of this project is to increase our theoretical understanding of networks, with a special focus on faster exact exponential-time algorithms and more accurate polynomial-time approximation algorithms for NP-hard network design problems. We will consider classic, challenging open problems in the literature, as well as new, exciting problems arising from the applications. These problems will be addressed with the most advanced algorithmic and analytical tools, including our recently developed techniques: iterative randomized rounding, core detouring, and randomized dissection.
A second, ambitious goal of this project is to stimulate the interaction and cross-fertilization between exact and approximation algorithms. This might open new research horizons."
Summary
"Networks pervade every aspect of nowadays life. This is one of the reasons why their design, management, and analysis is one of the most active areas of theoretical and empirical research in Computer Science and Operations Research. The main goal of this project is to increase our theoretical understanding of networks, with a special focus on faster exact exponential-time algorithms and more accurate polynomial-time approximation algorithms for NP-hard network design problems. We will consider classic, challenging open problems in the literature, as well as new, exciting problems arising from the applications. These problems will be addressed with the most advanced algorithmic and analytical tools, including our recently developed techniques: iterative randomized rounding, core detouring, and randomized dissection.
A second, ambitious goal of this project is to stimulate the interaction and cross-fertilization between exact and approximation algorithms. This might open new research horizons."
Max ERC Funding
1 122 199 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym NMU-LIPIDS
Project Biomimetic Lipid Structures on Nano- and Microfluidic Platforms
Researcher (PI) Petra Stephanie Dittrich
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), PE6, ERC-2007-StG
Summary The projects aim at the formation, manipulation, and analysis of three-dimensional lipid membrane structures on micro- and nano-structured platforms. The goal is to develop a novel methodology to design and create simple artificial cells and cell organelles, bio-hybrid cells, and bio-mimicking membrane networks, which could be an entirely novel tool for cell analysis, and promises fascinating prospects for cell manipulation, biotechnology, pharmacy and material sciences. The basis of the projects is formed by an unconventional concept that involves two current cutting-edge fabrication technologies, i.e. the so-called top-down and bottom-up approaches. The combination of the two approaches, with respect to both engineering methods and biological applications, opens the door to overcome current limitations in the creation of complex soft matter objects in micro- and nanometre dimension. The key method is a recently developed micro-extrusion process. It relies, on the one hand, on the ability of the lipid molecules to self-assemble (“bottom-up”). On the other hand, photolithography processes (“top-down”) are utilized to fabricate microchips, in which shape transformation, handling and analysis of the lipid structures are performed. The proposed engineering process will enable, for the first time, to precisely design composition, size and morphology of complex membrane structures. It will provide the requirements to design an artificial cell of reasonable complexity (“bottom-up”). One main emphasis is the creation of unique bio-hybrid systems, in which artificial membrane structures are connected to living cells, or in which natural membranes of cells are integrated within artificial systems (“top-down”). This highly interdisciplinary study will further include fundamental studies on membrane properties, engineering aspects to generate novel soft-matter devices, and the development of analytical methods and lipid sensors based on micro- and nanostructured chips.
Summary
The projects aim at the formation, manipulation, and analysis of three-dimensional lipid membrane structures on micro- and nano-structured platforms. The goal is to develop a novel methodology to design and create simple artificial cells and cell organelles, bio-hybrid cells, and bio-mimicking membrane networks, which could be an entirely novel tool for cell analysis, and promises fascinating prospects for cell manipulation, biotechnology, pharmacy and material sciences. The basis of the projects is formed by an unconventional concept that involves two current cutting-edge fabrication technologies, i.e. the so-called top-down and bottom-up approaches. The combination of the two approaches, with respect to both engineering methods and biological applications, opens the door to overcome current limitations in the creation of complex soft matter objects in micro- and nanometre dimension. The key method is a recently developed micro-extrusion process. It relies, on the one hand, on the ability of the lipid molecules to self-assemble (“bottom-up”). On the other hand, photolithography processes (“top-down”) are utilized to fabricate microchips, in which shape transformation, handling and analysis of the lipid structures are performed. The proposed engineering process will enable, for the first time, to precisely design composition, size and morphology of complex membrane structures. It will provide the requirements to design an artificial cell of reasonable complexity (“bottom-up”). One main emphasis is the creation of unique bio-hybrid systems, in which artificial membrane structures are connected to living cells, or in which natural membranes of cells are integrated within artificial systems (“top-down”). This highly interdisciplinary study will further include fundamental studies on membrane properties, engineering aspects to generate novel soft-matter devices, and the development of analytical methods and lipid sensors based on micro- and nanostructured chips.
Max ERC Funding
1 941 000 €
Duration
Start date: 2008-07-01, End date: 2014-06-30
Project acronym NOBUGS
Project Toward Zero-Defect Software Through Automatic Cooperative Self-Improvement
Researcher (PI) George Candea
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary "This proposal advocates a fundamentally new approach to achieving software quality: remove the distinction between software use and software testing -- enable programs to accumulate experience from each one of their executions, and leverage this experience toward self-improvement of the software. My hypothesis is that every program execution has information by-products that, if suitably captured and aggregated, can substantially speed up the process of testing programs and proving them correct. Software is being executed billions of times around the world, with the corresponding information going to waste. At the same time, traditional software testing tries to simulate a small subset of real-world conditions and executions. I propose instead viewing every execution of a program as a test run, and the aggregation of executions across the lifetime of all copies of that program as one gigantic test suite.
I propose the study of techniques and formalisms for automatically recouping the information that is lost during everyday software use, aggregating it, and automatically turning it into tests and proofs; techniques to use these tests and proofs to automatically correct the behavior of programs; and techniques for automatically steering programs into exploring behaviors for which information is lacking. All these techniques will be embodied in a platform, called BeeNet, that implements a massively distributed learning process which turns execution by-products into a collective experience that leads to higher quality software. This is a radical new way of exploiting the vast (but today completely wasted) information that results from program execution.
I will investigate these questions with an integrated approach that combines thorough theoretical studies with practical application to real-world software, employing the perspectives of three different research communities: operating systems, programming languages, and software verification."
Summary
"This proposal advocates a fundamentally new approach to achieving software quality: remove the distinction between software use and software testing -- enable programs to accumulate experience from each one of their executions, and leverage this experience toward self-improvement of the software. My hypothesis is that every program execution has information by-products that, if suitably captured and aggregated, can substantially speed up the process of testing programs and proving them correct. Software is being executed billions of times around the world, with the corresponding information going to waste. At the same time, traditional software testing tries to simulate a small subset of real-world conditions and executions. I propose instead viewing every execution of a program as a test run, and the aggregation of executions across the lifetime of all copies of that program as one gigantic test suite.
I propose the study of techniques and formalisms for automatically recouping the information that is lost during everyday software use, aggregating it, and automatically turning it into tests and proofs; techniques to use these tests and proofs to automatically correct the behavior of programs; and techniques for automatically steering programs into exploring behaviors for which information is lacking. All these techniques will be embodied in a platform, called BeeNet, that implements a massively distributed learning process which turns execution by-products into a collective experience that leads to higher quality software. This is a radical new way of exploiting the vast (but today completely wasted) information that results from program execution.
I will investigate these questions with an integrated approach that combines thorough theoretical studies with practical application to real-world software, employing the perspectives of three different research communities: operating systems, programming languages, and software verification."
Max ERC Funding
1 334 977 €
Duration
Start date: 2012-02-01, End date: 2018-01-31
Project acronym NWScan
Project Bottom-up Nanowires as Scanning Multifunctional Sensors
Researcher (PI) Martino Poggio
Host Institution (HI) UNIVERSITAT BASEL
Call Details Starting Grant (StG), PE3, ERC-2013-StG
Summary Advances in growth and fabrication of semiconductor nanostructures have led to both the production of exquisitely sensitive force transducers and the development of solid-state quantum devices. Force transducers, typically monolithic Si cantilevers, are central to techniques such as AFM, and MFM. On the other hand, quantum devices including quantum wells, quantum dots (QDs), and single electron transistors are essential to technologies like lasers, optical detectors, and in experiments on quantum information. These two types of devices have – until now – occupied distinct material systems and have, for the most part, not been combined.
New developments in the growth of inorganic nanowires (NWs), however, are set to change the status quo. Researchers can now grow nanoscale structures from the bottom-up with unprecedented mechanical properties. Unlike traditional top-down cantilevers, which are etched or milled out of a larger block of material, bottom-up structures are assembled unit-by-unit to be almost defect-free on the atomic-scale. This near perfection gives NWs a much smaller mechanical dissipation than their top-down counterparts, while their higher resonance frequencies allow them to couple less strongly to common sources of noise. Meanwhile, layer-by-layer growth of NWs is rapidly developing such that both axial and radial heterostructures have now been realized. Such fine control allows for band-structure engineering and the production of devices including FETs, single photon sources, and QDs. NWs are also attractive hosts for optical emitters as their geometry favors the efficient extraction of photons.
These properties and the fact that a NW can be integrated as the tip of an SPM make NWs extremely promising devices. We propose to develop the use of NWs as scanning multifunctional sensors. We intend to 1) use NW cantilevers as force transducers in high-resolution scanning force microscopy, and 2) use NW quantum devices as scanning sensors.
Summary
Advances in growth and fabrication of semiconductor nanostructures have led to both the production of exquisitely sensitive force transducers and the development of solid-state quantum devices. Force transducers, typically monolithic Si cantilevers, are central to techniques such as AFM, and MFM. On the other hand, quantum devices including quantum wells, quantum dots (QDs), and single electron transistors are essential to technologies like lasers, optical detectors, and in experiments on quantum information. These two types of devices have – until now – occupied distinct material systems and have, for the most part, not been combined.
New developments in the growth of inorganic nanowires (NWs), however, are set to change the status quo. Researchers can now grow nanoscale structures from the bottom-up with unprecedented mechanical properties. Unlike traditional top-down cantilevers, which are etched or milled out of a larger block of material, bottom-up structures are assembled unit-by-unit to be almost defect-free on the atomic-scale. This near perfection gives NWs a much smaller mechanical dissipation than their top-down counterparts, while their higher resonance frequencies allow them to couple less strongly to common sources of noise. Meanwhile, layer-by-layer growth of NWs is rapidly developing such that both axial and radial heterostructures have now been realized. Such fine control allows for band-structure engineering and the production of devices including FETs, single photon sources, and QDs. NWs are also attractive hosts for optical emitters as their geometry favors the efficient extraction of photons.
These properties and the fact that a NW can be integrated as the tip of an SPM make NWs extremely promising devices. We propose to develop the use of NWs as scanning multifunctional sensors. We intend to 1) use NW cantilevers as force transducers in high-resolution scanning force microscopy, and 2) use NW quantum devices as scanning sensors.
Max ERC Funding
1 480 680 €
Duration
Start date: 2013-11-01, End date: 2018-10-31
Project acronym OptApprox
Project Strong Convex Relaxations with Optimal Approximation Guarantees
Researcher (PI) Ola Nils Anders Svensson
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE6, ERC-2013-StG
Summary One of the most exciting areas of theoretical computer science is to understand the approximability of fundamental optimization problems. A prominent example is the traveling salesman problem, for which a long-standing conjecture states that a certain algorithm gives a better guarantee than current methods can prove. The resolution of this conjecture and of many other fundamental problems is intimately related to an increased understanding of strong convex relaxations.
Although these problems have resisted numerous attempts, recent breakthrough results, in which the PI has played a central role, indicate new research directions with the potential to resolve some of our most exciting open questions. We propose three research directions to revolutionize our understanding of these problems and more generally of the use of convex relaxations in approximation algorithms:
(I) develop new approaches to analyze and harness the power of existing convex relaxations;
(II) understand the power of automatically generated relaxations; and
(III) prove the optimality of algorithms based on convex relaxations.
The proposed research lies in the frontier of approximation algorithms and optimization with connections to major problems in complexity theory, such as the unique games conjecture. Any progress will be a significant contribution to theoretical computer science and mathematical optimization.
Summary
One of the most exciting areas of theoretical computer science is to understand the approximability of fundamental optimization problems. A prominent example is the traveling salesman problem, for which a long-standing conjecture states that a certain algorithm gives a better guarantee than current methods can prove. The resolution of this conjecture and of many other fundamental problems is intimately related to an increased understanding of strong convex relaxations.
Although these problems have resisted numerous attempts, recent breakthrough results, in which the PI has played a central role, indicate new research directions with the potential to resolve some of our most exciting open questions. We propose three research directions to revolutionize our understanding of these problems and more generally of the use of convex relaxations in approximation algorithms:
(I) develop new approaches to analyze and harness the power of existing convex relaxations;
(II) understand the power of automatically generated relaxations; and
(III) prove the optimality of algorithms based on convex relaxations.
The proposed research lies in the frontier of approximation algorithms and optimization with connections to major problems in complexity theory, such as the unique games conjecture. Any progress will be a significant contribution to theoretical computer science and mathematical optimization.
Max ERC Funding
1 451 052 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym OPTINT
Project OPTINT: Optimization-based Design of Interactive Technologies
Researcher (PI) Otmar Dieter HILLIGES
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), PE6, ERC-2016-STG
Summary As technology moves further away from the desktop setting, it is becoming increasingly clear that conventional interfaces (i.e., mice and keyboards) are no longer adequate means for interaction and that the traditional computing paradigm will be replaced or complemented by new forms of interaction such as wearable computing, head-worn displays, AR and VR.
The main goal of OPTINT is to lay the foundations of design and implementation of 21st century interactive technologies. While contemporary user interface (UI) design techniques have been developed for the era of the PC, modern user interfaces are much more diverse and have to be designed for challenging contexts such as embedded and wearable computers, augmented and virtual reality (AR/VR), the Internet of the Things (IoT) and intelligent robotics. Furthermore, instead of flat, rectangular 2D devices, things will be flexible and of custom shape; instead of mass-produced we’ll use 3D printing to customize all types of technology and end-user designed interactive objects are becoming a reality. Designing in such context requires expertise in a large and diverse set of domains ranging from hardware-level sensor design all the way to user experience aspects. These requirements go largely beyond traditional UI design techniques, calling for next generation tools that can integrate all of them in a unified manner. Embracing these challenges, I argue for a novel approach to the design of interactive devices, leveraging optimization algorithms allowing the designer to focus on user experience instead of having to worry about technical details. OPTINT will deliver an optimization-based framework and a set of easy to use tools that allow user-experience (UX) designers and domain experts to develop a broad range of interactive technologies.
Summary
As technology moves further away from the desktop setting, it is becoming increasingly clear that conventional interfaces (i.e., mice and keyboards) are no longer adequate means for interaction and that the traditional computing paradigm will be replaced or complemented by new forms of interaction such as wearable computing, head-worn displays, AR and VR.
The main goal of OPTINT is to lay the foundations of design and implementation of 21st century interactive technologies. While contemporary user interface (UI) design techniques have been developed for the era of the PC, modern user interfaces are much more diverse and have to be designed for challenging contexts such as embedded and wearable computers, augmented and virtual reality (AR/VR), the Internet of the Things (IoT) and intelligent robotics. Furthermore, instead of flat, rectangular 2D devices, things will be flexible and of custom shape; instead of mass-produced we’ll use 3D printing to customize all types of technology and end-user designed interactive objects are becoming a reality. Designing in such context requires expertise in a large and diverse set of domains ranging from hardware-level sensor design all the way to user experience aspects. These requirements go largely beyond traditional UI design techniques, calling for next generation tools that can integrate all of them in a unified manner. Embracing these challenges, I argue for a novel approach to the design of interactive devices, leveraging optimization algorithms allowing the designer to focus on user experience instead of having to worry about technical details. OPTINT will deliver an optimization-based framework and a set of easy to use tools that allow user-experience (UX) designers and domain experts to develop a broad range of interactive technologies.
Max ERC Funding
1 500 000 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym PARATOP
Project New paradigms for correlated quantum matter:Hierarchical topology, Kondo topological metals, and deep learning
Researcher (PI) Titus NEUPERT
Host Institution (HI) UNIVERSITAT ZURICH
Call Details Starting Grant (StG), PE3, ERC-2017-STG
Summary Discovering, classifying and understanding phases of quantum matter is a core goal of condensed matter physics. Next to the notion of symmetry breaking phases, the concept of topological phases of matter is a prevailing theme of recent research. Topological phases are envisioned for various applications due to their universal and robust properties, such as protected conducting boundary modes, and provoke fundamental questions about the nature of many-body quantum states by providing the basis for exotic quasiparticles.
In this ERC research project, I propose several new topological phases and novel numerical approaches for studying and classifying the most sought-after topological phases of matter. Concretely, I propose the concept of three-dimensional hierarchical topological insulators, which, in contrast to the known topological phases, do not posses gapless surface, but protected gapless edge modes. Moreover, I plan to study topological metals arising in strongly correlated Kondo systems, going beyond the current paradigm of considering topological metals that arise in the absence of electronic correlations. Furthermore, I propose to make the analogous step for topological superconductors, which have been studied as free models to search for Majorana quasiparticles: For the first time, I want to explore strongly interacting systems that realize the more powerful parafermion quasiparticles with numerical techniques. Finally, in a cross-disciplinary and exploratory sub-project, I will employ methods of deep neural networks to classify strongly correlated quantum phases using supervised learning combined with a technique called deep dreaming.
Each of these sub-projects has the potential to make a paradigm-changing contribution to the study of strongly correlated and topological states of quantum matter and the combination of them allows to take advantage of synergy effects and a balance between high-risk and definitely feasible key developments.
Summary
Discovering, classifying and understanding phases of quantum matter is a core goal of condensed matter physics. Next to the notion of symmetry breaking phases, the concept of topological phases of matter is a prevailing theme of recent research. Topological phases are envisioned for various applications due to their universal and robust properties, such as protected conducting boundary modes, and provoke fundamental questions about the nature of many-body quantum states by providing the basis for exotic quasiparticles.
In this ERC research project, I propose several new topological phases and novel numerical approaches for studying and classifying the most sought-after topological phases of matter. Concretely, I propose the concept of three-dimensional hierarchical topological insulators, which, in contrast to the known topological phases, do not posses gapless surface, but protected gapless edge modes. Moreover, I plan to study topological metals arising in strongly correlated Kondo systems, going beyond the current paradigm of considering topological metals that arise in the absence of electronic correlations. Furthermore, I propose to make the analogous step for topological superconductors, which have been studied as free models to search for Majorana quasiparticles: For the first time, I want to explore strongly interacting systems that realize the more powerful parafermion quasiparticles with numerical techniques. Finally, in a cross-disciplinary and exploratory sub-project, I will employ methods of deep neural networks to classify strongly correlated quantum phases using supervised learning combined with a technique called deep dreaming.
Each of these sub-projects has the potential to make a paradigm-changing contribution to the study of strongly correlated and topological states of quantum matter and the combination of them allows to take advantage of synergy effects and a balance between high-risk and definitely feasible key developments.
Max ERC Funding
1 362 401 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym PERDY
Project Perceptually-Driven Optimizations of Graphics Content for Novel Displays
Researcher (PI) Piotr Didyk
Host Institution (HI) UNIVERSITA DELLA SVIZZERA ITALIANA
Call Details Starting Grant (StG), PE6, ERC-2018-STG
Summary Displays play a vital role in many professional and personal activities. They are a crucial interface between a user and the digital world in tasks involving visualization and interaction with digital data. The abilities of new display technologies regarding reproduction of important visual cues, such as binocular disparity, accommodation, or motion parallax, outperform the capabilities of methods for optimizing graphics content to match the requirements of particular hardware designs. This leads to a poor visual quality and massive computational overhead, which hamper the adoption of novel displays. I argue that there are significant gaps between hardware, computational techniques, and understanding of human perception, which prevents taking full advantage of these technologies.
To overcome these limitations, I and my team will combine hardware, computation, and perception into a unique platform where the capabilities of displays and quality requirements are represented in a shared space. The basis for our project will be in-depth understanding of human perception. Our experiments will focus on three aspects: (1) investigation of perceptual limits across a wide field of view, (2) involving all visual cues, and (3) establishing optimal trade-offs between different quality aspects. We will build efficient computational models that will predict perceived quality and enable perceptual optimizations to drive new content adaptation techniques.
This project will contribute display-specific perceptual optimizations of graphics content to match the requirements of human perception. It will address the key aspects of portable devices such as energy efficiency and visual quality. Our experiments and modeling of human perception will provide crucial insights into new hardware developments. The contributions will be necessary for development and standardization of new, high-quality display devices which will not only improve existing applications but also enable new ones.
Summary
Displays play a vital role in many professional and personal activities. They are a crucial interface between a user and the digital world in tasks involving visualization and interaction with digital data. The abilities of new display technologies regarding reproduction of important visual cues, such as binocular disparity, accommodation, or motion parallax, outperform the capabilities of methods for optimizing graphics content to match the requirements of particular hardware designs. This leads to a poor visual quality and massive computational overhead, which hamper the adoption of novel displays. I argue that there are significant gaps between hardware, computational techniques, and understanding of human perception, which prevents taking full advantage of these technologies.
To overcome these limitations, I and my team will combine hardware, computation, and perception into a unique platform where the capabilities of displays and quality requirements are represented in a shared space. The basis for our project will be in-depth understanding of human perception. Our experiments will focus on three aspects: (1) investigation of perceptual limits across a wide field of view, (2) involving all visual cues, and (3) establishing optimal trade-offs between different quality aspects. We will build efficient computational models that will predict perceived quality and enable perceptual optimizations to drive new content adaptation techniques.
This project will contribute display-specific perceptual optimizations of graphics content to match the requirements of human perception. It will address the key aspects of portable devices such as energy efficiency and visual quality. Our experiments and modeling of human perception will provide crucial insights into new hardware developments. The contributions will be necessary for development and standardization of new, high-quality display devices which will not only improve existing applications but also enable new ones.
Max ERC Funding
1 497 302 €
Duration
Start date: 2019-02-01, End date: 2024-01-31
Project acronym PHONUIT
Project Phononic Circuits: manipulation and coherent control of phonons
Researcher (PI) Ilaria ZARDO
Host Institution (HI) UNIVERSITAT BASEL
Call Details Starting Grant (StG), PE3, ERC-2017-STG
Summary In the last decades, the power to control photons and electrons paved the way for extraordinary technological developments in electronic and optoelectronic applications. The same degree of control is still lacking with quantized lattice vibrations, i.e. phonons. Phonons are the carriers of heat and sound. The understanding and ability to manipulate phonons as quantum particles in solids enable the control of coherent phonon transport, which is of fundamental interest and could also be exploited in applications. Logic operations can be realized with the manipulation of phonons both in their coherent and incoherent form in order to switch, amplify, and route signals, and to store information. If brought to a mature level, phononic devices can become complementary to the conventional electronics, opening new opportunities.
I envision to realize each part of this technology exploiting phonons and to bring them together in an integrated circuit on chip: a phononic integrated circuit. The objective of the proposal is:
A: the realization of coherent phonon source and detector;
B: the realization of phonon computation with the use of thermal logic gates;
C: the realization of phonon based quantum and thermal memories.
To this end it is crucial to engineer nanoscale heterostructures with suitable interfaces, and to engineer the phonon spectrum and the interface thermal resistance. Phonons will be launched, probed and manipulated with a combination of pump-probe experiments and resistive thermal measurements on chip.
The proposed research will be of great relevance for fundamental research as well as for technological applications in the field of sound and thermal management.
Summary
In the last decades, the power to control photons and electrons paved the way for extraordinary technological developments in electronic and optoelectronic applications. The same degree of control is still lacking with quantized lattice vibrations, i.e. phonons. Phonons are the carriers of heat and sound. The understanding and ability to manipulate phonons as quantum particles in solids enable the control of coherent phonon transport, which is of fundamental interest and could also be exploited in applications. Logic operations can be realized with the manipulation of phonons both in their coherent and incoherent form in order to switch, amplify, and route signals, and to store information. If brought to a mature level, phononic devices can become complementary to the conventional electronics, opening new opportunities.
I envision to realize each part of this technology exploiting phonons and to bring them together in an integrated circuit on chip: a phononic integrated circuit. The objective of the proposal is:
A: the realization of coherent phonon source and detector;
B: the realization of phonon computation with the use of thermal logic gates;
C: the realization of phonon based quantum and thermal memories.
To this end it is crucial to engineer nanoscale heterostructures with suitable interfaces, and to engineer the phonon spectrum and the interface thermal resistance. Phonons will be launched, probed and manipulated with a combination of pump-probe experiments and resistive thermal measurements on chip.
The proposed research will be of great relevance for fundamental research as well as for technological applications in the field of sound and thermal management.
Max ERC Funding
1 488 388 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym PLANETOGENESIS
Project Building the next generation of planet formation models: protoplanetary disks, internal structure, and formation of planetary systems
Researcher (PI) Yann Alibert
Host Institution (HI) UNIVERSITAET BERN
Call Details Starting Grant (StG), PE9, ERC-2009-StG
Summary The discovery of extra-solar planetary systems with properties so different from those of our own Solar System has overturned our theoretical understanding of how planets and planetary systems form. Indeed, planet formation models have to link observations of two classes of objects: Protoplanetary disk, whose structure and early evolution provide the initial conditions of planets formation, and actual detected planets. The observational knowledge of these two classes of objects will see in the near future dramatic improvements, with three major breakthroughs: 1) high angular resolution observations will tightly constrain the structure and early evolution of protoplanetary disks, 2) direct observation of extrasolar planets will allow to understand their internal structure as well as their formation process, and 3) detection of very low mass extrasolar planets will constrain the mass function of planets and planetary systems, down to the terrestrial planet regime The goal of this project is to develop a theoretical understanding of planet formation that quantitatively stands up to these observational confrontations. For this, we will build on the basis of first generation planet formation models developed at the time the PI was assistant at the Physikalisches Institute of the University of Berne. The PI, a PhD student, and a Postdoc will conduct three inter-related sub-projects linked to the three breakthroughs mentioned above: A) improving the disk part of planet formation models, B) determining the internal structure of forming planets, including the effects of accretion shocks and envelope pollution by infalling planetesimals, and calculating their early evolution, and C) building planetary system formation models, including both gas giant and low mass rocky planets.
Summary
The discovery of extra-solar planetary systems with properties so different from those of our own Solar System has overturned our theoretical understanding of how planets and planetary systems form. Indeed, planet formation models have to link observations of two classes of objects: Protoplanetary disk, whose structure and early evolution provide the initial conditions of planets formation, and actual detected planets. The observational knowledge of these two classes of objects will see in the near future dramatic improvements, with three major breakthroughs: 1) high angular resolution observations will tightly constrain the structure and early evolution of protoplanetary disks, 2) direct observation of extrasolar planets will allow to understand their internal structure as well as their formation process, and 3) detection of very low mass extrasolar planets will constrain the mass function of planets and planetary systems, down to the terrestrial planet regime The goal of this project is to develop a theoretical understanding of planet formation that quantitatively stands up to these observational confrontations. For this, we will build on the basis of first generation planet formation models developed at the time the PI was assistant at the Physikalisches Institute of the University of Berne. The PI, a PhD student, and a Postdoc will conduct three inter-related sub-projects linked to the three breakthroughs mentioned above: A) improving the disk part of planet formation models, B) determining the internal structure of forming planets, including the effects of accretion shocks and envelope pollution by infalling planetesimals, and calculating their early evolution, and C) building planetary system formation models, including both gas giant and low mass rocky planets.
Max ERC Funding
1 395 323 €
Duration
Start date: 2010-02-01, End date: 2015-11-30
Project acronym POLYTE
Project Polynomial term structure models
Researcher (PI) Damir Filipovic
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE1, ERC-2012-StG_20111012
Summary "The term structure of interest rates plays a central role in the functioning of the interbank market. It also represents a key factor for the valuation and management of long term liabilities, such as pensions. The financial crisis has revealed the multivariate risk nature of the term structure, which includes inflation, credit and liquidity risk, resulting in multiple spread adjusted discount curves. This has generated a strong interest in tractable stochastic models for the movements of the term structure that can match all determining risk factors.
We propose a new class of term structure models based on polynomial factor processes which are defined as jump-diffusions whose generator leaves the space of polynomials of any fixed degree invariant. The moments of their transition distributions are polynomials in the initial state. The coefficients defining this relationship are given as solutions of a system of nested linear ordinary differential equations. As a consequence polynomial processes yield closed form polynomial-rational expressions for the term structure of interest rates. Polynomial processes include affine processes, whose transition functions admit an exponential-affine characteristic function. Affine processes are among the most widely used models in finance to date, but come along with some severe specification limitations. We propose to overcome these shortcomings by studying polynomial processes and polynomial expansion methods achieving a comparable efficiency as Fourier methods in the affine case.
In sum, the objectives of this project are threefold. First, we plan to develop a theory for polynomial processes and entirely explore their statistical properties. This fills a gap in the literature on affine processes in particular. Second, we aim to develop polynomial-rational term structure models addressing the new paradigm of multiple spread adjusted discount curves. Third, we plan to implement and estimate these models using real market data."
Summary
"The term structure of interest rates plays a central role in the functioning of the interbank market. It also represents a key factor for the valuation and management of long term liabilities, such as pensions. The financial crisis has revealed the multivariate risk nature of the term structure, which includes inflation, credit and liquidity risk, resulting in multiple spread adjusted discount curves. This has generated a strong interest in tractable stochastic models for the movements of the term structure that can match all determining risk factors.
We propose a new class of term structure models based on polynomial factor processes which are defined as jump-diffusions whose generator leaves the space of polynomials of any fixed degree invariant. The moments of their transition distributions are polynomials in the initial state. The coefficients defining this relationship are given as solutions of a system of nested linear ordinary differential equations. As a consequence polynomial processes yield closed form polynomial-rational expressions for the term structure of interest rates. Polynomial processes include affine processes, whose transition functions admit an exponential-affine characteristic function. Affine processes are among the most widely used models in finance to date, but come along with some severe specification limitations. We propose to overcome these shortcomings by studying polynomial processes and polynomial expansion methods achieving a comparable efficiency as Fourier methods in the affine case.
In sum, the objectives of this project are threefold. First, we plan to develop a theory for polynomial processes and entirely explore their statistical properties. This fills a gap in the literature on affine processes in particular. Second, we aim to develop polynomial-rational term structure models addressing the new paradigm of multiple spread adjusted discount curves. Third, we plan to implement and estimate these models using real market data."
Max ERC Funding
995 155 €
Duration
Start date: 2012-12-01, End date: 2017-11-30
Project acronym PORABEL
Project Nanopore integrated nanoelectrodes for biomolecular manipulation and sensing
Researcher (PI) Aleksandra Radenovic
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE3, ERC-2010-StG_20091028
Summary In this proposal we aim to address several complex biophysical problems at single molecule level that remained elusive due to the lack of appropriate experimental approach where one could manipulate independently both interacting biomolecules and in the same time measure the strength of their interaction and correlate it with their electronic signature. In particular we are interested in finding out how biopolymer finds, enters and translocates nanopore. Equally intriguing is still unresolved mechanism of phage DNA ejection. We will also investigate how exactly proteins recognize the target binding places on DNA and if the protein DNA recognition is based on the complementarity of their charge patterns.
To allow addressing those biophysical problems we will develop novel experimental framework by integrating electrodes to the nanopore based force spectroscopy. The proposed strategy will enable two directions of the research: single molecule manipulation and single molecule detection /sensing equally suitable for investigating complex biophysical problems and molecular recognition assays.
By exploiting superior sensing and detection capabilities of our devices, we will investigate following practical applications improved nucleotide detection, selective protein detection and protein charge profiling via nanopore unfolding.
Unique combination of optical manipulation and nanofluidics could lead to new methods of bioanalysis, mechanical characterization and discrimination between specific and non-specific DNA protein interactions. This research proposal combines nanofabrication, optics, nano/microfluidics, electronics, computer programming, and biochemistry
Summary
In this proposal we aim to address several complex biophysical problems at single molecule level that remained elusive due to the lack of appropriate experimental approach where one could manipulate independently both interacting biomolecules and in the same time measure the strength of their interaction and correlate it with their electronic signature. In particular we are interested in finding out how biopolymer finds, enters and translocates nanopore. Equally intriguing is still unresolved mechanism of phage DNA ejection. We will also investigate how exactly proteins recognize the target binding places on DNA and if the protein DNA recognition is based on the complementarity of their charge patterns.
To allow addressing those biophysical problems we will develop novel experimental framework by integrating electrodes to the nanopore based force spectroscopy. The proposed strategy will enable two directions of the research: single molecule manipulation and single molecule detection /sensing equally suitable for investigating complex biophysical problems and molecular recognition assays.
By exploiting superior sensing and detection capabilities of our devices, we will investigate following practical applications improved nucleotide detection, selective protein detection and protein charge profiling via nanopore unfolding.
Unique combination of optical manipulation and nanofluidics could lead to new methods of bioanalysis, mechanical characterization and discrimination between specific and non-specific DNA protein interactions. This research proposal combines nanofabrication, optics, nano/microfluidics, electronics, computer programming, and biochemistry
Max ERC Funding
1 439 840 €
Duration
Start date: 2010-10-01, End date: 2015-09-30
Project acronym RAM
Project Regularity theory for area minimizing currents
Researcher (PI) Camillo De Lellis
Host Institution (HI) UNIVERSITAT ZURICH
Call Details Starting Grant (StG), PE1, ERC-2012-StG_20111012
Summary "The Plateau's problem consists in finding the surface of least area spanning a given contour. This question has attracted the attention of many mathematicians in the last two centuries, providing a prototypical problem for several fields of research in mathematics. For hypersurfaces a lot is known about the existence and regularity thanks to the classical works of De Giorgi, Almgren, Fleming, Federer, Simons, Allard, Simon, Schoen and several other authors.
In higher codimension a quite powerful existence theory, the ``theory of currents'', was developed by Federer and Fleming in 1960. The success of this theory relies on its homological flavor and indeed it has found several applications to problems in differential geometry. Many geometric objects which are widely studied in the modern literature are naturally area-minimizing currents: two examples among many are special lagrangians and holomorphic subvarieties. However the understanding of the regularity issues is, compared to the case of hypersurfaces, much poorer. Aside from its intrinsic interest, a good regularity theory is likely to provide more insightful geometric applications. A quite striking example is Taubes' proof of the equivalence between the Gromov and Seiberg-Witten invariants.
A very complicated and far reaching regularity theory has been developed by Almgren thirty years ago in a monumental work of almost 1000 pages. The first part of this project aims at reaching the same conclusions of Almgren with a more flexible and accessible theory. In the second part I wish to go beyond Almgren's work and attack some of the many open questions which still remain in the field."
Summary
"The Plateau's problem consists in finding the surface of least area spanning a given contour. This question has attracted the attention of many mathematicians in the last two centuries, providing a prototypical problem for several fields of research in mathematics. For hypersurfaces a lot is known about the existence and regularity thanks to the classical works of De Giorgi, Almgren, Fleming, Federer, Simons, Allard, Simon, Schoen and several other authors.
In higher codimension a quite powerful existence theory, the ``theory of currents'', was developed by Federer and Fleming in 1960. The success of this theory relies on its homological flavor and indeed it has found several applications to problems in differential geometry. Many geometric objects which are widely studied in the modern literature are naturally area-minimizing currents: two examples among many are special lagrangians and holomorphic subvarieties. However the understanding of the regularity issues is, compared to the case of hypersurfaces, much poorer. Aside from its intrinsic interest, a good regularity theory is likely to provide more insightful geometric applications. A quite striking example is Taubes' proof of the equivalence between the Gromov and Seiberg-Witten invariants.
A very complicated and far reaching regularity theory has been developed by Almgren thirty years ago in a monumental work of almost 1000 pages. The first part of this project aims at reaching the same conclusions of Almgren with a more flexible and accessible theory. In the second part I wish to go beyond Almgren's work and attack some of the many open questions which still remain in the field."
Max ERC Funding
919 500 €
Duration
Start date: 2012-09-01, End date: 2017-08-31
Project acronym RandMat
Project Spectral Statistics of Structured Random Matrices
Researcher (PI) Antti Kenneth Viktor KNOWLES
Host Institution (HI) UNIVERSITE DE GENEVE
Call Details Starting Grant (StG), PE1, ERC-2016-STG
Summary The purpose of this proposal is a better mathematical understanding of certain classes of large random matrices. Up to very recently, random matrix theory has been mainly focused on mean-field models with independent entries. In this proposal I instead consider random matrices that incorporate some nontrivial structure. I focus on two types of structured random matrices that arise naturally in important applications and lead to a rich mathematical behaviour: (1) random graphs with fixed degrees, such as random regular graphs, and (2) random band matrices, which constitute a good model of disordered quantum Hamiltonians.
The goals are strongly motivated by the applications to spectral graph theory and quantum chaos for (1) and to the physics of conductance in disordered media for (2). Specifically, I will work in the following directions. First, derive precise bounds on the locations of the extremal eigenvalues and the spectral gap, ultimately obtaining their limiting distributions. Second, characterize the spectral statistics in the bulk of the spectrum, using both eigenvalue correlation functions on small scales and linear eigenvalue statistics on intermediate mesoscopic scales. Third, prove the delocalization of eigenvectors and derive the distribution of their components. These results will address several of the most important questions about the structured random matrices (1) and (2), such as expansion properties of random graphs, hallmarks of quantum chaos in random regular graphs, crossovers in the eigenvalue statistics of disordered conductors, and quantum diffusion.
To achieve these goals I will combine tools introduced in my previous work, such as local resampling of graphs and subdiagram resummation techniques, and in addition develop novel, robust techniques to address the more challenging goals. I expect the output of this proposal to contribute significantly to the understanding of structured random matrices.
Summary
The purpose of this proposal is a better mathematical understanding of certain classes of large random matrices. Up to very recently, random matrix theory has been mainly focused on mean-field models with independent entries. In this proposal I instead consider random matrices that incorporate some nontrivial structure. I focus on two types of structured random matrices that arise naturally in important applications and lead to a rich mathematical behaviour: (1) random graphs with fixed degrees, such as random regular graphs, and (2) random band matrices, which constitute a good model of disordered quantum Hamiltonians.
The goals are strongly motivated by the applications to spectral graph theory and quantum chaos for (1) and to the physics of conductance in disordered media for (2). Specifically, I will work in the following directions. First, derive precise bounds on the locations of the extremal eigenvalues and the spectral gap, ultimately obtaining their limiting distributions. Second, characterize the spectral statistics in the bulk of the spectrum, using both eigenvalue correlation functions on small scales and linear eigenvalue statistics on intermediate mesoscopic scales. Third, prove the delocalization of eigenvectors and derive the distribution of their components. These results will address several of the most important questions about the structured random matrices (1) and (2), such as expansion properties of random graphs, hallmarks of quantum chaos in random regular graphs, crossovers in the eigenvalue statistics of disordered conductors, and quantum diffusion.
To achieve these goals I will combine tools introduced in my previous work, such as local resampling of graphs and subdiagram resummation techniques, and in addition develop novel, robust techniques to address the more challenging goals. I expect the output of this proposal to contribute significantly to the understanding of structured random matrices.
Max ERC Funding
1 257 442 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym SCADAPT
Project "Large-scale Adaptive Sensing, Learning and Decision Making: Theory and Applications"
Researcher (PI) Rainer Andreas Krause
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary "We address one of the fundamental challenges of our time: Acting effectively while facing a deluge of data. Massive volumes of data are generated from corporate and public sources every second, in social, scientific and commercial applications. In addition, more and more low level sensor devices are becoming available and accessible, potentially to the benefit of myriads of applications. However, access to the data is limited, due to computational, bandwidth, power and other limitations. Crucially, simply gathering data is not enough: we need to make decisions based on the information we obtain. Thus, one of the key problems is: How can we obtain most decision-relevant information at minimum cost?
Most existing techniques are either heuristics with no guarantees, or do not scale to large problems. We recently showed that many information gathering problems satisfy submodularity, an intuitive diminishing returns condition. Its exploitation allowed us to develop algorithms with strong guarantees and empirical performance. However, existing algorithms are limited: they cannot cope with dynamic phenomena that change over time, are inherently centralized and thus do not scale with modern, distributed computing paradigms. Perhaps most crucially, they have been designed with the focus of gathering data, but not for making decisions based on this data.
We seek to substantially advance large-scale adaptive decision making under partial observability, by grounding it in the novel computational framework of adaptive submodular optimization. We will develop fundamentally new scalable techniques bridging statistical learning, combinatorial optimization, probabilistic inference and decision theory to overcome the limitations of existing methods. In addition to developing novel theory and algorithms, we will demonstrate the performance of our methods on challenging real world interdisciplinary problems in community sensing, information retrieval and computational sustainability."
Summary
"We address one of the fundamental challenges of our time: Acting effectively while facing a deluge of data. Massive volumes of data are generated from corporate and public sources every second, in social, scientific and commercial applications. In addition, more and more low level sensor devices are becoming available and accessible, potentially to the benefit of myriads of applications. However, access to the data is limited, due to computational, bandwidth, power and other limitations. Crucially, simply gathering data is not enough: we need to make decisions based on the information we obtain. Thus, one of the key problems is: How can we obtain most decision-relevant information at minimum cost?
Most existing techniques are either heuristics with no guarantees, or do not scale to large problems. We recently showed that many information gathering problems satisfy submodularity, an intuitive diminishing returns condition. Its exploitation allowed us to develop algorithms with strong guarantees and empirical performance. However, existing algorithms are limited: they cannot cope with dynamic phenomena that change over time, are inherently centralized and thus do not scale with modern, distributed computing paradigms. Perhaps most crucially, they have been designed with the focus of gathering data, but not for making decisions based on this data.
We seek to substantially advance large-scale adaptive decision making under partial observability, by grounding it in the novel computational framework of adaptive submodular optimization. We will develop fundamentally new scalable techniques bridging statistical learning, combinatorial optimization, probabilistic inference and decision theory to overcome the limitations of existing methods. In addition to developing novel theory and algorithms, we will demonstrate the performance of our methods on challenging real world interdisciplinary problems in community sensing, information retrieval and computational sustainability."
Max ERC Funding
1 499 900 €
Duration
Start date: 2012-11-01, End date: 2017-10-31
Project acronym SCALABIM
Project Scalable Bayesian Methods for Machine Learning and Imaging
Researcher (PI) Matthias Seeger
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary Machine learning seeks to automatize the processing of
large complex datasets by adaptive computing, a core strategy to meet growing
demands of science and applications.
Typically, real-world problems are mapped to penalized estimation tasks (e.g.,
binary classification), which are solved by simple efficient algorithms. While
successful so far, I believe this approach is too limited to
realise the potential of adaptive computing. Most of the work, such as data
selection, feature construction, model calibration and comparison, still has to
be done by hand. Demands for automated decision-making (e.g., tuning
data acquisition during an experiment) are not met.
Such problems are naturally addressed by Bayesian reasoning about uncertain
knowledge, which however remains infeasible in most large scale settings.
The main goal of this proposal is to unite the strengths of penalized
estimation and Bayesian decision-making, exploiting the former's advanced state
of the art in order to implement substantial improvements coming with
the latter in large scale applications. A major focus is on improving magnetic
resonance imaging (MRI) by way of new Bayesian technology, driving robust
nonlinear
reconstruction from less data, and optimizing the acquisition through
Bayesian experimental design, applications not previously attempted by machine
learning. Far beyond the reach of present methodology, these goals demand
a novel computational foundation for approximate Bayesian inference through
numerical algorithmic reductions.
This project will have high impact on probabilistic machine learning, raising
the bar for scalable Bayesian computations. It will help to open up a whole new
range of medical imaging applications for machine learning. Moreover,
substantial impact on MRI reconstruction research is anticipated. There is
strong recent interest in savings through compressive sensing, whose full
potential is realised only by way of adaptive technology such as projected
here.
Summary
Machine learning seeks to automatize the processing of
large complex datasets by adaptive computing, a core strategy to meet growing
demands of science and applications.
Typically, real-world problems are mapped to penalized estimation tasks (e.g.,
binary classification), which are solved by simple efficient algorithms. While
successful so far, I believe this approach is too limited to
realise the potential of adaptive computing. Most of the work, such as data
selection, feature construction, model calibration and comparison, still has to
be done by hand. Demands for automated decision-making (e.g., tuning
data acquisition during an experiment) are not met.
Such problems are naturally addressed by Bayesian reasoning about uncertain
knowledge, which however remains infeasible in most large scale settings.
The main goal of this proposal is to unite the strengths of penalized
estimation and Bayesian decision-making, exploiting the former's advanced state
of the art in order to implement substantial improvements coming with
the latter in large scale applications. A major focus is on improving magnetic
resonance imaging (MRI) by way of new Bayesian technology, driving robust
nonlinear
reconstruction from less data, and optimizing the acquisition through
Bayesian experimental design, applications not previously attempted by machine
learning. Far beyond the reach of present methodology, these goals demand
a novel computational foundation for approximate Bayesian inference through
numerical algorithmic reductions.
This project will have high impact on probabilistic machine learning, raising
the bar for scalable Bayesian computations. It will help to open up a whole new
range of medical imaging applications for machine learning. Moreover,
substantial impact on MRI reconstruction research is anticipated. There is
strong recent interest in savings through compressive sensing, whose full
potential is realised only by way of adaptive technology such as projected
here.
Max ERC Funding
1 401 697 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym SIMCOMICS
Project Simulation of droplets in complex microchannels
Researcher (PI) Francois Gallaire
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE8, ERC-2011-StG_20101014
Summary In droplet-based microfluidics, the elementary units transporting reagents from one functional site to another (mixer, sensor or analyzer) are droplets, which are carried by an inert wetting fluid. This research project aims at the development of numerical models of flowing droplets in thin spatially extended microchannels, designed at avoiding the exponential complexity of parallelized 1-D networks. We aim at simulating the trajectory of droplets transported by a pressure-driven carrier fluid, as they evolve in a surface energy gradient, generated by channel depth variations or surface tension inhomogeneities.
To this end, we exploit the remarkable aspect ratio of these microfluidic devices to propose a depth-averaged description of the pancake shaped droplets. The resulting equations, called Brinkman's equations, combine the 2D Stokes equations with 2D Darcy potential-flow-like equations. Their diphasic simulation relies on the adaptation of existing algorithms to this particular free interface problem. Pressure corrections due to the thickness variations of the lubricating thin films will also be included.
Surfactant and heat dynamics will then be added to model thermo- and soluto-capillary forcing. The depth-averaged model will be finally generalized to account for arbitrary depth variations, so as to add dynamics to the quasi-static description of droplets moving along successive minimal surface energy locations.
A specific part of the project is also devoted to the development of an experimental expertise: it is indeed essential to the success of the project to conduct fundamental microfluidic experiments in order to validate our new models. While SIMCOMICS aims at shrinking the gap between present computations of droplets flowing in microchannels and the increasing number of application-oriented experimental studies, it both raises fundamental questions and opens promising perspectives for the engineering design of new microcarved microchannels.
Summary
In droplet-based microfluidics, the elementary units transporting reagents from one functional site to another (mixer, sensor or analyzer) are droplets, which are carried by an inert wetting fluid. This research project aims at the development of numerical models of flowing droplets in thin spatially extended microchannels, designed at avoiding the exponential complexity of parallelized 1-D networks. We aim at simulating the trajectory of droplets transported by a pressure-driven carrier fluid, as they evolve in a surface energy gradient, generated by channel depth variations or surface tension inhomogeneities.
To this end, we exploit the remarkable aspect ratio of these microfluidic devices to propose a depth-averaged description of the pancake shaped droplets. The resulting equations, called Brinkman's equations, combine the 2D Stokes equations with 2D Darcy potential-flow-like equations. Their diphasic simulation relies on the adaptation of existing algorithms to this particular free interface problem. Pressure corrections due to the thickness variations of the lubricating thin films will also be included.
Surfactant and heat dynamics will then be added to model thermo- and soluto-capillary forcing. The depth-averaged model will be finally generalized to account for arbitrary depth variations, so as to add dynamics to the quasi-static description of droplets moving along successive minimal surface energy locations.
A specific part of the project is also devoted to the development of an experimental expertise: it is indeed essential to the success of the project to conduct fundamental microfluidic experiments in order to validate our new models. While SIMCOMICS aims at shrinking the gap between present computations of droplets flowing in microchannels and the increasing number of application-oriented experimental studies, it both raises fundamental questions and opens promising perspectives for the engineering design of new microcarved microchannels.
Max ERC Funding
1 405 796 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym SPARCCLE
Project STRUCTURE PRESERVING APPROXIMATIONS FOR ROBUST COMPUTATION OF CONSERVATION LAWS AND RELATED EQUATIONS
Researcher (PI) Siddhartha Mishra
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), PE1, ERC-2012-StG_20111012
Summary "Many interesting systems in physics and engineering are mathematically modeled by first-order non-linear hyperbolic partial differential equations termed as systems of conservation laws. Examples include the Euler equations of aerodynamics, the shallow water equations of oceanography, multi-phase flows in a porous medium (used in the oil industry), equations of non-linear elasticity and the MHD equations of plasma physics. Numerical methods are the key tools to study these equations and to simulate interesting phenomena such as shock waves.
Despite the intense development of numerical methods for the past three decades and great success in applying these methods to large scale complex physical and engineering simulations, the massive increase in computational power in recent years has exposed the inability of state of the art schemes to simulate very large, multiscale, multiphysics three dimensional problems on complex geometries. In particular, problems with strong shocks that depend explicitly on underlying small scale effects, involve geometric constraints like vorticity and require uncertain inputs such as random initial data and source terms, are beyond the range of existing methods.
The main goal of this project will be to design space-time adaptive \emph{structure preserving} arbitrarily high-order finite volume and discontinuous Galerkin schemes that incorporate correct small scale information and provide for efficient uncertainty quantification. These schemes will tackle emerging grand challenges and dramatically increase the range and scope of numerical simulations for systems modeled by hyperbolic PDEs. Moreover, the schemes will be implemented to ensure optimal performance on emerging massively parallel hardware architecture. The resulting publicly available code can be used by scientists and engineers to study complex systems and design new technologies."
Summary
"Many interesting systems in physics and engineering are mathematically modeled by first-order non-linear hyperbolic partial differential equations termed as systems of conservation laws. Examples include the Euler equations of aerodynamics, the shallow water equations of oceanography, multi-phase flows in a porous medium (used in the oil industry), equations of non-linear elasticity and the MHD equations of plasma physics. Numerical methods are the key tools to study these equations and to simulate interesting phenomena such as shock waves.
Despite the intense development of numerical methods for the past three decades and great success in applying these methods to large scale complex physical and engineering simulations, the massive increase in computational power in recent years has exposed the inability of state of the art schemes to simulate very large, multiscale, multiphysics three dimensional problems on complex geometries. In particular, problems with strong shocks that depend explicitly on underlying small scale effects, involve geometric constraints like vorticity and require uncertain inputs such as random initial data and source terms, are beyond the range of existing methods.
The main goal of this project will be to design space-time adaptive \emph{structure preserving} arbitrarily high-order finite volume and discontinuous Galerkin schemes that incorporate correct small scale information and provide for efficient uncertainty quantification. These schemes will tackle emerging grand challenges and dramatically increase the range and scope of numerical simulations for systems modeled by hyperbolic PDEs. Moreover, the schemes will be implemented to ensure optimal performance on emerging massively parallel hardware architecture. The resulting publicly available code can be used by scientists and engineers to study complex systems and design new technologies."
Max ERC Funding
1 220 433 €
Duration
Start date: 2012-12-01, End date: 2017-11-30
Project acronym SSX
Project "State Space Exploration: Principles, Algorithms and Applications"
Researcher (PI) Malte Helmert
Host Institution (HI) UNIVERSITAT BASEL
Call Details Starting Grant (StG), PE6, ERC-2013-StG
Summary "State-space search, finding paths in huge, implicitly given graphs, is a fundamental problem in artificial intelligence and other areas of computer science. State-space search algorithms like A*, IDA* and greedy best-first search are major success stories in artificial intelligence, and hundreds of papers based on variations of these algorithms are published every year. Due to this success, the major assumptions of these algorithms are rarely questioned.
We argue that the current generation of state-space search algorithms has three significant deficiencies that impede further progress in the field:
1. They explore a monolithic model of the world rather than applying a factored perspective.
2. They do not learn from mistakes and hence tend do commit the same mistake thousands of times.
3. In the case of satisficing (i.e., suboptimal) search, the design of the major algorithms has been based on ad-hoc intuitions rather than sound theoretical principles.
This proposal targets these three issues. We propose to develop a rigorous theory of factored state-space search, a rigorous theory of learning from information gathered during search, and a
decision-theoretic foundation for satisficing search algorithms. Based on these insights we will design and implement new state-space search algorithms addressing the deficiencies of current methods. Finally, we will apply the new algorithms to application domains of state-space search to raise the state of the art in these areas."
Summary
"State-space search, finding paths in huge, implicitly given graphs, is a fundamental problem in artificial intelligence and other areas of computer science. State-space search algorithms like A*, IDA* and greedy best-first search are major success stories in artificial intelligence, and hundreds of papers based on variations of these algorithms are published every year. Due to this success, the major assumptions of these algorithms are rarely questioned.
We argue that the current generation of state-space search algorithms has three significant deficiencies that impede further progress in the field:
1. They explore a monolithic model of the world rather than applying a factored perspective.
2. They do not learn from mistakes and hence tend do commit the same mistake thousands of times.
3. In the case of satisficing (i.e., suboptimal) search, the design of the major algorithms has been based on ad-hoc intuitions rather than sound theoretical principles.
This proposal targets these three issues. We propose to develop a rigorous theory of factored state-space search, a rigorous theory of learning from information gathered during search, and a
decision-theoretic foundation for satisficing search algorithms. Based on these insights we will design and implement new state-space search algorithms addressing the deficiencies of current methods. Finally, we will apply the new algorithms to application domains of state-space search to raise the state of the art in these areas."
Max ERC Funding
1 499 737 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym SUBLINEAR
Project Sublinear Algorithms for Modern Data Analysis
Researcher (PI) Mikhail KAPRALOV
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE6, ERC-2017-STG
Summary Designing efficient algorithms for fundamental computational tasks as well as understanding the limits of tractability has been the goal of computer science since its inception. Polynomial runtime has been the de facto notion of efficiency since the introduction of the notion of NP-completeness. As the sizes of modern datasets grow, however, many classical polynomial time (and sometimes even linear time) solutions become prohibitively expensive. This calls for sublinear algorithms, i.e. algorithms whose resource requirements are substantially smaller than the size of the input that they operate on.
We propose to design a toolbox of powerful algorithmic techniques with sublinear resource requirements that will form the theoretical foundation of large data analysis, as well as develop a nuanced runtime, space and communication complexity theory to show optimality of our algorithms. Specifically, we propose to:
1. design an algorithmic toolkit for sublinear graph processing that will form the basis of large scale graph analytics;
2. design a new generation of sublinear algorithms for signal processing that will become the method of choice for a wide range of applications;
3. develop a far-reaching set of techniques for lower bounding runtime, space and communication complexity of sublinear algorithms.
The problems that we propose to solve are among the most fundamental algorithmic questions on the forefront of the rapidly developing algorithmic theory of large data analysis, which has been the focus of an extensive body of work in the research community. The algorithms and complexity theoretic results that we propose to design will cut at the core of fundamental computational problems and form the theoretical foundation of computing with constrained resources.
Summary
Designing efficient algorithms for fundamental computational tasks as well as understanding the limits of tractability has been the goal of computer science since its inception. Polynomial runtime has been the de facto notion of efficiency since the introduction of the notion of NP-completeness. As the sizes of modern datasets grow, however, many classical polynomial time (and sometimes even linear time) solutions become prohibitively expensive. This calls for sublinear algorithms, i.e. algorithms whose resource requirements are substantially smaller than the size of the input that they operate on.
We propose to design a toolbox of powerful algorithmic techniques with sublinear resource requirements that will form the theoretical foundation of large data analysis, as well as develop a nuanced runtime, space and communication complexity theory to show optimality of our algorithms. Specifically, we propose to:
1. design an algorithmic toolkit for sublinear graph processing that will form the basis of large scale graph analytics;
2. design a new generation of sublinear algorithms for signal processing that will become the method of choice for a wide range of applications;
3. develop a far-reaching set of techniques for lower bounding runtime, space and communication complexity of sublinear algorithms.
The problems that we propose to solve are among the most fundamental algorithmic questions on the forefront of the rapidly developing algorithmic theory of large data analysis, which has been the focus of an extensive body of work in the research community. The algorithms and complexity theoretic results that we propose to design will cut at the core of fundamental computational problems and form the theoretical foundation of computing with constrained resources.
Max ERC Funding
1 473 175 €
Duration
Start date: 2018-03-01, End date: 2023-02-28
Project acronym Topo2DEG
Project Topological states in superconducting two-dimensional electron gases
Researcher (PI) Fabrizio Nichele
Host Institution (HI) IBM RESEARCH GMBH
Call Details Starting Grant (StG), PE3, ERC-2018-STG
Summary I will experimentally investigate hybrid superconductor/semiconductor devices for realizing novel topological states of matter, with interest both in fundamental physics and quantum computing applications. Common denominator of the proposed experiments is a regime where the characteristic energy scales of the system, namely Fermi energy, spin orbit interaction correction, superconducting gap and Zeeman splitting are comparable to each other, resulting in unique and mostly uncharted physical territories. Differently from the most widespread use of semiconductor nanowires coupled to superconductors, I will employ novel hybrid two-dimensional electron gases (2DEGs) where the superconductor is grown in-situ and matched to the semiconductor lattice. This novel system was mainly developed by the team I supervise, during the last two years. Compared to the conventional nanowire-based approach, hybrid 2DEGs are readily available, characterized by very low disorder and more amenable to complex sample designs. The work will focus on: 1) Taking full advantage of the planar geometry to study spatial and non-local properties of individual Majorana wires, as well as branched geometries. These experiments will constitute critical tests to establish if the commonly observed zero bias peaks are indeed associated with Majorana modes and pave the way to complex networks of interacting Majorana wires, a requirement for quantum computing. 2) Studying topological phenomena in multi-terminal Josephson junctions (JJs), with particular emphasis on tuning the superconducting phase difference across electrodes pairs. Topological JJs offer a new and possibly advantageous path forward to create and manipulate Majorana modes not explored up to date, including the possibility to reach the topological regime for vanishing small external magnetic fields, useful for applications. Success of the proposal will constitute a key step forward towards topological quantum computing.
Summary
I will experimentally investigate hybrid superconductor/semiconductor devices for realizing novel topological states of matter, with interest both in fundamental physics and quantum computing applications. Common denominator of the proposed experiments is a regime where the characteristic energy scales of the system, namely Fermi energy, spin orbit interaction correction, superconducting gap and Zeeman splitting are comparable to each other, resulting in unique and mostly uncharted physical territories. Differently from the most widespread use of semiconductor nanowires coupled to superconductors, I will employ novel hybrid two-dimensional electron gases (2DEGs) where the superconductor is grown in-situ and matched to the semiconductor lattice. This novel system was mainly developed by the team I supervise, during the last two years. Compared to the conventional nanowire-based approach, hybrid 2DEGs are readily available, characterized by very low disorder and more amenable to complex sample designs. The work will focus on: 1) Taking full advantage of the planar geometry to study spatial and non-local properties of individual Majorana wires, as well as branched geometries. These experiments will constitute critical tests to establish if the commonly observed zero bias peaks are indeed associated with Majorana modes and pave the way to complex networks of interacting Majorana wires, a requirement for quantum computing. 2) Studying topological phenomena in multi-terminal Josephson junctions (JJs), with particular emphasis on tuning the superconducting phase difference across electrodes pairs. Topological JJs offer a new and possibly advantageous path forward to create and manipulate Majorana modes not explored up to date, including the possibility to reach the topological regime for vanishing small external magnetic fields, useful for applications. Success of the proposal will constitute a key step forward towards topological quantum computing.
Max ERC Funding
1 999 916 €
Duration
Start date: 2019-03-01, End date: 2024-02-29
Project acronym TRIPLE
Project Three Indirect Probes of Lyman continuum LEakage from galaxies
Researcher (PI) Anne VERHAMME
Host Institution (HI) UNIVERSITE DE GENEVE
Call Details Starting Grant (StG), PE9, ERC-2017-STG
Summary Cosmic reionization corresponds to the period in the history of the Universe during which the predominantly neutral intergalactic medium was ionised by the emergence of the first luminous sources. Young stars in primeval galaxies may be the sources of reionization, if the ionising radiation, called Lyman continuum (LyC), that they produce can escape their interstellar medium: the escape fraction of LyC photons from galaxies is one of the main unknowns of reionization studies. This ERC project will contribute to answer this question, by computing from simulated galaxies three indirect diagnostics of LyC leakage that were recently reported in the literature, and comparing the virtual observables with the direct escape of LyC photons from simulated galaxies, and with observations. The first diagnostic for LyC leakage relates the escape of the strongly resonant Lyman-alpha radiation from galaxies to the LyC escape. It was proposed by the PI (Verhamme et al. 2015), and recently validated by observations (Verhamme et al. 2016). The second diagnostic proposes that the strength of Oxygen lines ratios can trace density-bounded interstellar regions. It was the selection criterion for the successful detection of 5 strong Lyman Continuum Emitters from our team (Izotov 2016a,b). The third diagnostic relates the metallic absorption line strengths to the porosity of the absorbing interstellar gas in front of the stars. The increasing opacity of the intergalactic medium with redshift renders direct LyC detections impossible during reionisation. Indirect methods are the only probes of LyC leakage in the distant Universe, and the proposed diagnostics will soon become observables at the redshifts of interest with JWST. They have passed the validation tests, it is now urgent to calibrate these indicators on state-of-the art simulations of galaxy formation. This is the main objective of the proposed project.
Summary
Cosmic reionization corresponds to the period in the history of the Universe during which the predominantly neutral intergalactic medium was ionised by the emergence of the first luminous sources. Young stars in primeval galaxies may be the sources of reionization, if the ionising radiation, called Lyman continuum (LyC), that they produce can escape their interstellar medium: the escape fraction of LyC photons from galaxies is one of the main unknowns of reionization studies. This ERC project will contribute to answer this question, by computing from simulated galaxies three indirect diagnostics of LyC leakage that were recently reported in the literature, and comparing the virtual observables with the direct escape of LyC photons from simulated galaxies, and with observations. The first diagnostic for LyC leakage relates the escape of the strongly resonant Lyman-alpha radiation from galaxies to the LyC escape. It was proposed by the PI (Verhamme et al. 2015), and recently validated by observations (Verhamme et al. 2016). The second diagnostic proposes that the strength of Oxygen lines ratios can trace density-bounded interstellar regions. It was the selection criterion for the successful detection of 5 strong Lyman Continuum Emitters from our team (Izotov 2016a,b). The third diagnostic relates the metallic absorption line strengths to the porosity of the absorbing interstellar gas in front of the stars. The increasing opacity of the intergalactic medium with redshift renders direct LyC detections impossible during reionisation. Indirect methods are the only probes of LyC leakage in the distant Universe, and the proposed diagnostics will soon become observables at the redshifts of interest with JWST. They have passed the validation tests, it is now urgent to calibrate these indicators on state-of-the art simulations of galaxy formation. This is the main objective of the proposed project.
Max ERC Funding
1 500 000 €
Duration
Start date: 2018-03-01, End date: 2023-02-28
Project acronym UFO
Project Uncovering the origins of friction
Researcher (PI) Jean-François Molinari
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE8, ERC-2009-StG
Summary Nanotechnology is a new frontier in research and new tools must be developed. As surface to volume ratios become large, engineering at the nanoscale will be dominated by surface science. The study of Contact Mechanics at nanoscales nanotribology- needs to fully account for adhesive forces, third-body interactions and deformation mechanisms at contacting asperities. Understanding these factors as well as the morphological evolution of contact clusters has the potential of explaining the origins of frictional forces and wear. This will guide us in the design of tailored-made lubricants and surface morphologies, which, in turn, will help reduce the high societal cost of wear damage. This ERCstg proposal describes a plan to establish a world-leading group in Contact Mechanics at length scales ranging from the atomic to macroscopic scales relevant to Civil or Mechanical Engineering structural applications. Our approach will have recourse to molecular dynamics coupled with the finite-element method for an accurate description of atomic interactions at the contact surface, and of long-range elastic forces. The project is interdisciplinary as the deepening of our understanding of Contact Mechanics will necessitate Computer Science developments. A central objective of the research will be the release of an open, 3D parallel, finite-element platform dedicated to contact applications. The PI will assemble a team of Engineers and Computer Scientists to ensure a successful and perennial diffusion in the European academic and industrial network. The research will therefore explore the origins of friction, a scientific quest of fundamental importance to many industrial applications, and will also create a stable base for sharing scientific-computing resources.
Summary
Nanotechnology is a new frontier in research and new tools must be developed. As surface to volume ratios become large, engineering at the nanoscale will be dominated by surface science. The study of Contact Mechanics at nanoscales nanotribology- needs to fully account for adhesive forces, third-body interactions and deformation mechanisms at contacting asperities. Understanding these factors as well as the morphological evolution of contact clusters has the potential of explaining the origins of frictional forces and wear. This will guide us in the design of tailored-made lubricants and surface morphologies, which, in turn, will help reduce the high societal cost of wear damage. This ERCstg proposal describes a plan to establish a world-leading group in Contact Mechanics at length scales ranging from the atomic to macroscopic scales relevant to Civil or Mechanical Engineering structural applications. Our approach will have recourse to molecular dynamics coupled with the finite-element method for an accurate description of atomic interactions at the contact surface, and of long-range elastic forces. The project is interdisciplinary as the deepening of our understanding of Contact Mechanics will necessitate Computer Science developments. A central objective of the research will be the release of an open, 3D parallel, finite-element platform dedicated to contact applications. The PI will assemble a team of Engineers and Computer Scientists to ensure a successful and perennial diffusion in the European academic and industrial network. The research will therefore explore the origins of friction, a scientific quest of fundamental importance to many industrial applications, and will also create a stable base for sharing scientific-computing resources.
Max ERC Funding
1 773 000 €
Duration
Start date: 2009-09-01, End date: 2014-08-31
Project acronym UltimateMembranes
Project Energy-efficient membranes for carbon capture by crystal engineering of two-dimensional nanoporous materials
Researcher (PI) Kumar Varoon AGRAWAL
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE8, ERC-2018-STG
Summary The EU integrated strategic energy technology plan, SET-plan, in its 2016 progress report, has called for urgent measures on the carbon capture, however, the high energy-penalty and environmental issues related to the conventional capture process (amine-based scrubbing) has been a major bottleneck. High-performance membranes can reduce the energy penalty for the capture, are environment-friendly (no chemical is used, no waste is generated), can intensify chemical processes, and can be employed for the capture in a decentralized fashion. However, a technological breakthrough is needed to realize such chemically and thermally stable, high-performance membranes. This project seeks to develop the ultimate high-performance membranes for H2/CO2 (pre-combustion capture), CO2/N2 (post-combustion capture), and CO2/CH4 separations (natural gas sweetening). Based on calculations, these membranes will yield a gigantic gas permeance (1 and 0.1 million GPU for the H2 and the CO2 selective membranes, respectively), 1000 and 10-fold higher than that of the state-of-the-art polymeric and nanoporous membranes, respectively, reducing capital expenditure per unit performance and the needed membrane area. For this, we introduce three novel concepts, combining the top-down and the bottom-up crystal engineering approaches to develop size-selective, chemically and thermally stable, nanoporous two-dimensional membranes. First, exfoliated nanoporous 2d nanosheets will be stitched in-plane to synthesize the truly-2d membranes. Second, metal-organic frameworks will be confined across a nanoporous 2d matrix to prepare a composite 2d membrane. Third, atom-thick graphene films with tunable, uniform and size-selective nanopores will be crystallized using a novel thermodynamic equilibrium between the lattice growth and etching. Overall, the innovative concepts developed here will open up several frontiers on the synthesis of high-performance membranes for a wide-range of separation processes.
Summary
The EU integrated strategic energy technology plan, SET-plan, in its 2016 progress report, has called for urgent measures on the carbon capture, however, the high energy-penalty and environmental issues related to the conventional capture process (amine-based scrubbing) has been a major bottleneck. High-performance membranes can reduce the energy penalty for the capture, are environment-friendly (no chemical is used, no waste is generated), can intensify chemical processes, and can be employed for the capture in a decentralized fashion. However, a technological breakthrough is needed to realize such chemically and thermally stable, high-performance membranes. This project seeks to develop the ultimate high-performance membranes for H2/CO2 (pre-combustion capture), CO2/N2 (post-combustion capture), and CO2/CH4 separations (natural gas sweetening). Based on calculations, these membranes will yield a gigantic gas permeance (1 and 0.1 million GPU for the H2 and the CO2 selective membranes, respectively), 1000 and 10-fold higher than that of the state-of-the-art polymeric and nanoporous membranes, respectively, reducing capital expenditure per unit performance and the needed membrane area. For this, we introduce three novel concepts, combining the top-down and the bottom-up crystal engineering approaches to develop size-selective, chemically and thermally stable, nanoporous two-dimensional membranes. First, exfoliated nanoporous 2d nanosheets will be stitched in-plane to synthesize the truly-2d membranes. Second, metal-organic frameworks will be confined across a nanoporous 2d matrix to prepare a composite 2d membrane. Third, atom-thick graphene films with tunable, uniform and size-selective nanopores will be crystallized using a novel thermodynamic equilibrium between the lattice growth and etching. Overall, the innovative concepts developed here will open up several frontiers on the synthesis of high-performance membranes for a wide-range of separation processes.
Max ERC Funding
1 875 000 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym USED
Project Ultrafast Spectroscopic Electron Diffraction (USED) of quantum solids and thin films
Researcher (PI) Fabrizio Carbone
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE3, ERC-2010-StG_20091028
Summary Ultrafast Spectroscopic Electron Diffraction (USED) of quantum solids and thin films
Summary
Ultrafast Spectroscopic Electron Diffraction (USED) of quantum solids and thin films
Max ERC Funding
1 464 000 €
Duration
Start date: 2010-11-01, End date: 2015-10-31
Project acronym WINDMIL
Project Smart Monitoring, Inspection and Life-Cycle Assessment of Wind Turbines
Researcher (PI) Eleni Chatzi
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), PE8, ERC-2015-STG
Summary The excessive energy consumption that Europe is faced with, calls for sustainable resource management and policy-making. Amongst renewable sources of the global energy pool, wind energy holds the lead. Nonetheless, wind turbine (WT) facilities are conjoined with a number of shortcomings relating to their short life-span and the lack of efficient management schemes. With a number of WTs currently reaching their design span, stakeholders and policy makers are convinced of the necessity for reliable life-cycle assessment methodologies. However, existing tools have not yet caught up with the maturity of the WT technology, leaving visual inspection and offline non-destructive evaluation methods as the norm.
This proposal aims to establish a smart framework for the monitoring, inspection and life-cycle assessment of WTs, able to guide WT operators in the management of these assets from cradle-to-grave. Our project is founded on a minimal intervention principle, coupling easily deployed and affordable sensor technology with state-of-the-art numerical modeling and data processing tools. An integrated approach is proposed comprising: (i) a new monitoring paradigm for WTs relying on fusion of structural response information, (ii) simulation of influential, yet little explored, factors affecting structural response, such as structure-foundation-soil interaction and fatigue (ii) a stochastic framework for detecting anomalies in both a short- (damage) and long-term (deterioration) scale.
Our end goal is to deliver a “protection-suit” for WTs comprising a hardware (sensor) solution and a modular readily implementable software package, titled ETH-WINDMIL. The suggested kit aims to completely redefine the status quo in current Supervisory Control And Data Acquisition systems. This pursuit is well founded on background work of the PI within the area of structural monitoring, with a focus in translating the value of information into quantifiable terms and engineering practice.
Summary
The excessive energy consumption that Europe is faced with, calls for sustainable resource management and policy-making. Amongst renewable sources of the global energy pool, wind energy holds the lead. Nonetheless, wind turbine (WT) facilities are conjoined with a number of shortcomings relating to their short life-span and the lack of efficient management schemes. With a number of WTs currently reaching their design span, stakeholders and policy makers are convinced of the necessity for reliable life-cycle assessment methodologies. However, existing tools have not yet caught up with the maturity of the WT technology, leaving visual inspection and offline non-destructive evaluation methods as the norm.
This proposal aims to establish a smart framework for the monitoring, inspection and life-cycle assessment of WTs, able to guide WT operators in the management of these assets from cradle-to-grave. Our project is founded on a minimal intervention principle, coupling easily deployed and affordable sensor technology with state-of-the-art numerical modeling and data processing tools. An integrated approach is proposed comprising: (i) a new monitoring paradigm for WTs relying on fusion of structural response information, (ii) simulation of influential, yet little explored, factors affecting structural response, such as structure-foundation-soil interaction and fatigue (ii) a stochastic framework for detecting anomalies in both a short- (damage) and long-term (deterioration) scale.
Our end goal is to deliver a “protection-suit” for WTs comprising a hardware (sensor) solution and a modular readily implementable software package, titled ETH-WINDMIL. The suggested kit aims to completely redefine the status quo in current Supervisory Control And Data Acquisition systems. This pursuit is well founded on background work of the PI within the area of structural monitoring, with a focus in translating the value of information into quantifiable terms and engineering practice.
Max ERC Funding
1 486 224 €
Duration
Start date: 2016-05-01, End date: 2021-04-30