Project acronym ABCTRANSPORT
Project Minimalist multipurpose ATP-binding cassette transporters
Researcher (PI) Dirk Jan Slotboom
Host Institution (HI) RIJKSUNIVERSITEIT GRONINGEN
Call Details Starting Grant (StG), LS1, ERC-2011-StG_20101109
Summary Many Gram-positive (pathogenic) bacteria are dependent on the uptake of vitamins from the environment or from the infected host. We have recently discovered the long-elusive family of membrane protein complexes catalyzing such transport. The vitamin transporters have an unprecedented modular architecture consisting of a single multipurpose energizing module (the Energy Coupling Factor, ECF) and multiple exchangeable membrane proteins responsible for substrate recognition (S-components). The S-components have characteristics of ion-gradient driven transporters (secondary active transporters), whereas the energizing modules are related to ATP-binding cassette (ABC) transporters (primary active transporters).
The aim of the proposal is threefold: First, we will address the question how properties of primary and secondary transporters are combined in ECF transporters to obtain a novel transport mechanism. Second, we will study the fundamental and unresolved question how protein-protein recognition takes place in the hydrophobic environment of the lipid bilayer. The modular nature of the ECF proteins offers a natural system to study the driving forces used for membrane protein interaction. Third, we will assess whether the ECF transport systems could become targets for antibacterial drugs. ECF transporters are found exclusively in prokaryotes, and their activity is often essential for viability of Gram-positive pathogens. Thus they could turn out to be an Achilles’ heel for the organisms.
Structural and mechanistic studies (X-ray crystallography, microscopy, spectroscopy and biochemistry) will reveal how the different transport modes are combined in a single protein complex, how transport is energized and catalyzed, and how protein-protein recognition takes place. Microbiological screens will be developed to search for compounds that inhibit prokaryote-specific steps of the mechanism of ECF transporters.
Summary
Many Gram-positive (pathogenic) bacteria are dependent on the uptake of vitamins from the environment or from the infected host. We have recently discovered the long-elusive family of membrane protein complexes catalyzing such transport. The vitamin transporters have an unprecedented modular architecture consisting of a single multipurpose energizing module (the Energy Coupling Factor, ECF) and multiple exchangeable membrane proteins responsible for substrate recognition (S-components). The S-components have characteristics of ion-gradient driven transporters (secondary active transporters), whereas the energizing modules are related to ATP-binding cassette (ABC) transporters (primary active transporters).
The aim of the proposal is threefold: First, we will address the question how properties of primary and secondary transporters are combined in ECF transporters to obtain a novel transport mechanism. Second, we will study the fundamental and unresolved question how protein-protein recognition takes place in the hydrophobic environment of the lipid bilayer. The modular nature of the ECF proteins offers a natural system to study the driving forces used for membrane protein interaction. Third, we will assess whether the ECF transport systems could become targets for antibacterial drugs. ECF transporters are found exclusively in prokaryotes, and their activity is often essential for viability of Gram-positive pathogens. Thus they could turn out to be an Achilles’ heel for the organisms.
Structural and mechanistic studies (X-ray crystallography, microscopy, spectroscopy and biochemistry) will reveal how the different transport modes are combined in a single protein complex, how transport is energized and catalyzed, and how protein-protein recognition takes place. Microbiological screens will be developed to search for compounds that inhibit prokaryote-specific steps of the mechanism of ECF transporters.
Max ERC Funding
1 500 000 €
Duration
Start date: 2012-01-01, End date: 2017-12-31
Project acronym ABDESIGN
Project Computational design of novel protein function in antibodies
Researcher (PI) Sarel-Jacob Fleishman
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), LS1, ERC-2013-StG
Summary We propose to elucidate the structural design principles of naturally occurring antibody complementarity-determining regions (CDRs) and to computationally design novel antibody functions. Antibodies represent the most versatile known system for molecular recognition. Research has yielded many insights into antibody design principles and promising biotechnological and pharmaceutical applications. Still, our understanding of how CDRs encode specific loop conformations lags far behind our understanding of structure-function relationships in non-immunological scaffolds. Thus, design of antibodies from first principles has not been demonstrated. We propose a computational-experimental strategy to address this challenge. We will: (a) characterize the design principles and sequence elements that rigidify antibody CDRs. Natural antibody loops will be subjected to computational modeling, crystallography, and a combined in vitro evolution and deep-sequencing approach to isolate sequence features that rigidify loop backbones; (b) develop a novel computational-design strategy, which uses the >1000 solved structures of antibodies deposited in structure databases to realistically model CDRs and design them to recognize proteins that have not been co-crystallized with antibodies. For example, we will design novel antibodies targeting insulin, for which clinically useful diagnostics are needed. By accessing much larger sequence/structure spaces than are available to natural immune-system repertoires and experimental methods, computational antibody design could produce higher-specificity and higher-affinity binders, even to challenging targets; and (c) develop new strategies to program conformational change in CDRs, generating, e.g., the first allosteric antibodies. These will allow targeting, in principle, of any molecule, potentially revolutionizing how antibodies are generated for research and medicine, providing new insights on the design principles of protein functional sites.
Summary
We propose to elucidate the structural design principles of naturally occurring antibody complementarity-determining regions (CDRs) and to computationally design novel antibody functions. Antibodies represent the most versatile known system for molecular recognition. Research has yielded many insights into antibody design principles and promising biotechnological and pharmaceutical applications. Still, our understanding of how CDRs encode specific loop conformations lags far behind our understanding of structure-function relationships in non-immunological scaffolds. Thus, design of antibodies from first principles has not been demonstrated. We propose a computational-experimental strategy to address this challenge. We will: (a) characterize the design principles and sequence elements that rigidify antibody CDRs. Natural antibody loops will be subjected to computational modeling, crystallography, and a combined in vitro evolution and deep-sequencing approach to isolate sequence features that rigidify loop backbones; (b) develop a novel computational-design strategy, which uses the >1000 solved structures of antibodies deposited in structure databases to realistically model CDRs and design them to recognize proteins that have not been co-crystallized with antibodies. For example, we will design novel antibodies targeting insulin, for which clinically useful diagnostics are needed. By accessing much larger sequence/structure spaces than are available to natural immune-system repertoires and experimental methods, computational antibody design could produce higher-specificity and higher-affinity binders, even to challenging targets; and (c) develop new strategies to program conformational change in CDRs, generating, e.g., the first allosteric antibodies. These will allow targeting, in principle, of any molecule, potentially revolutionizing how antibodies are generated for research and medicine, providing new insights on the design principles of protein functional sites.
Max ERC Funding
1 499 930 €
Duration
Start date: 2013-09-01, End date: 2018-08-31
Project acronym ACDC
Project Algorithms and Complexity of Highly Decentralized Computations
Researcher (PI) Fabian Daniel Kuhn
Host Institution (HI) ALBERT-LUDWIGS-UNIVERSITAET FREIBURG
Call Details Starting Grant (StG), PE6, ERC-2013-StG
Summary "Many of today's and tomorrow's computer systems are built on top of large-scale networks such as, e.g., the Internet, the world wide web, wireless ad hoc and sensor networks, or peer-to-peer networks. Driven by technological advances, new kinds of networks and applications have become possible and we can safely assume that this trend is going to continue. Often modern systems are envisioned to consist of a potentially large number of individual components that are organized in a completely decentralized way. There is no central authority that controls the topology of the network, how nodes join or leave the system, or in which way nodes communicate with each other. Also, many future distributed applications will be built using wireless devices that communicate via radio.
The general objective of the proposed project is to improve our understanding of the algorithmic and theoretical foundations of decentralized distributed systems. From an algorithmic point of view, decentralized networks and computations pose a number of fascinating and unique challenges that are not present in sequential or more standard distributed systems. As communication is limited and mostly between nearby nodes, each node of a large network can only maintain a very restricted view of the global state of the system. This is particularly true if the network can change dynamically, either by nodes joining or leaving the system or if the topology changes over time, e.g., because of the mobility of the devices in case of a wireless network. Nevertheless, the nodes of a network need to coordinate in order to achieve some global goal.
In particular, we plan to study algorithms and lower bounds for basic computation and information dissemination tasks in such systems. In addition, we are particularly interested in the complexity of distributed computations in dynamic and wireless networks."
Summary
"Many of today's and tomorrow's computer systems are built on top of large-scale networks such as, e.g., the Internet, the world wide web, wireless ad hoc and sensor networks, or peer-to-peer networks. Driven by technological advances, new kinds of networks and applications have become possible and we can safely assume that this trend is going to continue. Often modern systems are envisioned to consist of a potentially large number of individual components that are organized in a completely decentralized way. There is no central authority that controls the topology of the network, how nodes join or leave the system, or in which way nodes communicate with each other. Also, many future distributed applications will be built using wireless devices that communicate via radio.
The general objective of the proposed project is to improve our understanding of the algorithmic and theoretical foundations of decentralized distributed systems. From an algorithmic point of view, decentralized networks and computations pose a number of fascinating and unique challenges that are not present in sequential or more standard distributed systems. As communication is limited and mostly between nearby nodes, each node of a large network can only maintain a very restricted view of the global state of the system. This is particularly true if the network can change dynamically, either by nodes joining or leaving the system or if the topology changes over time, e.g., because of the mobility of the devices in case of a wireless network. Nevertheless, the nodes of a network need to coordinate in order to achieve some global goal.
In particular, we plan to study algorithms and lower bounds for basic computation and information dissemination tasks in such systems. In addition, we are particularly interested in the complexity of distributed computations in dynamic and wireless networks."
Max ERC Funding
1 148 000 €
Duration
Start date: 2013-11-01, End date: 2018-10-31
Project acronym ACROSS
Project 3D Reconstruction and Modeling across Different Levels of Abstraction
Researcher (PI) Leif Kobbelt
Host Institution (HI) RHEINISCH-WESTFAELISCHE TECHNISCHE HOCHSCHULE AACHEN
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary "Digital 3D models are gaining more and more importance in diverse application fields ranging from computer graphics, multimedia and simulation sciences to engineering, architecture, and medicine. Powerful technologies to digitize the 3D shape of real objects and scenes are becoming available even to consumers. However, the raw geometric data emerging from, e.g., 3D scanning or multi-view stereo often lacks a consistent structure and meta-information which are necessary for the effective deployment of such models in sophisticated down-stream applications like animation, simulation, or CAD/CAM that go beyond mere visualization. Our goal is to develop new fundamental algorithms which transform raw geometric input data into augmented 3D models that are equipped with structural meta information such as feature aligned meshes, patch segmentations, local and global geometric constraints, statistical shape variation data, or even procedural descriptions. Our methodological approach is inspired by the human perceptual system that integrates bottom-up (data-driven) and top-down (model-driven) mechanisms in its hierarchical processing. Similarly we combine algorithms operating on different levels of abstraction into reconstruction and modeling networks. Instead of developing an individual solution for each specific application scenario, we create an eco-system of algorithms for automatic processing and interactive design of highly complex 3D models. A key concept is the information flow across all levels of abstraction in a bottom-up as well as top-down fashion. We not only aim at optimizing geometric representations but in fact at bridging the gap between reconstruction and recognition of geometric objects. The results from this project will make it possible to bring 3D models of real world objects into many highly relevant applications in science, industry, and entertainment, greatly reducing the excessive manual effort that is still necessary today."
Summary
"Digital 3D models are gaining more and more importance in diverse application fields ranging from computer graphics, multimedia and simulation sciences to engineering, architecture, and medicine. Powerful technologies to digitize the 3D shape of real objects and scenes are becoming available even to consumers. However, the raw geometric data emerging from, e.g., 3D scanning or multi-view stereo often lacks a consistent structure and meta-information which are necessary for the effective deployment of such models in sophisticated down-stream applications like animation, simulation, or CAD/CAM that go beyond mere visualization. Our goal is to develop new fundamental algorithms which transform raw geometric input data into augmented 3D models that are equipped with structural meta information such as feature aligned meshes, patch segmentations, local and global geometric constraints, statistical shape variation data, or even procedural descriptions. Our methodological approach is inspired by the human perceptual system that integrates bottom-up (data-driven) and top-down (model-driven) mechanisms in its hierarchical processing. Similarly we combine algorithms operating on different levels of abstraction into reconstruction and modeling networks. Instead of developing an individual solution for each specific application scenario, we create an eco-system of algorithms for automatic processing and interactive design of highly complex 3D models. A key concept is the information flow across all levels of abstraction in a bottom-up as well as top-down fashion. We not only aim at optimizing geometric representations but in fact at bridging the gap between reconstruction and recognition of geometric objects. The results from this project will make it possible to bring 3D models of real world objects into many highly relevant applications in science, industry, and entertainment, greatly reducing the excessive manual effort that is still necessary today."
Max ERC Funding
2 482 000 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym ACUITY
Project Algorithms for coping with uncertainty and intractability
Researcher (PI) Nikhil Bansal
Host Institution (HI) TECHNISCHE UNIVERSITEIT EINDHOVEN
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary The two biggest challenges in solving practical optimization problems are computational intractability, and the presence
of uncertainty: most problems are either NP-hard, or have incomplete input data which
makes an exact computation impossible.
Recently, there has been a huge progress in our understanding of intractability, based on spectacular algorithmic and lower bound techniques. For several problems, especially those with only local constraints, we can design optimum
approximation algorithms that are provably the best possible.
However, typical optimization problems usually involve complex global constraints and are much less understood. The situation is even worse for coping with uncertainty. Most of the algorithms are based on ad-hoc techniques and there is no deeper understanding of what makes various problems easy or hard.
This proposal describes several new directions, together with concrete intermediate goals, that will break important new ground in the theory of approximation and online algorithms. The particular directions we consider are (i) extend the primal dual method to systematically design online algorithms, (ii) build a structural theory of online problems based on work functions, (iii) develop new tools to use the power of strong convex relaxations and (iv) design new algorithmic approaches based on non-constructive proof techniques.
The proposed research is at the
cutting edge of algorithm design, and builds upon the recent success of the PI in resolving several longstanding questions in these areas. Any progress is likely to be a significant contribution to theoretical
computer science and combinatorial optimization.
Summary
The two biggest challenges in solving practical optimization problems are computational intractability, and the presence
of uncertainty: most problems are either NP-hard, or have incomplete input data which
makes an exact computation impossible.
Recently, there has been a huge progress in our understanding of intractability, based on spectacular algorithmic and lower bound techniques. For several problems, especially those with only local constraints, we can design optimum
approximation algorithms that are provably the best possible.
However, typical optimization problems usually involve complex global constraints and are much less understood. The situation is even worse for coping with uncertainty. Most of the algorithms are based on ad-hoc techniques and there is no deeper understanding of what makes various problems easy or hard.
This proposal describes several new directions, together with concrete intermediate goals, that will break important new ground in the theory of approximation and online algorithms. The particular directions we consider are (i) extend the primal dual method to systematically design online algorithms, (ii) build a structural theory of online problems based on work functions, (iii) develop new tools to use the power of strong convex relaxations and (iv) design new algorithmic approaches based on non-constructive proof techniques.
The proposed research is at the
cutting edge of algorithm design, and builds upon the recent success of the PI in resolving several longstanding questions in these areas. Any progress is likely to be a significant contribution to theoretical
computer science and combinatorial optimization.
Max ERC Funding
1 519 285 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym ALEXANDRIA
Project "Foundations for Temporal Retrieval, Exploration and Analytics in Web Archives"
Researcher (PI) Wolfgang Nejdl
Host Institution (HI) GOTTFRIED WILHELM LEIBNIZ UNIVERSITAET HANNOVER
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary "Significant parts of our cultural heritage are produced on the Web, yet only insufficient opportunities exist for accessing and exploring the past of the Web. The ALEXANDRIA project aims to develop models, tools and techniques necessary to archive and index relevant parts of the Web, and to retrieve and explore this information in a meaningful way. While the easy accessibility to the current Web is a good baseline, optimal access to Web archives requires new models and algorithms for retrieval, exploration, and analytics which go far beyond what is needed to access the current state of the Web. This includes taking into account the unique temporal dimension of Web archives, structured semantic information already available on the Web, as well as social media and network information.
Within ALEXANDRIA, we will significantly advance semantic and time-based indexing for Web archives using human-compiled knowledge available on the Web, to efficiently index, retrieve and explore information about entities and events from the past. In doing so, we will focus on the concurrent evolution of this knowledge and the Web content to be indexed, and take into account diversity and incompleteness of this knowledge. We will further investigate mixed crowd- and machine-based Web analytics to support long- running and collaborative retrieval and analysis processes on Web archives. Usage of implicit human feedback will be essential to provide better indexing through insights during the analysis process and to better focus harvesting of content.
The ALEXANDRIA Testbed will provide an important context for research, exploration and evaluation of the concepts, methods and algorithms developed in this project, and will provide both relevant collections and algorithms that enable further research on and practical application of our research results to existing archives like the Internet Archive, the Internet Memory Foundation and Web archives maintained by European national libraries."
Summary
"Significant parts of our cultural heritage are produced on the Web, yet only insufficient opportunities exist for accessing and exploring the past of the Web. The ALEXANDRIA project aims to develop models, tools and techniques necessary to archive and index relevant parts of the Web, and to retrieve and explore this information in a meaningful way. While the easy accessibility to the current Web is a good baseline, optimal access to Web archives requires new models and algorithms for retrieval, exploration, and analytics which go far beyond what is needed to access the current state of the Web. This includes taking into account the unique temporal dimension of Web archives, structured semantic information already available on the Web, as well as social media and network information.
Within ALEXANDRIA, we will significantly advance semantic and time-based indexing for Web archives using human-compiled knowledge available on the Web, to efficiently index, retrieve and explore information about entities and events from the past. In doing so, we will focus on the concurrent evolution of this knowledge and the Web content to be indexed, and take into account diversity and incompleteness of this knowledge. We will further investigate mixed crowd- and machine-based Web analytics to support long- running and collaborative retrieval and analysis processes on Web archives. Usage of implicit human feedback will be essential to provide better indexing through insights during the analysis process and to better focus harvesting of content.
The ALEXANDRIA Testbed will provide an important context for research, exploration and evaluation of the concepts, methods and algorithms developed in this project, and will provide both relevant collections and algorithms that enable further research on and practical application of our research results to existing archives like the Internet Archive, the Internet Memory Foundation and Web archives maintained by European national libraries."
Max ERC Funding
2 493 600 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym ALGILE
Project Foundations of Algebraic and Dynamic Data Management Systems
Researcher (PI) Christoph Koch
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary "Contemporary database query languages are ultimately founded on logic and feature an additive operation – usually a form of (multi)set union or disjunction – that is asymmetric in that additions or updates do not always have an inverse. This asymmetry puts a greater part of the machinery of abstract algebra for equation solving outside the reach of databases. However, such equation solving would be a key functionality that problems such as query equivalence testing and data integration could be reduced to: In the current scenario of the presence of an asymmetric additive operation they are undecidable. Moreover, query languages with a symmetric additive operation (i.e., which has an inverse and is thus based on ring theory) would open up databases for a large range of new scientific and mathematical applications.
The goal of the proposed project is to reinvent database management systems with a foundation in abstract algebra and specifically in ring theory. The presence of an additive inverse allows to cleanly define differences between queries. This gives rise to a database analog of differential calculus that leads to radically new incremental and adaptive query evaluation algorithms that substantially outperform the state of the art techniques. These algorithms enable a new class of systems which I call Dynamic Data Management Systems. Such systems can maintain continuously fresh query views at extremely high update rates and have important applications in interactive Large-scale Data Analysis. There is a natural connection between differences and updates, motivating the group theoretic study of updates that will lead to better ways of creating out-of-core data processing algorithms for new storage devices. Basing queries on ring theory leads to a new class of systems, Algebraic Data Management Systems, which herald a convergence of database systems and computer algebra systems."
Summary
"Contemporary database query languages are ultimately founded on logic and feature an additive operation – usually a form of (multi)set union or disjunction – that is asymmetric in that additions or updates do not always have an inverse. This asymmetry puts a greater part of the machinery of abstract algebra for equation solving outside the reach of databases. However, such equation solving would be a key functionality that problems such as query equivalence testing and data integration could be reduced to: In the current scenario of the presence of an asymmetric additive operation they are undecidable. Moreover, query languages with a symmetric additive operation (i.e., which has an inverse and is thus based on ring theory) would open up databases for a large range of new scientific and mathematical applications.
The goal of the proposed project is to reinvent database management systems with a foundation in abstract algebra and specifically in ring theory. The presence of an additive inverse allows to cleanly define differences between queries. This gives rise to a database analog of differential calculus that leads to radically new incremental and adaptive query evaluation algorithms that substantially outperform the state of the art techniques. These algorithms enable a new class of systems which I call Dynamic Data Management Systems. Such systems can maintain continuously fresh query views at extremely high update rates and have important applications in interactive Large-scale Data Analysis. There is a natural connection between differences and updates, motivating the group theoretic study of updates that will lead to better ways of creating out-of-core data processing algorithms for new storage devices. Basing queries on ring theory leads to a new class of systems, Algebraic Data Management Systems, which herald a convergence of database systems and computer algebra systems."
Max ERC Funding
1 480 548 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym ALUNIF
Project Algorithms and Lower Bounds: A Unified Approach
Researcher (PI) Rahul Santhanam
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary One of the fundamental goals of theoretical computer science is to
understand the possibilities and limits of efficient computation. This
quest has two dimensions. The
theory of algorithms focuses on finding efficient solutions to
problems, while computational complexity theory aims to understand when
and why problems are hard to solve. These two areas have different
philosophies and use different sets of techniques. However, in recent
years there have been indications of deep and mysterious connections
between them.
In this project, we propose to explore and develop the connections between
algorithmic analysis and complexity lower bounds in a systematic way.
On the one hand, we plan to use complexity lower bound techniques as inspiration
to design new and improved algorithms for Satisfiability and other
NP-complete problems, as well as to analyze existing algorithms better.
On the other hand, we plan to strengthen implications yielding circuit
lower bounds from non-trivial algorithms for Satisfiability, and to derive
new circuit lower bounds using these stronger implications.
This project has potential for massive impact in both the areas of algorithms
and computational complexity. Improved algorithms for Satisfiability could lead
to improved SAT solvers, and the new analytical tools would lead to a better
understanding of existing heuristics. Complexity lower bound questions are
fundamental
but notoriously difficult, and new lower bounds would open the way to
unconditionally secure cryptographic protocols and derandomization of
probabilistic algorithms. More broadly, this project aims to initiate greater
dialogue between the two areas, with an exchange of ideas and techniques
which leads to accelerated progress in both, as well as a deeper understanding
of the nature of efficient computation.
Summary
One of the fundamental goals of theoretical computer science is to
understand the possibilities and limits of efficient computation. This
quest has two dimensions. The
theory of algorithms focuses on finding efficient solutions to
problems, while computational complexity theory aims to understand when
and why problems are hard to solve. These two areas have different
philosophies and use different sets of techniques. However, in recent
years there have been indications of deep and mysterious connections
between them.
In this project, we propose to explore and develop the connections between
algorithmic analysis and complexity lower bounds in a systematic way.
On the one hand, we plan to use complexity lower bound techniques as inspiration
to design new and improved algorithms for Satisfiability and other
NP-complete problems, as well as to analyze existing algorithms better.
On the other hand, we plan to strengthen implications yielding circuit
lower bounds from non-trivial algorithms for Satisfiability, and to derive
new circuit lower bounds using these stronger implications.
This project has potential for massive impact in both the areas of algorithms
and computational complexity. Improved algorithms for Satisfiability could lead
to improved SAT solvers, and the new analytical tools would lead to a better
understanding of existing heuristics. Complexity lower bound questions are
fundamental
but notoriously difficult, and new lower bounds would open the way to
unconditionally secure cryptographic protocols and derandomization of
probabilistic algorithms. More broadly, this project aims to initiate greater
dialogue between the two areas, with an exchange of ideas and techniques
which leads to accelerated progress in both, as well as a deeper understanding
of the nature of efficient computation.
Max ERC Funding
1 274 496 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym AMD
Project Algorithmic Mechanism Design: Beyond Truthful Mechanisms
Researcher (PI) Michal Feldman
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), PE6, ERC-2013-StG
Summary "The first decade of Algorithmic Mechanism Design (AMD) concentrated, very successfully, on the design of truthful mechanisms for the allocation of resources among agents with private preferences.
Truthful mechanisms are ones that incentivize rational users to report their preferences truthfully.
Truthfulness, however, for all its theoretical appeal, suffers from several inherent limitations, mainly its high communication and computation complexities.
It is not surprising, therefore, that practical applications forego truthfulness and use simpler mechanisms instead.
Simplicity in itself, however, is not sufficient, as any meaningful mechanism should also have some notion of fairness; otherwise agents will stop using it over time.
In this project I plan to develop an innovative AMD theoretical framework that will go beyond truthfulness and focus instead on the natural themes of simplicity and fairness, in addition to computational tractability.
One of my primary goals will be the design of simple and fair poly-time mechanisms that perform at near optimal levels with respect to important economic objectives such as social welfare and revenue.
To this end, I will work toward providing precise definitions of simplicity and fairness and quantifying the effects of these restrictions on the performance levels that can be obtained.
A major challenge in the evaluation of non-truthful mechanisms is defining a reasonable behavior model that will enable their evaluation.
The success of this project could have a broad impact on Europe and beyond, as it would guide the design of natural mechanisms for markets of tens of billions of dollars in revenue, such as online advertising, or sales of wireless frequencies.
The timing of this project is ideal, as the AMD field is now sufficiently mature to lead to a breakthrough and at the same time young enough to be receptive to new approaches and themes."
Summary
"The first decade of Algorithmic Mechanism Design (AMD) concentrated, very successfully, on the design of truthful mechanisms for the allocation of resources among agents with private preferences.
Truthful mechanisms are ones that incentivize rational users to report their preferences truthfully.
Truthfulness, however, for all its theoretical appeal, suffers from several inherent limitations, mainly its high communication and computation complexities.
It is not surprising, therefore, that practical applications forego truthfulness and use simpler mechanisms instead.
Simplicity in itself, however, is not sufficient, as any meaningful mechanism should also have some notion of fairness; otherwise agents will stop using it over time.
In this project I plan to develop an innovative AMD theoretical framework that will go beyond truthfulness and focus instead on the natural themes of simplicity and fairness, in addition to computational tractability.
One of my primary goals will be the design of simple and fair poly-time mechanisms that perform at near optimal levels with respect to important economic objectives such as social welfare and revenue.
To this end, I will work toward providing precise definitions of simplicity and fairness and quantifying the effects of these restrictions on the performance levels that can be obtained.
A major challenge in the evaluation of non-truthful mechanisms is defining a reasonable behavior model that will enable their evaluation.
The success of this project could have a broad impact on Europe and beyond, as it would guide the design of natural mechanisms for markets of tens of billions of dollars in revenue, such as online advertising, or sales of wireless frequencies.
The timing of this project is ideal, as the AMD field is now sufficiently mature to lead to a breakthrough and at the same time young enough to be receptive to new approaches and themes."
Max ERC Funding
1 394 600 €
Duration
Start date: 2013-11-01, End date: 2018-10-31
Project acronym AMPLIFY
Project Amplifying Human Perception Through Interactive Digital Technologies
Researcher (PI) Albrecht Schmidt
Host Institution (HI) LUDWIG-MAXIMILIANS-UNIVERSITAET MUENCHEN
Call Details Consolidator Grant (CoG), PE6, ERC-2015-CoG
Summary Current technical sensor systems offer capabilities that are superior to human perception. Cameras can capture a spectrum that is wider than visible light, high-speed cameras can show movements that are invisible to the human eye, and directional microphones can pick up sounds at long distances. The vision of this project is to lay a foundation for the creation of digital technologies that provide novel sensory experiences and new perceptual capabilities for humans that are natural and intuitive to use. In a first step, the project will assess the feasibility of creating artificial human senses that provide new perceptual channels to the human mind, without increasing the experienced cognitive load. A particular focus is on creating intuitive and natural control mechanisms for amplified senses using eye gaze, muscle activity, and brain signals. Through the creation of a prototype that provides mildly unpleasant stimulations in response to perceived information, the feasibility of implementing an artificial reflex will be experimentally explored. The project will quantify the effectiveness of new senses and artificial perceptual aids compared to the baseline of unaugmented perception. The overall objective is to systematically research, explore, and model new means for increasing the human intake of information in order to lay the foundation for new and improved human senses enabled through digital technologies and to enable artificial reflexes. The ground-breaking contributions of this project are (1) to demonstrate the feasibility of reliably implementing amplified senses and new perceptual capabilities, (2) to prove the possibility of creating an artificial reflex, (3) to provide an example implementation of amplified cognition that is empirically validated, and (4) to develop models, concepts, components, and platforms that will enable and ease the creation of interactive systems that measurably increase human perceptual capabilities.
Summary
Current technical sensor systems offer capabilities that are superior to human perception. Cameras can capture a spectrum that is wider than visible light, high-speed cameras can show movements that are invisible to the human eye, and directional microphones can pick up sounds at long distances. The vision of this project is to lay a foundation for the creation of digital technologies that provide novel sensory experiences and new perceptual capabilities for humans that are natural and intuitive to use. In a first step, the project will assess the feasibility of creating artificial human senses that provide new perceptual channels to the human mind, without increasing the experienced cognitive load. A particular focus is on creating intuitive and natural control mechanisms for amplified senses using eye gaze, muscle activity, and brain signals. Through the creation of a prototype that provides mildly unpleasant stimulations in response to perceived information, the feasibility of implementing an artificial reflex will be experimentally explored. The project will quantify the effectiveness of new senses and artificial perceptual aids compared to the baseline of unaugmented perception. The overall objective is to systematically research, explore, and model new means for increasing the human intake of information in order to lay the foundation for new and improved human senses enabled through digital technologies and to enable artificial reflexes. The ground-breaking contributions of this project are (1) to demonstrate the feasibility of reliably implementing amplified senses and new perceptual capabilities, (2) to prove the possibility of creating an artificial reflex, (3) to provide an example implementation of amplified cognition that is empirically validated, and (4) to develop models, concepts, components, and platforms that will enable and ease the creation of interactive systems that measurably increase human perceptual capabilities.
Max ERC Funding
1 925 250 €
Duration
Start date: 2016-07-01, End date: 2021-06-30