Project acronym 3D-OA-HISTO
Project Development of 3D Histopathological Grading of Osteoarthritis
Researcher (PI) Simo Jaakko Saarakkala
Host Institution (HI) OULUN YLIOPISTO
Call Details Starting Grant (StG), LS7, ERC-2013-StG
Summary "Background: Osteoarthritis (OA) is a common musculoskeletal disease occurring worldwide. Despite extensive research, etiology of OA is still poorly understood. Histopathological grading (HPG) of 2D tissue sections is the gold standard reference method for determination of OA stage. However, traditional 2D-HPG is destructive and based only on subjective visual evaluation. These limitations induce bias to clinical in vitro OA diagnostics and basic research that both rely strongly on HPG.
Objectives: 1) To establish and validate the very first 3D-HPG of OA based on cutting-edge nano/micro-CT (Computed Tomography) technologies in vitro; 2) To use the established method to clarify the beginning phases of OA; and 3) To validate 3D-HPG of OA for in vivo use.
Methods: Several hundreds of human osteochondral samples from patients undergoing total knee arthroplasty will be collected. The samples will be imaged in vitro with nano/micro-CT and clinical high-end extremity CT devices using specific contrast-agents to quantify tissue constituents and structure in 3D in large volume. From this information, a novel 3D-HPG is developed with statistical classification algorithms. Finally, the developed novel 3D-HPG of OA will be applied clinically in vivo.
Significance: This is the very first study to establish 3D-HPG of OA pathology in vitro and in vivo. Furthermore, the developed technique hugely improves the understanding of the beginning phases of OA. Ultimately, the study will contribute for improving OA patients’ quality of life by slowing the disease progression, and for providing powerful tools to develop new OA therapies."
Summary
"Background: Osteoarthritis (OA) is a common musculoskeletal disease occurring worldwide. Despite extensive research, etiology of OA is still poorly understood. Histopathological grading (HPG) of 2D tissue sections is the gold standard reference method for determination of OA stage. However, traditional 2D-HPG is destructive and based only on subjective visual evaluation. These limitations induce bias to clinical in vitro OA diagnostics and basic research that both rely strongly on HPG.
Objectives: 1) To establish and validate the very first 3D-HPG of OA based on cutting-edge nano/micro-CT (Computed Tomography) technologies in vitro; 2) To use the established method to clarify the beginning phases of OA; and 3) To validate 3D-HPG of OA for in vivo use.
Methods: Several hundreds of human osteochondral samples from patients undergoing total knee arthroplasty will be collected. The samples will be imaged in vitro with nano/micro-CT and clinical high-end extremity CT devices using specific contrast-agents to quantify tissue constituents and structure in 3D in large volume. From this information, a novel 3D-HPG is developed with statistical classification algorithms. Finally, the developed novel 3D-HPG of OA will be applied clinically in vivo.
Significance: This is the very first study to establish 3D-HPG of OA pathology in vitro and in vivo. Furthermore, the developed technique hugely improves the understanding of the beginning phases of OA. Ultimately, the study will contribute for improving OA patients’ quality of life by slowing the disease progression, and for providing powerful tools to develop new OA therapies."
Max ERC Funding
1 500 000 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym A-HERO
Project Anthelmintic Research and Optimization
Researcher (PI) Jennifer Irene Keiser
Host Institution (HI) SCHWEIZERISCHES TROPEN- UND PUBLIC HEALTH-INSTITUT
Call Details Consolidator Grant (CoG), LS7, ERC-2013-CoG
Summary "I propose an ambitious, yet feasible 5-year research project that will fill an important gap in global health. Specifically, I will develop and validate novel approaches for anthelmintic drug discovery and development. My proposal pursues the following five research questions: (i) Is a chip calorimeter suitable for high-throughput screening in anthelmintic drug discovery? (ii) Is combination chemotherapy safe and more efficacious than monotherapy against strongyloidiasis and trichuriasis? (iii) What are the key pharmacokinetic parameters of praziquantel in preschool-aged children and school-aged children infected with Schistosoma mansoni and S. haematobium using a novel and validated technology based on dried blood spotting? (iv) What are the metabolic consequences and clearance of praziquantel treatment in S. mansoni-infected mice and S. mansoni- and S. haematobium-infected children? (v) Which is the ideal compartment to study pharmacokinetic parameters for intestinal nematode infections and does age, nutrition, co-infection and infection intensity influence the efficacy of anthelmintic drugs?
My proposed research is of considerable public health relevance since it will ultimately result in improved treatments for soil-transmitted helminthiasis and pediatric schistosomiasis. Additionally, at the end of this project, I have generated comprehensive information on drug disposition of anthelmintics. A comprehensive database of metabolite profiles following praziquantel treatment will be available. Finally, the proof-of-concept of chip calorimetry in anthelmintic drug discovery has been established and broadly validated."
Summary
"I propose an ambitious, yet feasible 5-year research project that will fill an important gap in global health. Specifically, I will develop and validate novel approaches for anthelmintic drug discovery and development. My proposal pursues the following five research questions: (i) Is a chip calorimeter suitable for high-throughput screening in anthelmintic drug discovery? (ii) Is combination chemotherapy safe and more efficacious than monotherapy against strongyloidiasis and trichuriasis? (iii) What are the key pharmacokinetic parameters of praziquantel in preschool-aged children and school-aged children infected with Schistosoma mansoni and S. haematobium using a novel and validated technology based on dried blood spotting? (iv) What are the metabolic consequences and clearance of praziquantel treatment in S. mansoni-infected mice and S. mansoni- and S. haematobium-infected children? (v) Which is the ideal compartment to study pharmacokinetic parameters for intestinal nematode infections and does age, nutrition, co-infection and infection intensity influence the efficacy of anthelmintic drugs?
My proposed research is of considerable public health relevance since it will ultimately result in improved treatments for soil-transmitted helminthiasis and pediatric schistosomiasis. Additionally, at the end of this project, I have generated comprehensive information on drug disposition of anthelmintics. A comprehensive database of metabolite profiles following praziquantel treatment will be available. Finally, the proof-of-concept of chip calorimetry in anthelmintic drug discovery has been established and broadly validated."
Max ERC Funding
1 927 350 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym ACDC
Project Algorithms and Complexity of Highly Decentralized Computations
Researcher (PI) Fabian Daniel Kuhn
Host Institution (HI) ALBERT-LUDWIGS-UNIVERSITAET FREIBURG
Call Details Starting Grant (StG), PE6, ERC-2013-StG
Summary "Many of today's and tomorrow's computer systems are built on top of large-scale networks such as, e.g., the Internet, the world wide web, wireless ad hoc and sensor networks, or peer-to-peer networks. Driven by technological advances, new kinds of networks and applications have become possible and we can safely assume that this trend is going to continue. Often modern systems are envisioned to consist of a potentially large number of individual components that are organized in a completely decentralized way. There is no central authority that controls the topology of the network, how nodes join or leave the system, or in which way nodes communicate with each other. Also, many future distributed applications will be built using wireless devices that communicate via radio.
The general objective of the proposed project is to improve our understanding of the algorithmic and theoretical foundations of decentralized distributed systems. From an algorithmic point of view, decentralized networks and computations pose a number of fascinating and unique challenges that are not present in sequential or more standard distributed systems. As communication is limited and mostly between nearby nodes, each node of a large network can only maintain a very restricted view of the global state of the system. This is particularly true if the network can change dynamically, either by nodes joining or leaving the system or if the topology changes over time, e.g., because of the mobility of the devices in case of a wireless network. Nevertheless, the nodes of a network need to coordinate in order to achieve some global goal.
In particular, we plan to study algorithms and lower bounds for basic computation and information dissemination tasks in such systems. In addition, we are particularly interested in the complexity of distributed computations in dynamic and wireless networks."
Summary
"Many of today's and tomorrow's computer systems are built on top of large-scale networks such as, e.g., the Internet, the world wide web, wireless ad hoc and sensor networks, or peer-to-peer networks. Driven by technological advances, new kinds of networks and applications have become possible and we can safely assume that this trend is going to continue. Often modern systems are envisioned to consist of a potentially large number of individual components that are organized in a completely decentralized way. There is no central authority that controls the topology of the network, how nodes join or leave the system, or in which way nodes communicate with each other. Also, many future distributed applications will be built using wireless devices that communicate via radio.
The general objective of the proposed project is to improve our understanding of the algorithmic and theoretical foundations of decentralized distributed systems. From an algorithmic point of view, decentralized networks and computations pose a number of fascinating and unique challenges that are not present in sequential or more standard distributed systems. As communication is limited and mostly between nearby nodes, each node of a large network can only maintain a very restricted view of the global state of the system. This is particularly true if the network can change dynamically, either by nodes joining or leaving the system or if the topology changes over time, e.g., because of the mobility of the devices in case of a wireless network. Nevertheless, the nodes of a network need to coordinate in order to achieve some global goal.
In particular, we plan to study algorithms and lower bounds for basic computation and information dissemination tasks in such systems. In addition, we are particularly interested in the complexity of distributed computations in dynamic and wireless networks."
Max ERC Funding
1 148 000 €
Duration
Start date: 2013-11-01, End date: 2018-10-31
Project acronym ACOPS
Project Advanced Coherent Ultrafast Laser Pulse Stacking
Researcher (PI) Jens Limpert
Host Institution (HI) FRIEDRICH-SCHILLER-UNIVERSITAT JENA
Call Details Consolidator Grant (CoG), PE2, ERC-2013-CoG
Summary "An important driver of scientific progress has always been the envisioning of applications far beyond existing technological capabilities. Such thinking creates new challenges for physicists, driven by the groundbreaking nature of the anticipated application. In the case of laser physics, one of these applications is laser wake-field particle acceleration and possible future uses thereof, such as in collider experiments, or for medical applications such as cancer treatment. To accelerate electrons and positrons to TeV-energies, a laser architecture is required that allows for the combination of high efficiency, Petawatt peak powers, and Megawatt average powers. Developing such a laser system would be a challenging task that might take decades of aggressive research, development, and, most important, revolutionary approaches and innovative ideas.
The goal of the ACOPS project is to develop a compact, efficient, scalable, and cost-effective high-average and high-peak power ultra-short pulse laser concept.
The proposed approach to this goal relies on the spatially and temporally separated amplification of ultrashort laser pulses in waveguide structures, followed by coherent combination into a single train of pulses with increased average power and pulse energy. This combination can be realized through the coherent addition of the output beams of spatially separated amplifiers, combined with the pulse stacking of temporally separated pulses in passive enhancement cavities, employing a fast-switching element as cavity dumper.
Therefore, the three main tasks are the development of kW-class high-repetition-rate driving lasers, the investigation of non-steady state pulse enhancement in passive cavities, and the development of a suitable dumping element.
If successful, the proposed concept would undoubtedly provide a tool that would allow researchers to surpass the current limits in high-field physics and accelerator science."
Summary
"An important driver of scientific progress has always been the envisioning of applications far beyond existing technological capabilities. Such thinking creates new challenges for physicists, driven by the groundbreaking nature of the anticipated application. In the case of laser physics, one of these applications is laser wake-field particle acceleration and possible future uses thereof, such as in collider experiments, or for medical applications such as cancer treatment. To accelerate electrons and positrons to TeV-energies, a laser architecture is required that allows for the combination of high efficiency, Petawatt peak powers, and Megawatt average powers. Developing such a laser system would be a challenging task that might take decades of aggressive research, development, and, most important, revolutionary approaches and innovative ideas.
The goal of the ACOPS project is to develop a compact, efficient, scalable, and cost-effective high-average and high-peak power ultra-short pulse laser concept.
The proposed approach to this goal relies on the spatially and temporally separated amplification of ultrashort laser pulses in waveguide structures, followed by coherent combination into a single train of pulses with increased average power and pulse energy. This combination can be realized through the coherent addition of the output beams of spatially separated amplifiers, combined with the pulse stacking of temporally separated pulses in passive enhancement cavities, employing a fast-switching element as cavity dumper.
Therefore, the three main tasks are the development of kW-class high-repetition-rate driving lasers, the investigation of non-steady state pulse enhancement in passive cavities, and the development of a suitable dumping element.
If successful, the proposed concept would undoubtedly provide a tool that would allow researchers to surpass the current limits in high-field physics and accelerator science."
Max ERC Funding
1 881 040 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ACROSS
Project 3D Reconstruction and Modeling across Different Levels of Abstraction
Researcher (PI) Leif Kobbelt
Host Institution (HI) RHEINISCH-WESTFAELISCHE TECHNISCHE HOCHSCHULE AACHEN
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary "Digital 3D models are gaining more and more importance in diverse application fields ranging from computer graphics, multimedia and simulation sciences to engineering, architecture, and medicine. Powerful technologies to digitize the 3D shape of real objects and scenes are becoming available even to consumers. However, the raw geometric data emerging from, e.g., 3D scanning or multi-view stereo often lacks a consistent structure and meta-information which are necessary for the effective deployment of such models in sophisticated down-stream applications like animation, simulation, or CAD/CAM that go beyond mere visualization. Our goal is to develop new fundamental algorithms which transform raw geometric input data into augmented 3D models that are equipped with structural meta information such as feature aligned meshes, patch segmentations, local and global geometric constraints, statistical shape variation data, or even procedural descriptions. Our methodological approach is inspired by the human perceptual system that integrates bottom-up (data-driven) and top-down (model-driven) mechanisms in its hierarchical processing. Similarly we combine algorithms operating on different levels of abstraction into reconstruction and modeling networks. Instead of developing an individual solution for each specific application scenario, we create an eco-system of algorithms for automatic processing and interactive design of highly complex 3D models. A key concept is the information flow across all levels of abstraction in a bottom-up as well as top-down fashion. We not only aim at optimizing geometric representations but in fact at bridging the gap between reconstruction and recognition of geometric objects. The results from this project will make it possible to bring 3D models of real world objects into many highly relevant applications in science, industry, and entertainment, greatly reducing the excessive manual effort that is still necessary today."
Summary
"Digital 3D models are gaining more and more importance in diverse application fields ranging from computer graphics, multimedia and simulation sciences to engineering, architecture, and medicine. Powerful technologies to digitize the 3D shape of real objects and scenes are becoming available even to consumers. However, the raw geometric data emerging from, e.g., 3D scanning or multi-view stereo often lacks a consistent structure and meta-information which are necessary for the effective deployment of such models in sophisticated down-stream applications like animation, simulation, or CAD/CAM that go beyond mere visualization. Our goal is to develop new fundamental algorithms which transform raw geometric input data into augmented 3D models that are equipped with structural meta information such as feature aligned meshes, patch segmentations, local and global geometric constraints, statistical shape variation data, or even procedural descriptions. Our methodological approach is inspired by the human perceptual system that integrates bottom-up (data-driven) and top-down (model-driven) mechanisms in its hierarchical processing. Similarly we combine algorithms operating on different levels of abstraction into reconstruction and modeling networks. Instead of developing an individual solution for each specific application scenario, we create an eco-system of algorithms for automatic processing and interactive design of highly complex 3D models. A key concept is the information flow across all levels of abstraction in a bottom-up as well as top-down fashion. We not only aim at optimizing geometric representations but in fact at bridging the gap between reconstruction and recognition of geometric objects. The results from this project will make it possible to bring 3D models of real world objects into many highly relevant applications in science, industry, and entertainment, greatly reducing the excessive manual effort that is still necessary today."
Max ERC Funding
2 482 000 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym ACTAR TPC
Project Active Target and Time Projection Chamber
Researcher (PI) Gwen Grinyer
Host Institution (HI) GRAND ACCELERATEUR NATIONAL D'IONS LOURDS
Call Details Starting Grant (StG), PE2, ERC-2013-StG
Summary The active target and time projection chamber (ACTAR TPC) is a novel gas-filled detection system that will permit new studies into the structure and decays of the most exotic nuclei. The use of a gas volume that acts as a sensitive detection medium and as the reaction target itself (an “active target”) offers considerable advantages over traditional nuclear physics detectors and techniques. In high-energy physics, TPC detectors have found profitable applications but their use in nuclear physics has been limited. With the ACTAR TPC design, individual detection pad sizes of 2 mm are the smallest ever attempted in either discipline but is a requirement for high-efficiency and high-resolution nuclear spectroscopy. The corresponding large number of electronic channels (16000 from a surface of only 25×25 cm) requires new developments in high-density electronics and data-acquisition systems that are not yet available in the nuclear physics domain. New experiments in regions of the nuclear chart that cannot be presently contemplated will become feasible with ACTAR TPC.
Summary
The active target and time projection chamber (ACTAR TPC) is a novel gas-filled detection system that will permit new studies into the structure and decays of the most exotic nuclei. The use of a gas volume that acts as a sensitive detection medium and as the reaction target itself (an “active target”) offers considerable advantages over traditional nuclear physics detectors and techniques. In high-energy physics, TPC detectors have found profitable applications but their use in nuclear physics has been limited. With the ACTAR TPC design, individual detection pad sizes of 2 mm are the smallest ever attempted in either discipline but is a requirement for high-efficiency and high-resolution nuclear spectroscopy. The corresponding large number of electronic channels (16000 from a surface of only 25×25 cm) requires new developments in high-density electronics and data-acquisition systems that are not yet available in the nuclear physics domain. New experiments in regions of the nuclear chart that cannot be presently contemplated will become feasible with ACTAR TPC.
Max ERC Funding
1 290 000 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ACUITY
Project Algorithms for coping with uncertainty and intractability
Researcher (PI) Nikhil Bansal
Host Institution (HI) TECHNISCHE UNIVERSITEIT EINDHOVEN
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary The two biggest challenges in solving practical optimization problems are computational intractability, and the presence
of uncertainty: most problems are either NP-hard, or have incomplete input data which
makes an exact computation impossible.
Recently, there has been a huge progress in our understanding of intractability, based on spectacular algorithmic and lower bound techniques. For several problems, especially those with only local constraints, we can design optimum
approximation algorithms that are provably the best possible.
However, typical optimization problems usually involve complex global constraints and are much less understood. The situation is even worse for coping with uncertainty. Most of the algorithms are based on ad-hoc techniques and there is no deeper understanding of what makes various problems easy or hard.
This proposal describes several new directions, together with concrete intermediate goals, that will break important new ground in the theory of approximation and online algorithms. The particular directions we consider are (i) extend the primal dual method to systematically design online algorithms, (ii) build a structural theory of online problems based on work functions, (iii) develop new tools to use the power of strong convex relaxations and (iv) design new algorithmic approaches based on non-constructive proof techniques.
The proposed research is at the
cutting edge of algorithm design, and builds upon the recent success of the PI in resolving several longstanding questions in these areas. Any progress is likely to be a significant contribution to theoretical
computer science and combinatorial optimization.
Summary
The two biggest challenges in solving practical optimization problems are computational intractability, and the presence
of uncertainty: most problems are either NP-hard, or have incomplete input data which
makes an exact computation impossible.
Recently, there has been a huge progress in our understanding of intractability, based on spectacular algorithmic and lower bound techniques. For several problems, especially those with only local constraints, we can design optimum
approximation algorithms that are provably the best possible.
However, typical optimization problems usually involve complex global constraints and are much less understood. The situation is even worse for coping with uncertainty. Most of the algorithms are based on ad-hoc techniques and there is no deeper understanding of what makes various problems easy or hard.
This proposal describes several new directions, together with concrete intermediate goals, that will break important new ground in the theory of approximation and online algorithms. The particular directions we consider are (i) extend the primal dual method to systematically design online algorithms, (ii) build a structural theory of online problems based on work functions, (iii) develop new tools to use the power of strong convex relaxations and (iv) design new algorithmic approaches based on non-constructive proof techniques.
The proposed research is at the
cutting edge of algorithm design, and builds upon the recent success of the PI in resolving several longstanding questions in these areas. Any progress is likely to be a significant contribution to theoretical
computer science and combinatorial optimization.
Max ERC Funding
1 519 285 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym AdOC
Project Advance Optical Clocks
Researcher (PI) Sebastien André Marcel Bize
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE2, ERC-2013-CoG
Summary "The proposed research program has three main objectives. The first and second objectives are to seek extreme precisions in optical atomic spectroscopy and optical clocks, and to use this quest as a mean of exploration in atomic physics. The third objective is to explore new possibilities that stem from extreme precision. These goals will be pursued via three complementary activities: #1: Search for extreme precisions with an Hg optical lattice clock. #2: Explore and exploit the rich Hg system, which is essentially unexplored in the cold and ultra-cold regime. #3: Identify new applications of clocks with extreme precision to Earth science. Clocks can measure directly the gravitational potential via Einstein’s gravitational redshift, leading to the idea of “clock-based geodesy”.
The 2 first activities are experimental and build on an existing setup, where we demonstrated the feasibility of an Hg optical lattice clock. Hg is chosen for its potential to surpass competing systems. We will investigate the unexplored physics of the Hg clock. This includes interactions between Hg atoms, lattice-induced light shifts, and sensitivity to external fields which are specific to the atomic species. Beyond, we will explore the fundamental limits of the optical lattice scheme. We will exploit other remarkable features of Hg associated to the high atomic number and the diversity of stable isotopes. These features enable tests of fundamental physical laws, ultra-precise measurements of isotope shifts, measurement of collisional properties toward evaporative cooling and quantum gases of Hg, investigation of forbidden transitions promising for measuring the nuclear anapole moment of Hg.
The third activity is theoretical and is aimed at initiating collaborations with experts in modelling Earth gravity. With this expertise, we will identify the most promising and realistic approaches for clocks and emerging remote comparison methods to contribute to geodesy, hydrology, oceanography, etc."
Summary
"The proposed research program has three main objectives. The first and second objectives are to seek extreme precisions in optical atomic spectroscopy and optical clocks, and to use this quest as a mean of exploration in atomic physics. The third objective is to explore new possibilities that stem from extreme precision. These goals will be pursued via three complementary activities: #1: Search for extreme precisions with an Hg optical lattice clock. #2: Explore and exploit the rich Hg system, which is essentially unexplored in the cold and ultra-cold regime. #3: Identify new applications of clocks with extreme precision to Earth science. Clocks can measure directly the gravitational potential via Einstein’s gravitational redshift, leading to the idea of “clock-based geodesy”.
The 2 first activities are experimental and build on an existing setup, where we demonstrated the feasibility of an Hg optical lattice clock. Hg is chosen for its potential to surpass competing systems. We will investigate the unexplored physics of the Hg clock. This includes interactions between Hg atoms, lattice-induced light shifts, and sensitivity to external fields which are specific to the atomic species. Beyond, we will explore the fundamental limits of the optical lattice scheme. We will exploit other remarkable features of Hg associated to the high atomic number and the diversity of stable isotopes. These features enable tests of fundamental physical laws, ultra-precise measurements of isotope shifts, measurement of collisional properties toward evaporative cooling and quantum gases of Hg, investigation of forbidden transitions promising for measuring the nuclear anapole moment of Hg.
The third activity is theoretical and is aimed at initiating collaborations with experts in modelling Earth gravity. With this expertise, we will identify the most promising and realistic approaches for clocks and emerging remote comparison methods to contribute to geodesy, hydrology, oceanography, etc."
Max ERC Funding
1 946 432 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym ALEXANDRIA
Project "Foundations for Temporal Retrieval, Exploration and Analytics in Web Archives"
Researcher (PI) Wolfgang Nejdl
Host Institution (HI) GOTTFRIED WILHELM LEIBNIZ UNIVERSITAET HANNOVER
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary "Significant parts of our cultural heritage are produced on the Web, yet only insufficient opportunities exist for accessing and exploring the past of the Web. The ALEXANDRIA project aims to develop models, tools and techniques necessary to archive and index relevant parts of the Web, and to retrieve and explore this information in a meaningful way. While the easy accessibility to the current Web is a good baseline, optimal access to Web archives requires new models and algorithms for retrieval, exploration, and analytics which go far beyond what is needed to access the current state of the Web. This includes taking into account the unique temporal dimension of Web archives, structured semantic information already available on the Web, as well as social media and network information.
Within ALEXANDRIA, we will significantly advance semantic and time-based indexing for Web archives using human-compiled knowledge available on the Web, to efficiently index, retrieve and explore information about entities and events from the past. In doing so, we will focus on the concurrent evolution of this knowledge and the Web content to be indexed, and take into account diversity and incompleteness of this knowledge. We will further investigate mixed crowd- and machine-based Web analytics to support long- running and collaborative retrieval and analysis processes on Web archives. Usage of implicit human feedback will be essential to provide better indexing through insights during the analysis process and to better focus harvesting of content.
The ALEXANDRIA Testbed will provide an important context for research, exploration and evaluation of the concepts, methods and algorithms developed in this project, and will provide both relevant collections and algorithms that enable further research on and practical application of our research results to existing archives like the Internet Archive, the Internet Memory Foundation and Web archives maintained by European national libraries."
Summary
"Significant parts of our cultural heritage are produced on the Web, yet only insufficient opportunities exist for accessing and exploring the past of the Web. The ALEXANDRIA project aims to develop models, tools and techniques necessary to archive and index relevant parts of the Web, and to retrieve and explore this information in a meaningful way. While the easy accessibility to the current Web is a good baseline, optimal access to Web archives requires new models and algorithms for retrieval, exploration, and analytics which go far beyond what is needed to access the current state of the Web. This includes taking into account the unique temporal dimension of Web archives, structured semantic information already available on the Web, as well as social media and network information.
Within ALEXANDRIA, we will significantly advance semantic and time-based indexing for Web archives using human-compiled knowledge available on the Web, to efficiently index, retrieve and explore information about entities and events from the past. In doing so, we will focus on the concurrent evolution of this knowledge and the Web content to be indexed, and take into account diversity and incompleteness of this knowledge. We will further investigate mixed crowd- and machine-based Web analytics to support long- running and collaborative retrieval and analysis processes on Web archives. Usage of implicit human feedback will be essential to provide better indexing through insights during the analysis process and to better focus harvesting of content.
The ALEXANDRIA Testbed will provide an important context for research, exploration and evaluation of the concepts, methods and algorithms developed in this project, and will provide both relevant collections and algorithms that enable further research on and practical application of our research results to existing archives like the Internet Archive, the Internet Memory Foundation and Web archives maintained by European national libraries."
Max ERC Funding
2 493 600 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym ALUNIF
Project Algorithms and Lower Bounds: A Unified Approach
Researcher (PI) Rahul Santhanam
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary One of the fundamental goals of theoretical computer science is to
understand the possibilities and limits of efficient computation. This
quest has two dimensions. The
theory of algorithms focuses on finding efficient solutions to
problems, while computational complexity theory aims to understand when
and why problems are hard to solve. These two areas have different
philosophies and use different sets of techniques. However, in recent
years there have been indications of deep and mysterious connections
between them.
In this project, we propose to explore and develop the connections between
algorithmic analysis and complexity lower bounds in a systematic way.
On the one hand, we plan to use complexity lower bound techniques as inspiration
to design new and improved algorithms for Satisfiability and other
NP-complete problems, as well as to analyze existing algorithms better.
On the other hand, we plan to strengthen implications yielding circuit
lower bounds from non-trivial algorithms for Satisfiability, and to derive
new circuit lower bounds using these stronger implications.
This project has potential for massive impact in both the areas of algorithms
and computational complexity. Improved algorithms for Satisfiability could lead
to improved SAT solvers, and the new analytical tools would lead to a better
understanding of existing heuristics. Complexity lower bound questions are
fundamental
but notoriously difficult, and new lower bounds would open the way to
unconditionally secure cryptographic protocols and derandomization of
probabilistic algorithms. More broadly, this project aims to initiate greater
dialogue between the two areas, with an exchange of ideas and techniques
which leads to accelerated progress in both, as well as a deeper understanding
of the nature of efficient computation.
Summary
One of the fundamental goals of theoretical computer science is to
understand the possibilities and limits of efficient computation. This
quest has two dimensions. The
theory of algorithms focuses on finding efficient solutions to
problems, while computational complexity theory aims to understand when
and why problems are hard to solve. These two areas have different
philosophies and use different sets of techniques. However, in recent
years there have been indications of deep and mysterious connections
between them.
In this project, we propose to explore and develop the connections between
algorithmic analysis and complexity lower bounds in a systematic way.
On the one hand, we plan to use complexity lower bound techniques as inspiration
to design new and improved algorithms for Satisfiability and other
NP-complete problems, as well as to analyze existing algorithms better.
On the other hand, we plan to strengthen implications yielding circuit
lower bounds from non-trivial algorithms for Satisfiability, and to derive
new circuit lower bounds using these stronger implications.
This project has potential for massive impact in both the areas of algorithms
and computational complexity. Improved algorithms for Satisfiability could lead
to improved SAT solvers, and the new analytical tools would lead to a better
understanding of existing heuristics. Complexity lower bound questions are
fundamental
but notoriously difficult, and new lower bounds would open the way to
unconditionally secure cryptographic protocols and derandomization of
probabilistic algorithms. More broadly, this project aims to initiate greater
dialogue between the two areas, with an exchange of ideas and techniques
which leads to accelerated progress in both, as well as a deeper understanding
of the nature of efficient computation.
Max ERC Funding
1 274 496 €
Duration
Start date: 2014-03-01, End date: 2019-02-28