Project acronym ACDC
Project Algorithms and Complexity of Highly Decentralized Computations
Researcher (PI) Fabian Daniel Kuhn
Host Institution (HI) ALBERT-LUDWIGS-UNIVERSITAET FREIBURG
Country Germany
Call Details Starting Grant (StG), PE6, ERC-2013-StG
Summary "Many of today's and tomorrow's computer systems are built on top of large-scale networks such as, e.g., the Internet, the world wide web, wireless ad hoc and sensor networks, or peer-to-peer networks. Driven by technological advances, new kinds of networks and applications have become possible and we can safely assume that this trend is going to continue. Often modern systems are envisioned to consist of a potentially large number of individual components that are organized in a completely decentralized way. There is no central authority that controls the topology of the network, how nodes join or leave the system, or in which way nodes communicate with each other. Also, many future distributed applications will be built using wireless devices that communicate via radio.
The general objective of the proposed project is to improve our understanding of the algorithmic and theoretical foundations of decentralized distributed systems. From an algorithmic point of view, decentralized networks and computations pose a number of fascinating and unique challenges that are not present in sequential or more standard distributed systems. As communication is limited and mostly between nearby nodes, each node of a large network can only maintain a very restricted view of the global state of the system. This is particularly true if the network can change dynamically, either by nodes joining or leaving the system or if the topology changes over time, e.g., because of the mobility of the devices in case of a wireless network. Nevertheless, the nodes of a network need to coordinate in order to achieve some global goal.
In particular, we plan to study algorithms and lower bounds for basic computation and information dissemination tasks in such systems. In addition, we are particularly interested in the complexity of distributed computations in dynamic and wireless networks."
Summary
"Many of today's and tomorrow's computer systems are built on top of large-scale networks such as, e.g., the Internet, the world wide web, wireless ad hoc and sensor networks, or peer-to-peer networks. Driven by technological advances, new kinds of networks and applications have become possible and we can safely assume that this trend is going to continue. Often modern systems are envisioned to consist of a potentially large number of individual components that are organized in a completely decentralized way. There is no central authority that controls the topology of the network, how nodes join or leave the system, or in which way nodes communicate with each other. Also, many future distributed applications will be built using wireless devices that communicate via radio.
The general objective of the proposed project is to improve our understanding of the algorithmic and theoretical foundations of decentralized distributed systems. From an algorithmic point of view, decentralized networks and computations pose a number of fascinating and unique challenges that are not present in sequential or more standard distributed systems. As communication is limited and mostly between nearby nodes, each node of a large network can only maintain a very restricted view of the global state of the system. This is particularly true if the network can change dynamically, either by nodes joining or leaving the system or if the topology changes over time, e.g., because of the mobility of the devices in case of a wireless network. Nevertheless, the nodes of a network need to coordinate in order to achieve some global goal.
In particular, we plan to study algorithms and lower bounds for basic computation and information dissemination tasks in such systems. In addition, we are particularly interested in the complexity of distributed computations in dynamic and wireless networks."
Max ERC Funding
1 148 000 €
Duration
Start date: 2013-11-01, End date: 2018-10-31
Project acronym ACOPS
Project Advanced Coherent Ultrafast Laser Pulse Stacking
Researcher (PI) Jens Limpert
Host Institution (HI) FRIEDRICH-SCHILLER-UNIVERSITAT JENA
Country Germany
Call Details Consolidator Grant (CoG), PE2, ERC-2013-CoG
Summary "An important driver of scientific progress has always been the envisioning of applications far beyond existing technological capabilities. Such thinking creates new challenges for physicists, driven by the groundbreaking nature of the anticipated application. In the case of laser physics, one of these applications is laser wake-field particle acceleration and possible future uses thereof, such as in collider experiments, or for medical applications such as cancer treatment. To accelerate electrons and positrons to TeV-energies, a laser architecture is required that allows for the combination of high efficiency, Petawatt peak powers, and Megawatt average powers. Developing such a laser system would be a challenging task that might take decades of aggressive research, development, and, most important, revolutionary approaches and innovative ideas.
The goal of the ACOPS project is to develop a compact, efficient, scalable, and cost-effective high-average and high-peak power ultra-short pulse laser concept.
The proposed approach to this goal relies on the spatially and temporally separated amplification of ultrashort laser pulses in waveguide structures, followed by coherent combination into a single train of pulses with increased average power and pulse energy. This combination can be realized through the coherent addition of the output beams of spatially separated amplifiers, combined with the pulse stacking of temporally separated pulses in passive enhancement cavities, employing a fast-switching element as cavity dumper.
Therefore, the three main tasks are the development of kW-class high-repetition-rate driving lasers, the investigation of non-steady state pulse enhancement in passive cavities, and the development of a suitable dumping element.
If successful, the proposed concept would undoubtedly provide a tool that would allow researchers to surpass the current limits in high-field physics and accelerator science."
Summary
"An important driver of scientific progress has always been the envisioning of applications far beyond existing technological capabilities. Such thinking creates new challenges for physicists, driven by the groundbreaking nature of the anticipated application. In the case of laser physics, one of these applications is laser wake-field particle acceleration and possible future uses thereof, such as in collider experiments, or for medical applications such as cancer treatment. To accelerate electrons and positrons to TeV-energies, a laser architecture is required that allows for the combination of high efficiency, Petawatt peak powers, and Megawatt average powers. Developing such a laser system would be a challenging task that might take decades of aggressive research, development, and, most important, revolutionary approaches and innovative ideas.
The goal of the ACOPS project is to develop a compact, efficient, scalable, and cost-effective high-average and high-peak power ultra-short pulse laser concept.
The proposed approach to this goal relies on the spatially and temporally separated amplification of ultrashort laser pulses in waveguide structures, followed by coherent combination into a single train of pulses with increased average power and pulse energy. This combination can be realized through the coherent addition of the output beams of spatially separated amplifiers, combined with the pulse stacking of temporally separated pulses in passive enhancement cavities, employing a fast-switching element as cavity dumper.
Therefore, the three main tasks are the development of kW-class high-repetition-rate driving lasers, the investigation of non-steady state pulse enhancement in passive cavities, and the development of a suitable dumping element.
If successful, the proposed concept would undoubtedly provide a tool that would allow researchers to surpass the current limits in high-field physics and accelerator science."
Max ERC Funding
1 881 040 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ACROSS
Project 3D Reconstruction and Modeling across Different Levels of Abstraction
Researcher (PI) Leif Kobbelt
Host Institution (HI) RHEINISCH-WESTFAELISCHE TECHNISCHE HOCHSCHULE AACHEN
Country Germany
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary "Digital 3D models are gaining more and more importance in diverse application fields ranging from computer graphics, multimedia and simulation sciences to engineering, architecture, and medicine. Powerful technologies to digitize the 3D shape of real objects and scenes are becoming available even to consumers. However, the raw geometric data emerging from, e.g., 3D scanning or multi-view stereo often lacks a consistent structure and meta-information which are necessary for the effective deployment of such models in sophisticated down-stream applications like animation, simulation, or CAD/CAM that go beyond mere visualization. Our goal is to develop new fundamental algorithms which transform raw geometric input data into augmented 3D models that are equipped with structural meta information such as feature aligned meshes, patch segmentations, local and global geometric constraints, statistical shape variation data, or even procedural descriptions. Our methodological approach is inspired by the human perceptual system that integrates bottom-up (data-driven) and top-down (model-driven) mechanisms in its hierarchical processing. Similarly we combine algorithms operating on different levels of abstraction into reconstruction and modeling networks. Instead of developing an individual solution for each specific application scenario, we create an eco-system of algorithms for automatic processing and interactive design of highly complex 3D models. A key concept is the information flow across all levels of abstraction in a bottom-up as well as top-down fashion. We not only aim at optimizing geometric representations but in fact at bridging the gap between reconstruction and recognition of geometric objects. The results from this project will make it possible to bring 3D models of real world objects into many highly relevant applications in science, industry, and entertainment, greatly reducing the excessive manual effort that is still necessary today."
Summary
"Digital 3D models are gaining more and more importance in diverse application fields ranging from computer graphics, multimedia and simulation sciences to engineering, architecture, and medicine. Powerful technologies to digitize the 3D shape of real objects and scenes are becoming available even to consumers. However, the raw geometric data emerging from, e.g., 3D scanning or multi-view stereo often lacks a consistent structure and meta-information which are necessary for the effective deployment of such models in sophisticated down-stream applications like animation, simulation, or CAD/CAM that go beyond mere visualization. Our goal is to develop new fundamental algorithms which transform raw geometric input data into augmented 3D models that are equipped with structural meta information such as feature aligned meshes, patch segmentations, local and global geometric constraints, statistical shape variation data, or even procedural descriptions. Our methodological approach is inspired by the human perceptual system that integrates bottom-up (data-driven) and top-down (model-driven) mechanisms in its hierarchical processing. Similarly we combine algorithms operating on different levels of abstraction into reconstruction and modeling networks. Instead of developing an individual solution for each specific application scenario, we create an eco-system of algorithms for automatic processing and interactive design of highly complex 3D models. A key concept is the information flow across all levels of abstraction in a bottom-up as well as top-down fashion. We not only aim at optimizing geometric representations but in fact at bridging the gap between reconstruction and recognition of geometric objects. The results from this project will make it possible to bring 3D models of real world objects into many highly relevant applications in science, industry, and entertainment, greatly reducing the excessive manual effort that is still necessary today."
Max ERC Funding
2 482 000 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym ALEXANDRIA
Project "Foundations for Temporal Retrieval, Exploration and Analytics in Web Archives"
Researcher (PI) Wolfgang Nejdl
Host Institution (HI) GOTTFRIED WILHELM LEIBNIZ UNIVERSITAET HANNOVER
Country Germany
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary "Significant parts of our cultural heritage are produced on the Web, yet only insufficient opportunities exist for accessing and exploring the past of the Web. The ALEXANDRIA project aims to develop models, tools and techniques necessary to archive and index relevant parts of the Web, and to retrieve and explore this information in a meaningful way. While the easy accessibility to the current Web is a good baseline, optimal access to Web archives requires new models and algorithms for retrieval, exploration, and analytics which go far beyond what is needed to access the current state of the Web. This includes taking into account the unique temporal dimension of Web archives, structured semantic information already available on the Web, as well as social media and network information.
Within ALEXANDRIA, we will significantly advance semantic and time-based indexing for Web archives using human-compiled knowledge available on the Web, to efficiently index, retrieve and explore information about entities and events from the past. In doing so, we will focus on the concurrent evolution of this knowledge and the Web content to be indexed, and take into account diversity and incompleteness of this knowledge. We will further investigate mixed crowd- and machine-based Web analytics to support long- running and collaborative retrieval and analysis processes on Web archives. Usage of implicit human feedback will be essential to provide better indexing through insights during the analysis process and to better focus harvesting of content.
The ALEXANDRIA Testbed will provide an important context for research, exploration and evaluation of the concepts, methods and algorithms developed in this project, and will provide both relevant collections and algorithms that enable further research on and practical application of our research results to existing archives like the Internet Archive, the Internet Memory Foundation and Web archives maintained by European national libraries."
Summary
"Significant parts of our cultural heritage are produced on the Web, yet only insufficient opportunities exist for accessing and exploring the past of the Web. The ALEXANDRIA project aims to develop models, tools and techniques necessary to archive and index relevant parts of the Web, and to retrieve and explore this information in a meaningful way. While the easy accessibility to the current Web is a good baseline, optimal access to Web archives requires new models and algorithms for retrieval, exploration, and analytics which go far beyond what is needed to access the current state of the Web. This includes taking into account the unique temporal dimension of Web archives, structured semantic information already available on the Web, as well as social media and network information.
Within ALEXANDRIA, we will significantly advance semantic and time-based indexing for Web archives using human-compiled knowledge available on the Web, to efficiently index, retrieve and explore information about entities and events from the past. In doing so, we will focus on the concurrent evolution of this knowledge and the Web content to be indexed, and take into account diversity and incompleteness of this knowledge. We will further investigate mixed crowd- and machine-based Web analytics to support long- running and collaborative retrieval and analysis processes on Web archives. Usage of implicit human feedback will be essential to provide better indexing through insights during the analysis process and to better focus harvesting of content.
The ALEXANDRIA Testbed will provide an important context for research, exploration and evaluation of the concepts, methods and algorithms developed in this project, and will provide both relevant collections and algorithms that enable further research on and practical application of our research results to existing archives like the Internet Archive, the Internet Memory Foundation and Web archives maintained by European national libraries."
Max ERC Funding
2 493 600 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym ALUNIF
Project Algorithms and Lower Bounds: A Unified Approach
Researcher (PI) Rahul Santhanam
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Country United Kingdom
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary One of the fundamental goals of theoretical computer science is to
understand the possibilities and limits of efficient computation. This
quest has two dimensions. The
theory of algorithms focuses on finding efficient solutions to
problems, while computational complexity theory aims to understand when
and why problems are hard to solve. These two areas have different
philosophies and use different sets of techniques. However, in recent
years there have been indications of deep and mysterious connections
between them.
In this project, we propose to explore and develop the connections between
algorithmic analysis and complexity lower bounds in a systematic way.
On the one hand, we plan to use complexity lower bound techniques as inspiration
to design new and improved algorithms for Satisfiability and other
NP-complete problems, as well as to analyze existing algorithms better.
On the other hand, we plan to strengthen implications yielding circuit
lower bounds from non-trivial algorithms for Satisfiability, and to derive
new circuit lower bounds using these stronger implications.
This project has potential for massive impact in both the areas of algorithms
and computational complexity. Improved algorithms for Satisfiability could lead
to improved SAT solvers, and the new analytical tools would lead to a better
understanding of existing heuristics. Complexity lower bound questions are
fundamental
but notoriously difficult, and new lower bounds would open the way to
unconditionally secure cryptographic protocols and derandomization of
probabilistic algorithms. More broadly, this project aims to initiate greater
dialogue between the two areas, with an exchange of ideas and techniques
which leads to accelerated progress in both, as well as a deeper understanding
of the nature of efficient computation.
Summary
One of the fundamental goals of theoretical computer science is to
understand the possibilities and limits of efficient computation. This
quest has two dimensions. The
theory of algorithms focuses on finding efficient solutions to
problems, while computational complexity theory aims to understand when
and why problems are hard to solve. These two areas have different
philosophies and use different sets of techniques. However, in recent
years there have been indications of deep and mysterious connections
between them.
In this project, we propose to explore and develop the connections between
algorithmic analysis and complexity lower bounds in a systematic way.
On the one hand, we plan to use complexity lower bound techniques as inspiration
to design new and improved algorithms for Satisfiability and other
NP-complete problems, as well as to analyze existing algorithms better.
On the other hand, we plan to strengthen implications yielding circuit
lower bounds from non-trivial algorithms for Satisfiability, and to derive
new circuit lower bounds using these stronger implications.
This project has potential for massive impact in both the areas of algorithms
and computational complexity. Improved algorithms for Satisfiability could lead
to improved SAT solvers, and the new analytical tools would lead to a better
understanding of existing heuristics. Complexity lower bound questions are
fundamental
but notoriously difficult, and new lower bounds would open the way to
unconditionally secure cryptographic protocols and derandomization of
probabilistic algorithms. More broadly, this project aims to initiate greater
dialogue between the two areas, with an exchange of ideas and techniques
which leads to accelerated progress in both, as well as a deeper understanding
of the nature of efficient computation.
Max ERC Funding
1 274 496 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym ANTI-ATOM
Project Many-body theory of antimatter interactions with atoms, molecules and condensed matter
Researcher (PI) Dermot GREEN
Host Institution (HI) THE QUEEN'S UNIVERSITY OF BELFAST
Country United Kingdom
Call Details Starting Grant (StG), PE2, ERC-2018-STG
Summary The ability of positrons to annihilate with electrons, producing characteristic gamma rays, gives them important use in medicine via positron-emission tomography (PET), diagnostics of industrially-important materials, and in elucidating astrophysical phenomena. Moreover, the fundamental interactions of positrons and positronium (Ps) with atoms, molecules and condensed matter are currently under intensive study in numerous international laboratories, to illuminate collision phenomena and perform precision tests of fundamental laws.
Proper interpretation and development of these costly and difficult experiments requires accurate calculations of low-energy positron and Ps interactions with normal matter. These systems, however, involve strong correlations, e.g., polarisation of the atom and virtual-Ps formation (where an atomic electron tunnels to the positron): they significantly effect positron- and Ps-atom/molecule interactions, e.g., enhancing annihilation rates by many orders of magnitude, and making the accurate description of these systems a challenging many-body problem. Current theoretical capability lags severely behind that of experiment. Major theoretical and computational developments are required to bridge the gap.
One powerful method, which accounts for the correlations in a natural, transparent and systematic way, is many-body theory (MBT). Building on my expertise in the field, I propose to develop new MBT to deliver unique and unrivalled capability in theory and computation of low-energy positron and Ps interactions with atoms, molecules, and condensed matter. The ambitious programme will provide the basic understanding required to interpret and develop the fundamental experiments, antimatter-based materials science techniques, and wider technologies, e.g., (PET), and more broadly, potentially revolutionary and generally applicable computational methodologies that promise to define a new level of high-precision in atomic-MBT calculations.
Summary
The ability of positrons to annihilate with electrons, producing characteristic gamma rays, gives them important use in medicine via positron-emission tomography (PET), diagnostics of industrially-important materials, and in elucidating astrophysical phenomena. Moreover, the fundamental interactions of positrons and positronium (Ps) with atoms, molecules and condensed matter are currently under intensive study in numerous international laboratories, to illuminate collision phenomena and perform precision tests of fundamental laws.
Proper interpretation and development of these costly and difficult experiments requires accurate calculations of low-energy positron and Ps interactions with normal matter. These systems, however, involve strong correlations, e.g., polarisation of the atom and virtual-Ps formation (where an atomic electron tunnels to the positron): they significantly effect positron- and Ps-atom/molecule interactions, e.g., enhancing annihilation rates by many orders of magnitude, and making the accurate description of these systems a challenging many-body problem. Current theoretical capability lags severely behind that of experiment. Major theoretical and computational developments are required to bridge the gap.
One powerful method, which accounts for the correlations in a natural, transparent and systematic way, is many-body theory (MBT). Building on my expertise in the field, I propose to develop new MBT to deliver unique and unrivalled capability in theory and computation of low-energy positron and Ps interactions with atoms, molecules, and condensed matter. The ambitious programme will provide the basic understanding required to interpret and develop the fundamental experiments, antimatter-based materials science techniques, and wider technologies, e.g., (PET), and more broadly, potentially revolutionary and generally applicable computational methodologies that promise to define a new level of high-precision in atomic-MBT calculations.
Max ERC Funding
1 318 419 €
Duration
Start date: 2019-02-01, End date: 2024-01-31
Project acronym ANTICIPATE
Project Anticipatory Human-Computer Interaction
Researcher (PI) Andreas BULLING
Host Institution (HI) UNIVERSITAET STUTTGART
Country Germany
Call Details Starting Grant (StG), PE6, ERC-2018-STG
Summary Even after three decades of research on human-computer interaction (HCI), current general-purpose user interfaces (UI) still lack the ability to attribute mental states to their users, i.e. they fail to understand users' intentions and needs and to anticipate their actions. This drastically restricts their interactive capabilities.
ANTICIPATE aims to establish the scientific foundations for a new generation of user interfaces that pro-actively adapt to users' future input actions by monitoring their attention and predicting their interaction intentions - thereby significantly improving the naturalness, efficiency, and user experience of the interactions. Realising this vision of anticipatory human-computer interaction requires groundbreaking advances in everyday sensing of user attention from eye and brain activity. We will further pioneer methods to predict entangled user intentions and forecast interactive behaviour with fine temporal granularity during interactions in everyday stationary and mobile settings. Finally, we will develop fundamental interaction paradigms that enable anticipatory UIs to pro-actively adapt to users' attention and intentions in a mindful way. The new capabilities will be demonstrated in four challenging cases: 1) mobile information retrieval, 2) intelligent notification management, 3) Autism diagnosis and monitoring, and 4) computer-based training.
Anticipatory human-computer interaction offers a strong complement to existing UI paradigms that only react to user input post-hoc. If successful, ANTICIPATE will deliver the first important building blocks for implementing Theory of Mind in general-purpose UIs. As such, the project has the potential to drastically improve the billions of interactions we perform with computers every day, to trigger a wide range of follow-up research in HCI as well as adjacent areas within and outside computer science, and to act as a key technical enabler for new applications, e.g. in healthcare and education.
Summary
Even after three decades of research on human-computer interaction (HCI), current general-purpose user interfaces (UI) still lack the ability to attribute mental states to their users, i.e. they fail to understand users' intentions and needs and to anticipate their actions. This drastically restricts their interactive capabilities.
ANTICIPATE aims to establish the scientific foundations for a new generation of user interfaces that pro-actively adapt to users' future input actions by monitoring their attention and predicting their interaction intentions - thereby significantly improving the naturalness, efficiency, and user experience of the interactions. Realising this vision of anticipatory human-computer interaction requires groundbreaking advances in everyday sensing of user attention from eye and brain activity. We will further pioneer methods to predict entangled user intentions and forecast interactive behaviour with fine temporal granularity during interactions in everyday stationary and mobile settings. Finally, we will develop fundamental interaction paradigms that enable anticipatory UIs to pro-actively adapt to users' attention and intentions in a mindful way. The new capabilities will be demonstrated in four challenging cases: 1) mobile information retrieval, 2) intelligent notification management, 3) Autism diagnosis and monitoring, and 4) computer-based training.
Anticipatory human-computer interaction offers a strong complement to existing UI paradigms that only react to user input post-hoc. If successful, ANTICIPATE will deliver the first important building blocks for implementing Theory of Mind in general-purpose UIs. As such, the project has the potential to drastically improve the billions of interactions we perform with computers every day, to trigger a wide range of follow-up research in HCI as well as adjacent areas within and outside computer science, and to act as a key technical enabler for new applications, e.g. in healthcare and education.
Max ERC Funding
1 499 625 €
Duration
Start date: 2019-02-01, End date: 2024-01-31
Project acronym ANYON
Project Engineering and exploring anyonic quantum gases
Researcher (PI) Christof WEITENBERG
Host Institution (HI) UNIVERSITAET HAMBURG
Country Germany
Call Details Starting Grant (StG), PE2, ERC-2018-STG
Summary This project enters the experimental investigation of anyonic quantum gases. We will study anyons – conjectured particles with a statistical exchange phase anywhere between 0 and π – in different many-body systems. This progress will be enabled by a unique approach of bringing together artificial gauge fields and quantum gas microscopes for ultracold atoms.
Specifically, we will implement the 1D anyon Hubbard model via a lattice shaking protocol that imprints density-dependent Peierls phases. By engineering the statistical exchange phase, we can continuously tune between bosons and fermions and explore a statistically-induced quantum phase transition. We will monitor the continuous fermionization via the build-up of Friedel oscillations. Using state-of-the-art cold atom technology, we will thus open the physics of anyons to experimental research and address open questions related to their fractional exclusion statistics.
Secondly, we will create fractional quantum Hall systems in rapidly rotating microtraps. Using the quantum gas microscope, we will i) control the optical potentials at a level which allows approaching the centrifugal limit and ii) use small atom numbers equal to the inserted angular momentum quantum number. The strongly-correlated ground states such as the Laughlin state can be identified via their characteristic density correlations. Of particular interest are the quasihole excitations, whose predicted anyonic exchange statistics have not been directly observed to date. We will probe and test their statistics via the characteristic counting sequence in the excitation spectrum. Furthermore, we will test ideas to transfer anyonic properties of the excitations to a second tracer species. This approach will enable us to both probe the fractional exclusion statistics of the excitations and to create a 2D anyonic quantum gas.
In the long run, these techniques open a path to also study non-Abelian anyons with ultracold atoms.
Summary
This project enters the experimental investigation of anyonic quantum gases. We will study anyons – conjectured particles with a statistical exchange phase anywhere between 0 and π – in different many-body systems. This progress will be enabled by a unique approach of bringing together artificial gauge fields and quantum gas microscopes for ultracold atoms.
Specifically, we will implement the 1D anyon Hubbard model via a lattice shaking protocol that imprints density-dependent Peierls phases. By engineering the statistical exchange phase, we can continuously tune between bosons and fermions and explore a statistically-induced quantum phase transition. We will monitor the continuous fermionization via the build-up of Friedel oscillations. Using state-of-the-art cold atom technology, we will thus open the physics of anyons to experimental research and address open questions related to their fractional exclusion statistics.
Secondly, we will create fractional quantum Hall systems in rapidly rotating microtraps. Using the quantum gas microscope, we will i) control the optical potentials at a level which allows approaching the centrifugal limit and ii) use small atom numbers equal to the inserted angular momentum quantum number. The strongly-correlated ground states such as the Laughlin state can be identified via their characteristic density correlations. Of particular interest are the quasihole excitations, whose predicted anyonic exchange statistics have not been directly observed to date. We will probe and test their statistics via the characteristic counting sequence in the excitation spectrum. Furthermore, we will test ideas to transfer anyonic properties of the excitations to a second tracer species. This approach will enable us to both probe the fractional exclusion statistics of the excitations and to create a 2D anyonic quantum gas.
In the long run, these techniques open a path to also study non-Abelian anyons with ultracold atoms.
Max ERC Funding
1 497 500 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym APPLAUSE
Project Adolescent Precursors to Psychiatric Disorders – Learing from Analysis of User-Service Engagement
Researcher (PI) Sara Evans
Host Institution (HI) LONDON SCHOOL OF ECONOMICS AND POLITICAL SCIENCE
Country United Kingdom
Call Details Starting Grant (StG), LS7, ERC-2013-StG
Summary APPLAUSE’s aim is to produce a body of evidence that illustrates how young people with mental health problems currently interact with both formal mental health services and informal social and familial support structures. Careful analysis of data gathered in the UK and Brazil will allow formulation of globally relevant insights into mental health care delivery for young people, which will be presented internationally as a resource for future health care service design.
APPLAUSE will allow the collection of an important data set that does not currently exist in this field, and will look to other disciplines for innovative approaches to data analysis. Whist standard analysis may allow for snapshots of health service use, using innovative life course methods will allow us to to characterise patterns of complete service use of each individual participant’s experience of accessing mental health care and social support.
Adolescence is a critical period in mental health development, which has been largely neglected by public health efforts. Psychiatric disorders rank as the primary cause of disability among individuals aged 10-24 years, worldwide. Moreover, many health risk behaviours emerge during adolescence and 70% of adult psychiatric disorders are preceded by mental health problems during adolescent years. However, delays to receiving care for psychiatric disorders, following disorder onset, avreage more than ten years and little is known about factors which impede access to and continuity of care among young people with mental health problems. APPLAUSE will analyse current access models, reports of individual experiences of positive and negative interactions with health care services and the culturally embedded social factors that impact on such access. Addressing this complex problem from a global perspective will advance the development of a more diverse and innovative set of strategies for improving earlier access to care.
Summary
APPLAUSE’s aim is to produce a body of evidence that illustrates how young people with mental health problems currently interact with both formal mental health services and informal social and familial support structures. Careful analysis of data gathered in the UK and Brazil will allow formulation of globally relevant insights into mental health care delivery for young people, which will be presented internationally as a resource for future health care service design.
APPLAUSE will allow the collection of an important data set that does not currently exist in this field, and will look to other disciplines for innovative approaches to data analysis. Whist standard analysis may allow for snapshots of health service use, using innovative life course methods will allow us to to characterise patterns of complete service use of each individual participant’s experience of accessing mental health care and social support.
Adolescence is a critical period in mental health development, which has been largely neglected by public health efforts. Psychiatric disorders rank as the primary cause of disability among individuals aged 10-24 years, worldwide. Moreover, many health risk behaviours emerge during adolescence and 70% of adult psychiatric disorders are preceded by mental health problems during adolescent years. However, delays to receiving care for psychiatric disorders, following disorder onset, avreage more than ten years and little is known about factors which impede access to and continuity of care among young people with mental health problems. APPLAUSE will analyse current access models, reports of individual experiences of positive and negative interactions with health care services and the culturally embedded social factors that impact on such access. Addressing this complex problem from a global perspective will advance the development of a more diverse and innovative set of strategies for improving earlier access to care.
Max ERC Funding
1 499 948 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym ARBODYNAMIC
Project Coupling dynamic population immunity profiles and host behaviours to arboviral spread
Researcher (PI) Henrik SALJE
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARSOF THE UNIVERSITY OF CAMBRIDGE
Country United Kingdom
Call Details Starting Grant (StG), LS8, ERC-2018-STG
Summary Arboviruses infect millions of people each year, however, mechanisms that drive viral emergence and maintenance remain largely unknown. A combination of host factors (e.g., human mobility), mosquito factors (e.g., abundance) and viral factors (e.g., transmissibility) interconnect to drive spread. Further, for endemic arboviruses, complex patterns of population immunity, built up over many years, appear key to the emergence of particular lineages. To disentangle the contribution of these different drivers, we need detailed data from the same pathogen system over a long time period from the same location. In addition, we need new methods, which can integrate these different data sources and allow appropriate mechanistic inferences.
In this project, I will use the most globally prevalent arbovirus, dengue virus, as a case study. I will focus on Thailand where all four dengue serotypes have circulated endemically for decades and excellent long-term data and isolates exist, to address two fundamental questions:
i) How do population-level patterns of immunity evolve over time and what is their impact on strain dynamics? I will use mechanistic models applied to historic serotype-specific case data to reconstruct the evolving immune profile of the population and explore the impact of immunity on viral diversity using sequences from archived isolates from each year over a 50-year period.
ii) How do human behaviors, vector densities interact with immunity to dictate spread? I will work with geolocated full genome sequences from across Thailand and use detailed data on how people move, their contact patterns, their immunity profiles and mosquito distributions to study competing hypotheses of how arboviruses spread. I will compare the key drivers of dengue spread with that found for outbreaks of Zika and chikungunya.
This proposal addresses fundamental questions about the mechanisms that drive arboviral emergence and spread that will be relevant across disease systems.
Summary
Arboviruses infect millions of people each year, however, mechanisms that drive viral emergence and maintenance remain largely unknown. A combination of host factors (e.g., human mobility), mosquito factors (e.g., abundance) and viral factors (e.g., transmissibility) interconnect to drive spread. Further, for endemic arboviruses, complex patterns of population immunity, built up over many years, appear key to the emergence of particular lineages. To disentangle the contribution of these different drivers, we need detailed data from the same pathogen system over a long time period from the same location. In addition, we need new methods, which can integrate these different data sources and allow appropriate mechanistic inferences.
In this project, I will use the most globally prevalent arbovirus, dengue virus, as a case study. I will focus on Thailand where all four dengue serotypes have circulated endemically for decades and excellent long-term data and isolates exist, to address two fundamental questions:
i) How do population-level patterns of immunity evolve over time and what is their impact on strain dynamics? I will use mechanistic models applied to historic serotype-specific case data to reconstruct the evolving immune profile of the population and explore the impact of immunity on viral diversity using sequences from archived isolates from each year over a 50-year period.
ii) How do human behaviors, vector densities interact with immunity to dictate spread? I will work with geolocated full genome sequences from across Thailand and use detailed data on how people move, their contact patterns, their immunity profiles and mosquito distributions to study competing hypotheses of how arboviruses spread. I will compare the key drivers of dengue spread with that found for outbreaks of Zika and chikungunya.
This proposal addresses fundamental questions about the mechanisms that drive arboviral emergence and spread that will be relevant across disease systems.
Max ERC Funding
1 499 896 €
Duration
Start date: 2019-01-01, End date: 2023-12-31