Project acronym AMPLIFY
Project Amplifying Human Perception Through Interactive Digital Technologies
Researcher (PI) Albrecht Schmidt
Host Institution (HI) LUDWIG-MAXIMILIANS-UNIVERSITAET MUENCHEN
Call Details Consolidator Grant (CoG), PE6, ERC-2015-CoG
Summary Current technical sensor systems offer capabilities that are superior to human perception. Cameras can capture a spectrum that is wider than visible light, high-speed cameras can show movements that are invisible to the human eye, and directional microphones can pick up sounds at long distances. The vision of this project is to lay a foundation for the creation of digital technologies that provide novel sensory experiences and new perceptual capabilities for humans that are natural and intuitive to use. In a first step, the project will assess the feasibility of creating artificial human senses that provide new perceptual channels to the human mind, without increasing the experienced cognitive load. A particular focus is on creating intuitive and natural control mechanisms for amplified senses using eye gaze, muscle activity, and brain signals. Through the creation of a prototype that provides mildly unpleasant stimulations in response to perceived information, the feasibility of implementing an artificial reflex will be experimentally explored. The project will quantify the effectiveness of new senses and artificial perceptual aids compared to the baseline of unaugmented perception. The overall objective is to systematically research, explore, and model new means for increasing the human intake of information in order to lay the foundation for new and improved human senses enabled through digital technologies and to enable artificial reflexes. The ground-breaking contributions of this project are (1) to demonstrate the feasibility of reliably implementing amplified senses and new perceptual capabilities, (2) to prove the possibility of creating an artificial reflex, (3) to provide an example implementation of amplified cognition that is empirically validated, and (4) to develop models, concepts, components, and platforms that will enable and ease the creation of interactive systems that measurably increase human perceptual capabilities.
Summary
Current technical sensor systems offer capabilities that are superior to human perception. Cameras can capture a spectrum that is wider than visible light, high-speed cameras can show movements that are invisible to the human eye, and directional microphones can pick up sounds at long distances. The vision of this project is to lay a foundation for the creation of digital technologies that provide novel sensory experiences and new perceptual capabilities for humans that are natural and intuitive to use. In a first step, the project will assess the feasibility of creating artificial human senses that provide new perceptual channels to the human mind, without increasing the experienced cognitive load. A particular focus is on creating intuitive and natural control mechanisms for amplified senses using eye gaze, muscle activity, and brain signals. Through the creation of a prototype that provides mildly unpleasant stimulations in response to perceived information, the feasibility of implementing an artificial reflex will be experimentally explored. The project will quantify the effectiveness of new senses and artificial perceptual aids compared to the baseline of unaugmented perception. The overall objective is to systematically research, explore, and model new means for increasing the human intake of information in order to lay the foundation for new and improved human senses enabled through digital technologies and to enable artificial reflexes. The ground-breaking contributions of this project are (1) to demonstrate the feasibility of reliably implementing amplified senses and new perceptual capabilities, (2) to prove the possibility of creating an artificial reflex, (3) to provide an example implementation of amplified cognition that is empirically validated, and (4) to develop models, concepts, components, and platforms that will enable and ease the creation of interactive systems that measurably increase human perceptual capabilities.
Max ERC Funding
1 925 250 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym APEG
Project Algorithmic Performance Guarantees: Foundations and Applications
Researcher (PI) Susanne ALBERS
Host Institution (HI) TECHNISCHE UNIVERSITAET MUENCHEN
Call Details Advanced Grant (AdG), PE6, ERC-2015-AdG
Summary Optimization problems are ubiquitous in computer science. Almost every problem involves the optimization of some objective function. However a major part of these problems cannot be solved to optimality. Therefore, algorithms that achieve provably good performance guarantees are of immense importance. Considerable progress has already been made, but great challenges remain: Some fundamental problems are not well understood. Moreover, for central problems arising in new applications, no solutions are known at all.
The goal of APEG is to significantly advance the state of the art on algorithmic performance guarantees. Specifically, the project has two missions: First, it will develop new algorithmic techniques, breaking new ground in the areas of online algorithms, approximations algorithms and algorithmic game theory. Second, it will apply these techniques to solve fundamental problems that are central in these algorithmic disciplines. APEG will attack long-standing open problems, some of which have been unresolved for several decades. Furthermore, it will formulate and investigate new algorithmic problems that arise in modern applications. The research agenda encompasses a broad spectrum of classical and timely topics including (a) resource allocation in computer systems, (b) data structuring, (c) graph problems, with relations to Internet advertising, (d) complex networks and (e) massively parallel systems. In addition to basic optimization objectives, the project will also study the new performance metric of energy minimization in computer systems.
Overall, APEG pursues cutting-edge algorithms research, focusing on both foundational problems and applications. Any progress promises to be a breakthrough or significant contribution.
Summary
Optimization problems are ubiquitous in computer science. Almost every problem involves the optimization of some objective function. However a major part of these problems cannot be solved to optimality. Therefore, algorithms that achieve provably good performance guarantees are of immense importance. Considerable progress has already been made, but great challenges remain: Some fundamental problems are not well understood. Moreover, for central problems arising in new applications, no solutions are known at all.
The goal of APEG is to significantly advance the state of the art on algorithmic performance guarantees. Specifically, the project has two missions: First, it will develop new algorithmic techniques, breaking new ground in the areas of online algorithms, approximations algorithms and algorithmic game theory. Second, it will apply these techniques to solve fundamental problems that are central in these algorithmic disciplines. APEG will attack long-standing open problems, some of which have been unresolved for several decades. Furthermore, it will formulate and investigate new algorithmic problems that arise in modern applications. The research agenda encompasses a broad spectrum of classical and timely topics including (a) resource allocation in computer systems, (b) data structuring, (c) graph problems, with relations to Internet advertising, (d) complex networks and (e) massively parallel systems. In addition to basic optimization objectives, the project will also study the new performance metric of energy minimization in computer systems.
Overall, APEG pursues cutting-edge algorithms research, focusing on both foundational problems and applications. Any progress promises to be a breakthrough or significant contribution.
Max ERC Funding
2 404 250 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym ARCA
Project Analysis and Representation of Complex Activities in Videos
Researcher (PI) Juergen Gall
Host Institution (HI) RHEINISCHE FRIEDRICH-WILHELMS-UNIVERSITAT BONN
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary The goal of the project is to automatically analyse human activities observed in videos. Any solution to this problem will allow the development of novel applications. It could be used to create short videos that summarize daily activities to support patients suffering from Alzheimer's disease. It could also be used for education, e.g., by providing a video analysis for a trainee in the hospital that shows if the tasks have been correctly executed.
The analysis of complex activities in videos, however, is very challenging since activities vary in temporal duration between minutes and hours, involve interactions with several objects that change their appearance and shape, e.g., food during cooking, and are composed of many sub-activities, which can happen at the same time or in various orders.
While the majority of recent works in action recognition focuses on developing better feature encoding techniques for classifying sub-activities in short video clips of a few seconds, this project moves forward and aims to develop a higher level representation of complex activities to overcome the limitations of current approaches. This includes the handling of large time variations and the ability to recognize and locate complex activities in videos. To this end, we aim to develop a unified model that provides detailed information about the activities and sub-activities in terms of time and spatial location, as well as involved pose motion, objects and their transformations.
Another aspect of the project is to learn a representation from videos that is not tied to a specific source of videos or limited to a specific application. Instead we aim to learn a representation that is invariant to a perspective change, e.g., from a third-person perspective to an egocentric perspective, and can be applied to various modalities like videos or depth data without the need of collecting massive training data for all modalities. In other words, we aim to learn the essence of activities.
Summary
The goal of the project is to automatically analyse human activities observed in videos. Any solution to this problem will allow the development of novel applications. It could be used to create short videos that summarize daily activities to support patients suffering from Alzheimer's disease. It could also be used for education, e.g., by providing a video analysis for a trainee in the hospital that shows if the tasks have been correctly executed.
The analysis of complex activities in videos, however, is very challenging since activities vary in temporal duration between minutes and hours, involve interactions with several objects that change their appearance and shape, e.g., food during cooking, and are composed of many sub-activities, which can happen at the same time or in various orders.
While the majority of recent works in action recognition focuses on developing better feature encoding techniques for classifying sub-activities in short video clips of a few seconds, this project moves forward and aims to develop a higher level representation of complex activities to overcome the limitations of current approaches. This includes the handling of large time variations and the ability to recognize and locate complex activities in videos. To this end, we aim to develop a unified model that provides detailed information about the activities and sub-activities in terms of time and spatial location, as well as involved pose motion, objects and their transformations.
Another aspect of the project is to learn a representation from videos that is not tied to a specific source of videos or limited to a specific application. Instead we aim to learn a representation that is invariant to a perspective change, e.g., from a third-person perspective to an egocentric perspective, and can be applied to various modalities like videos or depth data without the need of collecting massive training data for all modalities. In other words, we aim to learn the essence of activities.
Max ERC Funding
1 499 875 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym CSP-Infinity
Project Homogeneous Structures, Constraint Satisfaction Problems, and Topological Clones
Researcher (PI) Manuel Bodirsky
Host Institution (HI) TECHNISCHE UNIVERSITAET DRESDEN
Call Details Consolidator Grant (CoG), PE6, ERC-2015-CoG
Summary The complexity of constraint satisfaction problems (CSPs) is a field in rapid development, and involves central questions in graph homomorphisms, finite model theory, reasoning in artificial intelligence, and, last but not least, universal algebra. In previous work, it was shown that a substantial part of the results and tools for the study of the computational complexity of CSPs can be generalised to infinite domains when the constraints are definable over a homogeneous structure. There are many computational problems, in particular in temporal and spatial reasoning, that can be modelled in this way, but not over finite domains. Also in finite model theory and descriptive complexity, CSPs over infinite domains arise systematically as problems in monotone fragments of existential second-order logic.
In this project, we will advance in three directions:
(a) Further develop the universal-algebraic approach for CSPs over homogeneous structures. E.g., provide evidence for a universal-algebraic tractability conjecture for such CSPs.
(b) Apply the universal-algebraic approach. In particular, classify the complexity of all problems in guarded monotone SNP, a logic discovered independently in finite model theory and ontology-based data-access.
(c) Investigate the complexity of CSPs over those infinite domains that are most relevant in computer science, namely the integers, the rationals, and the reals. Can we adapt the universal-algebraic approach to this setting?
Summary
The complexity of constraint satisfaction problems (CSPs) is a field in rapid development, and involves central questions in graph homomorphisms, finite model theory, reasoning in artificial intelligence, and, last but not least, universal algebra. In previous work, it was shown that a substantial part of the results and tools for the study of the computational complexity of CSPs can be generalised to infinite domains when the constraints are definable over a homogeneous structure. There are many computational problems, in particular in temporal and spatial reasoning, that can be modelled in this way, but not over finite domains. Also in finite model theory and descriptive complexity, CSPs over infinite domains arise systematically as problems in monotone fragments of existential second-order logic.
In this project, we will advance in three directions:
(a) Further develop the universal-algebraic approach for CSPs over homogeneous structures. E.g., provide evidence for a universal-algebraic tractability conjecture for such CSPs.
(b) Apply the universal-algebraic approach. In particular, classify the complexity of all problems in guarded monotone SNP, a logic discovered independently in finite model theory and ontology-based data-access.
(c) Investigate the complexity of CSPs over those infinite domains that are most relevant in computer science, namely the integers, the rationals, and the reals. Can we adapt the universal-algebraic approach to this setting?
Max ERC Funding
1 416 250 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym EPoCH
Project Exploring and Preventing Cryptographic Hardware Backdoors: Protecting the Internet of Things against Next-Generation Attacks
Researcher (PI) Christof PAAR
Host Institution (HI) RUHR-UNIVERSITAET BOCHUM
Call Details Advanced Grant (AdG), PE6, ERC-2015-AdG
Summary The digital landscape is currently undergoing an evolution towards the Internet of Things. The IoT comes with a dramatically increased threat potential, as attacks can endanger human life and can lead to a massive loss of privacy of (European) citizens. A particular dangerous class of attacks manipulates the cryptographic algorithms in the underlying hardware. Backdoors in the cryptography of IoT devices can lead to system-wide loss of security. This proposal has the ambitious goal to comprehensively understand and counter low-level backdoor attacks. The required research consists of two major modules:
1) The development of an encompassing understanding of how hardware manipulations of cryptographic functions can actually be performed, and what the consequences are for the system security. Exploring attacks is fundamental for designing strong countermeasures, analogous to the role of cryptanalysis in cryptology.
2) The development of hardware countermeasures that provide systematic protection against malicious manipulations. In contrast to detection-based methods which dominate the literature, our approach will be pro-active. We will develop solutions for instances of important problems, including hardware reverse engineering and hardware hiding. Little is known about the limits of and optimum approaches to both problems in specific settings.
Beyond prevention of hardware Trojans, the research will have applications in IP protection and will spark research in the theory of computer science community.
Summary
The digital landscape is currently undergoing an evolution towards the Internet of Things. The IoT comes with a dramatically increased threat potential, as attacks can endanger human life and can lead to a massive loss of privacy of (European) citizens. A particular dangerous class of attacks manipulates the cryptographic algorithms in the underlying hardware. Backdoors in the cryptography of IoT devices can lead to system-wide loss of security. This proposal has the ambitious goal to comprehensively understand and counter low-level backdoor attacks. The required research consists of two major modules:
1) The development of an encompassing understanding of how hardware manipulations of cryptographic functions can actually be performed, and what the consequences are for the system security. Exploring attacks is fundamental for designing strong countermeasures, analogous to the role of cryptanalysis in cryptology.
2) The development of hardware countermeasures that provide systematic protection against malicious manipulations. In contrast to detection-based methods which dominate the literature, our approach will be pro-active. We will develop solutions for instances of important problems, including hardware reverse engineering and hardware hiding. Little is known about the limits of and optimum approaches to both problems in specific settings.
Beyond prevention of hardware Trojans, the research will have applications in IP protection and will spark research in the theory of computer science community.
Max ERC Funding
2 498 286 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym GrInflaGal
Project Gravity, Inflation, and Galaxies: Fundamental Physics with Large-Scale Structure
Researcher (PI) Fabian Schmidt
Host Institution (HI) MAX-PLANCK-GESELLSCHAFT ZUR FORDERUNG DER WISSENSCHAFTEN EV
Call Details Starting Grant (StG), PE9, ERC-2015-STG
Summary Over the past two decades, a data-driven revolution has occurred in our understanding of the origin and evolution of our Universe and the structure within it. During this period, cosmology has evolved from a speculative branch of theoretical physics into precision science at the intersection of gravity, particle- and astrophysics. Despite all we have learned, we still do not understand why the Universe accelerates, and how the structure in the Universe originated. Recent breakthrough research, with leading contributions by the PI of this proposal, has shown that we can make progress on these questions using observations of the large-scale structure and its tracers, galaxies. This opens up a fascinating, new interdisciplinary research field: probing Gravity and Inflation with Galaxies. The goal of the proposed research is to first, probe our theory of gravity, General Relativity, on cosmological scales. Second, it aims to shed light on the origin of the initial seed fluctuations out of which all structure in the Universe formed, by constraining the physics and energy scale of inflation. While seemingly unrelated, the main challenge in both research directions consists in understanding the nonlinear physics of structure formation, which is dominated by gravity on scales larger than a few Mpc. By making progress in this understanding, we can unlock a rich trove of information on fundamental physics from large-scale structure. The research goals will be pursued on all three fronts of analytical theory, numerical simulations, and confrontation with data. With space missions, such as Planck and Euclid, as well as ground-based surveys delivering data sets of unprecedented size and quality at this very moment, the proposed research is especially timely. It will make key contributions towards maximizing the science output of these experiments, deepen our understanding of the laws of physics, and uncover our cosmological origins.
Summary
Over the past two decades, a data-driven revolution has occurred in our understanding of the origin and evolution of our Universe and the structure within it. During this period, cosmology has evolved from a speculative branch of theoretical physics into precision science at the intersection of gravity, particle- and astrophysics. Despite all we have learned, we still do not understand why the Universe accelerates, and how the structure in the Universe originated. Recent breakthrough research, with leading contributions by the PI of this proposal, has shown that we can make progress on these questions using observations of the large-scale structure and its tracers, galaxies. This opens up a fascinating, new interdisciplinary research field: probing Gravity and Inflation with Galaxies. The goal of the proposed research is to first, probe our theory of gravity, General Relativity, on cosmological scales. Second, it aims to shed light on the origin of the initial seed fluctuations out of which all structure in the Universe formed, by constraining the physics and energy scale of inflation. While seemingly unrelated, the main challenge in both research directions consists in understanding the nonlinear physics of structure formation, which is dominated by gravity on scales larger than a few Mpc. By making progress in this understanding, we can unlock a rich trove of information on fundamental physics from large-scale structure. The research goals will be pursued on all three fronts of analytical theory, numerical simulations, and confrontation with data. With space missions, such as Planck and Euclid, as well as ground-based surveys delivering data sets of unprecedented size and quality at this very moment, the proposed research is especially timely. It will make key contributions towards maximizing the science output of these experiments, deepen our understanding of the laws of physics, and uncover our cosmological origins.
Max ERC Funding
1 330 625 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym INTERCLOUDS
Project Using the Magellanic Clouds to Understand the Interaction of Galaxies
Researcher (PI) Maria-Rosa Cioni
Host Institution (HI) LEIBNIZ-INSTITUT FUR ASTROPHYSIK POTSDAM (AIP)
Call Details Consolidator Grant (CoG), PE9, ERC-2015-CoG
Summary The Magellanic Clouds are the nearest gas-rich dwarf satellites of the Milky Way and illustrate a typical example of an early phase of a minor merger event, the collision of galaxies that differ in mass by at least a factor of ten. In spite of their important role in supplementing material to the Milky Way halo and the numerous investigations made in the last decade, there remain several uncertainties. Their origin is still a matter of debate, their satellite status is unclear, their mass is uncertain, their gravitational centres are undefined, their structure depends strongly on stellar populations and is severely shaped by interactions, their orbital history is only vaguely associated to star forming events, and their chemical history rests upon limited data. This proposal aims to remedy this lack of knowledge by providing a comprehensive analysis of the stellar content of the Magellanic Clouds and dissect the substructures that are related to their accretion history and the interaction with the Milky Way. Their internal kinematics and orbital history, establishing their bound/unbound status, will be resolved thanks to the analysis of state-of-the art proper motions from the VMC survey and the Gaia mission, and the development of sophisticated theoretical models. Multi-wavelength photometric observations from ongoing large-scale projects will be analysed together to characterise the stellar population of the Magellanic Clouds as has never been previously attempted, including the effects of separate structural components. New large-scale spectroscopic survey projects in preparation will resolve metallicity dependencies and complete the full six-phase space information (distance, position, and motion). This proposal will have a tremendous impact on our understanding of the consequences of minor mergers, and will offer a firm perspective of the Magellanic Clouds.
Summary
The Magellanic Clouds are the nearest gas-rich dwarf satellites of the Milky Way and illustrate a typical example of an early phase of a minor merger event, the collision of galaxies that differ in mass by at least a factor of ten. In spite of their important role in supplementing material to the Milky Way halo and the numerous investigations made in the last decade, there remain several uncertainties. Their origin is still a matter of debate, their satellite status is unclear, their mass is uncertain, their gravitational centres are undefined, their structure depends strongly on stellar populations and is severely shaped by interactions, their orbital history is only vaguely associated to star forming events, and their chemical history rests upon limited data. This proposal aims to remedy this lack of knowledge by providing a comprehensive analysis of the stellar content of the Magellanic Clouds and dissect the substructures that are related to their accretion history and the interaction with the Milky Way. Their internal kinematics and orbital history, establishing their bound/unbound status, will be resolved thanks to the analysis of state-of-the art proper motions from the VMC survey and the Gaia mission, and the development of sophisticated theoretical models. Multi-wavelength photometric observations from ongoing large-scale projects will be analysed together to characterise the stellar population of the Magellanic Clouds as has never been previously attempted, including the effects of separate structural components. New large-scale spectroscopic survey projects in preparation will resolve metallicity dependencies and complete the full six-phase space information (distance, position, and motion). This proposal will have a tremendous impact on our understanding of the consequences of minor mergers, and will offer a firm perspective of the Magellanic Clouds.
Max ERC Funding
1 985 017 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym OSARES
Project Output-Sensitive Algorithms for Reactive Synthesis
Researcher (PI) Bernd Erhard Finkbeiner
Host Institution (HI) UNIVERSITAT DES SAARLANDES
Call Details Consolidator Grant (CoG), PE6, ERC-2015-CoG
Summary Reactive synthesis has the potential to revolutionize the development of distributed embedded systems. From a given logical specification, the synthesis algorithm automatically constructs an implementation that is correct-by-design. The vision is that a designer analyzes the design objectives with a synthesis tool, automatically identifies competing or contradictory requirements and obtains an error-free prototype implementation. Coding and testing, the most expensive stages of development, are eliminated from the development process. Recent case studies from robotic control and from hardware design, such as the automatic synthesis of the AMBA AHB bus controller, demonstrate that this vision is in principle feasible. So far, however, synthesis does not scale to large systems. Even if successful, it produces code that is much larger and much more complicated than the code produced by human programmers for the same specification. Our goal is to address both of these fundamental shortcomings at the same time. We will develop output-sensitive synthesis algorithms, i.e. algorithms that, in addition to optimal performance in the size of the specification, also perform optimally in the size and structural complexity of the implementation. Target applications for our algorithms come from both the classic areas of reactive synthesis, such as hardware circuits, and from new and much more challenging application areas such as the distributed control and coordination of autonomous vehicles and manufacturing robots, which are far beyond the reach of the currently available synthesis algorithms.
Summary
Reactive synthesis has the potential to revolutionize the development of distributed embedded systems. From a given logical specification, the synthesis algorithm automatically constructs an implementation that is correct-by-design. The vision is that a designer analyzes the design objectives with a synthesis tool, automatically identifies competing or contradictory requirements and obtains an error-free prototype implementation. Coding and testing, the most expensive stages of development, are eliminated from the development process. Recent case studies from robotic control and from hardware design, such as the automatic synthesis of the AMBA AHB bus controller, demonstrate that this vision is in principle feasible. So far, however, synthesis does not scale to large systems. Even if successful, it produces code that is much larger and much more complicated than the code produced by human programmers for the same specification. Our goal is to address both of these fundamental shortcomings at the same time. We will develop output-sensitive synthesis algorithms, i.e. algorithms that, in addition to optimal performance in the size of the specification, also perform optimally in the size and structural complexity of the implementation. Target applications for our algorithms come from both the classic areas of reactive synthesis, such as hardware circuits, and from new and much more challenging application areas such as the distributed control and coordination of autonomous vehicles and manufacturing robots, which are far beyond the reach of the currently available synthesis algorithms.
Max ERC Funding
1 995 000 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym PhysSF
Project Physics of Star Formation and Its Regulation
Researcher (PI) Eva SCHINNERER
Host Institution (HI) MAX-PLANCK-GESELLSCHAFT ZUR FORDERUNG DER WISSENSCHAFTEN EV
Call Details Advanced Grant (AdG), PE9, ERC-2015-AdG
Summary In the past decade we learned when and where stellar mass was built up in galaxies through cosmic time, now we must understand the physical causes in order to answer `How do galaxies form and evolve?’. This ERC project is designed to greatly advance our understanding of the physics of the star formation (SF) process and its regulation in typical star forming galaxies. The ERC project consists of 2 complementary parts: (A) an unparalleled characterization of the SF process in nearby galaxies through full exploitation of the revolutionary capabilities of the latest millimeter interferometers (ALMA) and optical integral field units (MUSE). This study will constrain the key physical parameters for the SF process on only 50pc scales - the scale of large HII regions and their predecessors, giant molecular clouds. At this crucial scale, the MUSE-ALMA-HST Survey will provide a characterization of the SF history, stellar/gaseous surface densities, metallicities of stars and gas, the stellar radiation field, extinction, and stellar/gas kinematics, and thus uncover the physical conditions that control and regulate the SF process. Part (B) will place the results of part (A) in a cosmological context, by characterizing key galaxy quantities (e.g., gas mass fraction, specific SF rates, gas depletion times) in fully representative galaxy samples after (z<3) and before (z>3) the peak epoch of cosmic star formation density. In addition to providing the critically needed constraints on the conditions that govern the SF process, this ERC project will provide the observational benchmark for state-of-the art galaxy simulations and models. The PI is internationally recognized as a leader in SF studies in nearby and distant galaxies, and has successfully led large international collaborations that strongly shaped our current understanding of the SF process. Through her track record and access to the required data, the PI is uniquely positioned to successful lead this ambitious program.
Summary
In the past decade we learned when and where stellar mass was built up in galaxies through cosmic time, now we must understand the physical causes in order to answer `How do galaxies form and evolve?’. This ERC project is designed to greatly advance our understanding of the physics of the star formation (SF) process and its regulation in typical star forming galaxies. The ERC project consists of 2 complementary parts: (A) an unparalleled characterization of the SF process in nearby galaxies through full exploitation of the revolutionary capabilities of the latest millimeter interferometers (ALMA) and optical integral field units (MUSE). This study will constrain the key physical parameters for the SF process on only 50pc scales - the scale of large HII regions and their predecessors, giant molecular clouds. At this crucial scale, the MUSE-ALMA-HST Survey will provide a characterization of the SF history, stellar/gaseous surface densities, metallicities of stars and gas, the stellar radiation field, extinction, and stellar/gas kinematics, and thus uncover the physical conditions that control and regulate the SF process. Part (B) will place the results of part (A) in a cosmological context, by characterizing key galaxy quantities (e.g., gas mass fraction, specific SF rates, gas depletion times) in fully representative galaxy samples after (z<3) and before (z>3) the peak epoch of cosmic star formation density. In addition to providing the critically needed constraints on the conditions that govern the SF process, this ERC project will provide the observational benchmark for state-of-the art galaxy simulations and models. The PI is internationally recognized as a leader in SF studies in nearby and distant galaxies, and has successfully led large international collaborations that strongly shaped our current understanding of the SF process. Through her track record and access to the required data, the PI is uniquely positioned to successful lead this ambitious program.
Max ERC Funding
2 495 000 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym POWVER
Project Power to the People. Verified.
Researcher (PI) Holger Hermanns
Host Institution (HI) UNIVERSITAT DES SAARLANDES
Call Details Advanced Grant (AdG), PE6, ERC-2015-AdG
Summary Twenty years ago we were able to repair cars at home. Nowadays customer services repair coffee machines. By installing software updates. Soon you will no longer be able to repair your bike.
Embedded software innovations boost our society; they help us tremendously in our daily life. But we do not understand what the software does, regardless of how well educated or smart we are. Proprietary embedded software has become an opaque layer between functionality and user. That layer is thick enough to possibly induce malicious or unintended behaviour. Proprietary embedded software locks us out of the products we own.
We need a turn to open and hence customisable embedded software. However, a minor customisation might well have strong unexpected impact, for instance on the longevity of an embedded battery, or the safety of the battery charging process. We thus need means to detect, quantify and prevent such implications.
The POWVER project lays the foundations. It provides quantitative verification technology for system-level correctness, safety, dependability, and performability. In this endeavour, POWVER takes up a hard scientific challenge, a challenge where discrete and continuous, real-time, stochastic as well as data- and user-dependent aspects are all deeply intertwined: embedded software for electric power management. Electric power is intricate to handle by software, is safety-critical, but vital for mobile devices and their longevity. Since ever more tools, gadgets, and vehicles run on batteries and use power harvesting, power management is a pivot of the future.
POWVER will demonstrate that quantitative verification of open embedded software is feasible, and can ensure safe and dependable operation of safety-critical devices. A proof of concept will target the field of electric mobility, set up as a blueprint for other battery-powered appliances. As such, POWVER is the nucleus for a radical change in the way embedded software quality is assured in general.
Summary
Twenty years ago we were able to repair cars at home. Nowadays customer services repair coffee machines. By installing software updates. Soon you will no longer be able to repair your bike.
Embedded software innovations boost our society; they help us tremendously in our daily life. But we do not understand what the software does, regardless of how well educated or smart we are. Proprietary embedded software has become an opaque layer between functionality and user. That layer is thick enough to possibly induce malicious or unintended behaviour. Proprietary embedded software locks us out of the products we own.
We need a turn to open and hence customisable embedded software. However, a minor customisation might well have strong unexpected impact, for instance on the longevity of an embedded battery, or the safety of the battery charging process. We thus need means to detect, quantify and prevent such implications.
The POWVER project lays the foundations. It provides quantitative verification technology for system-level correctness, safety, dependability, and performability. In this endeavour, POWVER takes up a hard scientific challenge, a challenge where discrete and continuous, real-time, stochastic as well as data- and user-dependent aspects are all deeply intertwined: embedded software for electric power management. Electric power is intricate to handle by software, is safety-critical, but vital for mobile devices and their longevity. Since ever more tools, gadgets, and vehicles run on batteries and use power harvesting, power management is a pivot of the future.
POWVER will demonstrate that quantitative verification of open embedded software is feasible, and can ensure safe and dependable operation of safety-critical devices. A proof of concept will target the field of electric mobility, set up as a blueprint for other battery-powered appliances. As such, POWVER is the nucleus for a radical change in the way embedded software quality is assured in general.
Max ERC Funding
2 425 000 €
Duration
Start date: 2016-09-01, End date: 2021-08-31