Project acronym APPROXNP
Project Approximation of NP-hard optimization problems
Researcher (PI) Johan Håstad
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Advanced Grant (AdG), PE6, ERC-2008-AdG
Summary The proposed project aims to create a center of excellence that aims at understanding the approximability of NP-hard optimization problems. In particular, for central problems like vertex cover, coloring of graphs, and various constraint satisfaction problems we want to study upper and lower bounds on how well they can be approximated in polynomial time. Many existing strong results are based on what is known as the Unique Games Conjecture (UGC) and a significant part of the project will be devoted to studying this conjecture. We expect that a major step needed to be taken in this process is to further develop the understanding of Boolean functions on the Boolean hypercube. We anticipate that the tools needed for this will come in the form of harmonic analysis which in its turn will rely on the corresponding results in the analysis of functions over the domain of real numbers.
Summary
The proposed project aims to create a center of excellence that aims at understanding the approximability of NP-hard optimization problems. In particular, for central problems like vertex cover, coloring of graphs, and various constraint satisfaction problems we want to study upper and lower bounds on how well they can be approximated in polynomial time. Many existing strong results are based on what is known as the Unique Games Conjecture (UGC) and a significant part of the project will be devoted to studying this conjecture. We expect that a major step needed to be taken in this process is to further develop the understanding of Boolean functions on the Boolean hypercube. We anticipate that the tools needed for this will come in the form of harmonic analysis which in its turn will rely on the corresponding results in the analysis of functions over the domain of real numbers.
Max ERC Funding
2 376 000 €
Duration
Start date: 2009-01-01, End date: 2014-12-31
Project acronym ASTRODYN
Project Astrophysical Dynamos
Researcher (PI) Axel Brandenburg
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Advanced Grant (AdG), PE9, ERC-2008-AdG
Summary Magnetic fields in stars, planets, accretion discs, and galaxies are believed to be the result of a dynamo process converting kinetic energy into magnetic energy. This work focuses on the solar dynamo, but dynamos in other astrophysical systems will also be addressed. In particular, direct high-resolution three-dimensional simulations are used to understand particular aspects of the solar dynamo and ultimately to simulate the solar dynamo as a whole. Phenomenological approaches will be avoided in favor of obtaining rigorous results. A major problem is catastrophic quenching, i.e. the decline of dynamo effects in inverse proportion to the magnetic Reynolds number, which is huge. Tremendous advances have been made in the last few years since the cause of catastrophic quenching in dynamos has been understood in terms of magnetic helicity evolution. The numerical tools are now in place to allow for magnetic helicity fluxes via coronal mass ejections, thus alleviating catastrophic quenching. This work employs simulations in spherical shells, augmented by Cartesian simulations in special cases. The roles of the near-surface shear layer, the tachocline, as well as pumping in the bulk of the convection zone are to be clarified. The Pencil Code will be used for most applications. The code is third order in time and sixth order in space and is used for solving the hydromagnetic equations. It is a public domain code developed by roughly 20 scientists world wide and maintained under an a central versioning system at Nordita. Automatic nightly tests of currently 30 applications ensure the integrity of the code. It is used for a wide range of applications and may include the effects of radiation, self-gravity, dust, chemistry, variable ionization, cosmic rays, in addition to those of magnetohydrodynamics. The code with its infrastructure offers a good opportunity for individuals within a broad group of people to develop new tools that may automatically be useful to others.
Summary
Magnetic fields in stars, planets, accretion discs, and galaxies are believed to be the result of a dynamo process converting kinetic energy into magnetic energy. This work focuses on the solar dynamo, but dynamos in other astrophysical systems will also be addressed. In particular, direct high-resolution three-dimensional simulations are used to understand particular aspects of the solar dynamo and ultimately to simulate the solar dynamo as a whole. Phenomenological approaches will be avoided in favor of obtaining rigorous results. A major problem is catastrophic quenching, i.e. the decline of dynamo effects in inverse proportion to the magnetic Reynolds number, which is huge. Tremendous advances have been made in the last few years since the cause of catastrophic quenching in dynamos has been understood in terms of magnetic helicity evolution. The numerical tools are now in place to allow for magnetic helicity fluxes via coronal mass ejections, thus alleviating catastrophic quenching. This work employs simulations in spherical shells, augmented by Cartesian simulations in special cases. The roles of the near-surface shear layer, the tachocline, as well as pumping in the bulk of the convection zone are to be clarified. The Pencil Code will be used for most applications. The code is third order in time and sixth order in space and is used for solving the hydromagnetic equations. It is a public domain code developed by roughly 20 scientists world wide and maintained under an a central versioning system at Nordita. Automatic nightly tests of currently 30 applications ensure the integrity of the code. It is used for a wide range of applications and may include the effects of radiation, self-gravity, dust, chemistry, variable ionization, cosmic rays, in addition to those of magnetohydrodynamics. The code with its infrastructure offers a good opportunity for individuals within a broad group of people to develop new tools that may automatically be useful to others.
Max ERC Funding
2 220 000 €
Duration
Start date: 2009-02-01, End date: 2014-01-31
Project acronym CEMYSS
Project Cosmochemical Exploration of the first two Million Years of the Solar System
Researcher (PI) Marc Chaussidon
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE9, ERC-2008-AdG
Summary One of the major outcomes of recent studies on the formation of the Solar System is the reconnaissance of the fundamental importance of processes which took place during the first 10 thousands to 2 or 3 millions years of the lifetime of the Sun and its accretion disk. Astrophysical observations in the optical to infrared wavelengths of circumstellar disks around young stars have shown the existence in the inner disk of high-temperature processing of the dust. X-ray observations of T-Tauri stars revealed that they exhibit X-ray flare enhancements by several orders of magnitude. The work we have performed over the last years on the isotopic analysis of either solar wind trapped in lunar soils or of Ca-, Al-rich inclusions and chondrules from primitive chondrites, has allowed us to link some of these astrophysical observations around young stars with processes, such as irradiation by energetic particles and UV light, which took place around the T-Tauri Sun. The aim of this project is to make decisive progress in our understanding of the early solar system though the development of in situ high-precision isotopic measurements by ion microprobe in extra-terrestrial matter. The project will be focused on the exploration of the variations in the isotopic composition of O and Mg and in the concentration of short-lived radioactive nuclides, such as 26Al and 10Be, with half-lives shorter than 1.5 millions years. A special emphasis will be put on the search for nuclides with very short half-lives such as 32Si (650 years) and 14C (5730 years), nuclides which have never been discovered yet in meteorites. These new data will bring critical information on, for instance, the astrophysical context for the formation of the Sun and the first solids in the accretion disk, or the timing and the processes by which protoplanets were formed and destroyed close to the Sun during the first 2 million years of the lifetime of the Solar System.
Summary
One of the major outcomes of recent studies on the formation of the Solar System is the reconnaissance of the fundamental importance of processes which took place during the first 10 thousands to 2 or 3 millions years of the lifetime of the Sun and its accretion disk. Astrophysical observations in the optical to infrared wavelengths of circumstellar disks around young stars have shown the existence in the inner disk of high-temperature processing of the dust. X-ray observations of T-Tauri stars revealed that they exhibit X-ray flare enhancements by several orders of magnitude. The work we have performed over the last years on the isotopic analysis of either solar wind trapped in lunar soils or of Ca-, Al-rich inclusions and chondrules from primitive chondrites, has allowed us to link some of these astrophysical observations around young stars with processes, such as irradiation by energetic particles and UV light, which took place around the T-Tauri Sun. The aim of this project is to make decisive progress in our understanding of the early solar system though the development of in situ high-precision isotopic measurements by ion microprobe in extra-terrestrial matter. The project will be focused on the exploration of the variations in the isotopic composition of O and Mg and in the concentration of short-lived radioactive nuclides, such as 26Al and 10Be, with half-lives shorter than 1.5 millions years. A special emphasis will be put on the search for nuclides with very short half-lives such as 32Si (650 years) and 14C (5730 years), nuclides which have never been discovered yet in meteorites. These new data will bring critical information on, for instance, the astrophysical context for the formation of the Sun and the first solids in the accretion disk, or the timing and the processes by which protoplanets were formed and destroyed close to the Sun during the first 2 million years of the lifetime of the Solar System.
Max ERC Funding
1 270 419 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym ECC SCIENG
Project Error-correcting codes and their applications in Science and Engineering
Researcher (PI) Mohammad Amin Shokrollahi
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Advanced Grant (AdG), PE6, ERC-2008-AdG
Summary Error correcting codes are combinatorial objects which have traditionally been used to enhance the transmission of data on unreliable media. They have experienced a phenomenal growth since their birth some fifty years ago. Today, everyday tasks such as listening to a CD, accessing the hard disk of an electronic device, talking on a wireless phone, or downloading files from the Internet are impossible without the use of error-correcting codes. Though traditional communication still occupies centerstage in the realm of applied coding theory, emerging applications are changing the rules of the game, and calling for a new type of coding theory capable of addressing future needs. These are not limited to physical applications, however. In fact, coding theory is an integral part of solutions offered by researchers outside traditional physical communication to solve fundamental problems of interest, such as the complexity of computation, reliable transfer of bulk data, cryptographic protocols, self correcting software, signal processing, or even computational biology.While research in the past fifty years has put traditional coding theory on firm theoretical grounds, emerging applications are in need of new tools and methods to design, analyze, and implement coding technologies capable of dealing with future needs. This is the main concern of the present proposal. To strike the right balance between length and impact we have identified five areas of research that span the full spectrum of coding theory ranging from fundamental theoretical aspects to practical applications. We set out to develop new theoretical and practical models for the design and analysis of codes, and explore new application areas hitherto untouched. A unique feature of this proposal is our choice of the tools, ranging from classical areas of algebra, combinatorics, and probability theory, to ideas and methods from theoretical computer science.
Summary
Error correcting codes are combinatorial objects which have traditionally been used to enhance the transmission of data on unreliable media. They have experienced a phenomenal growth since their birth some fifty years ago. Today, everyday tasks such as listening to a CD, accessing the hard disk of an electronic device, talking on a wireless phone, or downloading files from the Internet are impossible without the use of error-correcting codes. Though traditional communication still occupies centerstage in the realm of applied coding theory, emerging applications are changing the rules of the game, and calling for a new type of coding theory capable of addressing future needs. These are not limited to physical applications, however. In fact, coding theory is an integral part of solutions offered by researchers outside traditional physical communication to solve fundamental problems of interest, such as the complexity of computation, reliable transfer of bulk data, cryptographic protocols, self correcting software, signal processing, or even computational biology.While research in the past fifty years has put traditional coding theory on firm theoretical grounds, emerging applications are in need of new tools and methods to design, analyze, and implement coding technologies capable of dealing with future needs. This is the main concern of the present proposal. To strike the right balance between length and impact we have identified five areas of research that span the full spectrum of coding theory ranging from fundamental theoretical aspects to practical applications. We set out to develop new theoretical and practical models for the design and analysis of codes, and explore new application areas hitherto untouched. A unique feature of this proposal is our choice of the tools, ranging from classical areas of algebra, combinatorics, and probability theory, to ideas and methods from theoretical computer science.
Max ERC Funding
1 959 998 €
Duration
Start date: 2009-04-01, End date: 2013-03-31
Project acronym GRBS
Project Gamma Ray Bursts as a Focal Point of High Energy Astrophysics
Researcher (PI) Tsvi Piran
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Advanced Grant (AdG), PE9, ERC-2008-AdG
Summary Gamma-Ray Bursts (GRBs), short and intense bursts of gamma-rays originating from random directions in the sky, are the brightest explosions in our Universe. They involve ultra-relativistic motion, huge magnetic fields, the strongest gravitational fields, acceleration of photons, neutrinos and cosmic rays to ultra high energies, the collapse of massive stars, mergers of neutron star binaries and formation of newborn black holes. They are at the focal point of relativistic high energy astrophysics and they serve as the best laboratory for extreme physics. The internal-external shocks model was formulated to explain their inner working. This model had impressive successes in interpreting and predicting GRB properties. Still it had left many fundamental questions unanswered. Furthermore, recently it has been confronted with puzzling Swift observations of the early afterglow and it is not clear if it needs minor revisions or a drastic overhaul. I describe here an extensive research program that deals with practically all aspects of GRB. From a technical point of view this program involves sophisticated state of the art computations on one hand, fundamental theory and phenomenological analysis of observations and data analysis on the other one. My goal is to address both old and new open question, considering, among other options the possibility that the current model has to be drastically revised. My long term goal, beyond understanding the inner working of GRBs, is to create a unified theory of accretion acceleration and collimation and of emission of high energy gamma-rays and relativistic particles that will synergize our understanding of GRBs, AGNs, Microquasars, galactic binary black holes SNRs and other high energy astrophysics phenomena. A second hope is to find ways to utilize GRBs to reveal new physics that cannot be explored otherwise.
Summary
Gamma-Ray Bursts (GRBs), short and intense bursts of gamma-rays originating from random directions in the sky, are the brightest explosions in our Universe. They involve ultra-relativistic motion, huge magnetic fields, the strongest gravitational fields, acceleration of photons, neutrinos and cosmic rays to ultra high energies, the collapse of massive stars, mergers of neutron star binaries and formation of newborn black holes. They are at the focal point of relativistic high energy astrophysics and they serve as the best laboratory for extreme physics. The internal-external shocks model was formulated to explain their inner working. This model had impressive successes in interpreting and predicting GRB properties. Still it had left many fundamental questions unanswered. Furthermore, recently it has been confronted with puzzling Swift observations of the early afterglow and it is not clear if it needs minor revisions or a drastic overhaul. I describe here an extensive research program that deals with practically all aspects of GRB. From a technical point of view this program involves sophisticated state of the art computations on one hand, fundamental theory and phenomenological analysis of observations and data analysis on the other one. My goal is to address both old and new open question, considering, among other options the possibility that the current model has to be drastically revised. My long term goal, beyond understanding the inner working of GRBs, is to create a unified theory of accretion acceleration and collimation and of emission of high energy gamma-rays and relativistic particles that will synergize our understanding of GRBs, AGNs, Microquasars, galactic binary black holes SNRs and other high energy astrophysics phenomena. A second hope is to find ways to utilize GRBs to reveal new physics that cannot be explored otherwise.
Max ERC Funding
1 933 460 €
Duration
Start date: 2009-01-01, End date: 2014-12-31
Project acronym HIGHZ
Project HIGHZ: Elucidating galaxy formation and evolution from very deep Near-IR imaging
Researcher (PI) Marijn Franx
Host Institution (HI) UNIVERSITEIT LEIDEN
Call Details Advanced Grant (AdG), PE9, ERC-2008-AdG
Summary "Studies of high redshift galaxies require very deep Near-IR imaging. This allows the study of z=2-4 galaxies redward of the Balmer/4000 Angstrom break, and the detection of UV-bright galaxies at z>7. Two new facilities wil revolutionize these studies: the VISTA telescope built for ESO, and the Near-IR channel on WF3 for HST. They will become available at the start of the grant period. We propose to build a group to analyze the imaging data from these facilities. We will make use of the fact that I am Co-PI on the ultra-deep ""ULTRA-VISTA"" survey on the VISTA telescope, and we will analyze public and privately proposed data from WF3. The following science questions will be addressed: (1) what is the origin and evolution of the Hubble sequence out to z=3, (2) what is the evolution of the Luminosity Function of UV bright galaxies between z=6 to z=11, and what galaxies cause re-ionization, (3) how does the mass function of quiescent and star forming galaxies evolve to z=4, and how do the correlation functions of subpopulations evolve as a function of redshift. A crucial component of this proposal is the request for support for a junior faculty position. This person will take on the lead for the highly specialized data processing, and will supervise the analysis of the selection effects, and other crucial components needed for a proper analysis."
Summary
"Studies of high redshift galaxies require very deep Near-IR imaging. This allows the study of z=2-4 galaxies redward of the Balmer/4000 Angstrom break, and the detection of UV-bright galaxies at z>7. Two new facilities wil revolutionize these studies: the VISTA telescope built for ESO, and the Near-IR channel on WF3 for HST. They will become available at the start of the grant period. We propose to build a group to analyze the imaging data from these facilities. We will make use of the fact that I am Co-PI on the ultra-deep ""ULTRA-VISTA"" survey on the VISTA telescope, and we will analyze public and privately proposed data from WF3. The following science questions will be addressed: (1) what is the origin and evolution of the Hubble sequence out to z=3, (2) what is the evolution of the Luminosity Function of UV bright galaxies between z=6 to z=11, and what galaxies cause re-ionization, (3) how does the mass function of quiescent and star forming galaxies evolve to z=4, and how do the correlation functions of subpopulations evolve as a function of redshift. A crucial component of this proposal is the request for support for a junior faculty position. This person will take on the lead for the highly specialized data processing, and will supervise the analysis of the selection effects, and other crucial components needed for a proper analysis."
Max ERC Funding
1 471 200 €
Duration
Start date: 2009-09-01, End date: 2014-08-31
Project acronym LEAP
Project Large European Array for Pulsars
Researcher (PI) Michael Kramer
Host Institution (HI) THE UNIVERSITY OF MANCHESTER
Call Details Advanced Grant (AdG), PE9, ERC-2008-AdG
Summary In general relativity and other relativistic theories of gravity, space and time are combined to form ``space-time'' which is curved in the presence of mass. As masses move, for instance like the two components in a binary system, ripples in space-time are created that propagate through the Universe, very much like waves caused by a stone falling into a pond. These ``gravitational waves'' (GWs) are known to exist from the effect that they have on a system of two orbiting stars. After inferring their existence indirectly, the next great challenge is the {\em direct} detection of GWs. While this is the aim of a number of gravitational wave detectors around the world, a detection has not been made. Fortunately, a method exists that allows us today to detect GWs directly, in a frequency range that is much lower but complementary to those covered by ground-based detectors. This method utilises the radio astronomical observations of a special type of star known as radio pulsars. We propose an experiment to achieve the ground-breaking goal of GW detection with the help of an innovative approach. At the heart of this approach, named LEAP, lies the goal to combine the collective power of Europe's biggest radio-telescopes to form the biggest fully-steerable telescope on Earth, providing a ``leap'' in our sensitivity to go beyond the threshold that delivers the first direct detection of GWs. While the rewards for a successful detection of GWs are immense, we demonstrate that this is possible by harvesting the experience and resources uniquely available in Europe.
Summary
In general relativity and other relativistic theories of gravity, space and time are combined to form ``space-time'' which is curved in the presence of mass. As masses move, for instance like the two components in a binary system, ripples in space-time are created that propagate through the Universe, very much like waves caused by a stone falling into a pond. These ``gravitational waves'' (GWs) are known to exist from the effect that they have on a system of two orbiting stars. After inferring their existence indirectly, the next great challenge is the {\em direct} detection of GWs. While this is the aim of a number of gravitational wave detectors around the world, a detection has not been made. Fortunately, a method exists that allows us today to detect GWs directly, in a frequency range that is much lower but complementary to those covered by ground-based detectors. This method utilises the radio astronomical observations of a special type of star known as radio pulsars. We propose an experiment to achieve the ground-breaking goal of GW detection with the help of an innovative approach. At the heart of this approach, named LEAP, lies the goal to combine the collective power of Europe's biggest radio-telescopes to form the biggest fully-steerable telescope on Earth, providing a ``leap'' in our sensitivity to go beyond the threshold that delivers the first direct detection of GWs. While the rewards for a successful detection of GWs are immense, we demonstrate that this is possible by harvesting the experience and resources uniquely available in Europe.
Max ERC Funding
2 455 285 €
Duration
Start date: 2009-01-01, End date: 2014-09-30
Project acronym LIBPR
Project Liberating Programming
Researcher (PI) David Harel
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Advanced Grant (AdG), PE6, ERC-2008-AdG
Summary We propose to provide the theoretical, algorithmic and methodological foundations, and build the supporting tools, to bring about a major, paradigmatic, revolutionary change in the way software and systems are programmed and executed, based on the idea of liberated programming, a sweeping extension of the scenario-based play-in/play-out approach to program design and execution that I and my group have done around the language of live sequence charts (LSCs). Play-in is a new way of software programming, combining the ideas of showing and teaching, instead of telling, relying on friendly advanced user interfaces, and using intuitive yet formal and expressive visual languages. Play-out is a general name for the technologies of executing played-in programs using powerful tools such as model-checking and synthesis. Our proposed work is divided into four main threads: (1) play-in, the development of new languages and interaction techniques; (2) play-out, the development of new execution technologies; (3) domain specific adaptations and applications; and (4) integration and tools. The play-in techniques proposed include the translation of systems requirements given in natural language into an executable artifact, the use of novel and dynamic human machine interaction techniques, relying on visual languages as target languages. The play-out execution methods proposed include the use of model-checking and synthesis algorithms, compilation, and execution environments that learn. Domain specific applications proposed include web services, tactical simulators, embedded systems, and biological modeling. Finally, we propose to build prototype tools that will allow the evaluation of the new technologies and their dissemination into the academic community and industry.
Summary
We propose to provide the theoretical, algorithmic and methodological foundations, and build the supporting tools, to bring about a major, paradigmatic, revolutionary change in the way software and systems are programmed and executed, based on the idea of liberated programming, a sweeping extension of the scenario-based play-in/play-out approach to program design and execution that I and my group have done around the language of live sequence charts (LSCs). Play-in is a new way of software programming, combining the ideas of showing and teaching, instead of telling, relying on friendly advanced user interfaces, and using intuitive yet formal and expressive visual languages. Play-out is a general name for the technologies of executing played-in programs using powerful tools such as model-checking and synthesis. Our proposed work is divided into four main threads: (1) play-in, the development of new languages and interaction techniques; (2) play-out, the development of new execution technologies; (3) domain specific adaptations and applications; and (4) integration and tools. The play-in techniques proposed include the translation of systems requirements given in natural language into an executable artifact, the use of novel and dynamic human machine interaction techniques, relying on visual languages as target languages. The play-out execution methods proposed include the use of model-checking and synthesis algorithms, compilation, and execution environments that learn. Domain specific applications proposed include web services, tactical simulators, embedded systems, and biological modeling. Finally, we propose to build prototype tools that will allow the evaluation of the new technologies and their dissemination into the academic community and industry.
Max ERC Funding
2 102 958 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym LOFAR-AUGER
Project From Black Holes to Ultra-High Energy Cosmic Rays: Exploring the Extremes of the Universe with Low-Frequency Radio Interferometry
Researcher (PI) Heino Falcke
Host Institution (HI) STICHTING KATHOLIEKE UNIVERSITEIT
Call Details Advanced Grant (AdG), PE9, ERC-2008-AdG
Summary Black holes (BHs) and ultra-high energy cosmic rays (UHECRs) are two extremes of the universe that link particle physics and astrophysics. BHs are the most efficient power generators in the universe while UHECRs are the most energetic particles ever detected. As we showed previously, a major fraction of the power of BHs is channeled into radio-emitting plasma jets, which are also efficient particle accelerators. Are BHs also responsible for UHECRs? This long-standing question could be answered soon, through the dawn of cosmic ray astronomy. The giant Auger observatory has now shown for the first time that the arrival directions of UHECRs are non-isotropic, potentially pointing back to their sources of origin. BHs turned out to be major suspects, but other sources could still also be responsible. To address this conclusively and to establish cosmic ray astronomy as a productive new field in the coming years, we need to increase statistics, expand current observatories, and have complementary all-sky radio surveys available to identify sources, since radio emission traces particle acceleration sites. Here, techniques pioneered by the Low-Frequency Array (LOFAR) promise major advances. First of all, working on LOFAR we uncovered a new technique to detect UHECRs with radio antennas and verified it experimentally. The technique promises to increase the number of high-quality events by almost an order of magnitude and provides much improved energy and direction resolution. We now want to implement this technique in Auger, combining LOFAR and AUGER know-how. Secondly, LOFAR and soon other SKA pathfinders will significantly improve all-sky radio surveys with high sensitivity, resolution, and image quality. Hence, we will use LOFAR to understand the astrophysics of UHECR source candidates and compile a radio-based catalog thereof. We start with jets from BHs and move later to other sources. Together this will allow us to identify UHECR sources and study them in detail.
Summary
Black holes (BHs) and ultra-high energy cosmic rays (UHECRs) are two extremes of the universe that link particle physics and astrophysics. BHs are the most efficient power generators in the universe while UHECRs are the most energetic particles ever detected. As we showed previously, a major fraction of the power of BHs is channeled into radio-emitting plasma jets, which are also efficient particle accelerators. Are BHs also responsible for UHECRs? This long-standing question could be answered soon, through the dawn of cosmic ray astronomy. The giant Auger observatory has now shown for the first time that the arrival directions of UHECRs are non-isotropic, potentially pointing back to their sources of origin. BHs turned out to be major suspects, but other sources could still also be responsible. To address this conclusively and to establish cosmic ray astronomy as a productive new field in the coming years, we need to increase statistics, expand current observatories, and have complementary all-sky radio surveys available to identify sources, since radio emission traces particle acceleration sites. Here, techniques pioneered by the Low-Frequency Array (LOFAR) promise major advances. First of all, working on LOFAR we uncovered a new technique to detect UHECRs with radio antennas and verified it experimentally. The technique promises to increase the number of high-quality events by almost an order of magnitude and provides much improved energy and direction resolution. We now want to implement this technique in Auger, combining LOFAR and AUGER know-how. Secondly, LOFAR and soon other SKA pathfinders will significantly improve all-sky radio surveys with high sensitivity, resolution, and image quality. Hence, we will use LOFAR to understand the astrophysics of UHECR source candidates and compile a radio-based catalog thereof. We start with jets from BHs and move later to other sources. Together this will allow us to identify UHECR sources and study them in detail.
Max ERC Funding
3 460 000 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym OFAV
Project Open intelligent systems for Future Autonomous Vehicles
Researcher (PI) Alberto Broggi
Host Institution (HI) UNIVERSITA DEGLI STUDI DI PARMA
Call Details Advanced Grant (AdG), PE6, ERC-2008-AdG
Summary The objective of this proposal is the development of an open architecture for future autonomous vehicles to become a standard platform shared by car makers in the design of next generation intelligent vehicles. It is based on 360 degrees sensorial suite which includes perceptual and decision making modules, with the ultimate goal of providing the vehicle with autonomous driving capabilities and/or supervise the driver's behavior. The perception module also includes vehicle-to-vehicle and vehicle-to-infrastructure subsystems, to increment the vehicle s sensing capabilities. The research is based on the extended know-how and experience of the Principal Investigator s group at the Univ of Parma, which already marked fundamental milestones worldwide in the field of vehicular robotics. Car manufacturers and automotive suppliers are extremely interested in this research stream, but at the same time are very cautious in investing in long term and risky research like this. Besides providing clear advantages on safety for road users, the availability of an open architecture will encourage and make possible the sharing of knowledge between public and private research communities (academic and automotive industry) and thus speed up the design of a standard platform for future vehicles. Further research steps will be eased -and therefore made more effective-thanks to the common and open architectural layer proposed by this project.
Summary
The objective of this proposal is the development of an open architecture for future autonomous vehicles to become a standard platform shared by car makers in the design of next generation intelligent vehicles. It is based on 360 degrees sensorial suite which includes perceptual and decision making modules, with the ultimate goal of providing the vehicle with autonomous driving capabilities and/or supervise the driver's behavior. The perception module also includes vehicle-to-vehicle and vehicle-to-infrastructure subsystems, to increment the vehicle s sensing capabilities. The research is based on the extended know-how and experience of the Principal Investigator s group at the Univ of Parma, which already marked fundamental milestones worldwide in the field of vehicular robotics. Car manufacturers and automotive suppliers are extremely interested in this research stream, but at the same time are very cautious in investing in long term and risky research like this. Besides providing clear advantages on safety for road users, the availability of an open architecture will encourage and make possible the sharing of knowledge between public and private research communities (academic and automotive industry) and thus speed up the design of a standard platform for future vehicles. Further research steps will be eased -and therefore made more effective-thanks to the common and open architectural layer proposed by this project.
Max ERC Funding
1 751 067 €
Duration
Start date: 2008-12-01, End date: 2013-10-31