Project acronym EARLY EARTH
Project Early Earth evolution: chemical differentiation vs. mantle mixing
Researcher (PI) Maud Boyet
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE8, ERC-2007-StG
Summary Although short-lived chronometers have yielded a precise chronology of the Early Earth differentiation, there is insufficient data available on the chemical fractionation related to these processes to model the Early Earth’s differentiation. 142Nd isotope data suggest that a reservoir enriched in rare earth elements (REE) has existed since 4.53 Ga, but has not been sampled since its formation. A key question is whether such a reservoir could remain hidden for more than 4.5 Gyr in the convective mantle. The first goal of this project is to test whether the REE alternatively could be stored in the core. Information on the mantle composition and the extent of chemical differentiation in the Early Earth will be also obtained by measurement of Sm-Nd, Pt-Re-Os and Lu-Hf radiogenic systems of Archean samples. This work will provide valuable information on (1) the redox state of the Early Earth, (2) the nature of the precursor material forming the Earth, the chronology of Earth's differentiation relative to the Moon formation, and (4) for reconstructing a model for terrestrial magma ocean crystallization. This proposal will provide the possibility of tackling a topic from a number of angles, using new instrumentation. New approaches and collaborations will be combined in order to constrain the most realistic model of the early Earth evolution.
Summary
Although short-lived chronometers have yielded a precise chronology of the Early Earth differentiation, there is insufficient data available on the chemical fractionation related to these processes to model the Early Earth’s differentiation. 142Nd isotope data suggest that a reservoir enriched in rare earth elements (REE) has existed since 4.53 Ga, but has not been sampled since its formation. A key question is whether such a reservoir could remain hidden for more than 4.5 Gyr in the convective mantle. The first goal of this project is to test whether the REE alternatively could be stored in the core. Information on the mantle composition and the extent of chemical differentiation in the Early Earth will be also obtained by measurement of Sm-Nd, Pt-Re-Os and Lu-Hf radiogenic systems of Archean samples. This work will provide valuable information on (1) the redox state of the Early Earth, (2) the nature of the precursor material forming the Earth, the chronology of Earth's differentiation relative to the Moon formation, and (4) for reconstructing a model for terrestrial magma ocean crystallization. This proposal will provide the possibility of tackling a topic from a number of angles, using new instrumentation. New approaches and collaborations will be combined in order to constrain the most realistic model of the early Earth evolution.
Max ERC Funding
453 286 €
Duration
Start date: 2008-08-01, End date: 2012-11-30
Project acronym ERIKLINDAHLERC2007
Project Multiscale and Distributed Computing Algorithms for Biomolecular Simulation and Efficient Free Energy Calculations
Researcher (PI) Erik Lindahl
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Starting Grant (StG), PE4, ERC-2007-StG
Summary The long-term goal of our research is to advance the state-of-the-art in molecular simulation algorithms by 4-5 orders of magnitude, particularly in the context of the GROMACS software we are developing. This is an immense challenge, but with huge potential rewards: it will be an amazing virtual microscope for basic chemistry, polymer and material science research; it could help us understand the molecular basis of diseases such as Creutzfeldt-Jacob, and it would enable rational design rather than random screening for future drugs. To realize it, we will focus on four critical topics: • ALGORITHMS FOR SIMULATION ON GRAPHICS AND OTHER STREAMING PROCESSORS: Graphics cards and the test Intel 80-core chip are not only the most powerful processors available, but this type of streaming architectures will power many supercomputers in 3-5 years, and it is thus critical that we design new “streamable” MD algorithms. • MULTISCALE MODELING: We will develop virtual-site-based methods to bridge atomic and mesoscopic dynamics, QM/MM, and mixed explicit/implicit solvent models with water layers around macromolecules. • MULTI-LEVEL PARALLEL & DISTRIBUTED SIMULATION: Distributed computing provides virtually infinite computer power, but has been limited to small systems. We will address this by combining SMP parallelization and Markov State Models that partition phase space into transition/local dynamics to enable distributed simulation of arbitrary systems. • EFFICIENT FREE ENERGY CALCULATIONS: We will design algorithms for multi-conformational parallel sampling, implement Bennett Acceptance Ratios in Gromacs, correction terms for PME lattice sums, and combine standard force fields with polarization/multipoles, e.g. Amoeba. We have a very strong track record of converting methodological advances into applications, and the results will have impact on a wide range of fields from biomolecules and polymer science through material simulations and nanotechnology.
Summary
The long-term goal of our research is to advance the state-of-the-art in molecular simulation algorithms by 4-5 orders of magnitude, particularly in the context of the GROMACS software we are developing. This is an immense challenge, but with huge potential rewards: it will be an amazing virtual microscope for basic chemistry, polymer and material science research; it could help us understand the molecular basis of diseases such as Creutzfeldt-Jacob, and it would enable rational design rather than random screening for future drugs. To realize it, we will focus on four critical topics: • ALGORITHMS FOR SIMULATION ON GRAPHICS AND OTHER STREAMING PROCESSORS: Graphics cards and the test Intel 80-core chip are not only the most powerful processors available, but this type of streaming architectures will power many supercomputers in 3-5 years, and it is thus critical that we design new “streamable” MD algorithms. • MULTISCALE MODELING: We will develop virtual-site-based methods to bridge atomic and mesoscopic dynamics, QM/MM, and mixed explicit/implicit solvent models with water layers around macromolecules. • MULTI-LEVEL PARALLEL & DISTRIBUTED SIMULATION: Distributed computing provides virtually infinite computer power, but has been limited to small systems. We will address this by combining SMP parallelization and Markov State Models that partition phase space into transition/local dynamics to enable distributed simulation of arbitrary systems. • EFFICIENT FREE ENERGY CALCULATIONS: We will design algorithms for multi-conformational parallel sampling, implement Bennett Acceptance Ratios in Gromacs, correction terms for PME lattice sums, and combine standard force fields with polarization/multipoles, e.g. Amoeba. We have a very strong track record of converting methodological advances into applications, and the results will have impact on a wide range of fields from biomolecules and polymer science through material simulations and nanotechnology.
Max ERC Funding
992 413 €
Duration
Start date: 2008-09-01, End date: 2013-08-31
Project acronym EXPLOREMAPS
Project Combinatorial methods, from enumerative topology to random discrete structures and compact data representations
Researcher (PI) Gilles Schaeffer
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2007-StG
Summary "Our aim is to built on recent combinatorial and algorithmic progress to attack a series of deeply connected problems that have independantly surfaced in enumerative topology, statistical physics, and data compression. The relation between these problems lies in the notion of ""combinatorial map"", the natural discrete mathematical abstraction of objects with a 2-dimensional structures (like geographical maps, computer graphics' meshes, or 2d manifolds). A whole new set of properties of these maps has been uncovered in the last few years under the impulsion of the principal investigator. Rougly speaking, we have shown that classical graph exploration algorithms, when correctly applied to maps, lead to remarkable decompositions of the underlying surfaces. Our methods resort to algorithmic and enumerative combinatorics. In statistical physics, these decompositions offer an approach to the intrinsec geometry of discrete 2d quantum gravity: our method is here the first to outperform the celebrated ""topological expansion of matrix integrals"" of Brezin-Itzykson-Parisi-Zuber. Exploring its implications for the continuum limit of these random geometries is our great challenge now. From a computational geometry perspective, our approach yields the first encoding schemes with asymptotically optimal garanteed compression rates for the connectivity of triangular or polygonal meshes. These schemes improve on a long series of heuristically efficient but non optimal algorithms, and open the way to optimally compact data structures. Finally we have deep indications that the properties we have uncovered extend to the realm of ramified coverings of the sphere. Intriguing computations on the fundamental Hurwitz's numbers have been obtained using the ELSV formula, famous for its use by Okounkov et al. to rederive Kontsevich's model. We believe that further combinatorial progress here could allow to bypass the formula and obtaine an elementary explanation of these results."
Summary
"Our aim is to built on recent combinatorial and algorithmic progress to attack a series of deeply connected problems that have independantly surfaced in enumerative topology, statistical physics, and data compression. The relation between these problems lies in the notion of ""combinatorial map"", the natural discrete mathematical abstraction of objects with a 2-dimensional structures (like geographical maps, computer graphics' meshes, or 2d manifolds). A whole new set of properties of these maps has been uncovered in the last few years under the impulsion of the principal investigator. Rougly speaking, we have shown that classical graph exploration algorithms, when correctly applied to maps, lead to remarkable decompositions of the underlying surfaces. Our methods resort to algorithmic and enumerative combinatorics. In statistical physics, these decompositions offer an approach to the intrinsec geometry of discrete 2d quantum gravity: our method is here the first to outperform the celebrated ""topological expansion of matrix integrals"" of Brezin-Itzykson-Parisi-Zuber. Exploring its implications for the continuum limit of these random geometries is our great challenge now. From a computational geometry perspective, our approach yields the first encoding schemes with asymptotically optimal garanteed compression rates for the connectivity of triangular or polygonal meshes. These schemes improve on a long series of heuristically efficient but non optimal algorithms, and open the way to optimally compact data structures. Finally we have deep indications that the properties we have uncovered extend to the realm of ramified coverings of the sphere. Intriguing computations on the fundamental Hurwitz's numbers have been obtained using the ELSV formula, famous for its use by Okounkov et al. to rederive Kontsevich's model. We believe that further combinatorial progress here could allow to bypass the formula and obtaine an elementary explanation of these results."
Max ERC Funding
750 000 €
Duration
Start date: 2008-07-01, End date: 2013-06-30
Project acronym FBRAIN
Project Computational Anatomy of Fetal Brain
Researcher (PI) François Rousseau
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE5, ERC-2007-StG
Summary Studies about brain maturation aim at providing a better understanding of brain development and links between brain changes and cognitive development. Such studies are of great interest for diagnosis help and clinical course of development and treatment of illnesses. Several teams have begun to make 3D maps of developing brain structures from children to young adults. However, working out the development of fetal and neonatal brain remains an open issue. This project aims at jumping over several theoretical and practical barriers and at going beyond the formal description of the brain maturation thanks to the development of a realistic numerical model of brain aging. In this context, Magnetic Resonance (MR) imaging is a fundamental tool to study structural brain development across age group. We will rely on new image processing tools combining morphological information provided by T2-weighted MR images and diffusion information (degree of myelination and fiber orientation) given by diffusion tensor imaging (DTI). The joint analysis of these anatomical features will stress the generic maturation of normal fetal brain. We will first rely on mathematical models to allow reconstruction of high resolution 3D MR images in order to extract relevant features of brain maturation. The results issued from this first step will be used to build statistical atlases and to characterize the neuroanatomical differences between a reference group and the population under investigation. From a methodological point of view, our approach relies on an interdisciplinary research framework aiming at combining medical research to neuroimaging, image processing, statistical modelling and computer science. The robust characterization of the anatomical features of fetal brain and the development of a realistic model of brain maturation from biological concepts will come out from the strong interactions between these different research fields.
Summary
Studies about brain maturation aim at providing a better understanding of brain development and links between brain changes and cognitive development. Such studies are of great interest for diagnosis help and clinical course of development and treatment of illnesses. Several teams have begun to make 3D maps of developing brain structures from children to young adults. However, working out the development of fetal and neonatal brain remains an open issue. This project aims at jumping over several theoretical and practical barriers and at going beyond the formal description of the brain maturation thanks to the development of a realistic numerical model of brain aging. In this context, Magnetic Resonance (MR) imaging is a fundamental tool to study structural brain development across age group. We will rely on new image processing tools combining morphological information provided by T2-weighted MR images and diffusion information (degree of myelination and fiber orientation) given by diffusion tensor imaging (DTI). The joint analysis of these anatomical features will stress the generic maturation of normal fetal brain. We will first rely on mathematical models to allow reconstruction of high resolution 3D MR images in order to extract relevant features of brain maturation. The results issued from this first step will be used to build statistical atlases and to characterize the neuroanatomical differences between a reference group and the population under investigation. From a methodological point of view, our approach relies on an interdisciplinary research framework aiming at combining medical research to neuroimaging, image processing, statistical modelling and computer science. The robust characterization of the anatomical features of fetal brain and the development of a realistic model of brain maturation from biological concepts will come out from the strong interactions between these different research fields.
Max ERC Funding
753 393 €
Duration
Start date: 2008-09-01, End date: 2013-08-31
Project acronym GADA
Project Group Actions: Interactions between Dynamical Systems and Arithmetic
Researcher (PI) Emmanuel Breuillard
Host Institution (HI) UNIVERSITE PARIS-SUD
Call Details Starting Grant (StG), PE1, ERC-2007-StG
Summary "Our main goal is to apply the powerful analytical tools that are now emerging from areas of more ""applicable"" parts of mathematics such as ergodic theory, random walks, harmonic analysis and additive combinatorics to some longstanding open problems in more theoretical parts of mathematics such as group theory and number theory. The recent work of Green and Tao about arithmetic progressions of prime numbers, or Margulis' celebrated solution of the Oppenheim Conjecture about integer values of quadratic forms are examples of the growing interpenetration of such seemingly unrelated fields. We have in mind an explicit set of problems: a uniform Tits alternative, the equidistribution of dense subgroups, the Andre-Oort conjecture, the spectral gap conjecture, the Lehmer problem. All these questions involve group theory in various forms (discrete subgroups of Lie groups, representation theory and spectral theory, locally symmetric spaces and Shimura varieties, dynamics on homogeneous spaces of arithmetic origin, Cayley graphs of large finite groups, etc) and have also a number theoretic flavor. Their striking common feature is that each of them enjoys some intimate relationship, whether by the foreseen methods to tackle it or by its consequences, with ergodic theory on the one hand and harmonic analysis and combinatorics on the other. We believe that the new methods being currently developed in those fields will bring crucial insights to the problems at hand. This proposed research builds on previous results obtained by the author and addresses some of the most challenging open problems in the field."
Summary
"Our main goal is to apply the powerful analytical tools that are now emerging from areas of more ""applicable"" parts of mathematics such as ergodic theory, random walks, harmonic analysis and additive combinatorics to some longstanding open problems in more theoretical parts of mathematics such as group theory and number theory. The recent work of Green and Tao about arithmetic progressions of prime numbers, or Margulis' celebrated solution of the Oppenheim Conjecture about integer values of quadratic forms are examples of the growing interpenetration of such seemingly unrelated fields. We have in mind an explicit set of problems: a uniform Tits alternative, the equidistribution of dense subgroups, the Andre-Oort conjecture, the spectral gap conjecture, the Lehmer problem. All these questions involve group theory in various forms (discrete subgroups of Lie groups, representation theory and spectral theory, locally symmetric spaces and Shimura varieties, dynamics on homogeneous spaces of arithmetic origin, Cayley graphs of large finite groups, etc) and have also a number theoretic flavor. Their striking common feature is that each of them enjoys some intimate relationship, whether by the foreseen methods to tackle it or by its consequences, with ergodic theory on the one hand and harmonic analysis and combinatorics on the other. We believe that the new methods being currently developed in those fields will bring crucial insights to the problems at hand. This proposed research builds on previous results obtained by the author and addresses some of the most challenging open problems in the field."
Max ERC Funding
750 000 €
Duration
Start date: 2008-12-01, End date: 2013-11-30
Project acronym GAMMARAYBINARIES
Project Exploring the gamma-ray sky: binaries, microquasars and their impact on understanding particle acceleration, relativistic winds and accretion/ejection phenomena in cosmic sources
Researcher (PI) Guillaume Dubus
Host Institution (HI) UNIVERSITE JOSEPH FOURIER GRENOBLE 1
Call Details Starting Grant (StG), PE7, ERC-2007-StG
Summary The most energetic photons in the universe are produced by poorly known processes, typically in the vicinity of neutron stars or black holes. The past couple of years have seen an increase in the number of known sources of very high energy gamma-ray radiation from a handful to almost 50, thanks to the European collaborations HESS and MAGIC. Many of those sources are pulsar wind nebulae, supernova remnants or active galactic nuclei. HESS and MAGIC have also discovered gamma-ray emission from binary systems, finding that some emit most of their radiation at the highest energies. Expectations are running high with the December launch of the GLAST space telescope which will provide daily all-sky information in high energy gamma-rays with a sensitivity comparable to that achieved in years by its predecessor. I propose to explore the exciting observational opportunities in high energy gamma-ray astronomy with an emphasis on non-thermal emission from compact binary sources. Binary systems are intriguing new laboratories to understand how particle acceleration works in cosmic sources. The physics of gamma-ray emitting binary systems is related to that in pulsar wind nebulae or in active galactic nuclei. High energy gamma-ray emission is the result of non-thermal, out-of-equilibrium processes that challenge our intuitions built upon everyday phenomena. The particles are billions of times more energetic than X-rays and can reach energies greater than those in particle accelerators. Binary systems offer a novel, constrained environment to study how the cosmic rays that pervade our Galaxy are accelerated and how non-thermal emission is related to the formation of relativistic jets from black holes (accretion/ejection). The study requires a combination of skills in multiwavelength observations, interdisciplinary experience with gamma-ray observational techniques originating from particle physics, and theoretical know-how in accretion and high energy phenomena.
Summary
The most energetic photons in the universe are produced by poorly known processes, typically in the vicinity of neutron stars or black holes. The past couple of years have seen an increase in the number of known sources of very high energy gamma-ray radiation from a handful to almost 50, thanks to the European collaborations HESS and MAGIC. Many of those sources are pulsar wind nebulae, supernova remnants or active galactic nuclei. HESS and MAGIC have also discovered gamma-ray emission from binary systems, finding that some emit most of their radiation at the highest energies. Expectations are running high with the December launch of the GLAST space telescope which will provide daily all-sky information in high energy gamma-rays with a sensitivity comparable to that achieved in years by its predecessor. I propose to explore the exciting observational opportunities in high energy gamma-ray astronomy with an emphasis on non-thermal emission from compact binary sources. Binary systems are intriguing new laboratories to understand how particle acceleration works in cosmic sources. The physics of gamma-ray emitting binary systems is related to that in pulsar wind nebulae or in active galactic nuclei. High energy gamma-ray emission is the result of non-thermal, out-of-equilibrium processes that challenge our intuitions built upon everyday phenomena. The particles are billions of times more energetic than X-rays and can reach energies greater than those in particle accelerators. Binary systems offer a novel, constrained environment to study how the cosmic rays that pervade our Galaxy are accelerated and how non-thermal emission is related to the formation of relativistic jets from black holes (accretion/ejection). The study requires a combination of skills in multiwavelength observations, interdisciplinary experience with gamma-ray observational techniques originating from particle physics, and theoretical know-how in accretion and high energy phenomena.
Max ERC Funding
794 752 €
Duration
Start date: 2008-07-01, End date: 2013-06-30
Project acronym GLOBALVISION
Project Global Optimization Methods in Computer Vision, Pattern Recognition and Medical Imaging
Researcher (PI) Fredrik Kahl
Host Institution (HI) LUNDS UNIVERSITET
Call Details Starting Grant (StG), PE5, ERC-2007-StG
Summary Computer vision concerns itself with understanding the real world through the analysis of images. Typical problems are object recognition, medical image segmentation, geometric reconstruction problems and navigation of autonomous vehicles. Such problems often lead to complicated optimization problems with a mixture of discrete and continuous variables, or even infinite dimensional variables in terms of curves and surfaces. Today, state-of-the-art in solving these problems generally relies on heuristic methods that generate only local optima of various qualities. During the last few years, work by the applicant, co-workers, and others has opened new possibilities. This research project builds on this. We will in this project focus on developing new global optimization methods for computing high-quality solutions for a broad class of problems. A guiding principle will be to relax the original, complicated problem to an approximate, simpler one to which globally optimal solutions can more easily be computed. Technically, this relaxed problem often is convex. A crucial point in this approach is to estimate the quality of the exact solution of the approximate problem compared to the (unknown) global optimum of the original problem. Preliminary results have been well received by the research community and we now wish to extend this work to more difficult and more general problem settings, resulting in thorough re-examination of algorithms used widely in different and trans-disciplinary fields. This project is to be considered as a basic research project with relevance to industry. The expected outcome is new knowledge spread to a wide community through scientific papers published at international journals and conferences as well as publicly available software.
Summary
Computer vision concerns itself with understanding the real world through the analysis of images. Typical problems are object recognition, medical image segmentation, geometric reconstruction problems and navigation of autonomous vehicles. Such problems often lead to complicated optimization problems with a mixture of discrete and continuous variables, or even infinite dimensional variables in terms of curves and surfaces. Today, state-of-the-art in solving these problems generally relies on heuristic methods that generate only local optima of various qualities. During the last few years, work by the applicant, co-workers, and others has opened new possibilities. This research project builds on this. We will in this project focus on developing new global optimization methods for computing high-quality solutions for a broad class of problems. A guiding principle will be to relax the original, complicated problem to an approximate, simpler one to which globally optimal solutions can more easily be computed. Technically, this relaxed problem often is convex. A crucial point in this approach is to estimate the quality of the exact solution of the approximate problem compared to the (unknown) global optimum of the original problem. Preliminary results have been well received by the research community and we now wish to extend this work to more difficult and more general problem settings, resulting in thorough re-examination of algorithms used widely in different and trans-disciplinary fields. This project is to be considered as a basic research project with relevance to industry. The expected outcome is new knowledge spread to a wide community through scientific papers published at international journals and conferences as well as publicly available software.
Max ERC Funding
1 440 000 €
Duration
Start date: 2008-07-01, End date: 2013-06-30
Project acronym GOODSHAPE
Project numerical geometric abstraction : from bits to equations
Researcher (PI) Bruno Eric Emmanuel Levy
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE5, ERC-2007-StG
Summary "3D geometric objects play a central role in many industrial processes (modeling, scientific visualisation, numerical simulation). However, since the raw output of acquisition mechanisms cannot be used directly in these processes, converting a real object into its numerical counterpart still involves a great deal of user intervention. Geometry Processing is a recently emerged, highly competitive scientific domain that studies this type of problems. The author of this proposal contributed to this domain at its origin, and developped several parameterization algorithms, that construct a ""geometric coordinate system"" attached to the object. This facilitates converting from one representation to another. For instance, it is possible to convert a mesh model into a piecewise bi-cubic surface (much easier to manipulate in Computer Aided Design packages). In a certain sense, this retreives an ""equation"" of the geometry. One can also say that this constructs an *abstraction* of the geometry. Once the geometry is abstracted, re-instancing it into alternative representations is made easier. In this project, we propose to attack the problem from a new angle, and climb one more level of abstraction. In more general terms, a geometric coordinates system corresponds to a *function basis*. Thus, we consider the more general problem of constructing a *dynamic function basis* attached to the object. This abstract forms makes the meaningful parameters appear, and provides the user with new ""knobs"" to interact with the geometry. The formalism that we use combines aspects from finite element modeling, differential geometry, spectral geometry, topology and numerical optimization. We plan to develop applications for processing and optimimizing the representation of both static 3D objets, animated 3D objets, images and videos."
Summary
"3D geometric objects play a central role in many industrial processes (modeling, scientific visualisation, numerical simulation). However, since the raw output of acquisition mechanisms cannot be used directly in these processes, converting a real object into its numerical counterpart still involves a great deal of user intervention. Geometry Processing is a recently emerged, highly competitive scientific domain that studies this type of problems. The author of this proposal contributed to this domain at its origin, and developped several parameterization algorithms, that construct a ""geometric coordinate system"" attached to the object. This facilitates converting from one representation to another. For instance, it is possible to convert a mesh model into a piecewise bi-cubic surface (much easier to manipulate in Computer Aided Design packages). In a certain sense, this retreives an ""equation"" of the geometry. One can also say that this constructs an *abstraction* of the geometry. Once the geometry is abstracted, re-instancing it into alternative representations is made easier. In this project, we propose to attack the problem from a new angle, and climb one more level of abstraction. In more general terms, a geometric coordinates system corresponds to a *function basis*. Thus, we consider the more general problem of constructing a *dynamic function basis* attached to the object. This abstract forms makes the meaningful parameters appear, and provides the user with new ""knobs"" to interact with the geometry. The formalism that we use combines aspects from finite element modeling, differential geometry, spectral geometry, topology and numerical optimization. We plan to develop applications for processing and optimimizing the representation of both static 3D objets, animated 3D objets, images and videos."
Max ERC Funding
1 100 000 €
Duration
Start date: 2008-08-01, End date: 2013-07-31
Project acronym GOSSPLE
Project GOSSPLE: A Radically New Approach to Navigating the Digital Information Universe
Researcher (PI) Anne-Marie Kermarrec
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE5, ERC-2007-StG
Summary Over the past decade, distributed computing has experienced a dramatic scale shift, with respect to size, geographical spread and volume of data. Meanwhile, Internet has moved into homes, creating tremendous opportunities to exploit the huge amount of resources at the edge of the network. Search engines that navigate this universe are astonishingly powerful and rely on sophisticated tools to scan and index the network. However, the network contains far more than just the pages such systems can index. There is a tremendous potential in leveraging these new kinds of information to empower individuals in ways that Internet search will never be able to offer. This reveals striking evidence that navigating the Internet goes beyond traditional search engines. Complementary and different means to navigate the digital world are now required. The objective of GOSSPLE is to provide an innovative and fully decentralized approach to navigate the digital information universe by placing users affinities and preferences at the heart of the search process. GOSSPLE will turn the network into a self-organizing federation of overlapping sub-networks, capturing on the fly the interactions and affinities observed in real life and fully leveraging the huge resource potential available on edge nodes. GOSSPLE will provide a set of fully decentralized algorithms to efficiently search, dynamically index and asynchronously disseminate information to interested users based on their preferences and (implicit) recommendations. Building up upon the peer to peer communication paradigm and harnessing the power of gossip-based algorithms, GOSSPLE will yield a disruptive way of programming distributed collaborative applications. Our goal is ambitious: impose the GOSSPLE approach as a fully decentralized, collaborative and scalable, yet complementary, alternative to traditional search engines to fully exploit the capabilities of the digital universe.
Summary
Over the past decade, distributed computing has experienced a dramatic scale shift, with respect to size, geographical spread and volume of data. Meanwhile, Internet has moved into homes, creating tremendous opportunities to exploit the huge amount of resources at the edge of the network. Search engines that navigate this universe are astonishingly powerful and rely on sophisticated tools to scan and index the network. However, the network contains far more than just the pages such systems can index. There is a tremendous potential in leveraging these new kinds of information to empower individuals in ways that Internet search will never be able to offer. This reveals striking evidence that navigating the Internet goes beyond traditional search engines. Complementary and different means to navigate the digital world are now required. The objective of GOSSPLE is to provide an innovative and fully decentralized approach to navigate the digital information universe by placing users affinities and preferences at the heart of the search process. GOSSPLE will turn the network into a self-organizing federation of overlapping sub-networks, capturing on the fly the interactions and affinities observed in real life and fully leveraging the huge resource potential available on edge nodes. GOSSPLE will provide a set of fully decentralized algorithms to efficiently search, dynamically index and asynchronously disseminate information to interested users based on their preferences and (implicit) recommendations. Building up upon the peer to peer communication paradigm and harnessing the power of gossip-based algorithms, GOSSPLE will yield a disruptive way of programming distributed collaborative applications. Our goal is ambitious: impose the GOSSPLE approach as a fully decentralized, collaborative and scalable, yet complementary, alternative to traditional search engines to fully exploit the capabilities of the digital universe.
Max ERC Funding
1 250 000 €
Duration
Start date: 2008-09-01, End date: 2013-08-31
Project acronym ICEPROXY
Project Novel Lipid Biomarkers from Polar Ice: Climatic and Ecological Applications
Researcher (PI) Guillaume Masse
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE8, ERC-2007-StG
Summary It is widely acknowledged that polar sea ice plays a critical role in global climate change. As such, sea ice reconstructions are of paramount importance in establishing climatic evolution of the geological past. In the current project, some well characterised organic chemicals (biomarkers) from microalgae will be used as proxy indicators of current and past sea ice in the Arctic and Antarctic regions. These biomarkers, so-called highly branched isoprenoids (HBIs), possess a number of characteristics that make them attractive as sea ice proxies. Firstly, some HBIs are unique to sea ice diatoms, so their presence in polar sediments can be directly correlated with the previous occurrence of sea ice. Secondly, they are relatively resistant to degradation, which extends their usefulness in the geological record. Thirdly, their relative abundance makes them straightforward to measure with a high degree of geological resolution. One component of this project will consist of performing regional calibrations of the proxies. Concentrations of selected biomarkers in recent Arctic and Antarctic sediments will be correlated with the sea ice abundances determined using satellite technology over the last 30 years. The successful calibration of the proxies will then enable reconstructions of past sea ice extents to be performed at unprecedented high resolution. Sediment cores will be obtained from key locations across both of the Arctic and Antarctic regions and the data derived from these studies will be used for climate modelling studies. As a complement to these physico-chemical studies on sea ice, a second component of the project will investigate the use of these biomarkers for studying sea ice-biota interactions and, by examining the transfer of these chemicals through food chains, new tools for determining the consequences of future climate change on polar ecosystems will be established.
Summary
It is widely acknowledged that polar sea ice plays a critical role in global climate change. As such, sea ice reconstructions are of paramount importance in establishing climatic evolution of the geological past. In the current project, some well characterised organic chemicals (biomarkers) from microalgae will be used as proxy indicators of current and past sea ice in the Arctic and Antarctic regions. These biomarkers, so-called highly branched isoprenoids (HBIs), possess a number of characteristics that make them attractive as sea ice proxies. Firstly, some HBIs are unique to sea ice diatoms, so their presence in polar sediments can be directly correlated with the previous occurrence of sea ice. Secondly, they are relatively resistant to degradation, which extends their usefulness in the geological record. Thirdly, their relative abundance makes them straightforward to measure with a high degree of geological resolution. One component of this project will consist of performing regional calibrations of the proxies. Concentrations of selected biomarkers in recent Arctic and Antarctic sediments will be correlated with the sea ice abundances determined using satellite technology over the last 30 years. The successful calibration of the proxies will then enable reconstructions of past sea ice extents to be performed at unprecedented high resolution. Sediment cores will be obtained from key locations across both of the Arctic and Antarctic regions and the data derived from these studies will be used for climate modelling studies. As a complement to these physico-chemical studies on sea ice, a second component of the project will investigate the use of these biomarkers for studying sea ice-biota interactions and, by examining the transfer of these chemicals through food chains, new tools for determining the consequences of future climate change on polar ecosystems will be established.
Max ERC Funding
1 888 594 €
Duration
Start date: 2008-10-01, End date: 2013-09-30