Project acronym ADDECCO
Project Adaptive Schemes for Deterministic and Stochastic Flow Problems
Researcher (PI) Remi Abgrall
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Advanced Grant (AdG), PE1, ERC-2008-AdG
Summary The numerical simulation of complex compressible flow problem is still a challenge nowaday even for simple models. In our opinion, the most important hard points that need currently to be tackled and solved is how to obtain stable, scalable, very accurate, easy to code and to maintain schemes on complex geometries. The method should easily handle mesh refinement, even near the boundary where the most interesting engineering quantities have to be evaluated. Unsteady uncertainties in the model, for example in the geometry or the boundary conditions should represented efficiently.This proposal goal is to design, develop and evaluate solutions to each of the above problems. Our work program will lead to significant breakthroughs for flow simulations. More specifically, we propose to work on 3 connected problems: 1-A class of very high order numerical schemes able to easily deal with the geometry of boundaries and still can solve steep problems. The geometry is generally defined by CAD tools. The output is used to generate a mesh which is then used by the scheme. Hence, any mesh refinement process is disconnected from the CAD, a situation that prevents the spread of mesh adaptation techniques in industry! 2-A class of very high order numerical schemes which can utilize possibly solution dependant basis functions in order to lower the number of degrees of freedom, for example to compute accurately boundary layers with low resolutions. 3-A general non intrusive technique for handling uncertainties in order to deal with irregular probability density functions (pdf) and also to handle pdf that may evolve in time, for example thanks to an optimisation loop. The curse of dimensionality will be dealt thanks Harten's multiresolution method combined with sparse grid methods. Currently, and up to our knowledge, no scheme has each of these properties. This research program will have an impact on numerical schemes and industrial applications.
Summary
The numerical simulation of complex compressible flow problem is still a challenge nowaday even for simple models. In our opinion, the most important hard points that need currently to be tackled and solved is how to obtain stable, scalable, very accurate, easy to code and to maintain schemes on complex geometries. The method should easily handle mesh refinement, even near the boundary where the most interesting engineering quantities have to be evaluated. Unsteady uncertainties in the model, for example in the geometry or the boundary conditions should represented efficiently.This proposal goal is to design, develop and evaluate solutions to each of the above problems. Our work program will lead to significant breakthroughs for flow simulations. More specifically, we propose to work on 3 connected problems: 1-A class of very high order numerical schemes able to easily deal with the geometry of boundaries and still can solve steep problems. The geometry is generally defined by CAD tools. The output is used to generate a mesh which is then used by the scheme. Hence, any mesh refinement process is disconnected from the CAD, a situation that prevents the spread of mesh adaptation techniques in industry! 2-A class of very high order numerical schemes which can utilize possibly solution dependant basis functions in order to lower the number of degrees of freedom, for example to compute accurately boundary layers with low resolutions. 3-A general non intrusive technique for handling uncertainties in order to deal with irregular probability density functions (pdf) and also to handle pdf that may evolve in time, for example thanks to an optimisation loop. The curse of dimensionality will be dealt thanks Harten's multiresolution method combined with sparse grid methods. Currently, and up to our knowledge, no scheme has each of these properties. This research program will have an impact on numerical schemes and industrial applications.
Max ERC Funding
1 432 769 €
Duration
Start date: 2008-12-01, End date: 2013-11-30
Project acronym BIOMOL. SIMULATION
Project Development of multi-scale molecular models, force fields and computer software for biomolecular simulation
Researcher (PI) Willem Frederik Van Gunsteren
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Advanced Grant (AdG), PE4, ERC-2008-AdG
Summary During the past decades the PI has helped shape the research field of computer simulation of biomolecular systems at the atomic level. He has carried out one of the first molecular dynamics (MD) simulations of proteins, and has since then contributed many different methodological improvements and developed one of the major atomic-level force fields for simulations of proteins, carbohydrates, nucleotides and lipids. Methodology and force field have been implemented in a set of programs called GROMOS (GROningen MOlecular Simulation package), which is currently used in hundreds of academic and industrial research groups from over 50 countries on all continents. It is proposed to develop a next generation of molecular models, force fields, multi-scaling simulation methodology and software for biomolecular simulations which is at least an order of magnitude more accurate in terms of energetics, and which is 1000 times more efficient through the use of coarse-grained molecular models than the currently available software and models.
Summary
During the past decades the PI has helped shape the research field of computer simulation of biomolecular systems at the atomic level. He has carried out one of the first molecular dynamics (MD) simulations of proteins, and has since then contributed many different methodological improvements and developed one of the major atomic-level force fields for simulations of proteins, carbohydrates, nucleotides and lipids. Methodology and force field have been implemented in a set of programs called GROMOS (GROningen MOlecular Simulation package), which is currently used in hundreds of academic and industrial research groups from over 50 countries on all continents. It is proposed to develop a next generation of molecular models, force fields, multi-scaling simulation methodology and software for biomolecular simulations which is at least an order of magnitude more accurate in terms of energetics, and which is 1000 times more efficient through the use of coarse-grained molecular models than the currently available software and models.
Max ERC Funding
1 320 000 €
Duration
Start date: 2008-11-01, End date: 2014-09-30
Project acronym BIOMOLECULAR_COMP
Project Biomolecular computers
Researcher (PI) Ehud Shapiro
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Advanced Grant (AdG), LS9, ERC-2008-AdG
Summary Autonomous programmable computing devices made of biological molecules hold the promise of interacting with the biological environment in future biological and medical applications. Our laboratory's long-term objective is to develop a 'Doctor in a cell': molecular-sized device that can roam the body, equipped with medical knowledge. It would diagnose a disease by analyzing the data available in its biochemical environment based on the encoded medical knowledge and treat it by releasing the appropriate drug molecule in situ. This kind of device might, in the future, be delivered to all cells in a specific tissue, organ or the whole organism, and cure or kill only those cells diagnosed with a disease. Our laboratory embarked on the attempt to design and build these molecular computing devices and lay the foundation for their future biomedical applications. Several important milestones have already been accomplished towards the realization of the Doctor in a cell vision. The subject of this proposal is a construction of autonomous biomolecular computers that could be delivered into a living cell, interact with endogenous biomolecules that are known to indicate diseases, logically analyze them, make a diagnostic decision and couple it to the production of an active biomolecule capable of influencing cell fate.
Summary
Autonomous programmable computing devices made of biological molecules hold the promise of interacting with the biological environment in future biological and medical applications. Our laboratory's long-term objective is to develop a 'Doctor in a cell': molecular-sized device that can roam the body, equipped with medical knowledge. It would diagnose a disease by analyzing the data available in its biochemical environment based on the encoded medical knowledge and treat it by releasing the appropriate drug molecule in situ. This kind of device might, in the future, be delivered to all cells in a specific tissue, organ or the whole organism, and cure or kill only those cells diagnosed with a disease. Our laboratory embarked on the attempt to design and build these molecular computing devices and lay the foundation for their future biomedical applications. Several important milestones have already been accomplished towards the realization of the Doctor in a cell vision. The subject of this proposal is a construction of autonomous biomolecular computers that could be delivered into a living cell, interact with endogenous biomolecules that are known to indicate diseases, logically analyze them, make a diagnostic decision and couple it to the production of an active biomolecule capable of influencing cell fate.
Max ERC Funding
2 125 980 €
Duration
Start date: 2009-01-01, End date: 2013-10-31
Project acronym BSMOXFORD
Project Physics Beyond the Standard Model at the LHC and with Atom Interferometers
Researcher (PI) Savas Dimopoulos
Host Institution (HI) EUROPEAN ORGANIZATION FOR NUCLEAR RESEARCH
Call Details Advanced Grant (AdG), PE2, ERC-2008-AdG
Summary Elementary particle physics is entering a spectacular new era in which experiments at the Large Hadron Collider (LHC) at CERN will soon start probing some of the deepest questions in physics, such as: Why is gravity so weak? Do elementary particles have substructure? What is the origin of mass? Are there new dimensions? Can we produce black holes in the lab? Could there be other universes with different physical laws? While the LHC pushes the energy frontier, the unprecedented precision of Atom Interferometry, has pointed me to a new tool for fundamental physics. These experiments based on the quantum interference of atoms can test General Relativity on the surface of the Earth, detect gravity waves, and test short-distance gravity, charge quantization, and quantum mechanics with unprecedented precision in the next decade. This ERC Advanced grant proposal is aimed at setting up a world-leading European center for development of a deeper theory of fundamental physics. The next 10 years is the optimal time for such studies to benefit from the wealth of new data that will emerge from the LHC, astrophysical observations and atom interferometry. This is a once-in-a-generation opportunity for making ground-breaking progress, and will open up many new research horizons.
Summary
Elementary particle physics is entering a spectacular new era in which experiments at the Large Hadron Collider (LHC) at CERN will soon start probing some of the deepest questions in physics, such as: Why is gravity so weak? Do elementary particles have substructure? What is the origin of mass? Are there new dimensions? Can we produce black holes in the lab? Could there be other universes with different physical laws? While the LHC pushes the energy frontier, the unprecedented precision of Atom Interferometry, has pointed me to a new tool for fundamental physics. These experiments based on the quantum interference of atoms can test General Relativity on the surface of the Earth, detect gravity waves, and test short-distance gravity, charge quantization, and quantum mechanics with unprecedented precision in the next decade. This ERC Advanced grant proposal is aimed at setting up a world-leading European center for development of a deeper theory of fundamental physics. The next 10 years is the optimal time for such studies to benefit from the wealth of new data that will emerge from the LHC, astrophysical observations and atom interferometry. This is a once-in-a-generation opportunity for making ground-breaking progress, and will open up many new research horizons.
Max ERC Funding
2 200 000 €
Duration
Start date: 2009-05-01, End date: 2014-04-30
Project acronym CCC
Project Context, Content, and Compositionality
Researcher (PI) François Recanati
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), SH4, ERC-2008-AdG
Summary Over the past fifteen years, I have argued that the effects of context on content go well beyond what is standardly acknowledged in semantics. This view is sometimes referred to as Contextualism or (more technically) Truth-Conditional Pragmatics (TCP). The key idea is that the effects of context on content need not be traceable to the linguistic material in the uttered sentence. Some effects are due to the linguistic material (e.g. to context-sensitive words or morphemes which trigger the search for contextual values), but others result from top-down or free pragmatic processes that take place not because the linguistic material demands it, but because the literal meaning of the sentence requires adjustment or elaboration ( modulation ) in order to determine a contextually admissible content for the speaker s utterance. In the literature, one often finds arguments to the effect that, if Contextualism is right, then systematic semantics becomes impossible. More precisely, the claim that is often made is that TCP is incompatible with the Principle of Compositionality, upon which any systematic semantics must be based. The aim of this project is to defend Contextualism/TCP by demonstrating that it is not incompatible with the project of constructing a systematic, compositional semantics for natural language. This demonstration is of importance given the current predicament in the philosophy of language. We are, as it were, caught in a dilemma : formal semanticists provide compelling arguments that natural language must be compositional, but contextualists offer no less compelling arguments to the effect that « sense modulation is essential to speech, because we use a (mor or less) fixed stock of lexemes to talk about an indefinite variety of things, situations, and experiences » (Recanati 2004 : 131). What are we to do, if modulation is incompatible with compositionality? Our aim is to show that it is not, and thereby to dissolve the alleged dilemma.
Summary
Over the past fifteen years, I have argued that the effects of context on content go well beyond what is standardly acknowledged in semantics. This view is sometimes referred to as Contextualism or (more technically) Truth-Conditional Pragmatics (TCP). The key idea is that the effects of context on content need not be traceable to the linguistic material in the uttered sentence. Some effects are due to the linguistic material (e.g. to context-sensitive words or morphemes which trigger the search for contextual values), but others result from top-down or free pragmatic processes that take place not because the linguistic material demands it, but because the literal meaning of the sentence requires adjustment or elaboration ( modulation ) in order to determine a contextually admissible content for the speaker s utterance. In the literature, one often finds arguments to the effect that, if Contextualism is right, then systematic semantics becomes impossible. More precisely, the claim that is often made is that TCP is incompatible with the Principle of Compositionality, upon which any systematic semantics must be based. The aim of this project is to defend Contextualism/TCP by demonstrating that it is not incompatible with the project of constructing a systematic, compositional semantics for natural language. This demonstration is of importance given the current predicament in the philosophy of language. We are, as it were, caught in a dilemma : formal semanticists provide compelling arguments that natural language must be compositional, but contextualists offer no less compelling arguments to the effect that « sense modulation is essential to speech, because we use a (mor or less) fixed stock of lexemes to talk about an indefinite variety of things, situations, and experiences » (Recanati 2004 : 131). What are we to do, if modulation is incompatible with compositionality? Our aim is to show that it is not, and thereby to dissolve the alleged dilemma.
Max ERC Funding
1 144 706 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym CD8 T CELLS
Project Development and differentiation of CD8 T lymphocytes
Researcher (PI) Benedita Rocha
Host Institution (HI) INSTITUT NATIONAL DE LA SANTE ET DE LA RECHERCHE MEDICALE
Call Details Advanced Grant (AdG), LS6, ERC-2008-AdG
Summary CD8 T lymphocytes have a fundamental role in ensuring the control of different types of intracellular pathogens including bacteria, parasites and most viruses. This control may fail due to several reasons. The current aggressive anti-cancer therapies (or rarely certain congenital immune deficiencies) induce CD8 depletion. After bone-marrow transplantation, long time periods are required to ensure T cell reconstitution particularly in the adult. This long lag-time is due to the long-time periods required for hematopoietic precursors to generate T lymphocytes and to a thymus insufficiency in the adult. However, even when CD8 T cells are present CD8 immune responses are not always adequate. Certain chronic infections, as HIV, induce CD8 dysfunction and it is yet unclear how to generate efficient CD8 memory responses conferring adequate protection. To address these questions this project aims 1) To find strategies ensuring the rapid reconstitution of the peripheral and the gut CD8 T cell compartments a) by studying the mechanisms involved HSC division and T cell commitment; b) by isolating and characterizing progenitors we previously described that are T cell committed and able of an accelerated CD8 reconstitution c) by developing new strategies that may allow stable thymus transplantation and continuous thymus T cell generation. 2) To determine the mechanics associated to efficient CD8 memory generation a) by evaluating cellular modifications that ensure the efficient division and the remarkable accumulation and survival of CD8 T cells during the adequate immune responses as compared to inefficient responses b) by studying CD8 differentiation into effector and memory cells in both conditions. These studies will use original experiment mouse models we develop in the laboratory, that allow to address each of these aims. Besides state of the art methods, they will also apply unique very advanced approaches we introduced and are the sole laboratory to perform.
Summary
CD8 T lymphocytes have a fundamental role in ensuring the control of different types of intracellular pathogens including bacteria, parasites and most viruses. This control may fail due to several reasons. The current aggressive anti-cancer therapies (or rarely certain congenital immune deficiencies) induce CD8 depletion. After bone-marrow transplantation, long time periods are required to ensure T cell reconstitution particularly in the adult. This long lag-time is due to the long-time periods required for hematopoietic precursors to generate T lymphocytes and to a thymus insufficiency in the adult. However, even when CD8 T cells are present CD8 immune responses are not always adequate. Certain chronic infections, as HIV, induce CD8 dysfunction and it is yet unclear how to generate efficient CD8 memory responses conferring adequate protection. To address these questions this project aims 1) To find strategies ensuring the rapid reconstitution of the peripheral and the gut CD8 T cell compartments a) by studying the mechanisms involved HSC division and T cell commitment; b) by isolating and characterizing progenitors we previously described that are T cell committed and able of an accelerated CD8 reconstitution c) by developing new strategies that may allow stable thymus transplantation and continuous thymus T cell generation. 2) To determine the mechanics associated to efficient CD8 memory generation a) by evaluating cellular modifications that ensure the efficient division and the remarkable accumulation and survival of CD8 T cells during the adequate immune responses as compared to inefficient responses b) by studying CD8 differentiation into effector and memory cells in both conditions. These studies will use original experiment mouse models we develop in the laboratory, that allow to address each of these aims. Besides state of the art methods, they will also apply unique very advanced approaches we introduced and are the sole laboratory to perform.
Max ERC Funding
1 969 644 €
Duration
Start date: 2009-02-01, End date: 2014-05-31
Project acronym CEMYSS
Project Cosmochemical Exploration of the first two Million Years of the Solar System
Researcher (PI) Marc Chaussidon
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE9, ERC-2008-AdG
Summary One of the major outcomes of recent studies on the formation of the Solar System is the reconnaissance of the fundamental importance of processes which took place during the first 10 thousands to 2 or 3 millions years of the lifetime of the Sun and its accretion disk. Astrophysical observations in the optical to infrared wavelengths of circumstellar disks around young stars have shown the existence in the inner disk of high-temperature processing of the dust. X-ray observations of T-Tauri stars revealed that they exhibit X-ray flare enhancements by several orders of magnitude. The work we have performed over the last years on the isotopic analysis of either solar wind trapped in lunar soils or of Ca-, Al-rich inclusions and chondrules from primitive chondrites, has allowed us to link some of these astrophysical observations around young stars with processes, such as irradiation by energetic particles and UV light, which took place around the T-Tauri Sun. The aim of this project is to make decisive progress in our understanding of the early solar system though the development of in situ high-precision isotopic measurements by ion microprobe in extra-terrestrial matter. The project will be focused on the exploration of the variations in the isotopic composition of O and Mg and in the concentration of short-lived radioactive nuclides, such as 26Al and 10Be, with half-lives shorter than 1.5 millions years. A special emphasis will be put on the search for nuclides with very short half-lives such as 32Si (650 years) and 14C (5730 years), nuclides which have never been discovered yet in meteorites. These new data will bring critical information on, for instance, the astrophysical context for the formation of the Sun and the first solids in the accretion disk, or the timing and the processes by which protoplanets were formed and destroyed close to the Sun during the first 2 million years of the lifetime of the Solar System.
Summary
One of the major outcomes of recent studies on the formation of the Solar System is the reconnaissance of the fundamental importance of processes which took place during the first 10 thousands to 2 or 3 millions years of the lifetime of the Sun and its accretion disk. Astrophysical observations in the optical to infrared wavelengths of circumstellar disks around young stars have shown the existence in the inner disk of high-temperature processing of the dust. X-ray observations of T-Tauri stars revealed that they exhibit X-ray flare enhancements by several orders of magnitude. The work we have performed over the last years on the isotopic analysis of either solar wind trapped in lunar soils or of Ca-, Al-rich inclusions and chondrules from primitive chondrites, has allowed us to link some of these astrophysical observations around young stars with processes, such as irradiation by energetic particles and UV light, which took place around the T-Tauri Sun. The aim of this project is to make decisive progress in our understanding of the early solar system though the development of in situ high-precision isotopic measurements by ion microprobe in extra-terrestrial matter. The project will be focused on the exploration of the variations in the isotopic composition of O and Mg and in the concentration of short-lived radioactive nuclides, such as 26Al and 10Be, with half-lives shorter than 1.5 millions years. A special emphasis will be put on the search for nuclides with very short half-lives such as 32Si (650 years) and 14C (5730 years), nuclides which have never been discovered yet in meteorites. These new data will bring critical information on, for instance, the astrophysical context for the formation of the Sun and the first solids in the accretion disk, or the timing and the processes by which protoplanets were formed and destroyed close to the Sun during the first 2 million years of the lifetime of the Solar System.
Max ERC Funding
1 270 419 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym CENDUP
Project Decoding the mechanisms of centrosome duplication
Researcher (PI) Pierre Gönczy
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Advanced Grant (AdG), LS3, ERC-2008-AdG
Summary Centrosome duplication entails the formation of a single procentriole next to each centriole once per cell cycle. The mechanisms governing procentriole formation are poorly understood and constitute a fundamental open question in cell biology. We will launch an innovative multidisciplinary research program to gain significant insight into these mechanisms using C. elegans and human cells. This research program is also expected to have a significant impact by contributing important novel assays to the field. Six specific aims will be pursued: 1) SAS-6 as a ZYG-1 substrate: mechanisms of procentriole formation in C. elegans. We will test in vivo the consequence of SAS-6 phosphorylation by ZYG-1. 2) Biochemical and structural analysis of SAS-6-containing macromolecular complexes (SAMACs). We will isolate and characterize SAMACs from C. elegans embryos and human cells, and analyze their structure using single-particle electron microscopy. 3) Novel cell-free assay for procentriole formation in human cells. We will develop such an assay and use it to test whether SAMACs can direct procentriole formation and whether candidate proteins are needed at centrioles or in the cytoplasm. 4) Mapping interactions between centriolar proteins in live human cells. We will use chemical methods developed by our collaborators to probe interactions between HsSAS-6 and centriolar proteins in a time- and space-resolved manner. 5) Functional genomic and chemical genetic screens in human cells. We will conduct high-throughput fluorescence-based screens in human cells to identify novel genes required for procentriole formation and small molecule inhibitors of this process. 6) Mechanisms underlying differential centriolar maintenance in the germline. In C. elegans, we will characterize how the sas-1 locus is required for centriole maintenance during spermatogenesis, as well as analyze centriole elimination during oogenesis and identify components needed for this process
Summary
Centrosome duplication entails the formation of a single procentriole next to each centriole once per cell cycle. The mechanisms governing procentriole formation are poorly understood and constitute a fundamental open question in cell biology. We will launch an innovative multidisciplinary research program to gain significant insight into these mechanisms using C. elegans and human cells. This research program is also expected to have a significant impact by contributing important novel assays to the field. Six specific aims will be pursued: 1) SAS-6 as a ZYG-1 substrate: mechanisms of procentriole formation in C. elegans. We will test in vivo the consequence of SAS-6 phosphorylation by ZYG-1. 2) Biochemical and structural analysis of SAS-6-containing macromolecular complexes (SAMACs). We will isolate and characterize SAMACs from C. elegans embryos and human cells, and analyze their structure using single-particle electron microscopy. 3) Novel cell-free assay for procentriole formation in human cells. We will develop such an assay and use it to test whether SAMACs can direct procentriole formation and whether candidate proteins are needed at centrioles or in the cytoplasm. 4) Mapping interactions between centriolar proteins in live human cells. We will use chemical methods developed by our collaborators to probe interactions between HsSAS-6 and centriolar proteins in a time- and space-resolved manner. 5) Functional genomic and chemical genetic screens in human cells. We will conduct high-throughput fluorescence-based screens in human cells to identify novel genes required for procentriole formation and small molecule inhibitors of this process. 6) Mechanisms underlying differential centriolar maintenance in the germline. In C. elegans, we will characterize how the sas-1 locus is required for centriole maintenance during spermatogenesis, as well as analyze centriole elimination during oogenesis and identify components needed for this process
Max ERC Funding
2 004 155 €
Duration
Start date: 2009-04-01, End date: 2014-03-31
Project acronym CLEAN-ICE
Project Detailed chemical kinetic models for cleaner internal combustion engines
Researcher (PI) Frederique Battin-Leclerc
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE8, ERC-2008-AdG
Summary The key objective of this project is to promote cleaner and more efficient combustion technologies through the development of theoretically grounded and more accurate chemical models. This is motivated by the fact that the current models which have been developed for the combustion of constituents of gasoline, kerosene, and diesel fuels do a reasonable job in predicting auto-ignition and flame propagation parameters, and the formation of the main regulated pollutants. However their success rate deteriorates sharply in the prediction of the formation of minor products (alkenes, dienes, aromatics, aldehydes) and soot nano-particles, which have a deleterious impact on both the environment and on human health. At the same time, despite an increasing emphasis in shifting from hydrocarbon fossil fuels to bio-fuels (particularly bioethanol and biodiesel), there is a great lack of chemical models for the combustion of oxygenated reactants. The main scientific focus will then be to enlarge and deepen the understanding of the reaction mechanisms and pathways associated with the combustion of an increased range of fuels (hydrocarbons and oxygenated compounds) and to elucidate the formation of a large number of hazardous minor pollutants. The core of the project is to describe at a fundamental level more accurately the reactive chemistry of minor pollutants within extensively validated detailed mechanisms for not only traditional fuels, but also innovative surrogates, describing the complex chemistry of new environmentally important bio-fuels. At the level of individual reactions rate constants, generalized rate constant classes and molecular data will be enhanced by using techniques based on quantum mechanics and on statistical mechanics. Experimental data for validation will be obtained in well defined laboratory reactors by using analytical methods of increased accuracy.
Summary
The key objective of this project is to promote cleaner and more efficient combustion technologies through the development of theoretically grounded and more accurate chemical models. This is motivated by the fact that the current models which have been developed for the combustion of constituents of gasoline, kerosene, and diesel fuels do a reasonable job in predicting auto-ignition and flame propagation parameters, and the formation of the main regulated pollutants. However their success rate deteriorates sharply in the prediction of the formation of minor products (alkenes, dienes, aromatics, aldehydes) and soot nano-particles, which have a deleterious impact on both the environment and on human health. At the same time, despite an increasing emphasis in shifting from hydrocarbon fossil fuels to bio-fuels (particularly bioethanol and biodiesel), there is a great lack of chemical models for the combustion of oxygenated reactants. The main scientific focus will then be to enlarge and deepen the understanding of the reaction mechanisms and pathways associated with the combustion of an increased range of fuels (hydrocarbons and oxygenated compounds) and to elucidate the formation of a large number of hazardous minor pollutants. The core of the project is to describe at a fundamental level more accurately the reactive chemistry of minor pollutants within extensively validated detailed mechanisms for not only traditional fuels, but also innovative surrogates, describing the complex chemistry of new environmentally important bio-fuels. At the level of individual reactions rate constants, generalized rate constant classes and molecular data will be enhanced by using techniques based on quantum mechanics and on statistical mechanics. Experimental data for validation will be obtained in well defined laboratory reactors by using analytical methods of increased accuracy.
Max ERC Funding
1 869 450 €
Duration
Start date: 2008-12-01, End date: 2013-11-30
Project acronym COLLMOT
Project Complex structure and dynamics of collective motion
Researcher (PI) Tamás Vicsek
Host Institution (HI) EOTVOS LORAND TUDOMANYEGYETEM
Call Details Advanced Grant (AdG), PE3, ERC-2008-AdG
Summary Collective behaviour is a widespread phenomenon in nature and technology making it a very important subject to study in various contexts. The main goal we intend to achieve in our multidisciplinary research is the identification and documentation of new unifying principles describing the essential aspects of collective motion, being one of the most relevant and spectacular manifestations of collective behaviour. We shall carry out novel type of experiments, design models that are both simple and realistic enough to reproduce the observations and develop concepts for a better interpretation of the complexity of systems consisting of many organisms and such non-living objects as interacting robots. We plan to study systems ranging from cultures of migrating tissue cells through flocks of birds to collectively moving devices. The interrelation of these systems will be considered in order to deepen the understanding of the main patterns of group motion in both living and non-living systems by learning about the similar phenomena in the two domains of nature. Thus, we plan to understand the essential ingredients of flocking of birds by building collectively moving unmanned aerial vehicles while, in turn, high resolution spatiotemporal GPS data of pigeon flocks will be used to make helpful conclusions for the best designs for swarms of robots. In particular, we shall construct and build a set of vehicles that will be capable, for the first time, to exhibit flocking behaviour in the three-dimensional space. The methods we shall adopt will range from approaches used in statistical physics and network theory to various new techniques in cell biology and collective robotics. All this will be based on numerous prior results (both ours and others) published in leading interdisciplinary journals. The planned research will have the potential of leading to ground breaking results with significant implications in various fields of science and technology.
Summary
Collective behaviour is a widespread phenomenon in nature and technology making it a very important subject to study in various contexts. The main goal we intend to achieve in our multidisciplinary research is the identification and documentation of new unifying principles describing the essential aspects of collective motion, being one of the most relevant and spectacular manifestations of collective behaviour. We shall carry out novel type of experiments, design models that are both simple and realistic enough to reproduce the observations and develop concepts for a better interpretation of the complexity of systems consisting of many organisms and such non-living objects as interacting robots. We plan to study systems ranging from cultures of migrating tissue cells through flocks of birds to collectively moving devices. The interrelation of these systems will be considered in order to deepen the understanding of the main patterns of group motion in both living and non-living systems by learning about the similar phenomena in the two domains of nature. Thus, we plan to understand the essential ingredients of flocking of birds by building collectively moving unmanned aerial vehicles while, in turn, high resolution spatiotemporal GPS data of pigeon flocks will be used to make helpful conclusions for the best designs for swarms of robots. In particular, we shall construct and build a set of vehicles that will be capable, for the first time, to exhibit flocking behaviour in the three-dimensional space. The methods we shall adopt will range from approaches used in statistical physics and network theory to various new techniques in cell biology and collective robotics. All this will be based on numerous prior results (both ours and others) published in leading interdisciplinary journals. The planned research will have the potential of leading to ground breaking results with significant implications in various fields of science and technology.
Max ERC Funding
1 248 000 €
Duration
Start date: 2009-03-01, End date: 2015-02-28