Project acronym 321
Project from Cubic To Linear complexity in computational electromagnetics
Researcher (PI) Francesco Paolo ANDRIULLI
Host Institution (HI) POLITECNICO DI TORINO
Call Details Consolidator Grant (CoG), PE7, ERC-2016-COG
Summary Computational Electromagnetics (CEM) is the scientific field at the origin of all new modeling and simulation tools required by the constantly arising design challenges of emerging and future technologies in applied electromagnetics. As in many other technological fields, however, the trend in all emerging technologies in electromagnetic engineering is going towards miniaturized, higher density and multi-scale scenarios. Computationally speaking this translates in the steep increase of the number of degrees of freedom. Given that the design cost (the cost of a multi-right-hand side problem dominated by matrix inversion) can scale as badly as cubically with these degrees of freedom, this fact, as pointed out by many, will sensibly compromise the practical impact of CEM on future and emerging technologies.
For this reason, the CEM scientific community has been looking for years for a FFT-like paradigm shift: a dynamic fast direct solver providing a design cost that would scale only linearly with the degrees of freedom. Such a fast solver is considered today a Holy Grail of the discipline.
The Grand Challenge of 321 will be to tackle this Holy Grail in Computational Electromagnetics by investigating a dynamic Fast Direct Solver for Maxwell Problems that would run in a linear-instead-of-cubic complexity for an arbitrary number and configuration of degrees of freedom.
The failure of all previous attempts will be overcome by a game-changing transformation of the CEM classical problem that will leverage on a recent breakthrough of the PI. Starting from this, the project will investigate an entire new paradigm for impacting algorithms to achieve this grand challenge.
The impact of the FFT’s quadratic-to-linear paradigm shift shows how computational complexity reductions can be groundbreaking on applications. The cubic-to-linear paradigm shift, which the 321 project will aim for, will have such a rupturing impact on electromagnetic science and technology.
Summary
Computational Electromagnetics (CEM) is the scientific field at the origin of all new modeling and simulation tools required by the constantly arising design challenges of emerging and future technologies in applied electromagnetics. As in many other technological fields, however, the trend in all emerging technologies in electromagnetic engineering is going towards miniaturized, higher density and multi-scale scenarios. Computationally speaking this translates in the steep increase of the number of degrees of freedom. Given that the design cost (the cost of a multi-right-hand side problem dominated by matrix inversion) can scale as badly as cubically with these degrees of freedom, this fact, as pointed out by many, will sensibly compromise the practical impact of CEM on future and emerging technologies.
For this reason, the CEM scientific community has been looking for years for a FFT-like paradigm shift: a dynamic fast direct solver providing a design cost that would scale only linearly with the degrees of freedom. Such a fast solver is considered today a Holy Grail of the discipline.
The Grand Challenge of 321 will be to tackle this Holy Grail in Computational Electromagnetics by investigating a dynamic Fast Direct Solver for Maxwell Problems that would run in a linear-instead-of-cubic complexity for an arbitrary number and configuration of degrees of freedom.
The failure of all previous attempts will be overcome by a game-changing transformation of the CEM classical problem that will leverage on a recent breakthrough of the PI. Starting from this, the project will investigate an entire new paradigm for impacting algorithms to achieve this grand challenge.
The impact of the FFT’s quadratic-to-linear paradigm shift shows how computational complexity reductions can be groundbreaking on applications. The cubic-to-linear paradigm shift, which the 321 project will aim for, will have such a rupturing impact on electromagnetic science and technology.
Max ERC Funding
2 000 000 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym AB-SWITCH
Project Evaluation of commercial potential of a low-cost kit based on DNA-nanoswitches for the single-step measurement of diagnostic antibodies
Researcher (PI) Francesco RICCI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA TOR VERGATA
Call Details Proof of Concept (PoC), ERC-2016-PoC, ERC-2016-PoC
Summary "Antibodies are among the most widely monitored class of diagnostic biomarkers. Immunoassays market now covers about 1/3 of the global market of in-vitro diagnostics (about $50 billion). However, current methods for the detection of diagnostic antibodies are either qualitative or require cumbersome, resource-intensive laboratory procedures that need hours to provide clinicians with diagnostic information. A new method for fast and low-cost detection of antibodies will have a strong economic impact in the market of in-vitro diagnostics and Immunoassays.
During our ERC Starting Grant project ""Nature Nanodevices"" we have developed a novel diagnostic technology for the detection of clinically relevant antibodies in serum and other body fluids. The platform (here named Ab-switch) supports the fluorescent detection of diagnostic antibodies (for example, HIV diagnostic antibodies) in a rapid (<3 minutes), single-step and low-cost fashion.
The goal of this Proof of Concept project is to bring our promising platform to the proof of diagnostic market and exploit its innovative features for commercial purposes. We will focus our initial efforts in the development of rapid kits for the detection of antibodies diagnostic of HIV. We will 1) Fully characterize the Ab-switch product in terms of analytical performances (i.e. sensitivity, specificity, stability etc.) with direct comparison with other commercial kits; 2) Prepare a Manufacturing Plan for producing/testing the Ab-switch; 3) Establish an IP strategy for patent filing and maintenance; 4) Determine a business and commercialization planning."
Summary
"Antibodies are among the most widely monitored class of diagnostic biomarkers. Immunoassays market now covers about 1/3 of the global market of in-vitro diagnostics (about $50 billion). However, current methods for the detection of diagnostic antibodies are either qualitative or require cumbersome, resource-intensive laboratory procedures that need hours to provide clinicians with diagnostic information. A new method for fast and low-cost detection of antibodies will have a strong economic impact in the market of in-vitro diagnostics and Immunoassays.
During our ERC Starting Grant project ""Nature Nanodevices"" we have developed a novel diagnostic technology for the detection of clinically relevant antibodies in serum and other body fluids. The platform (here named Ab-switch) supports the fluorescent detection of diagnostic antibodies (for example, HIV diagnostic antibodies) in a rapid (<3 minutes), single-step and low-cost fashion.
The goal of this Proof of Concept project is to bring our promising platform to the proof of diagnostic market and exploit its innovative features for commercial purposes. We will focus our initial efforts in the development of rapid kits for the detection of antibodies diagnostic of HIV. We will 1) Fully characterize the Ab-switch product in terms of analytical performances (i.e. sensitivity, specificity, stability etc.) with direct comparison with other commercial kits; 2) Prepare a Manufacturing Plan for producing/testing the Ab-switch; 3) Establish an IP strategy for patent filing and maintenance; 4) Determine a business and commercialization planning."
Max ERC Funding
150 000 €
Duration
Start date: 2017-02-01, End date: 2018-07-31
Project acronym ABACUS
Project Ab-initio adiabatic-connection curves for density-functional analysis and construction
Researcher (PI) Trygve Ulf Helgaker
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE4, ERC-2010-AdG_20100224
Summary Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Summary
Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Max ERC Funding
2 017 932 €
Duration
Start date: 2011-03-01, End date: 2016-02-29
Project acronym AEROSPACEPHYS
Project Multiphysics models and simulations for reacting and plasma flows applied to the space exploration program
Researcher (PI) Thierry Edouard Bertrand Magin
Host Institution (HI) INSTITUT VON KARMAN DE DYNAMIQUE DES FLUIDES
Call Details Starting Grant (StG), PE8, ERC-2010-StG_20091028
Summary Space exploration is one of boldest and most exciting endeavors that humanity has undertaken, and it holds enormous promise for the future. Our next challenges for the spatial conquest include bringing back samples to Earth by means of robotic missions and continuing the manned exploration program, which aims at sending human beings to Mars and bring them home safely. Inaccurate prediction of the heat-flux to the surface of the spacecraft heat shield can be fatal for the crew or the success of a robotic mission. This quantity is estimated during the design phase. An accurate prediction is a particularly complex task, regarding modelling of the following phenomena that are potential “mission killers:” 1) Radiation of the plasma in the shock layer, 2) Complex surface chemistry on the thermal protection material, 3) Flow transition from laminar to turbulent. Our poor understanding of the coupled mechanisms of radiation, ablation, and transition leads to the difficulties in flux prediction. To avoid failure and ensure safety of the astronauts and payload, engineers resort to “safety factors” to determine the thickness of the heat shield, at the expense of the mass of embarked payload. Thinking out of the box and basic research are thus necessary for advancements of the models that will better define the environment and requirements for the design and safe operation of tomorrow’s space vehicles and planetary probes for the manned space exploration. The three basic ingredients for predictive science are: 1) Physico-chemical models, 2) Computational methods, 3) Experimental data. We propose to follow a complementary approach for prediction. The proposed research aims at: “Integrating new advanced physico-chemical models and computational methods, based on a multidisciplinary approach developed together with physicists, chemists, and applied mathematicians, to create a top-notch multiphysics and multiscale numerical platform for simulations of planetary atmosphere entries, crucial to the new challenges of the manned space exploration program. Experimental data will also be used for validation, following state-of-the-art uncertainty quantification methods.”
Summary
Space exploration is one of boldest and most exciting endeavors that humanity has undertaken, and it holds enormous promise for the future. Our next challenges for the spatial conquest include bringing back samples to Earth by means of robotic missions and continuing the manned exploration program, which aims at sending human beings to Mars and bring them home safely. Inaccurate prediction of the heat-flux to the surface of the spacecraft heat shield can be fatal for the crew or the success of a robotic mission. This quantity is estimated during the design phase. An accurate prediction is a particularly complex task, regarding modelling of the following phenomena that are potential “mission killers:” 1) Radiation of the plasma in the shock layer, 2) Complex surface chemistry on the thermal protection material, 3) Flow transition from laminar to turbulent. Our poor understanding of the coupled mechanisms of radiation, ablation, and transition leads to the difficulties in flux prediction. To avoid failure and ensure safety of the astronauts and payload, engineers resort to “safety factors” to determine the thickness of the heat shield, at the expense of the mass of embarked payload. Thinking out of the box and basic research are thus necessary for advancements of the models that will better define the environment and requirements for the design and safe operation of tomorrow’s space vehicles and planetary probes for the manned space exploration. The three basic ingredients for predictive science are: 1) Physico-chemical models, 2) Computational methods, 3) Experimental data. We propose to follow a complementary approach for prediction. The proposed research aims at: “Integrating new advanced physico-chemical models and computational methods, based on a multidisciplinary approach developed together with physicists, chemists, and applied mathematicians, to create a top-notch multiphysics and multiscale numerical platform for simulations of planetary atmosphere entries, crucial to the new challenges of the manned space exploration program. Experimental data will also be used for validation, following state-of-the-art uncertainty quantification methods.”
Max ERC Funding
1 494 892 €
Duration
Start date: 2010-09-01, End date: 2015-08-31
Project acronym AgeConsolidate
Project The Missing Link of Episodic Memory Decline in Aging: The Role of Inefficient Systems Consolidation
Researcher (PI) Anders Martin FJELL
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Consolidator Grant (CoG), SH4, ERC-2016-COG
Summary Which brain mechanisms are responsible for the faith of the memories we make with age, whether they wither or stay, and in what form? Episodic memory function does decline with age. While this decline can have multiple causes, research has focused almost entirely on encoding and retrieval processes, largely ignoring a third critical process– consolidation. The objective of AgeConsolidate is to provide this missing link, by combining novel experimental cognitive paradigms with neuroimaging in a longitudinal large-scale attempt to directly test how age-related changes in consolidation processes in the brain impact episodic memory decline. The ambitious aims of the present proposal are two-fold:
(1) Use recent advances in memory consolidation theory to achieve an elaborate model of episodic memory deficits in aging
(2) Use aging as a model to uncover how structural and functional brain changes affect episodic memory consolidation in general
The novelty of the project lies in the synthesis of recent methodological advances and theoretical models for episodic memory consolidation to explain age-related decline, by employing a unique combination of a range of different techniques and approaches. This is ground-breaking, in that it aims at taking our understanding of the brain processes underlying episodic memory decline in aging to a new level, while at the same time advancing our theoretical understanding of how episodic memories are consolidated in the human brain. To obtain this outcome, I will test the main hypothesis of the project: Brain processes of episodic memory consolidation are less effective in older adults, and this can account for a significant portion of the episodic memory decline in aging. This will be answered by six secondary hypotheses, with 1-3 experiments or tasks designated to address each hypothesis, focusing on functional and structural MRI, positron emission tomography data and sleep experiments to target consolidation from different angles.
Summary
Which brain mechanisms are responsible for the faith of the memories we make with age, whether they wither or stay, and in what form? Episodic memory function does decline with age. While this decline can have multiple causes, research has focused almost entirely on encoding and retrieval processes, largely ignoring a third critical process– consolidation. The objective of AgeConsolidate is to provide this missing link, by combining novel experimental cognitive paradigms with neuroimaging in a longitudinal large-scale attempt to directly test how age-related changes in consolidation processes in the brain impact episodic memory decline. The ambitious aims of the present proposal are two-fold:
(1) Use recent advances in memory consolidation theory to achieve an elaborate model of episodic memory deficits in aging
(2) Use aging as a model to uncover how structural and functional brain changes affect episodic memory consolidation in general
The novelty of the project lies in the synthesis of recent methodological advances and theoretical models for episodic memory consolidation to explain age-related decline, by employing a unique combination of a range of different techniques and approaches. This is ground-breaking, in that it aims at taking our understanding of the brain processes underlying episodic memory decline in aging to a new level, while at the same time advancing our theoretical understanding of how episodic memories are consolidated in the human brain. To obtain this outcome, I will test the main hypothesis of the project: Brain processes of episodic memory consolidation are less effective in older adults, and this can account for a significant portion of the episodic memory decline in aging. This will be answered by six secondary hypotheses, with 1-3 experiments or tasks designated to address each hypothesis, focusing on functional and structural MRI, positron emission tomography data and sleep experiments to target consolidation from different angles.
Max ERC Funding
1 999 482 €
Duration
Start date: 2017-05-01, End date: 2022-04-30
Project acronym AISENS
Project New generation of high sensitive atom interferometers
Researcher (PI) Marco Fattori
Host Institution (HI) CONSIGLIO NAZIONALE DELLE RICERCHE
Call Details Starting Grant (StG), PE2, ERC-2010-StG_20091028
Summary Interferometers are fundamental tools for the study of nature laws and for the precise measurement and control of the physical world. In the last century, the scientific and technological progress has proceeded in parallel with a constant improvement of interferometric performances. For this reason, the challenge of conceiving and realizing new generations of interferometers with broader ranges of operation and with higher sensitivities is always open and actual.
Despite the introduction of laser devices has deeply improved the way of developing and performing interferometric measurements with light, the atomic matter wave analogous, i.e. the Bose-Einstein condensate (BEC), has not yet triggered any revolution in precision interferometry. However, thanks to recent improvements on the control of the quantum properties of ultra-cold atomic gases, and new original ideas on the creation and manipulation of quantum entangled particles, the field of atom interferometry is now mature to experience a big step forward.
The system I want to realize is a Mach-Zehnder spatial interferometer operating with trapped BECs. Undesired decoherence sources will be suppressed by implementing BECs with tunable interactions in ultra-stable optical potentials. Entangled states will be used to improve the sensitivity of the sensor beyond the standard quantum limit to ideally reach the ultimate, Heisenberg, limit set by quantum mechanics. The resulting apparatus will show unprecedented spatial resolution and will overcome state-of-the-art interferometers with cold (non condensed) atomic gases.
A successful completion of this project will lead to a new generation of interferometers for the immediate application to local inertial measurements with unprecedented resolution. In addition, we expect to develop experimental capabilities which might find application well beyond quantum interferometry and crucially contribute to the broader emerging field of quantum-enhanced technologies.
Summary
Interferometers are fundamental tools for the study of nature laws and for the precise measurement and control of the physical world. In the last century, the scientific and technological progress has proceeded in parallel with a constant improvement of interferometric performances. For this reason, the challenge of conceiving and realizing new generations of interferometers with broader ranges of operation and with higher sensitivities is always open and actual.
Despite the introduction of laser devices has deeply improved the way of developing and performing interferometric measurements with light, the atomic matter wave analogous, i.e. the Bose-Einstein condensate (BEC), has not yet triggered any revolution in precision interferometry. However, thanks to recent improvements on the control of the quantum properties of ultra-cold atomic gases, and new original ideas on the creation and manipulation of quantum entangled particles, the field of atom interferometry is now mature to experience a big step forward.
The system I want to realize is a Mach-Zehnder spatial interferometer operating with trapped BECs. Undesired decoherence sources will be suppressed by implementing BECs with tunable interactions in ultra-stable optical potentials. Entangled states will be used to improve the sensitivity of the sensor beyond the standard quantum limit to ideally reach the ultimate, Heisenberg, limit set by quantum mechanics. The resulting apparatus will show unprecedented spatial resolution and will overcome state-of-the-art interferometers with cold (non condensed) atomic gases.
A successful completion of this project will lead to a new generation of interferometers for the immediate application to local inertial measurements with unprecedented resolution. In addition, we expect to develop experimental capabilities which might find application well beyond quantum interferometry and crucially contribute to the broader emerging field of quantum-enhanced technologies.
Max ERC Funding
1 068 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym AlchemEast
Project Alchemy in the Making: From ancient Babylonia via Graeco-Roman Egypt into the Byzantine, Syriac and Arabic traditions (1500 BCE - 1000 AD)
Researcher (PI) Matteo MARTELLI
Host Institution (HI) ALMA MATER STUDIORUM - UNIVERSITA DI BOLOGNA
Call Details Consolidator Grant (CoG), SH5, ERC-2016-COG
Summary The AlchemEast project is devoted to the study of alchemical theory and practice as it appeared and developed in distinct, albeit contiguous (both chronologically and geographically) areas: Graeco-Roman Egypt, Byzantium, and the Near East, from Ancient Babylonian times to the early Islamic Period. This project combines innovative textual investigations with experimental replications of ancient alchemical procedures. It uses sets of historically and philologically informed laboratory replications in order to reconstruct the actual practice of ancient alchemists, and it studies the texts and literary forms in which this practice was conceptualized and transmitted. It proposes new models for textual criticism in order to capture the fluidity of the transmission of ancient alchemical writings. AlchemEast is designed to carry out a comparative investigation of cuneiform tablets as well as a vast corpus of Greek, Syriac and Arabic writings. It will overcome the old, pejorative paradigm that dismissed ancient alchemy as a "pseudo-science", by proposing a new theoretical framework for comprehending the entirety of ancient alchemical practices and theories. Alongside established forms of scholarly output, such as critical editions of key texts, AlchemEast will provide an integrative, longue durée perspective on the many different phases of ancient alchemy. It will thus offer a radically new vision of this discipline as a dynamic and diversified art that developed across different technical and scholastic traditions. This new representation will allow us to connect ancient alchemy with medieval and early modern alchemy and thus fully reintegrate ancient alchemy in the history of pre-modern alchemy as well as in the history of ancient science more broadly.
Summary
The AlchemEast project is devoted to the study of alchemical theory and practice as it appeared and developed in distinct, albeit contiguous (both chronologically and geographically) areas: Graeco-Roman Egypt, Byzantium, and the Near East, from Ancient Babylonian times to the early Islamic Period. This project combines innovative textual investigations with experimental replications of ancient alchemical procedures. It uses sets of historically and philologically informed laboratory replications in order to reconstruct the actual practice of ancient alchemists, and it studies the texts and literary forms in which this practice was conceptualized and transmitted. It proposes new models for textual criticism in order to capture the fluidity of the transmission of ancient alchemical writings. AlchemEast is designed to carry out a comparative investigation of cuneiform tablets as well as a vast corpus of Greek, Syriac and Arabic writings. It will overcome the old, pejorative paradigm that dismissed ancient alchemy as a "pseudo-science", by proposing a new theoretical framework for comprehending the entirety of ancient alchemical practices and theories. Alongside established forms of scholarly output, such as critical editions of key texts, AlchemEast will provide an integrative, longue durée perspective on the many different phases of ancient alchemy. It will thus offer a radically new vision of this discipline as a dynamic and diversified art that developed across different technical and scholastic traditions. This new representation will allow us to connect ancient alchemy with medieval and early modern alchemy and thus fully reintegrate ancient alchemy in the history of pre-modern alchemy as well as in the history of ancient science more broadly.
Max ERC Funding
1 997 000 €
Duration
Start date: 2017-12-01, End date: 2022-11-30
Project acronym ALUFIX
Project Friction stir processing based local damage mitigation and healing in aluminium alloys
Researcher (PI) Aude SIMAR
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Starting Grant (StG), PE8, ERC-2016-STG
Summary ALUFIX proposes an original strategy for the development of aluminium-based materials involving damage mitigation and extrinsic self-healing concepts exploiting the new opportunities of the solid-state friction stir process. Friction stir processing locally extrudes and drags material from the front to the back and around the tool pin. It involves short duration at moderate temperatures (typically 80% of the melting temperature), fast cooling rates and large plastic deformations leading to far out-of-equilibrium microstructures. The idea is that commercial aluminium alloys can be locally improved and healed in regions of stress concentration where damage is likely to occur. Self-healing in metal-based materials is still in its infancy and existing strategies can hardly be extended to applications. Friction stir processing can enhance the damage and fatigue resistance of aluminium alloys by microstructure homogenisation and refinement. In parallel, friction stir processing can be used to integrate secondary phases in an aluminium matrix. In the ALUFIX project, healing phases will thus be integrated in aluminium in addition to refining and homogenising the microstructure. The “local stress management strategy” favours crack closure and crack deviation at the sub-millimetre scale thanks to a controlled residual stress field. The “transient liquid healing agent” strategy involves the in-situ generation of an out-of-equilibrium compositionally graded microstructure at the aluminium/healing agent interface capable of liquid-phase healing after a thermal treatment. Along the road, a variety of new scientific questions concerning the damage mechanisms will have to be addressed.
Summary
ALUFIX proposes an original strategy for the development of aluminium-based materials involving damage mitigation and extrinsic self-healing concepts exploiting the new opportunities of the solid-state friction stir process. Friction stir processing locally extrudes and drags material from the front to the back and around the tool pin. It involves short duration at moderate temperatures (typically 80% of the melting temperature), fast cooling rates and large plastic deformations leading to far out-of-equilibrium microstructures. The idea is that commercial aluminium alloys can be locally improved and healed in regions of stress concentration where damage is likely to occur. Self-healing in metal-based materials is still in its infancy and existing strategies can hardly be extended to applications. Friction stir processing can enhance the damage and fatigue resistance of aluminium alloys by microstructure homogenisation and refinement. In parallel, friction stir processing can be used to integrate secondary phases in an aluminium matrix. In the ALUFIX project, healing phases will thus be integrated in aluminium in addition to refining and homogenising the microstructure. The “local stress management strategy” favours crack closure and crack deviation at the sub-millimetre scale thanks to a controlled residual stress field. The “transient liquid healing agent” strategy involves the in-situ generation of an out-of-equilibrium compositionally graded microstructure at the aluminium/healing agent interface capable of liquid-phase healing after a thermal treatment. Along the road, a variety of new scientific questions concerning the damage mechanisms will have to be addressed.
Max ERC Funding
1 497 447 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym ANISOTROPIC UNIVERSE
Project The anisotropic universe -- a reality or fluke?
Researcher (PI) Hans Kristian Kamfjord Eriksen
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE9, ERC-2010-StG_20091028
Summary "During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Summary
"During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym ANOREP
Project Targeting the reproductive biology of the malaria mosquito Anopheles gambiae: from laboratory studies to field applications
Researcher (PI) Flaminia Catteruccia
Host Institution (HI) UNIVERSITA DEGLI STUDI DI PERUGIA
Call Details Starting Grant (StG), LS2, ERC-2010-StG_20091118
Summary Anopheles gambiae mosquitoes are the major vectors of malaria, a disease with devastating consequences for
human health. Novel methods for controlling the natural vector populations are urgently needed, given the
evolution of insecticide resistance in mosquitoes and the lack of novel insecticidals. Understanding the
processes at the bases of mosquito biology may help to roll back malaria. In this proposal, we will target
mosquito reproduction, a major determinant of the An. gambiae vectorial capacity. This will be achieved at
two levels: (i) fundamental research, to provide a deeper knowledge of the processes regulating reproduction
in this species, and (ii) applied research, to identify novel targets and to develop innovative approaches for
the control of natural populations. We will focus our analysis on three major players of mosquito
reproduction: male accessory glands (MAGs), sperm, and spermatheca, in both laboratory and field settings.
We will then translate this information into the identification of inhibitors of mosquito fertility. The
experimental activities will be divided across three objectives. In Objective 1, we will unravel the role of the
MAGs in shaping mosquito fertility and behaviour, by performing a combination of transcriptional and
functional studies that will reveal the multifaceted activities of these tissues. In Objective 2 we will instead
focus on the identification of the male and female factors responsible for sperm viability and function.
Results obtained in both objectives will be validated in field mosquitoes. In Objective 3, we will perform
screens aimed at the identification of inhibitors of mosquito reproductive success. This study will reveal as
yet unknown molecular mechanisms underlying reproductive success in mosquitoes, considerably increasing
our knowledge beyond the state-of-the-art and critically contributing with innovative tools and ideas to the
fight against malaria.
Summary
Anopheles gambiae mosquitoes are the major vectors of malaria, a disease with devastating consequences for
human health. Novel methods for controlling the natural vector populations are urgently needed, given the
evolution of insecticide resistance in mosquitoes and the lack of novel insecticidals. Understanding the
processes at the bases of mosquito biology may help to roll back malaria. In this proposal, we will target
mosquito reproduction, a major determinant of the An. gambiae vectorial capacity. This will be achieved at
two levels: (i) fundamental research, to provide a deeper knowledge of the processes regulating reproduction
in this species, and (ii) applied research, to identify novel targets and to develop innovative approaches for
the control of natural populations. We will focus our analysis on three major players of mosquito
reproduction: male accessory glands (MAGs), sperm, and spermatheca, in both laboratory and field settings.
We will then translate this information into the identification of inhibitors of mosquito fertility. The
experimental activities will be divided across three objectives. In Objective 1, we will unravel the role of the
MAGs in shaping mosquito fertility and behaviour, by performing a combination of transcriptional and
functional studies that will reveal the multifaceted activities of these tissues. In Objective 2 we will instead
focus on the identification of the male and female factors responsible for sperm viability and function.
Results obtained in both objectives will be validated in field mosquitoes. In Objective 3, we will perform
screens aimed at the identification of inhibitors of mosquito reproductive success. This study will reveal as
yet unknown molecular mechanisms underlying reproductive success in mosquitoes, considerably increasing
our knowledge beyond the state-of-the-art and critically contributing with innovative tools and ideas to the
fight against malaria.
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31