Project acronym A2C2
Project Atmospheric flow Analogues and Climate Change
Researcher (PI) Pascal Yiou
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Advanced Grant (AdG), PE10, ERC-2013-ADG
Summary "The A2C2 project treats two major challenges in climate and atmospheric research: the time dependence of the climate attractor to external forcings (solar, volcanic eruptions and anthropogenic), and the attribution of extreme climate events occurring in the northern extra-tropics. The main difficulties are the limited climate information, the computer cost of model simulations, and mathematical assumptions that are hardly verified and often overlooked in the literature.
A2C2 proposes a practical framework to overcome those three difficulties, linking the theory of dynamical systems and statistics. We will generalize the methodology of flow analogues to multiple databases in order to obtain probabilistic descriptions of analogue decompositions.
The project is divided into three workpackages (WP). WP1 embeds the analogue method in the theory of dynamical systems in order to provide a metric of an attractor deformation in time. The important methodological step is to detect trends or persisting outliers in the dates and scores of analogues when the system yields time-varying forcings. This is done from idealized models and full size climate models in which the forcings (anthropogenic and natural) are known.
A2C2 creates an open source toolkit to compute flow analogues from a wide array of databases (WP2). WP3 treats the two scientific challenges with the analogue method and multiple model ensembles, hence allowing uncertainty estimates under realistic mathematical hypotheses. The flow analogue methodology allows a systematic and quasi real-time analysis of extreme events, which is currently out of the reach of conventional climate modeling approaches.
The major breakthrough of A2C2 is to bridge the gap between operational needs (the immediate analysis of climate events) and the understanding long-term climate changes. A2C2 opens new research horizons for the exploitation of ensembles of simulations and reliable estimates of uncertainty."
Summary
"The A2C2 project treats two major challenges in climate and atmospheric research: the time dependence of the climate attractor to external forcings (solar, volcanic eruptions and anthropogenic), and the attribution of extreme climate events occurring in the northern extra-tropics. The main difficulties are the limited climate information, the computer cost of model simulations, and mathematical assumptions that are hardly verified and often overlooked in the literature.
A2C2 proposes a practical framework to overcome those three difficulties, linking the theory of dynamical systems and statistics. We will generalize the methodology of flow analogues to multiple databases in order to obtain probabilistic descriptions of analogue decompositions.
The project is divided into three workpackages (WP). WP1 embeds the analogue method in the theory of dynamical systems in order to provide a metric of an attractor deformation in time. The important methodological step is to detect trends or persisting outliers in the dates and scores of analogues when the system yields time-varying forcings. This is done from idealized models and full size climate models in which the forcings (anthropogenic and natural) are known.
A2C2 creates an open source toolkit to compute flow analogues from a wide array of databases (WP2). WP3 treats the two scientific challenges with the analogue method and multiple model ensembles, hence allowing uncertainty estimates under realistic mathematical hypotheses. The flow analogue methodology allows a systematic and quasi real-time analysis of extreme events, which is currently out of the reach of conventional climate modeling approaches.
The major breakthrough of A2C2 is to bridge the gap between operational needs (the immediate analysis of climate events) and the understanding long-term climate changes. A2C2 opens new research horizons for the exploitation of ensembles of simulations and reliable estimates of uncertainty."
Max ERC Funding
1 491 457 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym ACCLIMATE
Project Elucidating the Causes and Effects of Atlantic Circulation Changes through Model-Data Integration
Researcher (PI) Claire Waelbroeck
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE10, ERC-2013-ADG
Summary Rapid changes in ocean circulation and climate have been observed in marine sediment and ice cores, notably over the last 60 thousand years (ky), highlighting the non-linear character of the climate system and underlining the possibility of rapid climate shifts in response to anthropogenic greenhouse gas forcing.
To date, these rapid changes in climate and ocean circulation are still not fully explained. Two main obstacles prevent going beyond the current state of knowledge:
- Paleoclimatic proxy data are by essence only indirect indicators of the climatic variables, and thus can not be directly compared with model outputs;
- A 4-D (latitude, longitude, water depth, time) reconstruction of Atlantic water masses over the past 40 ky is lacking: previous studies have generated isolated records with disparate timescales which do not allow the causes of circulation changes to be identified.
Overcoming these two major limitations will lead to major breakthroughs in climate research. Concretely, I will create the first database of Atlantic deep-sea records over the last 40 ky, and extract full climatic information from these records through an innovative model-data integration scheme using an isotopic proxy forward modeling approach. The novelty and exceptional potential of this scheme is twofold: (i) it avoids hypotheses on proxy interpretation and hence suppresses or strongly reduces the errors of interpretation of paleoclimatic records; (ii) it produces states of the climate system that best explain the observations over the last 40 ky, while being consistent with the model physics.
Expected results include:
• The elucidation of the mechanisms explaining rapid changes in ocean circulation and climate over the last 40 ky,
• Improved climate model physics and parameterizations,
• The first projections of future climate changes obtained with a model able to reproduce the highly non linear behavior of the climate system observed over the last 40 ky.
Summary
Rapid changes in ocean circulation and climate have been observed in marine sediment and ice cores, notably over the last 60 thousand years (ky), highlighting the non-linear character of the climate system and underlining the possibility of rapid climate shifts in response to anthropogenic greenhouse gas forcing.
To date, these rapid changes in climate and ocean circulation are still not fully explained. Two main obstacles prevent going beyond the current state of knowledge:
- Paleoclimatic proxy data are by essence only indirect indicators of the climatic variables, and thus can not be directly compared with model outputs;
- A 4-D (latitude, longitude, water depth, time) reconstruction of Atlantic water masses over the past 40 ky is lacking: previous studies have generated isolated records with disparate timescales which do not allow the causes of circulation changes to be identified.
Overcoming these two major limitations will lead to major breakthroughs in climate research. Concretely, I will create the first database of Atlantic deep-sea records over the last 40 ky, and extract full climatic information from these records through an innovative model-data integration scheme using an isotopic proxy forward modeling approach. The novelty and exceptional potential of this scheme is twofold: (i) it avoids hypotheses on proxy interpretation and hence suppresses or strongly reduces the errors of interpretation of paleoclimatic records; (ii) it produces states of the climate system that best explain the observations over the last 40 ky, while being consistent with the model physics.
Expected results include:
• The elucidation of the mechanisms explaining rapid changes in ocean circulation and climate over the last 40 ky,
• Improved climate model physics and parameterizations,
• The first projections of future climate changes obtained with a model able to reproduce the highly non linear behavior of the climate system observed over the last 40 ky.
Max ERC Funding
3 000 000 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym Actanthrope
Project Computational Foundations of Anthropomorphic Action
Researcher (PI) Jean Paul Laumond
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE7, ERC-2013-ADG
Summary Actanthrope intends to promote a neuro-robotics perspective to explore original models of anthropomorphic action. The project targets contributions to humanoid robot autonomy (for rescue and service robotics), to advanced human body simulation (for applications in ergonomics), and to a new theory of embodied intelligence (by promoting a motion-based semiotics of the human action).
Actions take place in the physical space while they originate in the –robot or human– sensory-motor space. Geometry is the core abstraction that makes the link between these spaces. Considering that the structure of actions inherits from that of the body, the underlying intuition is that actions can be segmented within discrete sub-spaces lying in the entire continuous posture space. Such sub-spaces are viewed as symbols bridging deliberative reasoning and reactive control. Actanthrope argues that geometric approaches to motion segmentation and generation as promising and innovative routes to explore embodied intelligence:
- Motion segmentation: what are the sub-manifolds that define the structure of a given action?
- Motion generation: among all the solution paths within a given sub-manifold, what is the underlying law that makes the selection?
In Robotics these questions are related to the competition between abstract symbol manipulation and physical signal processing. In Computational Neuroscience the questions refer to the quest of motion invariants. The ambition of the project is to promote a dual perspective: exploring the computational foundations of human action to make better robots, while simultaneously doing better robotics to better understand human action.
A unique “Anthropomorphic Action Factory” supports the methodology. It aims at attracting to a single lab, researchers with complementary know-how and solid mathematical background. All of them will benefit from unique equipments, while being stimulated by four challenges dealing with locomotion and manipulation actions.
Summary
Actanthrope intends to promote a neuro-robotics perspective to explore original models of anthropomorphic action. The project targets contributions to humanoid robot autonomy (for rescue and service robotics), to advanced human body simulation (for applications in ergonomics), and to a new theory of embodied intelligence (by promoting a motion-based semiotics of the human action).
Actions take place in the physical space while they originate in the –robot or human– sensory-motor space. Geometry is the core abstraction that makes the link between these spaces. Considering that the structure of actions inherits from that of the body, the underlying intuition is that actions can be segmented within discrete sub-spaces lying in the entire continuous posture space. Such sub-spaces are viewed as symbols bridging deliberative reasoning and reactive control. Actanthrope argues that geometric approaches to motion segmentation and generation as promising and innovative routes to explore embodied intelligence:
- Motion segmentation: what are the sub-manifolds that define the structure of a given action?
- Motion generation: among all the solution paths within a given sub-manifold, what is the underlying law that makes the selection?
In Robotics these questions are related to the competition between abstract symbol manipulation and physical signal processing. In Computational Neuroscience the questions refer to the quest of motion invariants. The ambition of the project is to promote a dual perspective: exploring the computational foundations of human action to make better robots, while simultaneously doing better robotics to better understand human action.
A unique “Anthropomorphic Action Factory” supports the methodology. It aims at attracting to a single lab, researchers with complementary know-how and solid mathematical background. All of them will benefit from unique equipments, while being stimulated by four challenges dealing with locomotion and manipulation actions.
Max ERC Funding
2 500 000 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym BREAD
Project Breaking the curse of dimensionality: numerical challenges in high dimensional analysis and simulation
Researcher (PI) Albert Cohen
Host Institution (HI) UNIVERSITE PIERRE ET MARIE CURIE - PARIS 6
Call Details Advanced Grant (AdG), PE1, ERC-2013-ADG
Summary "This project is concerned with problems that involve a very large number of variables, and whose efficient numerical treatment is challenged by the so-called curse of dimensionality, meaning that computational complexity increases exponentially in the variable dimension.
The PI intend to establish in his host institution a scientific leadership on the mathematical understanding and numerical treatment of these problems, and to contribute to the development of this area of research through international collaborations, organization of workshops and research schools, and training of postdocs and PhD students.
High dimensional problems are ubiquitous in an increasing number of areas of scientific computing, among which statistical or active learning theory, parametric and stochastic partial differential equations, parameter optimization in numerical codes. There is a high demand from the industrial world of efficient numerical methods for treating such problems.
The practical success of various numerical algorithms, that have been developed in recent years in these application areas, is often limited to moderate dimensional setting.
In addition, these developments tend to be, as a rule, rather problem specific and not always founded on a solid mathematical analysis.
The central scientific objectives of this project are therefore: (i) to identify fundamental mathematical principles behind overcoming the curse of dimensionality, (ii) to understand how these principles enter in relevant instances of the above applications, and (iii) based on the these principles beyond particular problem classes, to develop broadly applicable numerical strategies that benefit from such mechanisms.
The performances of these strategies should be provably independent of the variable dimension, and in that sense break the curse of dimensionality. They will be tested on both synthetic benchmark tests and real world problems coming from the afore-mentioned applications."
Summary
"This project is concerned with problems that involve a very large number of variables, and whose efficient numerical treatment is challenged by the so-called curse of dimensionality, meaning that computational complexity increases exponentially in the variable dimension.
The PI intend to establish in his host institution a scientific leadership on the mathematical understanding and numerical treatment of these problems, and to contribute to the development of this area of research through international collaborations, organization of workshops and research schools, and training of postdocs and PhD students.
High dimensional problems are ubiquitous in an increasing number of areas of scientific computing, among which statistical or active learning theory, parametric and stochastic partial differential equations, parameter optimization in numerical codes. There is a high demand from the industrial world of efficient numerical methods for treating such problems.
The practical success of various numerical algorithms, that have been developed in recent years in these application areas, is often limited to moderate dimensional setting.
In addition, these developments tend to be, as a rule, rather problem specific and not always founded on a solid mathematical analysis.
The central scientific objectives of this project are therefore: (i) to identify fundamental mathematical principles behind overcoming the curse of dimensionality, (ii) to understand how these principles enter in relevant instances of the above applications, and (iii) based on the these principles beyond particular problem classes, to develop broadly applicable numerical strategies that benefit from such mechanisms.
The performances of these strategies should be provably independent of the variable dimension, and in that sense break the curse of dimensionality. They will be tested on both synthetic benchmark tests and real world problems coming from the afore-mentioned applications."
Max ERC Funding
1 848 000 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym CLOUDMAP
Project Cloud Computing via Homomorphic Encryption and Multilinear Maps
Researcher (PI) Jean-Sebastien Coron
Host Institution (HI) UNIVERSITE DU LUXEMBOURG
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary The past thirty years have seen cryptography move from arcane to commonplace: Internet, mobile phones, banking system, etc. Homomorphic cryptography now offers the tantalizing goal of being able to process sensitive information in encrypted form, without needing to compromise on the privacy and security of the citizens and organizations that provide the input data. More recently, cryptographic multilinear maps have revolutionized cryptography with the emergence of indistinguishability obfuscation (iO), which in theory can been used to realize numerous advanced cryptographic functionalities that previously seemed beyond reach. However the security of multilinear maps is still poorly understood, and many iO schemes have been broken; moreover all constructions of iO are currently unpractical.
The goal of the CLOUDMAP project is to make these advanced cryptographic tasks usable in practice, so that citizens do not have to compromise on the privacy and security of their input data. This goal can only be achieved by considering the mathematical foundations of these primitives, working "from first principles", rather than focusing on premature optimizations. To achieve this goal, our first objective will be to better understand the security of the underlying primitives of multilinear maps and iO schemes. Our second objective will be to develop new approaches to significantly improve their efficiency. Our third objective will be to build applications of multilinear maps and iO that can be implemented in practice.
Summary
The past thirty years have seen cryptography move from arcane to commonplace: Internet, mobile phones, banking system, etc. Homomorphic cryptography now offers the tantalizing goal of being able to process sensitive information in encrypted form, without needing to compromise on the privacy and security of the citizens and organizations that provide the input data. More recently, cryptographic multilinear maps have revolutionized cryptography with the emergence of indistinguishability obfuscation (iO), which in theory can been used to realize numerous advanced cryptographic functionalities that previously seemed beyond reach. However the security of multilinear maps is still poorly understood, and many iO schemes have been broken; moreover all constructions of iO are currently unpractical.
The goal of the CLOUDMAP project is to make these advanced cryptographic tasks usable in practice, so that citizens do not have to compromise on the privacy and security of their input data. This goal can only be achieved by considering the mathematical foundations of these primitives, working "from first principles", rather than focusing on premature optimizations. To achieve this goal, our first objective will be to better understand the security of the underlying primitives of multilinear maps and iO schemes. Our second objective will be to develop new approaches to significantly improve their efficiency. Our third objective will be to build applications of multilinear maps and iO that can be implemented in practice.
Max ERC Funding
2 491 266 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym CONCERTO
Project Intensity mapping of the atomic carbon CII line: the promise of a new observational probe of dusty star-formation in post-reionization and reionization epoch
Researcher (PI) Guilaine LAGACHE
Host Institution (HI) UNIVERSITE D'AIX MARSEILLE
Call Details Advanced Grant (AdG), PE9, ERC-2017-ADG
Summary I propose for funding to construct a spectrometer to map in 3-D the intensity due to line emission, a
technique known as Intensity Mapping. Instead of detecting individual galaxies, this emerging technique
measures signal fluctuations produced by the combined emission of the galaxy population on large regions
of the sky in a wide frequency (i.e. redshift) band, and thus increases sensitivity to faint sources.
Capitalizing on a recent technology breakthrough, our intensity mapping experiment will measure the 3-D
fluctuations of the [CII] line at redshifts 4.5<z<8.5. [CII] is one of the most valuable star formation tracers
at high redshift. My project will answer the outstanding questions of whether dusty star-formation
contributes to early galaxy evolution, and whether dusty galaxies play an important role in shaping cosmic
reionization.
My team will first build, test, and finally install the instrument on the APEX antenna following an
agreement with APEX partners. The spectrometer will be based on the state-of-the-art development of new
arrays in the millimeter using Kinetic Inductance Detectors. Spectra (200-360 GHz) will be obtained by a
fast Martin-Puplett interferometer. Then, we will observe with CONCERTO a few square degrees and offer
a straight forward alternative for probing star formation and dust build-up in the early Universe. Finally,
CONCERTO will set to music the various cosmic evolution probes. Cross-correlation of the signals will be
used in particular to capture the topology of the end of reionization era.
CONCERTO will be one of two instruments in the world to perform intensity mapping of the [CII] line in
the short term. The novel methodology is extremely promising as it targets an unexplored observable
touching on some of the fundamental processes building the early universe. In the flourishing of new ideas
in the intensity-mapping field, CONCERTO lies at the forefront.
Summary
I propose for funding to construct a spectrometer to map in 3-D the intensity due to line emission, a
technique known as Intensity Mapping. Instead of detecting individual galaxies, this emerging technique
measures signal fluctuations produced by the combined emission of the galaxy population on large regions
of the sky in a wide frequency (i.e. redshift) band, and thus increases sensitivity to faint sources.
Capitalizing on a recent technology breakthrough, our intensity mapping experiment will measure the 3-D
fluctuations of the [CII] line at redshifts 4.5<z<8.5. [CII] is one of the most valuable star formation tracers
at high redshift. My project will answer the outstanding questions of whether dusty star-formation
contributes to early galaxy evolution, and whether dusty galaxies play an important role in shaping cosmic
reionization.
My team will first build, test, and finally install the instrument on the APEX antenna following an
agreement with APEX partners. The spectrometer will be based on the state-of-the-art development of new
arrays in the millimeter using Kinetic Inductance Detectors. Spectra (200-360 GHz) will be obtained by a
fast Martin-Puplett interferometer. Then, we will observe with CONCERTO a few square degrees and offer
a straight forward alternative for probing star formation and dust build-up in the early Universe. Finally,
CONCERTO will set to music the various cosmic evolution probes. Cross-correlation of the signals will be
used in particular to capture the topology of the end of reionization era.
CONCERTO will be one of two instruments in the world to perform intensity mapping of the [CII] line in
the short term. The novel methodology is extremely promising as it targets an unexplored observable
touching on some of the fundamental processes building the early universe. In the flourishing of new ideas
in the intensity-mapping field, CONCERTO lies at the forefront.
Max ERC Funding
3 499 942 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym COXINEL
Project COherent Xray source INferred from Electrons accelerated by Laser
Researcher (PI) Marie-Emmanuelle Couprie
Host Institution (HI) SYNCHROTRON SOLEIL SOCIETE CIVILE
Call Details Advanced Grant (AdG), PE7, ERC-2013-ADG
Summary "Since the first laser discovery in 1960 and the first Free Electron Laser (FEL) in 1977, Linac based fourth generation light sources provide intense coherent fs pulses in the X-ray range for multidisciplinary investigations of matter. In parallel, Laser Wakefield Accelerator (LWFA) by using intense laser beams interacting with cm long plasmas can now provide high quality electron beams of very short bunches (few fs) with high peak currents (few kA). The so-called 5th generation light source aims at reducing the size and the cost of these FELs by replacing the linac by LWFA. Indeed, spontaneous emission from LWFA has already been observed, but the presently still rather large energy spread (1 %) and divergence (mrad) prevent from the FEL amplification. In 2012, two novel schemes in the transport proposed in the community, including my SOLEIL group, predict a laser gain increase by 3 or 4 orders of magnitudes. COXINEL aims at demonstrating the first lasing of an LWFA FEL and its detailed study in close interaction with future potential users. The key concept relies on an innovative electron beam longitudinal and transverse manipulation in the transport towards an undulator: a ""demixing"" chicane sorts the electrons in energy and reduces the spread from 1 % to a slice one of 0.1%, and the transverse density is maintained constant all along the undulator (supermatching). Simulations based on the performance of the 60 TW laser of the Laboratoire d’Optique Appliquée and existing undulators from SOLEIL suggest that the conditions for lasing are fulfilled. The SOLEIL environment also possesses the engineering fabrication capability for the actual realization of these theoretical ideas, with original undulators and innovative variable permanent compact magnets for the transport. COXINEL will enable to master in Europe advanced schemes scalable to shorter wavelengths and pulses, paving the way towards FEL light sources on laboratory size, for fs time resolved experiments."
Summary
"Since the first laser discovery in 1960 and the first Free Electron Laser (FEL) in 1977, Linac based fourth generation light sources provide intense coherent fs pulses in the X-ray range for multidisciplinary investigations of matter. In parallel, Laser Wakefield Accelerator (LWFA) by using intense laser beams interacting with cm long plasmas can now provide high quality electron beams of very short bunches (few fs) with high peak currents (few kA). The so-called 5th generation light source aims at reducing the size and the cost of these FELs by replacing the linac by LWFA. Indeed, spontaneous emission from LWFA has already been observed, but the presently still rather large energy spread (1 %) and divergence (mrad) prevent from the FEL amplification. In 2012, two novel schemes in the transport proposed in the community, including my SOLEIL group, predict a laser gain increase by 3 or 4 orders of magnitudes. COXINEL aims at demonstrating the first lasing of an LWFA FEL and its detailed study in close interaction with future potential users. The key concept relies on an innovative electron beam longitudinal and transverse manipulation in the transport towards an undulator: a ""demixing"" chicane sorts the electrons in energy and reduces the spread from 1 % to a slice one of 0.1%, and the transverse density is maintained constant all along the undulator (supermatching). Simulations based on the performance of the 60 TW laser of the Laboratoire d’Optique Appliquée and existing undulators from SOLEIL suggest that the conditions for lasing are fulfilled. The SOLEIL environment also possesses the engineering fabrication capability for the actual realization of these theoretical ideas, with original undulators and innovative variable permanent compact magnets for the transport. COXINEL will enable to master in Europe advanced schemes scalable to shorter wavelengths and pulses, paving the way towards FEL light sources on laboratory size, for fs time resolved experiments."
Max ERC Funding
2 500 000 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym CryptoCloud
Project Cryptography for the Cloud
Researcher (PI) David Daniel Rene Pointcheval
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary Many companies have already started the migration to the Cloud and many individuals share their personal informations on social networks. Unfortunately, in the current access mode, the provider first authenticates the client, and grants him access, or not, according to his rights in the access-control list. Therefore, the provider itself not only has total access to the data, but also knows which data are accessed, by whom, and how: privacy, which includes secrecy of data (confidentiality), identities (anonymity), and requests (obliviousness), should be enforced.
The industry of the Cloud introduces a new implicit trust requirement: nobody has any idea at all of where and how his data are stored and manipulated, but everybody should blindly trust the providers. Privacy-compliant procedures cannot be left to the responsibility of the provider: however strong the trustfulness of the provider may be, any system or human vulnerability can be exploited against privacy. This presents too huge a threat to tolerate. The distribution of the data and the secrecy of the actions must be given back to the users. It requires promoting privacy as a global security notion.
A new generation of secure multi-party computation protocols is required to protect everybody in an appropriate way, with privacy and efficiency: interactive protocols will be the core approach to provide privacy in practical systems.
Privacy for the Cloud will have a huge societal impact since it will revolutionize the trust model: users will be able to make safe use of outsourced storage, namely for personal, financial and medical data, without having to worry about failures or attacks of the server. It will also have a strong economic impact, conferring a competitive advantage on Cloud providers implementing these tools.
Summary
Many companies have already started the migration to the Cloud and many individuals share their personal informations on social networks. Unfortunately, in the current access mode, the provider first authenticates the client, and grants him access, or not, according to his rights in the access-control list. Therefore, the provider itself not only has total access to the data, but also knows which data are accessed, by whom, and how: privacy, which includes secrecy of data (confidentiality), identities (anonymity), and requests (obliviousness), should be enforced.
The industry of the Cloud introduces a new implicit trust requirement: nobody has any idea at all of where and how his data are stored and manipulated, but everybody should blindly trust the providers. Privacy-compliant procedures cannot be left to the responsibility of the provider: however strong the trustfulness of the provider may be, any system or human vulnerability can be exploited against privacy. This presents too huge a threat to tolerate. The distribution of the data and the secrecy of the actions must be given back to the users. It requires promoting privacy as a global security notion.
A new generation of secure multi-party computation protocols is required to protect everybody in an appropriate way, with privacy and efficiency: interactive protocols will be the core approach to provide privacy in practical systems.
Privacy for the Cloud will have a huge societal impact since it will revolutionize the trust model: users will be able to make safe use of outsourced storage, namely for personal, financial and medical data, without having to worry about failures or attacks of the server. It will also have a strong economic impact, conferring a competitive advantage on Cloud providers implementing these tools.
Max ERC Funding
2 168 261 €
Duration
Start date: 2014-06-01, End date: 2019-05-31
Project acronym DAMIC-M
Project Unveiling the Hidden: A Search for Light Dark Matter with CCDs
Researcher (PI) Paolo PRIVITERA
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE2, ERC-2017-ADG
Summary Dark matter (DM) is a ubiquitous yet invisible presence in our universe. It dictated how galaxies formed in the first place, and now moves stars around them at puzzling speeds. The DM mass in the universe is known to be five times that of ordinary matter; yet its true nature remains elusive.
Weakly interacting massive particles (WIMPs), relics from the early universe, are a compelling explanation chased by sensitive experiments in deep underground laboratories. However, searches for heavy WIMPs (≈100 times the proton mass), the most theoretically natural candidates, have been so far unsuccessful. Nor has evidence for such heavy particles yet been found at the CERN Large Hadron Collider. Alternative scenarios are now under scrutiny, such as the existence of a hidden sector of lighter DM particles that interact, differently than WIMPs, also with electrons.
DAMIC-M (Dark Matter In CCDs at Modane) will search beyond the heavy WIMP paradigm by detecting nuclear recoils and electrons induced by light DM in charge-coupled devices (CCDs). The 0.5 kg detector will be installed at the Laboratoire Souterrain de Modane, France. In this novel and unconventional use of CCDs, which are commonly employed for digital imaging in astronomical telescopes, the ionization charge will be detected in the most massive CCDs ever built with exquisite spatial resolution (15 μm x 15 μm pixel). The crucial innovation in these devices is the non-destructive, repetitive measurement of the pixel charge, which results in the high-resolution detection of a single electron and unprecedented sensitivity to light DM (≈ eV energies are enough to free an electron in silicon). By counting individual charges in a detector with extremely low leakage current – a combination unmatched by any other DM experiment – DAMIC-M will take a leap forward of several orders of magnitude in the exploration of the hidden sector, a jump that may be rewarded by serendipitous discovery.
Summary
Dark matter (DM) is a ubiquitous yet invisible presence in our universe. It dictated how galaxies formed in the first place, and now moves stars around them at puzzling speeds. The DM mass in the universe is known to be five times that of ordinary matter; yet its true nature remains elusive.
Weakly interacting massive particles (WIMPs), relics from the early universe, are a compelling explanation chased by sensitive experiments in deep underground laboratories. However, searches for heavy WIMPs (≈100 times the proton mass), the most theoretically natural candidates, have been so far unsuccessful. Nor has evidence for such heavy particles yet been found at the CERN Large Hadron Collider. Alternative scenarios are now under scrutiny, such as the existence of a hidden sector of lighter DM particles that interact, differently than WIMPs, also with electrons.
DAMIC-M (Dark Matter In CCDs at Modane) will search beyond the heavy WIMP paradigm by detecting nuclear recoils and electrons induced by light DM in charge-coupled devices (CCDs). The 0.5 kg detector will be installed at the Laboratoire Souterrain de Modane, France. In this novel and unconventional use of CCDs, which are commonly employed for digital imaging in astronomical telescopes, the ionization charge will be detected in the most massive CCDs ever built with exquisite spatial resolution (15 μm x 15 μm pixel). The crucial innovation in these devices is the non-destructive, repetitive measurement of the pixel charge, which results in the high-resolution detection of a single electron and unprecedented sensitivity to light DM (≈ eV energies are enough to free an electron in silicon). By counting individual charges in a detector with extremely low leakage current – a combination unmatched by any other DM experiment – DAMIC-M will take a leap forward of several orders of magnitude in the exploration of the hidden sector, a jump that may be rewarded by serendipitous discovery.
Max ERC Funding
3 349 563 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym DiluteParaWater
Project Long-Lived Nuclear Magnetization in Dilute Para-Water
Researcher (PI) Geoffrey Bodenhausen
Host Institution (HI) ECOLE NORMALE SUPERIEURE
Call Details Advanced Grant (AdG), PE4, ERC-2013-ADG
Summary The magnetization of hydrogen nuclei in H2O constitutes the basis of most applications of magnetic resonance imaging (MRI.) Only ortho-water, where the two proton spins are in states that are symmetric with respect to permutation, features NMR-allowed transitions. Para-water is analogous to para-hydrogen, where the two proton spins are anti-symmetric with respect to permutation. The objective of this proposal is to render para-H2O accessible to observation. Several strategies will be developed for its preparation and observation in solids, liquids and gas phase, with yields up to 33%. When diluted in acetonitrile at room temperature, we found that Tortho(H2O) = 6 s. Based on experiments on H2C groups where Tpara/Tortho > 37, we conservatively estimate that Tpara/Tortho > 10 for H2O, so that we expect Tpara = 60 s. Dilution in aprotic solvents inhibits the exchange of protons and extends the lifetimes t(H2O) of water molecules from ca. 1 ms in pure water to 10 s and beyond, so that proton exchange does not hamper the use para-water. The ratio Tpara/Tortho of H2O depends on temperature, viscosity, paramagnetic agents, etc., which affect intra- and inter-molecular dipole-dipole interactions, chemical shift anisotropy, and spin rotation. In cases where proton exchange significantly shortens the lifetime of para-H2O, we shall prepare and observe para-ethanol and aqueous solutions of para-glycine, which cannot suffer from proton exchange, and allow similar perspectives as para-water. In conventional MRI, contrast stems mostly from spatial variations of T1 and T2. By monitoring the ratio Tpara/Tortho as a function of spatial coordinates, it will be possible to obtain a novel type of contrast. In suitable phantoms and porous media, para-water will allow us to characterize slow transport phenomena such as flow, diffusion, and electrophoretic mobility. The study of transport phenomena will become possible over longer time intervals, lower velocities or greater distances.
Summary
The magnetization of hydrogen nuclei in H2O constitutes the basis of most applications of magnetic resonance imaging (MRI.) Only ortho-water, where the two proton spins are in states that are symmetric with respect to permutation, features NMR-allowed transitions. Para-water is analogous to para-hydrogen, where the two proton spins are anti-symmetric with respect to permutation. The objective of this proposal is to render para-H2O accessible to observation. Several strategies will be developed for its preparation and observation in solids, liquids and gas phase, with yields up to 33%. When diluted in acetonitrile at room temperature, we found that Tortho(H2O) = 6 s. Based on experiments on H2C groups where Tpara/Tortho > 37, we conservatively estimate that Tpara/Tortho > 10 for H2O, so that we expect Tpara = 60 s. Dilution in aprotic solvents inhibits the exchange of protons and extends the lifetimes t(H2O) of water molecules from ca. 1 ms in pure water to 10 s and beyond, so that proton exchange does not hamper the use para-water. The ratio Tpara/Tortho of H2O depends on temperature, viscosity, paramagnetic agents, etc., which affect intra- and inter-molecular dipole-dipole interactions, chemical shift anisotropy, and spin rotation. In cases where proton exchange significantly shortens the lifetime of para-H2O, we shall prepare and observe para-ethanol and aqueous solutions of para-glycine, which cannot suffer from proton exchange, and allow similar perspectives as para-water. In conventional MRI, contrast stems mostly from spatial variations of T1 and T2. By monitoring the ratio Tpara/Tortho as a function of spatial coordinates, it will be possible to obtain a novel type of contrast. In suitable phantoms and porous media, para-water will allow us to characterize slow transport phenomena such as flow, diffusion, and electrophoretic mobility. The study of transport phenomena will become possible over longer time intervals, lower velocities or greater distances.
Max ERC Funding
2 500 000 €
Duration
Start date: 2014-01-01, End date: 2018-12-31