Project acronym 4DPHOTON
Project Beyond Light Imaging: High-Rate Single-Photon Detection in Four Dimensions
Researcher (PI) Massimiliano FIORINI
Host Institution (HI) ISTITUTO NAZIONALE DI FISICA NUCLEARE
Call Details Consolidator Grant (CoG), PE2, ERC-2018-COG
Summary Goal of the 4DPHOTON project is the development and construction of a photon imaging detector with unprecedented performance. The proposed device will be capable of detecting fluxes of single-photons up to one billion photons per second, over areas of several square centimetres, and will measure - for each photon - position and time simultaneously with resolutions better than ten microns and few tens of picoseconds, respectively. These figures of merit will open many important applications allowing significant advances in particle physics, life sciences or other emerging fields where excellent timing and position resolutions are simultaneously required.
Our goal will be achieved thanks to the use of an application-specific integrated circuit in 65 nm complementary metal-oxide-semiconductor (CMOS) technology, that will deliver a timing resolution of few tens of picoseconds at the pixel level, over few hundred thousand individually-active pixel channels, allowing very high rates of photons to be detected, and the corresponding information digitized and transferred to a processing unit.
As a result of the 4DPHOTON project we will remove the constraints that many light imaging applications have due to the lack of precise single-photon information on four dimensions (4D): the three spatial coordinates and time simultaneously. In particular, we will prove the performance of this detector in the field of particle physics, performing the reconstruction of Cherenkov photon rings with a timing resolution of ten picoseconds. With its excellent granularity, timing resolution, rate capability and compactness, this detector will represent a new paradigm for the realisation of future Ring Imaging Cherenkov detectors, capable of achieving high efficiency particle identification in environments with very high particle multiplicities, exploiting time-association of the photon hits.
Summary
Goal of the 4DPHOTON project is the development and construction of a photon imaging detector with unprecedented performance. The proposed device will be capable of detecting fluxes of single-photons up to one billion photons per second, over areas of several square centimetres, and will measure - for each photon - position and time simultaneously with resolutions better than ten microns and few tens of picoseconds, respectively. These figures of merit will open many important applications allowing significant advances in particle physics, life sciences or other emerging fields where excellent timing and position resolutions are simultaneously required.
Our goal will be achieved thanks to the use of an application-specific integrated circuit in 65 nm complementary metal-oxide-semiconductor (CMOS) technology, that will deliver a timing resolution of few tens of picoseconds at the pixel level, over few hundred thousand individually-active pixel channels, allowing very high rates of photons to be detected, and the corresponding information digitized and transferred to a processing unit.
As a result of the 4DPHOTON project we will remove the constraints that many light imaging applications have due to the lack of precise single-photon information on four dimensions (4D): the three spatial coordinates and time simultaneously. In particular, we will prove the performance of this detector in the field of particle physics, performing the reconstruction of Cherenkov photon rings with a timing resolution of ten picoseconds. With its excellent granularity, timing resolution, rate capability and compactness, this detector will represent a new paradigm for the realisation of future Ring Imaging Cherenkov detectors, capable of achieving high efficiency particle identification in environments with very high particle multiplicities, exploiting time-association of the photon hits.
Max ERC Funding
1 975 000 €
Duration
Start date: 2019-12-01, End date: 2024-11-30
Project acronym B Massive
Project Binary massive black hole astrophysics
Researcher (PI) Alberto SESANA
Host Institution (HI) UNIVERSITA' DEGLI STUDI DI MILANO-BICOCCA
Call Details Consolidator Grant (CoG), PE9, ERC-2018-COG
Summary Massive black hole binaries (MBHBs) are the most extreme, fascinating yet elusive astrophysical objects in the Universe. Establishing observationally their existence will be a milestone for contemporary astronomy, providing a fundamental missing piece in the puzzle of galaxy formation, piercing through the (hydro)dynamical physical processes shaping dense galactic nuclei from parsec scales down to the event horizon, and probing gravity in extreme conditions.
We can both see and listen to MBHBs. Remarkably, besides arguably being among the brightest variable objects shining in the Cosmos, MBHBs are also the loudest gravitational wave (GW) sources in the Universe. As such, we shall take advantage of both the type of messengers – photons and gravitons – they are sending to us, which can now be probed by all-sky time-domain surveys and radio pulsar timing arrays (PTAs) respectively.
B MASSIVE leverages on a unique comprehensive approach combining theoretical astrophysics, radio and gravitational-wave astronomy and time-domain surveys, with state of the art data analysis techniques to: i) observationally prove the existence of MBHBs, ii) understand and constrain their astrophysics and dynamics, iii) enable and bring closer in time the direct detection of GWs with PTA.
As European PTA (EPTA) executive committee member and former I
International PTA (IPTA) chair, I am a driving force in the development of pulsar timing science world-wide, and the project will build on the profound knowledge, broad vision and wide collaboration network that established me as a world leader in the field of MBHB and GW astrophysics. B MASSIVE is extremely timely; a pulsar timing data set of unprecedented quality is being assembled by EPTA/IPTA, and Time-Domain astronomy surveys are at their dawn. In the long term, B MASSIVE will be a fundamental milestone establishing European leadership in the cutting-edge field of MBHB astrophysics in the era of LSST, SKA and LISA.
Summary
Massive black hole binaries (MBHBs) are the most extreme, fascinating yet elusive astrophysical objects in the Universe. Establishing observationally their existence will be a milestone for contemporary astronomy, providing a fundamental missing piece in the puzzle of galaxy formation, piercing through the (hydro)dynamical physical processes shaping dense galactic nuclei from parsec scales down to the event horizon, and probing gravity in extreme conditions.
We can both see and listen to MBHBs. Remarkably, besides arguably being among the brightest variable objects shining in the Cosmos, MBHBs are also the loudest gravitational wave (GW) sources in the Universe. As such, we shall take advantage of both the type of messengers – photons and gravitons – they are sending to us, which can now be probed by all-sky time-domain surveys and radio pulsar timing arrays (PTAs) respectively.
B MASSIVE leverages on a unique comprehensive approach combining theoretical astrophysics, radio and gravitational-wave astronomy and time-domain surveys, with state of the art data analysis techniques to: i) observationally prove the existence of MBHBs, ii) understand and constrain their astrophysics and dynamics, iii) enable and bring closer in time the direct detection of GWs with PTA.
As European PTA (EPTA) executive committee member and former I
International PTA (IPTA) chair, I am a driving force in the development of pulsar timing science world-wide, and the project will build on the profound knowledge, broad vision and wide collaboration network that established me as a world leader in the field of MBHB and GW astrophysics. B MASSIVE is extremely timely; a pulsar timing data set of unprecedented quality is being assembled by EPTA/IPTA, and Time-Domain astronomy surveys are at their dawn. In the long term, B MASSIVE will be a fundamental milestone establishing European leadership in the cutting-edge field of MBHB astrophysics in the era of LSST, SKA and LISA.
Max ERC Funding
1 532 750 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym bECOMiNG
Project spontaneous Evolution and Clonal heterOgeneity in MoNoclonal Gammopathies: from mechanisms of progression to clinical management
Researcher (PI) Niccolo Bolli
Host Institution (HI) UNIVERSITA DEGLI STUDI DI MILANO
Call Details Consolidator Grant (CoG), LS7, ERC-2018-COG
Summary As an onco-hematologist with a strong expertise in genomics, I significantly contributed to the understanding of multiple myeloma (MM) heterogeneity and its evolution over time, driven by genotypic and phenotypic features carried by different subpopulations of cells. MM is preceded by prevalent, asymptomatic stages that may evolve with variable frequency, not accurately captured by current clinical prognostic scores. Supported by preliminary data, my hypothesis is that the same heterogeneity is present early on the disease course, and identification of the biological determinants of evolution at this stage will allow better prediction of its evolutionary trajectory, if not its control. In this proposal I will therefore make a sharp change from conventional approaches and move to early stages of MM using unique retrospective sample cohorts and ambitious prospective sampling. To identify clonal MM cells in the elderly before a monoclonal gammopathy can be detected, I will collect bone marrow (BM) from hundreds of hip replacement specimens, and analyze archive peripheral blood samples of thousands of healthy individuals with years of annotated clinical follow-up. This will identify early genomic alterations that are permissive to disease initiation/evolution and may serve as biomarkers for clinical screening. Through innovative, integrated single-cell genotyping and phenotyping of hundreds of asymptomatic MMs, I will functionally dissect heterogeneity and characterize the BM microenvironment to look for determinants of disease progression. Correlation with clinical outcome and mini-invasive serial sampling of circulating cell-free DNA will identify candidate biological markers to better predict evolution. Last, aggressive modelling of candidate early lesions and modifier screens will offer a list of vulnerabilities that could be exploited for rationale therapies. These methodologies will deliver a paradigm for the use of molecularly-driven precision medicine in cancer.
Summary
As an onco-hematologist with a strong expertise in genomics, I significantly contributed to the understanding of multiple myeloma (MM) heterogeneity and its evolution over time, driven by genotypic and phenotypic features carried by different subpopulations of cells. MM is preceded by prevalent, asymptomatic stages that may evolve with variable frequency, not accurately captured by current clinical prognostic scores. Supported by preliminary data, my hypothesis is that the same heterogeneity is present early on the disease course, and identification of the biological determinants of evolution at this stage will allow better prediction of its evolutionary trajectory, if not its control. In this proposal I will therefore make a sharp change from conventional approaches and move to early stages of MM using unique retrospective sample cohorts and ambitious prospective sampling. To identify clonal MM cells in the elderly before a monoclonal gammopathy can be detected, I will collect bone marrow (BM) from hundreds of hip replacement specimens, and analyze archive peripheral blood samples of thousands of healthy individuals with years of annotated clinical follow-up. This will identify early genomic alterations that are permissive to disease initiation/evolution and may serve as biomarkers for clinical screening. Through innovative, integrated single-cell genotyping and phenotyping of hundreds of asymptomatic MMs, I will functionally dissect heterogeneity and characterize the BM microenvironment to look for determinants of disease progression. Correlation with clinical outcome and mini-invasive serial sampling of circulating cell-free DNA will identify candidate biological markers to better predict evolution. Last, aggressive modelling of candidate early lesions and modifier screens will offer a list of vulnerabilities that could be exploited for rationale therapies. These methodologies will deliver a paradigm for the use of molecularly-driven precision medicine in cancer.
Max ERC Funding
1 998 781 €
Duration
Start date: 2019-03-01, End date: 2024-02-29
Project acronym BrightEyes
Project Multi-Parameter Live-Cell Observation of Biomolecular Processes with Single-Photon Detector Array
Researcher (PI) Giuseppe Vicidomini
Host Institution (HI) FONDAZIONE ISTITUTO ITALIANO DI TECNOLOGIA
Call Details Consolidator Grant (CoG), PE7, ERC-2018-COG
Summary Fluorescence single-molecule (SM) detection techniques have the potential to provide insights into the complex functions, structures and interactions of individual, specifically labelled biomolecules. However, current SM techniques work properly only when the biomolecule is observed in controlled environments, e.g., immobilized on a glass surface. Observation of biomolecular processes in living (multi)cellular environments – which is fundamental for sound biological conclusion – always comes with a price, such as invasiveness, limitations in the accessible information and constraints in the spatial and temporal scales.
The overall objective of the BrightEyes project is to break the above limitations by creating a novel SM approach compatible with the state-of-the-art biomolecule-labelling protocols, able to track a biomolecule deep inside (multi)cellular environments – with temporal resolution in the microsecond scale, and with hundreds of micrometres tracking range – and simultaneously observe its structural changes, its nano- and micro-environments.
Specifically, by exploring a novel single-photon detectors array, the BrightEyes project will implement an optical system, able to continuously (i) track in real-time the biomolecule of interest from which to decode its dynamics and interactions; (ii) measure the nano-environment fluorescence spectroscopy properties, such as lifetime, photon-pair correlation and intensity, from which to extract the biochemical properties of the nano-environment, the structural properties of the biomolecule – via SM-FRET and anti-bunching – and the interactions of the biomolecule with other biomolecular species – via STED-FCS; (iii) visualize the sub-cellular structures within the micro-environment with sub-diffraction spatial resolution – via STED and image scanning microscopy.
This unique paradigm will enable unprecedented studies of biomolecular behaviours, interactions and self-organization at near-physiological conditions.
Summary
Fluorescence single-molecule (SM) detection techniques have the potential to provide insights into the complex functions, structures and interactions of individual, specifically labelled biomolecules. However, current SM techniques work properly only when the biomolecule is observed in controlled environments, e.g., immobilized on a glass surface. Observation of biomolecular processes in living (multi)cellular environments – which is fundamental for sound biological conclusion – always comes with a price, such as invasiveness, limitations in the accessible information and constraints in the spatial and temporal scales.
The overall objective of the BrightEyes project is to break the above limitations by creating a novel SM approach compatible with the state-of-the-art biomolecule-labelling protocols, able to track a biomolecule deep inside (multi)cellular environments – with temporal resolution in the microsecond scale, and with hundreds of micrometres tracking range – and simultaneously observe its structural changes, its nano- and micro-environments.
Specifically, by exploring a novel single-photon detectors array, the BrightEyes project will implement an optical system, able to continuously (i) track in real-time the biomolecule of interest from which to decode its dynamics and interactions; (ii) measure the nano-environment fluorescence spectroscopy properties, such as lifetime, photon-pair correlation and intensity, from which to extract the biochemical properties of the nano-environment, the structural properties of the biomolecule – via SM-FRET and anti-bunching – and the interactions of the biomolecule with other biomolecular species – via STED-FCS; (iii) visualize the sub-cellular structures within the micro-environment with sub-diffraction spatial resolution – via STED and image scanning microscopy.
This unique paradigm will enable unprecedented studies of biomolecular behaviours, interactions and self-organization at near-physiological conditions.
Max ERC Funding
1 861 250 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym CancerADAPT
Project Targeting the adaptive capacity of prostate cancer through the manipulation of transcriptional and metabolic traits
Researcher (PI) Arkaitz CARRACEDO PEREZ
Host Institution (HI) ASOCIACION CENTRO DE INVESTIGACION COOPERATIVA EN BIOCIENCIAS
Call Details Consolidator Grant (CoG), LS4, ERC-2018-COG
Summary The composition and molecular features of tumours vary during the course of the disease, and the selection pressure imposed by the environment is a central component in this process. Evolutionary principles have been exploited to explain the genomic aberrations in cancer. However, the phenotypic changes underlying disease progression remain poorly understood. In the past years, I have contributed to identify and characterise the therapeutic implications underlying metabolic alterations that are intrinsic to primary tumours or metastasis. In CancerADAPT I postulate that cancer cells rely on adaptive transcriptional & metabolic mechanisms [converging on a Metabolic Phenotype] in order to rapidly succeed in their establishment in new microenvironments along disease progression. I aim to predict the molecular cues that govern the adaptive properties in prostate cancer (PCa), one of the most commonly diagnosed cancers in men and an important source of cancer-related deaths. I will exploit single cell RNASeq, spatial transcriptomics and multiregional OMICs in order to identify the transcriptional and metabolic diversity within tumours and along disease progression. I will complement experimental strategies with computational analyses that identify and classify the predicted adaptation strategies of PCa cells in response to variations in the tumour microenvironment. Metabolic phenotypes postulated to sustain PCa adaptability will be functionally and mechanistically deconstructed. We will identify therapeutic strategies emanating from these results through in silico methodologies and small molecule high-throughput screening, and evaluate their potential to hamper the adaptability of tumour cells in vitro and in vivo, in two specific aspects: metastasis and therapy response. CancerADAPT will generate fundamental understanding on how cancer cells adapt in our organism, in turn leading to therapeutic strategies that increase the efficacy of current treatments.
Summary
The composition and molecular features of tumours vary during the course of the disease, and the selection pressure imposed by the environment is a central component in this process. Evolutionary principles have been exploited to explain the genomic aberrations in cancer. However, the phenotypic changes underlying disease progression remain poorly understood. In the past years, I have contributed to identify and characterise the therapeutic implications underlying metabolic alterations that are intrinsic to primary tumours or metastasis. In CancerADAPT I postulate that cancer cells rely on adaptive transcriptional & metabolic mechanisms [converging on a Metabolic Phenotype] in order to rapidly succeed in their establishment in new microenvironments along disease progression. I aim to predict the molecular cues that govern the adaptive properties in prostate cancer (PCa), one of the most commonly diagnosed cancers in men and an important source of cancer-related deaths. I will exploit single cell RNASeq, spatial transcriptomics and multiregional OMICs in order to identify the transcriptional and metabolic diversity within tumours and along disease progression. I will complement experimental strategies with computational analyses that identify and classify the predicted adaptation strategies of PCa cells in response to variations in the tumour microenvironment. Metabolic phenotypes postulated to sustain PCa adaptability will be functionally and mechanistically deconstructed. We will identify therapeutic strategies emanating from these results through in silico methodologies and small molecule high-throughput screening, and evaluate their potential to hamper the adaptability of tumour cells in vitro and in vivo, in two specific aspects: metastasis and therapy response. CancerADAPT will generate fundamental understanding on how cancer cells adapt in our organism, in turn leading to therapeutic strategies that increase the efficacy of current treatments.
Max ERC Funding
1 999 882 €
Duration
Start date: 2019-11-01, End date: 2024-10-31
Project acronym DIAPASoN
Project Differential Program Semantics
Researcher (PI) Ugo DAL LAGO
Host Institution (HI) ALMA MATER STUDIORUM - UNIVERSITA DI BOLOGNA
Call Details Consolidator Grant (CoG), PE6, ERC-2018-COG
Summary Traditionally, program semantics is centered around the notion of program identity, that is to say of program equivalence: a program is identified with its meaning, and programs are considered as equal only if their meanings are the same. This view has been extremely fruitful in the past, allowing for a deep understanding of highly interactive forms of computation as embodied by higher-order or concurrent programs. The byproducts of all this lie everywhere in computer science, from programming language design to verification methodologies. The emphasis on equality — as opposed to differences — is not however in line with the way programs are written and structured in modern complex software systems. Subtasks are delegated to pieces of code which behave as expected only up to a certain probability of error, and only if the environment in which they operate makes this possible deviation irrelevant. These aspects have been almost neglected by the program semantics community until recently, and still have a marginal role. DIAPASON's goal is to study differences between programs as a constitutive and informative concept, rather than by way of relations between them. This will be accomplished by generalizing four major frameworks of program semantics, traditionally used for giving semantics to programs, comparing them, proving properties of them, and controlling their usage of resources: logical relations, bisimulation, game semantics, and linear logic.
Summary
Traditionally, program semantics is centered around the notion of program identity, that is to say of program equivalence: a program is identified with its meaning, and programs are considered as equal only if their meanings are the same. This view has been extremely fruitful in the past, allowing for a deep understanding of highly interactive forms of computation as embodied by higher-order or concurrent programs. The byproducts of all this lie everywhere in computer science, from programming language design to verification methodologies. The emphasis on equality — as opposed to differences — is not however in line with the way programs are written and structured in modern complex software systems. Subtasks are delegated to pieces of code which behave as expected only up to a certain probability of error, and only if the environment in which they operate makes this possible deviation irrelevant. These aspects have been almost neglected by the program semantics community until recently, and still have a marginal role. DIAPASON's goal is to study differences between programs as a constitutive and informative concept, rather than by way of relations between them. This will be accomplished by generalizing four major frameworks of program semantics, traditionally used for giving semantics to programs, comparing them, proving properties of them, and controlling their usage of resources: logical relations, bisimulation, game semantics, and linear logic.
Max ERC Funding
959 562 €
Duration
Start date: 2019-03-01, End date: 2024-02-29
Project acronym ECHO
Project Extending Coherence for Hardware-Driven Optimizations in Multicore Architectures
Researcher (PI) Alberto ROS BARDISA
Host Institution (HI) UNIVERSIDAD DE MURCIA
Call Details Consolidator Grant (CoG), PE6, ERC-2018-COG
Summary Multicore processors are present nowadays in most digital devices, from smartphones to high-performance
servers. The increasing computational power of these processors is essential for enabling many important
emerging application domains such as big-data, media, medical, or scientific modeling. A fundamental
technique to improve performance is speculation, a technique that consists in executing work before it is
known if it is actually needed. In hardware, speculation significantly increases energy consumption by
performing unnecessary operations, while speculation in software (e.g., compilers) is not the default thus
preventing performance optimizations. Since performance in current multicores is limited by their power
budget, it is imperative to make multicores as energy-efficient as possible to increase performance even
further.
In a multicore architecture, the cache coherence protocol is an essential component since its unique but
challenging role is to offer a simple and unified view of the memory hierarchy. This project envisions that
extending the role of the coherence protocol to simplify other system components will be the key to
overcome the performance and energy limitations of current multicores. In particular, ECHO proposes to
add simple but effective extensions to the cache coherence protocol in order to (i) reduce and even
eliminate misspeculations at the processing cores and synchronization mechanisms and to (ii) enable
speculative optimizations at compile time. The goal of this innovative approach is to improve the
performance and energy efficiency of future multicore architectures. To accomplish the objectives
proposed in this project, I will build on my 14 years expertise in cache coherence, documented in over 40
publications of high impact.
Summary
Multicore processors are present nowadays in most digital devices, from smartphones to high-performance
servers. The increasing computational power of these processors is essential for enabling many important
emerging application domains such as big-data, media, medical, or scientific modeling. A fundamental
technique to improve performance is speculation, a technique that consists in executing work before it is
known if it is actually needed. In hardware, speculation significantly increases energy consumption by
performing unnecessary operations, while speculation in software (e.g., compilers) is not the default thus
preventing performance optimizations. Since performance in current multicores is limited by their power
budget, it is imperative to make multicores as energy-efficient as possible to increase performance even
further.
In a multicore architecture, the cache coherence protocol is an essential component since its unique but
challenging role is to offer a simple and unified view of the memory hierarchy. This project envisions that
extending the role of the coherence protocol to simplify other system components will be the key to
overcome the performance and energy limitations of current multicores. In particular, ECHO proposes to
add simple but effective extensions to the cache coherence protocol in order to (i) reduce and even
eliminate misspeculations at the processing cores and synchronization mechanisms and to (ii) enable
speculative optimizations at compile time. The goal of this innovative approach is to improve the
performance and energy efficiency of future multicore architectures. To accomplish the objectives
proposed in this project, I will build on my 14 years expertise in cache coherence, documented in over 40
publications of high impact.
Max ERC Funding
1 999 955 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym ENFORCE
Project ENgineering FrustratiOn in aRtificial Colloidal icEs:degeneracy, exotic lattices and 3D states
Researcher (PI) pietro TIERNO
Host Institution (HI) UNIVERSITAT DE BARCELONA
Call Details Consolidator Grant (CoG), PE3, ERC-2018-COG
Summary Geometric frustration, namely the impossibility of satisfying competing interactions on a lattice, has recently
become a topic of considerable interest as it engenders emergent, fundamentally new phenomena and holds
the exciting promise of delivering a new class of nanoscale devices based on the motion of magnetic charges.
With ENFORCE, I propose to realize two and three dimensional artificial colloidal ices and investigate the
fascinating manybody physics of geometric frustration in these mesoscopic structures. I will use these soft
matter systems to engineer novel frustrated states through independent control of the single particle
positions, lattice topology and collective magnetic coupling. The three project work packages (WPs) will
present increasing levels of complexity, challenge and ambition:
(i) In WP1, I will demonstrate a way to restore the residual entropy in the square ice, a fundamental longstanding
problem in the field. Furthermore, I will miniaturize the square and the honeycomb geometries and investigate the dynamics of thermally excited topological defects and the formation of grain boundaries.
(ii) In WP2, I will decimate both lattices and realize mixed coordination geometries, where the similarity
between the colloidal and spin ice systems breaks down. I will then develop a novel annealing protocol based
on the simultaneous system visualization and magnetic actuation control.
(iii) In WP3, I will realize a three dimensional artificial colloidal ice, in which interacting ferromagnetic
inclusions will be located in the voids of an inverse opal, and arranged to form the FCC or the pyrochlore
lattices. External fields will be used to align, bias and stir these magnetic inclusions while monitoring in situ
their orientation and dynamics via laser scanning confocal microscopy.
ENFORCE will exploit the accessible time and length scales of the colloidal ice to shed new light on the
exciting and interdisciplinary field of geometric frustration.
Summary
Geometric frustration, namely the impossibility of satisfying competing interactions on a lattice, has recently
become a topic of considerable interest as it engenders emergent, fundamentally new phenomena and holds
the exciting promise of delivering a new class of nanoscale devices based on the motion of magnetic charges.
With ENFORCE, I propose to realize two and three dimensional artificial colloidal ices and investigate the
fascinating manybody physics of geometric frustration in these mesoscopic structures. I will use these soft
matter systems to engineer novel frustrated states through independent control of the single particle
positions, lattice topology and collective magnetic coupling. The three project work packages (WPs) will
present increasing levels of complexity, challenge and ambition:
(i) In WP1, I will demonstrate a way to restore the residual entropy in the square ice, a fundamental longstanding
problem in the field. Furthermore, I will miniaturize the square and the honeycomb geometries and investigate the dynamics of thermally excited topological defects and the formation of grain boundaries.
(ii) In WP2, I will decimate both lattices and realize mixed coordination geometries, where the similarity
between the colloidal and spin ice systems breaks down. I will then develop a novel annealing protocol based
on the simultaneous system visualization and magnetic actuation control.
(iii) In WP3, I will realize a three dimensional artificial colloidal ice, in which interacting ferromagnetic
inclusions will be located in the voids of an inverse opal, and arranged to form the FCC or the pyrochlore
lattices. External fields will be used to align, bias and stir these magnetic inclusions while monitoring in situ
their orientation and dynamics via laser scanning confocal microscopy.
ENFORCE will exploit the accessible time and length scales of the colloidal ice to shed new light on the
exciting and interdisciplinary field of geometric frustration.
Max ERC Funding
1 850 298 €
Duration
Start date: 2020-01-01, End date: 2024-12-31
Project acronym FACETS
Project Face Aesthetics in Contemporary E-Technological Societies
Researcher (PI) Massimo LEONE
Host Institution (HI) UNIVERSITA DEGLI STUDI DI TORINO
Call Details Consolidator Grant (CoG), SH5, ERC-2018-COG
Summary FACETS studies the meaning of the face in contemporary visual cultures. There are two complementary research foci: widespread practices of face exhibition in social networks like Facebook, Instagram, Snapchat, and Tinder; and minority practices of occultation, including the mask in anti-establishment political activism (e.g., Anonymous) and in anti-surveillance artistic provocation (e.g., Leonardo Selvaggio). Arguably, the meaning of the human face is currently changing on a global scale: through the invention and diffusion of new visual technologies (e.g., digital photography, visual filters, as well as software for automatic face recognition); through the creation and establishment of novel genres of face representation (e.g., the selfie); and through new approaches to face perception, reading, and memorization (e.g., the ‘scrolling’ of faces on Tinder). Cognitions, emotions, and actions that people attach to the interaction with one’s and others’ faces might soon be undergoing dramatic shifts. In FACETS, an interdisciplinary but focused approach combines visual history, semiotics, phenomenology, visual anthropology, but also face perception studies and collection, analysis, and social contextualization of big data, so as to study the cultural and technological causes of these changes and their effects in terms of alterations in self-perception and communicative interaction. In the tension between, on the one hand, political and economic agencies pressing for increasing disclosure, detection, and marketing of the human face (for reasons of security and control, for commercial or bureaucratic purposes) and, on the other hand, the counter-trends of face occultation (writers and artists like Banksy, Ferrante, Sia, or Christopher Sievey / Frank Sidebottom choosing not to reveal their faces), the visual syntax, the semantics, and the pragmatics of the human face are rapidly evolving. FACETS carries on an innovative, cross-disciplinary survey of this phenomenon.
Summary
FACETS studies the meaning of the face in contemporary visual cultures. There are two complementary research foci: widespread practices of face exhibition in social networks like Facebook, Instagram, Snapchat, and Tinder; and minority practices of occultation, including the mask in anti-establishment political activism (e.g., Anonymous) and in anti-surveillance artistic provocation (e.g., Leonardo Selvaggio). Arguably, the meaning of the human face is currently changing on a global scale: through the invention and diffusion of new visual technologies (e.g., digital photography, visual filters, as well as software for automatic face recognition); through the creation and establishment of novel genres of face representation (e.g., the selfie); and through new approaches to face perception, reading, and memorization (e.g., the ‘scrolling’ of faces on Tinder). Cognitions, emotions, and actions that people attach to the interaction with one’s and others’ faces might soon be undergoing dramatic shifts. In FACETS, an interdisciplinary but focused approach combines visual history, semiotics, phenomenology, visual anthropology, but also face perception studies and collection, analysis, and social contextualization of big data, so as to study the cultural and technological causes of these changes and their effects in terms of alterations in self-perception and communicative interaction. In the tension between, on the one hand, political and economic agencies pressing for increasing disclosure, detection, and marketing of the human face (for reasons of security and control, for commercial or bureaucratic purposes) and, on the other hand, the counter-trends of face occultation (writers and artists like Banksy, Ferrante, Sia, or Christopher Sievey / Frank Sidebottom choosing not to reveal their faces), the visual syntax, the semantics, and the pragmatics of the human face are rapidly evolving. FACETS carries on an innovative, cross-disciplinary survey of this phenomenon.
Max ERC Funding
1 997 803 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym FeMiT
Project Ferrites-by-design for Millimeter-wave and Terahertz Technologies
Researcher (PI) Martí GICH
Host Institution (HI) AGENCIA ESTATAL CONSEJO SUPERIOR DEINVESTIGACIONES CIENTIFICAS
Call Details Consolidator Grant (CoG), PE8, ERC-2018-COG
Summary Robust disruptive materials will be essential for the “wireless everywhere” to become a reality. This is because we need a paradigm shift in mobile communications to meet the challenges of such an ambitious evolution. In particular, some of these emerging technologies will trigger the replacement of the magnetic microwave ferrites in use today. This will namely occur with the forecasted shift to high frequency mm-wave and THz bands and in novel antennas that can simultaneously transmit and receive data on the same frequency. In both cases, operating with state-of-the-art ferrites would require large external magnetic fields incompatible with future needs of smaller, power-efficient devices.
To overcome these issues, we target ferrites featuring the so far unmet combinations of low magnetic loss and large values of magnetocrystalline anisotropy, magnetostriction or magnetoelectric coupling.
The objective of FeMiT is developing a novel family of orthorhombic ferrites based on ε-Fe2O3, a room-temperature multiferroic with large magnetocrystalline anisotropy. Those properties and unique structural features make it an excellent platform to develop the sought-after functional materials for future compact and energy-efficient wireless devices.
In the first part of FeMiT we will explore the limits and diversity of this new family by exploiting rational chemical substitutions, high pressures and strain engineering. Soft chemistry and physical deposition methods will be both considered at this stage.
The second part of FeMiT entails a characterization of functional properties and selection of the best candidates to be integrated in composite and epitaxial films suitable for application. The expected outcomes will provide proof-of-concept self-biased or voltage-controlled signal-processing devices with low losses in the mm-wave to THz bands, with high potential impact in the development of future wireless technologies.
Summary
Robust disruptive materials will be essential for the “wireless everywhere” to become a reality. This is because we need a paradigm shift in mobile communications to meet the challenges of such an ambitious evolution. In particular, some of these emerging technologies will trigger the replacement of the magnetic microwave ferrites in use today. This will namely occur with the forecasted shift to high frequency mm-wave and THz bands and in novel antennas that can simultaneously transmit and receive data on the same frequency. In both cases, operating with state-of-the-art ferrites would require large external magnetic fields incompatible with future needs of smaller, power-efficient devices.
To overcome these issues, we target ferrites featuring the so far unmet combinations of low magnetic loss and large values of magnetocrystalline anisotropy, magnetostriction or magnetoelectric coupling.
The objective of FeMiT is developing a novel family of orthorhombic ferrites based on ε-Fe2O3, a room-temperature multiferroic with large magnetocrystalline anisotropy. Those properties and unique structural features make it an excellent platform to develop the sought-after functional materials for future compact and energy-efficient wireless devices.
In the first part of FeMiT we will explore the limits and diversity of this new family by exploiting rational chemical substitutions, high pressures and strain engineering. Soft chemistry and physical deposition methods will be both considered at this stage.
The second part of FeMiT entails a characterization of functional properties and selection of the best candidates to be integrated in composite and epitaxial films suitable for application. The expected outcomes will provide proof-of-concept self-biased or voltage-controlled signal-processing devices with low losses in the mm-wave to THz bands, with high potential impact in the development of future wireless technologies.
Max ERC Funding
1 989 967 €
Duration
Start date: 2019-05-01, End date: 2024-04-30
Project acronym GEMS
Project General Embedding Models for Spectroscopy
Researcher (PI) Chiara CAPPELLI
Host Institution (HI) SCUOLA NORMALE SUPERIORE
Call Details Consolidator Grant (CoG), PE4, ERC-2018-COG
Summary Recently, there has been a paradigmatic shift in experimental molecular spectroscopy, with new methods focusing on the study of molecules embedded within complex supramolecular/nanostructured aggregates. In the past, molecular spectroscopy has benefitted from the synergistic developments of accurate and cost-effective computational protocols for the simulation of a wide variety of spectroscopies. These methods, however, have been limited to isolated molecules or systems in solution, therefore are inadequate to describe the spectroscopy of complex nanostructured systems. The aim of GEMS is to bridge this gap, and to provide a coherent theoretical description and cost-effective computational tools for the simulation of spectra of molecules interacting with metal nano-particles, metal nanoaggregates and graphene sheets.
To this end, I will develop a novel frequency-dependent multilayer Quantum Mechanical (QM)/Molecular Mechanics (MM) embedding approach, general enough to be extendable to spectroscopic signals by using the machinery of quantum chemistry and able to treat any kind of plasmonic external environment by resorting to the same theoretical framework, but introducing its specificities through an accurate modelling and parametrization of the classical portion. The model will be interfaced with widely used computational chemistry software packages, so to maximize its use by the scientific community, and especially by non-specialists.
As pilot applications, GEMS will study the Surface-Enhanced Raman (SERS) spectra of systems that have found applications in the biosensor field, SERS of organic molecules in subnanometre junctions, enhanced infrared (IR) spectra of oligopeptides adsorbed on graphene, Graphene Enhanced Raman Scattering (GERS) of organic dyes, and the transmission of stereochemical response from a chiral analyte to an achiral molecule in the vicinity of a plasmon resonance of an achiral metallic nanostructure, as measured by Raman Optical Activity-ROA
Summary
Recently, there has been a paradigmatic shift in experimental molecular spectroscopy, with new methods focusing on the study of molecules embedded within complex supramolecular/nanostructured aggregates. In the past, molecular spectroscopy has benefitted from the synergistic developments of accurate and cost-effective computational protocols for the simulation of a wide variety of spectroscopies. These methods, however, have been limited to isolated molecules or systems in solution, therefore are inadequate to describe the spectroscopy of complex nanostructured systems. The aim of GEMS is to bridge this gap, and to provide a coherent theoretical description and cost-effective computational tools for the simulation of spectra of molecules interacting with metal nano-particles, metal nanoaggregates and graphene sheets.
To this end, I will develop a novel frequency-dependent multilayer Quantum Mechanical (QM)/Molecular Mechanics (MM) embedding approach, general enough to be extendable to spectroscopic signals by using the machinery of quantum chemistry and able to treat any kind of plasmonic external environment by resorting to the same theoretical framework, but introducing its specificities through an accurate modelling and parametrization of the classical portion. The model will be interfaced with widely used computational chemistry software packages, so to maximize its use by the scientific community, and especially by non-specialists.
As pilot applications, GEMS will study the Surface-Enhanced Raman (SERS) spectra of systems that have found applications in the biosensor field, SERS of organic molecules in subnanometre junctions, enhanced infrared (IR) spectra of oligopeptides adsorbed on graphene, Graphene Enhanced Raman Scattering (GERS) of organic dyes, and the transmission of stereochemical response from a chiral analyte to an achiral molecule in the vicinity of a plasmon resonance of an achiral metallic nanostructure, as measured by Raman Optical Activity-ROA
Max ERC Funding
1 609 500 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym GRAMS
Project GRavity from Astrophysical to Microscopic Scales
Researcher (PI) Enrico BARAUSSE
Host Institution (HI) SCUOLA INTERNAZIONALE SUPERIORE DI STUDI AVANZATI DI TRIESTE
Call Details Consolidator Grant (CoG), PE9, ERC-2018-COG
Summary General Relativity (GR) describes gravity on a huge range of scales, field strengths and velocities. However, despite its successes, GR has been showing its age. Cosmological data support the existence of a Dark Sector, but may also be interpreted as a breakdown of our understanding of gravity. Also, GR is intrinsically incompatible with quantum field theory, and should be replaced, at high energies, by a (still unknown) quantum theory of gravity.
This deadlock may prelude to a paradigm change in our understanding of gravity, possibly triggered by the direct observations of neutron stars and black holes by gravitational-wave interferometers. The recent LIGO/Virgo observations, and in particular the coincident detection of electromagnetic and gravitational signals from neutron-star binaries, have already made a huge impact on our theoretical understanding of gravity, by severely constraining several extensions of GR.
GRAMS is a high-risk/high-gain project seeking to push the implications of these observations even further, by exploring whether the existing LIGO/Virgo data, and in particular their absence of non-perturbative deviations from GR, are consistent with gravitational theories built to reproduce the large-scale behaviour of the Universe (i.e. the existence of Dark Energy and/or Dark Matter), while at the same time passing local tests of gravity thanks to non-perturbative screening mechanisms. I will prove that the very fact of screening local scales makes gravitational emission in these theories much more involved than in GR, and also intrinsically unlikely to yield results in agreement with existing (and future) gravitational-wave observations. This would be a huge step forward for our understanding of cosmology, as it would rule out a modified gravity origin for the Dark Sector. Even if this conjecture is incorrect, GRAMS will provide the first numerical-relativity simulations of compact binaries ever in gravitational theories of interest for cosmology.
Summary
General Relativity (GR) describes gravity on a huge range of scales, field strengths and velocities. However, despite its successes, GR has been showing its age. Cosmological data support the existence of a Dark Sector, but may also be interpreted as a breakdown of our understanding of gravity. Also, GR is intrinsically incompatible with quantum field theory, and should be replaced, at high energies, by a (still unknown) quantum theory of gravity.
This deadlock may prelude to a paradigm change in our understanding of gravity, possibly triggered by the direct observations of neutron stars and black holes by gravitational-wave interferometers. The recent LIGO/Virgo observations, and in particular the coincident detection of electromagnetic and gravitational signals from neutron-star binaries, have already made a huge impact on our theoretical understanding of gravity, by severely constraining several extensions of GR.
GRAMS is a high-risk/high-gain project seeking to push the implications of these observations even further, by exploring whether the existing LIGO/Virgo data, and in particular their absence of non-perturbative deviations from GR, are consistent with gravitational theories built to reproduce the large-scale behaviour of the Universe (i.e. the existence of Dark Energy and/or Dark Matter), while at the same time passing local tests of gravity thanks to non-perturbative screening mechanisms. I will prove that the very fact of screening local scales makes gravitational emission in these theories much more involved than in GR, and also intrinsically unlikely to yield results in agreement with existing (and future) gravitational-wave observations. This would be a huge step forward for our understanding of cosmology, as it would rule out a modified gravity origin for the Dark Sector. Even if this conjecture is incorrect, GRAMS will provide the first numerical-relativity simulations of compact binaries ever in gravitational theories of interest for cosmology.
Max ERC Funding
1 993 920 €
Duration
Start date: 2019-04-01, End date: 2024-03-31
Project acronym ImmunoStem
Project Dissecting and Overcoming Innate Immune Barriers for Therapeutically Efficient Hematopoietic Stem Cell Gene Engineering
Researcher (PI) Anna Christina Kajaste-Rudnitski
Host Institution (HI) OSPEDALE SAN RAFFAELE SRL
Call Details Consolidator Grant (CoG), LS7, ERC-2018-COG
Summary The low gene manipulation efficiency of human hematopoietic stem cells (HSC) remains a major hurdle for sustainable and broad clinical application of innovative therapies for a wide range of disorders. Indeed, high vector doses and prolonged ex vivo culture are still required for clinically relevant levels of gene transfer even with the most established lentiviral vector-based delivery platforms.
Current and emerging gene transfer and editing technologies expose HSC to components potentially recognized by host antiviral factors and nucleic acid sensors that likely restrict their genetic engineering and contribute to broad individual variability in clinical outcomes observed in recent gene therapy trials. Nevertheless, specific effectors are yet to be identified in HSC. We have recently identified an antiviral factor that potently blocks gene transfer in HSC and have discovered small molecules that efficiently counteract it. This is the first example of how manipulating a single host factor can significantly impact gene transfer efficiencies in HSC but likely represents the mere tip of the iceberg of the plethora of innate sensing mechanisms potentially hampering genetic manipulation of this primitive cell compartment.
This proposal aims to identify the antiviral factors and innate sensing pathways that prevent efficient modification of HSC and to mitigate their effects using methods developed through a thorough understanding of their mechanisms of action. My approach builds on the innovative concept that understanding the crosstalk between HSC and viral vectors will instruct us on which immune sensors and effectors to avoid and how, with direct implications for all gene engineering technologies. Successful completion of this project will deliver broadly exportable novel paradigms of innate pathogen recognition that will allow ground-breaking progress in the development of cutting-edge cell and gene therapies and to fight infectious and autoimmune diseases.
Summary
The low gene manipulation efficiency of human hematopoietic stem cells (HSC) remains a major hurdle for sustainable and broad clinical application of innovative therapies for a wide range of disorders. Indeed, high vector doses and prolonged ex vivo culture are still required for clinically relevant levels of gene transfer even with the most established lentiviral vector-based delivery platforms.
Current and emerging gene transfer and editing technologies expose HSC to components potentially recognized by host antiviral factors and nucleic acid sensors that likely restrict their genetic engineering and contribute to broad individual variability in clinical outcomes observed in recent gene therapy trials. Nevertheless, specific effectors are yet to be identified in HSC. We have recently identified an antiviral factor that potently blocks gene transfer in HSC and have discovered small molecules that efficiently counteract it. This is the first example of how manipulating a single host factor can significantly impact gene transfer efficiencies in HSC but likely represents the mere tip of the iceberg of the plethora of innate sensing mechanisms potentially hampering genetic manipulation of this primitive cell compartment.
This proposal aims to identify the antiviral factors and innate sensing pathways that prevent efficient modification of HSC and to mitigate their effects using methods developed through a thorough understanding of their mechanisms of action. My approach builds on the innovative concept that understanding the crosstalk between HSC and viral vectors will instruct us on which immune sensors and effectors to avoid and how, with direct implications for all gene engineering technologies. Successful completion of this project will deliver broadly exportable novel paradigms of innate pathogen recognition that will allow ground-breaking progress in the development of cutting-edge cell and gene therapies and to fight infectious and autoimmune diseases.
Max ERC Funding
1 994 375 €
Duration
Start date: 2019-03-01, End date: 2024-02-29
Project acronym INITIUM
Project an Innovative Negative Ion TIme projection chamber for Underground dark Matter searches
Researcher (PI) Elisabetta BARACCHINI
Host Institution (HI) GRAN SASSO SCIENCE INSTITUTE
Call Details Consolidator Grant (CoG), PE2, ERC-2018-COG
Summary INITIUM: an Innovative Negative Ion TIme projection chamber for Underground dark Matter searches. INITIUM goal is to boost the advancement of gaseous Time Projection Chamber detectors in the Dark Matter (DM) searches field, one of the most compelling issues of todays fundamental physics. I believe this approach to be superior because of its active neutron/electron discrimination, directional and fiducialization capability down to low energies and versatility in terms of target material. Thanks to recent advances in Micro Pattern Gas Detectors amplification and improved readout techniques, TPCs are nowadays mature detectors to aim at developing a ton-scale experiment. INITIUM focuses on the development and operation of the first 1 m3 Negative Ion TPC with Gas Electron Multipliers amplification and optical readout with CMOS-based cameras and PMTs for directional DM searches at Laboratori Nazionali del Gran Sasso (LNGS). INITIUM will put new significant constraints in a DM WIMP-nucleon scattering parameter space still unexplored to these days, with a remarkable sensitivity down to 10-42-10-43 cm2 for Spin Independent coupling in the 1-10 GeV WIMP mass region. As a by-product, INITIUM will also precisely and simultaneously measure environmental fast and thermal neutron flux at LNGS, supplying crucial information for any present and future experiment in this location. Consequently, I will demonstrate the proof-of-principle and scalability of INITIUM approach towards the development of a ton-scale detector in the context of CYGNUS, an international collaboration (of which I am one of the Spokespersons and PIs) recently gathered together with the aim to establish a Galactic Directional Recoil Observatory, that can test the DM hypothesis beyond the Neutrino Floor and measure the coherent scatter of galactic neutrinos, generating a significant long-term impact on detection techniques for rare events searches.
Summary
INITIUM: an Innovative Negative Ion TIme projection chamber for Underground dark Matter searches. INITIUM goal is to boost the advancement of gaseous Time Projection Chamber detectors in the Dark Matter (DM) searches field, one of the most compelling issues of todays fundamental physics. I believe this approach to be superior because of its active neutron/electron discrimination, directional and fiducialization capability down to low energies and versatility in terms of target material. Thanks to recent advances in Micro Pattern Gas Detectors amplification and improved readout techniques, TPCs are nowadays mature detectors to aim at developing a ton-scale experiment. INITIUM focuses on the development and operation of the first 1 m3 Negative Ion TPC with Gas Electron Multipliers amplification and optical readout with CMOS-based cameras and PMTs for directional DM searches at Laboratori Nazionali del Gran Sasso (LNGS). INITIUM will put new significant constraints in a DM WIMP-nucleon scattering parameter space still unexplored to these days, with a remarkable sensitivity down to 10-42-10-43 cm2 for Spin Independent coupling in the 1-10 GeV WIMP mass region. As a by-product, INITIUM will also precisely and simultaneously measure environmental fast and thermal neutron flux at LNGS, supplying crucial information for any present and future experiment in this location. Consequently, I will demonstrate the proof-of-principle and scalability of INITIUM approach towards the development of a ton-scale detector in the context of CYGNUS, an international collaboration (of which I am one of the Spokespersons and PIs) recently gathered together with the aim to establish a Galactic Directional Recoil Observatory, that can test the DM hypothesis beyond the Neutrino Floor and measure the coherent scatter of galactic neutrinos, generating a significant long-term impact on detection techniques for rare events searches.
Max ERC Funding
1 995 719 €
Duration
Start date: 2019-03-01, End date: 2024-02-29
Project acronym InOutBioLight
Project Advanced biohybrid lighting and photovoltaic devices
Researcher (PI) Rubén Darío COSTA
Host Institution (HI) FUNDACION IMDEA MATERIALES
Call Details Consolidator Grant (CoG), PE5, ERC-2018-COG
Summary InOutBioLight aims to design multifunctional rubbers with enhanced mechanical, thermal, color-converting, and light-guiding features towards advanced biohybrid lighting and photovoltaic technologies. The latter are placed at the forefront of the EU efforts for low-cost production and efficient consumption of electricity, a critical issue for a sustainable development.
In this context, the use of biomolecules as functional components in lighting and photovoltaic devices is still a challenge, as they quickly denature under storage and device operation conditions. This paradigm has changed using an innovative rubber-like material, in which the biofunctionality is long preserved. As a proof-of-concept, color down-converting rubbers based on fluorescent proteins were used to design the first biohybrid white light-emitting diode (bio-HWLED). To develop a new generation of biohybrid devices, InOutBioLight will address the following critical issues, namely i) the nature of the protein-matrix stabilization, ii) how to enhance the thermal/mechanical features, iii) how to design multifunctional rubbers, iv) how to mimic natural patterns for light-guiding, and v) how to expand the technological use of the rubber approach.
To achieve these goals, InOutBioLight involves comprehensive spectroscopic, microscopic, and mechanical studies to investigate the protein-matrix interaction using new polymer matrices, additives, and protein-based nanoparticles. In addition, the mechanical, thermal, and light-coupling features will be enhanced using structural biocompounds and reproducing biomorphic patterns. As such, InOutBioLight offers three major advances: i) a thorough scientific basis for the rubber approach, ii) a significant thrust of the emerging bio-HWLEDs, and iii) innovative breakthroughs beyond state-of-the-art biohybrid solar cells.
Summary
InOutBioLight aims to design multifunctional rubbers with enhanced mechanical, thermal, color-converting, and light-guiding features towards advanced biohybrid lighting and photovoltaic technologies. The latter are placed at the forefront of the EU efforts for low-cost production and efficient consumption of electricity, a critical issue for a sustainable development.
In this context, the use of biomolecules as functional components in lighting and photovoltaic devices is still a challenge, as they quickly denature under storage and device operation conditions. This paradigm has changed using an innovative rubber-like material, in which the biofunctionality is long preserved. As a proof-of-concept, color down-converting rubbers based on fluorescent proteins were used to design the first biohybrid white light-emitting diode (bio-HWLED). To develop a new generation of biohybrid devices, InOutBioLight will address the following critical issues, namely i) the nature of the protein-matrix stabilization, ii) how to enhance the thermal/mechanical features, iii) how to design multifunctional rubbers, iv) how to mimic natural patterns for light-guiding, and v) how to expand the technological use of the rubber approach.
To achieve these goals, InOutBioLight involves comprehensive spectroscopic, microscopic, and mechanical studies to investigate the protein-matrix interaction using new polymer matrices, additives, and protein-based nanoparticles. In addition, the mechanical, thermal, and light-coupling features will be enhanced using structural biocompounds and reproducing biomorphic patterns. As such, InOutBioLight offers three major advances: i) a thorough scientific basis for the rubber approach, ii) a significant thrust of the emerging bio-HWLEDs, and iii) innovative breakthroughs beyond state-of-the-art biohybrid solar cells.
Max ERC Funding
1 999 188 €
Duration
Start date: 2020-01-01, End date: 2024-12-31
Project acronym LArcHer
Project Breaking barriers between Science and Heritage approaches to Levantine Rock Art through Archaeology, Heritage Science and IT
Researcher (PI) Ines DOMINGO SANZ
Host Institution (HI) UNIVERSITAT DE BARCELONA
Call Details Consolidator Grant (CoG), SH6, ERC-2018-COG
Summary LArcHer project aims at pioneering a new and more comprehensive way of understanding one of Europe’s most extraordinary bodies of prehistoric art, awarded Unesco World Heritage status in 1998: Levantine rock art (LRA). The ground-breaking nature of the project relies on combining a multidisciplinary (Archaeology, Heritage Science and IT) and multiscale approach (from microanalysis to landscape perspectives) to gain a holistic view of this art. It also aims at closing existing gaps between science and heritage mainstreams, to better understand the values and threats affecting this tradition and bring about a change in the way we understand, care, use and manage this millenary legacy. LArcHer aims are: a) Use cross-disciplinary knowledge and methods to redefine LRA (i.e. new dating techniques to refine chronology, new analytical methods to understand the creative process); b) Use LRA as a proxy to raise new questions of global interest on the evolution of creative thinking and human cognition (i.e. the timing and driving forces behind the birth of anthropocentrism and visual narratives in the history of prehistoric art); c) Develop new research agendas to set off complementary goals between science and heritage and define best practices for open air rock art conservation and management.
Spread across Mediterranean Iberia, LRA is the only European body of figurative art dominated by humans engaged in dynamic narratives of hunting, violence, warfare, dances and so forth. These scenes are unique to explore past social dynamics, human behaviour and cultural practices. As such, it is the only body of European rock art with potential to answer some of the new questions raised by LArcHer.
Key to LArcHer are the systematic recording and analysis of the art through 3D Digital technologies, management and data storage systems, GIS, physicochemical analysis of pigments and bedrock and comparative analysis with other major bodies of art with equivalent developments.
Summary
LArcHer project aims at pioneering a new and more comprehensive way of understanding one of Europe’s most extraordinary bodies of prehistoric art, awarded Unesco World Heritage status in 1998: Levantine rock art (LRA). The ground-breaking nature of the project relies on combining a multidisciplinary (Archaeology, Heritage Science and IT) and multiscale approach (from microanalysis to landscape perspectives) to gain a holistic view of this art. It also aims at closing existing gaps between science and heritage mainstreams, to better understand the values and threats affecting this tradition and bring about a change in the way we understand, care, use and manage this millenary legacy. LArcHer aims are: a) Use cross-disciplinary knowledge and methods to redefine LRA (i.e. new dating techniques to refine chronology, new analytical methods to understand the creative process); b) Use LRA as a proxy to raise new questions of global interest on the evolution of creative thinking and human cognition (i.e. the timing and driving forces behind the birth of anthropocentrism and visual narratives in the history of prehistoric art); c) Develop new research agendas to set off complementary goals between science and heritage and define best practices for open air rock art conservation and management.
Spread across Mediterranean Iberia, LRA is the only European body of figurative art dominated by humans engaged in dynamic narratives of hunting, violence, warfare, dances and so forth. These scenes are unique to explore past social dynamics, human behaviour and cultural practices. As such, it is the only body of European rock art with potential to answer some of the new questions raised by LArcHer.
Key to LArcHer are the systematic recording and analysis of the art through 3D Digital technologies, management and data storage systems, GIS, physicochemical analysis of pigments and bedrock and comparative analysis with other major bodies of art with equivalent developments.
Max ERC Funding
1 991 178 €
Duration
Start date: 2019-10-01, End date: 2024-09-30
Project acronym MAGNESIA
Project The impact of highly magnetic neutron stars in the explosive and transient Universe
Researcher (PI) Nanda Rea
Host Institution (HI) AGENCIA ESTATAL CONSEJO SUPERIOR DEINVESTIGACIONES CIENTIFICAS
Call Details Consolidator Grant (CoG), PE9, ERC-2018-COG
Summary The gravitational wave window is now open. It is then imperative to build quantitative models of neutron stars that use all the available tracers to constrain fundamental physics at the highest densities and magnetic fields. The most magnetic neutron stars, the magnetars, have been recently suggested to be powering a large variety of explosive and transient events. The enormous rotational power at birth, and the magnetic energy they can release via large flares, put the magnetars in the (yet) hand-wavy interpretations of gamma-ray bursts, the early phases of double neutron star mergers, super-luminous supernovae, hypernovae, fast radio bursts, and ultra-luminous X-ray sources. However, despite knowing about 30 magnetars, we are lacking a census of how many we expect within the pulsar population, nor we have robust constraints on their flaring rates. The recent discovery of transient magnetars, of magnetar-like flares from sources with measured low dipolar magnetic fields and from typical radio pulsars, clearly showed that the magnetar census in our Galaxy is largely under-estimated. This hampers our understanding not only of the pulsar and magnetar populations, but also of them as possibly related to many of Universe’s explosive events. MAGNESIA will infer a sound Magnetar Census via an innovative approach that will build the first Pulsar Population Synthesis model able to cope with constraints/limits from multi-band observations, and taking into account 3D magnetic field evolution models and flaring rates for neutron stars. Combining expertise in multi-band observations, numerical modeling, nuclear physics, and computation, MAGNESIA will solve the physics, the observational systematic errors, and the computational challenges that inhibited previous works, to finally constrain the spin period and magnetic field distribution at birth of the neutron star population.
Summary
The gravitational wave window is now open. It is then imperative to build quantitative models of neutron stars that use all the available tracers to constrain fundamental physics at the highest densities and magnetic fields. The most magnetic neutron stars, the magnetars, have been recently suggested to be powering a large variety of explosive and transient events. The enormous rotational power at birth, and the magnetic energy they can release via large flares, put the magnetars in the (yet) hand-wavy interpretations of gamma-ray bursts, the early phases of double neutron star mergers, super-luminous supernovae, hypernovae, fast radio bursts, and ultra-luminous X-ray sources. However, despite knowing about 30 magnetars, we are lacking a census of how many we expect within the pulsar population, nor we have robust constraints on their flaring rates. The recent discovery of transient magnetars, of magnetar-like flares from sources with measured low dipolar magnetic fields and from typical radio pulsars, clearly showed that the magnetar census in our Galaxy is largely under-estimated. This hampers our understanding not only of the pulsar and magnetar populations, but also of them as possibly related to many of Universe’s explosive events. MAGNESIA will infer a sound Magnetar Census via an innovative approach that will build the first Pulsar Population Synthesis model able to cope with constraints/limits from multi-band observations, and taking into account 3D magnetic field evolution models and flaring rates for neutron stars. Combining expertise in multi-band observations, numerical modeling, nuclear physics, and computation, MAGNESIA will solve the physics, the observational systematic errors, and the computational challenges that inhibited previous works, to finally constrain the spin period and magnetic field distribution at birth of the neutron star population.
Max ERC Funding
2 263 148 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym MarsFirstWater
Project The physicochemical nature of water on early Mars
Researcher (PI) Alberto Gonzalez Fairen
Host Institution (HI) AGENCIA ESTATAL CONSEJO SUPERIOR DEINVESTIGACIONES CIENTIFICAS
Call Details Consolidator Grant (CoG), PE9, ERC-2018-COG
Summary Concepts of large bodies of glacial ice and liquid standing water, a robust hydrological cycle, and a rich Martian history of climate change are part of the current consensus model for early Mars. However, questions still poorly constrained include: a precise understanding of the inventory of water during the first billion years of Mars history and its early evolution on both global and local scales; whether liquid or solid H2O dominated, for what duration of time and where the water resided; what were the host-rock weathering rates and patterns and the physicochemical parameters defining such interactions; what specific landforms and mineralogies were generated during those periods; and what implications all these processes had on the possible inception of life on Mars. These fundamental questions represent large uncertainties and knowledge gaps. Therefore, a quantitative understanding of the basic characteristics of water on early Mars is very much needed and is the focus of this proposal.
This application outlines a plan for my research in the next five years, and explains how I propose to fully characterize the aqueous environments of early Mars through a quantitative and truly interdisciplinary investigation. Spacecraft mission-derived datasets will be consistently used to test hypotheses through paleogeomorphological reconstructions, geochemical modeling, mineralogical studies, and astrobiological investigations. The derived results will produce hard constraints on the physical evolution, chemical alteration and habitability of surface and near-surface aqueous environments on early Mars. The planned investigations will benefit from the combination of working with first-hand data from ongoing Mars missions and with the state-of-the-art laboratory tools at the host institution. The final expected result will be a complete understanding of the physicochemical nature of water on early Mars, also opening new paths for the astrobiological exploration of the planet.
Summary
Concepts of large bodies of glacial ice and liquid standing water, a robust hydrological cycle, and a rich Martian history of climate change are part of the current consensus model for early Mars. However, questions still poorly constrained include: a precise understanding of the inventory of water during the first billion years of Mars history and its early evolution on both global and local scales; whether liquid or solid H2O dominated, for what duration of time and where the water resided; what were the host-rock weathering rates and patterns and the physicochemical parameters defining such interactions; what specific landforms and mineralogies were generated during those periods; and what implications all these processes had on the possible inception of life on Mars. These fundamental questions represent large uncertainties and knowledge gaps. Therefore, a quantitative understanding of the basic characteristics of water on early Mars is very much needed and is the focus of this proposal.
This application outlines a plan for my research in the next five years, and explains how I propose to fully characterize the aqueous environments of early Mars through a quantitative and truly interdisciplinary investigation. Spacecraft mission-derived datasets will be consistently used to test hypotheses through paleogeomorphological reconstructions, geochemical modeling, mineralogical studies, and astrobiological investigations. The derived results will produce hard constraints on the physical evolution, chemical alteration and habitability of surface and near-surface aqueous environments on early Mars. The planned investigations will benefit from the combination of working with first-hand data from ongoing Mars missions and with the state-of-the-art laboratory tools at the host institution. The final expected result will be a complete understanding of the physicochemical nature of water on early Mars, also opening new paths for the astrobiological exploration of the planet.
Max ERC Funding
1 998 368 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym MATRIX
Project Novel mitochondria-targeted therapies for cancer treatment-induced cardiotoxicity
Researcher (PI) Borja Ibáñez Cabeza
Host Institution (HI) CENTRO NACIONAL DE INVESTIGACIONESCARDIOVASCULARES CARLOS III (F.S.P.)
Call Details Consolidator Grant (CoG), LS7, ERC-2018-COG
Summary Cardiac toxicity is one of the most frequent serious side effects of cancer therapy, affecting up to 30% of treated patients. Cancer treatment-induced cardiotoxicity (CTiCT) can result in severe heart failure. The trade-off between cancer and chronic heart failure is an immense personal burden with physical and psychological consequences. Current therapies for CTiCT are suboptimal, featuring poor early detection algorithms and nonspecific heart failure treatments. Based on our recently published results and additional preliminary data presented here, we propose that CTiCT is associated with altered mitochondrial dynamics, triggering a cardiomyocyte metabolic reprogramming. MATRIX represents a holistic approach to tackling mitochondrial dysfunction in CTiCT. Our hypothesis is that reverting metabolic reprogramming by shifting mitochondrial substrate utilization could represent a new paradigm in the treatment of early-stage CTiCT. By refining a novel imaging-based algorithm recently developed in our group, we will achieve very early detection of myocardial damage in patients treated with commonly prescribed cancer therapies, long before clinically used parameters become abnormal. Such early detection, not available currently, is crucial for implementation of early therapies. We also hypothesize that in end-stage CTiCT, mitochondrial dysfunction has passed a no-return point, and the failing heart will only be rescued by a strategy to replenish the myocardium with fresh healthy mitochondria. This will be achieved with a radical new therapeutic option: in-vivo mitochondrial transplantation. The MATRIX project has broad translational potential, including a new therapeutic approach to a clinically relevant condition, the development of technology for early diagnosis, and advances in knowledge of basic disease mechanisms.
Summary
Cardiac toxicity is one of the most frequent serious side effects of cancer therapy, affecting up to 30% of treated patients. Cancer treatment-induced cardiotoxicity (CTiCT) can result in severe heart failure. The trade-off between cancer and chronic heart failure is an immense personal burden with physical and psychological consequences. Current therapies for CTiCT are suboptimal, featuring poor early detection algorithms and nonspecific heart failure treatments. Based on our recently published results and additional preliminary data presented here, we propose that CTiCT is associated with altered mitochondrial dynamics, triggering a cardiomyocyte metabolic reprogramming. MATRIX represents a holistic approach to tackling mitochondrial dysfunction in CTiCT. Our hypothesis is that reverting metabolic reprogramming by shifting mitochondrial substrate utilization could represent a new paradigm in the treatment of early-stage CTiCT. By refining a novel imaging-based algorithm recently developed in our group, we will achieve very early detection of myocardial damage in patients treated with commonly prescribed cancer therapies, long before clinically used parameters become abnormal. Such early detection, not available currently, is crucial for implementation of early therapies. We also hypothesize that in end-stage CTiCT, mitochondrial dysfunction has passed a no-return point, and the failing heart will only be rescued by a strategy to replenish the myocardium with fresh healthy mitochondria. This will be achieved with a radical new therapeutic option: in-vivo mitochondrial transplantation. The MATRIX project has broad translational potential, including a new therapeutic approach to a clinically relevant condition, the development of technology for early diagnosis, and advances in knowledge of basic disease mechanisms.
Max ERC Funding
1 999 375 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym METAmorphoses
Project Shapeshifting Metasurfaces for Chemically Selective Augmented Reality
Researcher (PI) Antonio AMBROSIO
Host Institution (HI) FONDAZIONE ISTITUTO ITALIANO DI TECNOLOGIA
Call Details Consolidator Grant (CoG), PE8, ERC-2018-COG
Summary I propose to realize the first shapeshifting optical metasurface that changes its functionality on-demand and adapts to changing external conditions. The metasurface may work as a chemically selective lens that allows transmission only of the spectral fingerprint of a specific molecule in the mid-IR wavelength range. The same metasurface can later be turned into an adaptive lens for focusing and detection under the skin. For such ambitious goal, a radically new approach is needed.
I will realize shapeshifting metasurfaces made of a polymer containing photo-switchable molecules. The surface of such polymers undergoes a morphology re-organization (surface structuring) when illuminated by an external visible light pattern. The polymer will be structured with visible light and the resulting metasurfaces will work in the mid-IR. I will use state-of-the-art optical nano-imaging techniques to investigate the surface structuring phenomenon at the nanoscale in order to achieve full control of the mechanism.
Since the polymer surface can continuously be adjusted with the illuminating visible light, it will be possible to shift from one encoded optical functionality to a completely different one. Once optimized, this completely out-of-the-box approach will be completed by developing a feedback mechanism that allows for self- adjustment of the polymeric metasurface to changing external conditions. This will open endless possibilities in many fields, from medical imaging to security and quality control.
The proposed approach is unprecedented but it is perfectly in line with my research activities, resulting in fact from merging different techniques that I master into a new research field.
My approach is also inexpensive relative to the usual nano-fabrication techniques and immediately compatible with high-volume production, providing a viable technology platform for lightweight, eyewear technology, that reflects the views of key industrial players in the field.
Summary
I propose to realize the first shapeshifting optical metasurface that changes its functionality on-demand and adapts to changing external conditions. The metasurface may work as a chemically selective lens that allows transmission only of the spectral fingerprint of a specific molecule in the mid-IR wavelength range. The same metasurface can later be turned into an adaptive lens for focusing and detection under the skin. For such ambitious goal, a radically new approach is needed.
I will realize shapeshifting metasurfaces made of a polymer containing photo-switchable molecules. The surface of such polymers undergoes a morphology re-organization (surface structuring) when illuminated by an external visible light pattern. The polymer will be structured with visible light and the resulting metasurfaces will work in the mid-IR. I will use state-of-the-art optical nano-imaging techniques to investigate the surface structuring phenomenon at the nanoscale in order to achieve full control of the mechanism.
Since the polymer surface can continuously be adjusted with the illuminating visible light, it will be possible to shift from one encoded optical functionality to a completely different one. Once optimized, this completely out-of-the-box approach will be completed by developing a feedback mechanism that allows for self- adjustment of the polymeric metasurface to changing external conditions. This will open endless possibilities in many fields, from medical imaging to security and quality control.
The proposed approach is unprecedented but it is perfectly in line with my research activities, resulting in fact from merging different techniques that I master into a new research field.
My approach is also inexpensive relative to the usual nano-fabrication techniques and immediately compatible with high-volume production, providing a viable technology platform for lightweight, eyewear technology, that reflects the views of key industrial players in the field.
Max ERC Funding
2 745 000 €
Duration
Start date: 2019-10-01, End date: 2024-09-30
Project acronym MOF-reactors
Project Metal-Organic Frameworks as Chemical Reactors for the Synthesis of Well-Defined Sub-Nanometer Metal Clusters
Researcher (PI) Emilio PARDO
Host Institution (HI) UNIVERSITAT DE VALENCIA
Call Details Consolidator Grant (CoG), PE5, ERC-2018-COG
Summary Humankind advancement is connected to the use and development of metal forms. Recent works have unveiled exceptional properties –such as luminescence, biocompatibility, antitumoral activity or a superlative catalytic activity– for small aggregations of metal atoms, so–called sub–nanometer metal clusters (SNMCs). Despite this importance, the gram-scale synthesis of structurally and electronically well–defined SNMCs is still far from being a reality.
The present proposal situates at the centre of such weakness and aims at making a breakthrough step-change on the use of metal-organic frameworks (MOFs) as chemical reactors for the in–situ synthesis of stable ligand-free SNMCs with such unique properties. This challenging synthetic strategy, which is assisted by striking published and inedited preliminary results, has solid foundations. Firstly, the design and large-scale preparation of cheap and novel families of highly robust and crystalline MOFs with tailor-made functional channels to be used as chemical reactors. Secondly, the application of solid-state post-synthetic methods to drive the multigram-scale preparation of unique ligand-free homo- and heterometallic SNMCs, which are, in the best-case scenario, very difficult to be obtained and stabilised outside the channels. Last but not least, single-crystal X-Ray diffraction will be used as the definitive tool for the characterisation, at the atomic level, of such ultrasmall species offering unprecedented snapshots about their real structures and formation mechanisms.
The ultimate goal will be upscaling this synthetic strategy aiming at the large-scale fabrication of SNMCs and their industrial application will be then evaluated. A successful achievement of all the aforementioned objectives of this ground-breaking project would open new routes for the use of MOFs as chemical reactors to manufacture, at competitive prices, MOF-driven, structurally and electronically well–defined, ligand–free SNMCs in a multigram-scale.
Summary
Humankind advancement is connected to the use and development of metal forms. Recent works have unveiled exceptional properties –such as luminescence, biocompatibility, antitumoral activity or a superlative catalytic activity– for small aggregations of metal atoms, so–called sub–nanometer metal clusters (SNMCs). Despite this importance, the gram-scale synthesis of structurally and electronically well–defined SNMCs is still far from being a reality.
The present proposal situates at the centre of such weakness and aims at making a breakthrough step-change on the use of metal-organic frameworks (MOFs) as chemical reactors for the in–situ synthesis of stable ligand-free SNMCs with such unique properties. This challenging synthetic strategy, which is assisted by striking published and inedited preliminary results, has solid foundations. Firstly, the design and large-scale preparation of cheap and novel families of highly robust and crystalline MOFs with tailor-made functional channels to be used as chemical reactors. Secondly, the application of solid-state post-synthetic methods to drive the multigram-scale preparation of unique ligand-free homo- and heterometallic SNMCs, which are, in the best-case scenario, very difficult to be obtained and stabilised outside the channels. Last but not least, single-crystal X-Ray diffraction will be used as the definitive tool for the characterisation, at the atomic level, of such ultrasmall species offering unprecedented snapshots about their real structures and formation mechanisms.
The ultimate goal will be upscaling this synthetic strategy aiming at the large-scale fabrication of SNMCs and their industrial application will be then evaluated. A successful achievement of all the aforementioned objectives of this ground-breaking project would open new routes for the use of MOFs as chemical reactors to manufacture, at competitive prices, MOF-driven, structurally and electronically well–defined, ligand–free SNMCs in a multigram-scale.
Max ERC Funding
1 886 000 €
Duration
Start date: 2019-03-01, End date: 2024-02-29
Project acronym NBEB-SSP
Project Nonparametric Bayes and empirical Bayes for species sampling problems: classical questions, new directions and related issues
Researcher (PI) Stefano FAVARO
Host Institution (HI) UNIVERSITA DEGLI STUDI DI TORINO
Call Details Consolidator Grant (CoG), PE1, ERC-2018-COG
Summary Consider a population of individuals belonging to different species with unknown proportions. Given an
initial (observable) random sample from the population, how do we estimate the number of species in the
population, or the probability of discovering a new species in one additional sample, or the number of
hitherto unseen species that would be observed in additional unobservable samples? These are archetypal
examples of a broad class of statistical problems referred to as species sampling problems (SSP), namely:
statistical problems in which the objects of inference are functionals involving the unknown species
proportions and/or the species frequency counts induced by observable and unobservable samples from the
population. SSPs first appeared in ecology, and their importance has grown considerably in the recent years
driven by challenging applications in a wide range of leading scientific disciplines, e.g., biosciences and
physical sciences, engineering sciences, machine learning, theoretical computer science and information
theory, etc.
The objective of this project is the introduction and a thorough investigation of new nonparametric Bayes
and empirical Bayes methods for SSPs. The proposed advances will include: i) addressing challenging
methodological open problems in classical SSPs under the nonparametric empirical Bayes framework, which
is arguably the most developed (currently most implemented by practitioners) framework do deal with
classical SSPs; fully exploiting and developing the potential of tools from mathematical analysis,
combinatorial probability and Bayesian nonparametric statistics to set forth a coherent modern approach to
classical SSPs, and then investigating the interplay between this approach and its empirical counterpart;
extending the scope of the above studies to more challenging SSPs, and classes of generalized SSPs, that
have emerged recently in the fields of biosciences and physical sciences, machine learning and information
theory.
Summary
Consider a population of individuals belonging to different species with unknown proportions. Given an
initial (observable) random sample from the population, how do we estimate the number of species in the
population, or the probability of discovering a new species in one additional sample, or the number of
hitherto unseen species that would be observed in additional unobservable samples? These are archetypal
examples of a broad class of statistical problems referred to as species sampling problems (SSP), namely:
statistical problems in which the objects of inference are functionals involving the unknown species
proportions and/or the species frequency counts induced by observable and unobservable samples from the
population. SSPs first appeared in ecology, and their importance has grown considerably in the recent years
driven by challenging applications in a wide range of leading scientific disciplines, e.g., biosciences and
physical sciences, engineering sciences, machine learning, theoretical computer science and information
theory, etc.
The objective of this project is the introduction and a thorough investigation of new nonparametric Bayes
and empirical Bayes methods for SSPs. The proposed advances will include: i) addressing challenging
methodological open problems in classical SSPs under the nonparametric empirical Bayes framework, which
is arguably the most developed (currently most implemented by practitioners) framework do deal with
classical SSPs; fully exploiting and developing the potential of tools from mathematical analysis,
combinatorial probability and Bayesian nonparametric statistics to set forth a coherent modern approach to
classical SSPs, and then investigating the interplay between this approach and its empirical counterpart;
extending the scope of the above studies to more challenging SSPs, and classes of generalized SSPs, that
have emerged recently in the fields of biosciences and physical sciences, machine learning and information
theory.
Max ERC Funding
982 930 €
Duration
Start date: 2019-03-01, End date: 2024-02-29
Project acronym PAIDEIA
Project PlAsmon InduceD hot Electron extraction with doped semiconductors for Infrared solAr energy
Researcher (PI) Francesco SCOTOGNELLA
Host Institution (HI) POLITECNICO DI MILANO
Call Details Consolidator Grant (CoG), PE8, ERC-2018-COG
Summary Earth is inhabited by an energy hungry human society. The Sun, with a global radiation at the ground level of more than 1 kW/m^2, is our largest source of energy. However, 45% of the total radiation is in the near infrared (NIR) and is not absorbed by most photovoltaic materials.
PAIDEIA focuses on two main advantages aiming to enhance the capacity of solar energy conversion:
i) plasmon assisted hot carriers extraction from NIR plasmonic materials;
ii) linewidth narrowing in plasmonic nanoparticle films that enhances the lifetime of hot carriers and, thus, boosts the efficiency of light driven carrier extraction.
Instead of metals, which operate mostly in the visible region, we will make use of doped semiconductor nanocrystals (DSNCs) as hot electron extraction materials possessing a plasmonic response tunable in the range 800 nm – 4000 nm. Three different innovative architectures will be used for improved device performance: i) improved Schottky junctions (DSNC/wide band gap semiconductor nanocomposites); ii) ultrathin devices (DSNCs/2D quantum materials); iii) maximized interface DSNC/semiconductor bulk hetero-Schottky junctions.
By combining both concepts in advanced architectures we aim to produce a solar cell device that functions in the NIR with efficiencies of up to 10%. A tandem solar cell that combines the conventional power conversion efficiency, up to ~1100 nm, of a commercial Si solar cell (~20%) with the new PAIDEIA based device is expected to reach a total power conversion efficiency of 30% by extending the width of wavelengths that are converted to the full spectral range delivered by the Sun. PAIDEIA has a deeply fundamental character impacting several areas in the field of nanophysics, nanochemistry and materials processing and, at the same time, having a high impact on the study of solar energy conversion. Finally, PAIDEIA will provide answers to the fundamental questions regarding the physical behaviour of plasmonic/semiconductor interfaces.
Summary
Earth is inhabited by an energy hungry human society. The Sun, with a global radiation at the ground level of more than 1 kW/m^2, is our largest source of energy. However, 45% of the total radiation is in the near infrared (NIR) and is not absorbed by most photovoltaic materials.
PAIDEIA focuses on two main advantages aiming to enhance the capacity of solar energy conversion:
i) plasmon assisted hot carriers extraction from NIR plasmonic materials;
ii) linewidth narrowing in plasmonic nanoparticle films that enhances the lifetime of hot carriers and, thus, boosts the efficiency of light driven carrier extraction.
Instead of metals, which operate mostly in the visible region, we will make use of doped semiconductor nanocrystals (DSNCs) as hot electron extraction materials possessing a plasmonic response tunable in the range 800 nm – 4000 nm. Three different innovative architectures will be used for improved device performance: i) improved Schottky junctions (DSNC/wide band gap semiconductor nanocomposites); ii) ultrathin devices (DSNCs/2D quantum materials); iii) maximized interface DSNC/semiconductor bulk hetero-Schottky junctions.
By combining both concepts in advanced architectures we aim to produce a solar cell device that functions in the NIR with efficiencies of up to 10%. A tandem solar cell that combines the conventional power conversion efficiency, up to ~1100 nm, of a commercial Si solar cell (~20%) with the new PAIDEIA based device is expected to reach a total power conversion efficiency of 30% by extending the width of wavelengths that are converted to the full spectral range delivered by the Sun. PAIDEIA has a deeply fundamental character impacting several areas in the field of nanophysics, nanochemistry and materials processing and, at the same time, having a high impact on the study of solar energy conversion. Finally, PAIDEIA will provide answers to the fundamental questions regarding the physical behaviour of plasmonic/semiconductor interfaces.
Max ERC Funding
1 815 445 €
Duration
Start date: 2019-04-01, End date: 2024-03-31
Project acronym PRO-TOOLKITS
Project Programmable nucleic acid toolkits for cell-free diagnostics and genetically encoded biosensing
Researcher (PI) francesco RICCI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA TOR VERGATA
Call Details Consolidator Grant (CoG), PE4, ERC-2018-COG
Summary WHY: The biological complexity of tumours and the large diversity of diagnostic biomarkers call for the development of innovative analytical tools that can detect multiple targets in a sensitive, specific and low-cost way and allow real-time monitoring of disease pathways and therapeutic effects. To provide such transformative tools creative thinking, innovative approach and the exploration of new research avenues that span different disciplines is necessary.
WHAT: The goal of the PRO-TOOLKITS project is to address this need by developing innovative cell-free point of care diagnostic kits and genetically encodable biosensing tools.
HOW: I oriented my independent career as a P.I. towards the design and development of synthetic nucleic acid-based nanodevices and nanomachines. With the help of an ERC Starting Grant I made ground-breaking contributions in the field of nucleic acid Nanotechnology. Motivated by these advancements I propose to challenge my know-how and expertise to explore new research avenues that will open exciting possibilities in biosensing applications. The key, ground-breaking IDEA underlying this project is to take advantage of my expertise and harness the advantageous features of RNA synthetic modules that can translate the expression of proteins in controlled in-vitro cell-free systems and can be also genetically encoded in living organisms and function inside the cells. I will develop rationally designed programmable nucleic acid modules that respond to a wide range of molecular markers and environmental stimuli through innovative nature-inspired mechanisms and that can be orthogonally wired to provide cell-free diagnostic kits and genetically encoded live-cell biosensing tools. The project will provide transformative approaches, methods and tools that will represent a genuine break-through in the fields of in-vitro diagnostics, biosensing and synthetic biology.
Summary
WHY: The biological complexity of tumours and the large diversity of diagnostic biomarkers call for the development of innovative analytical tools that can detect multiple targets in a sensitive, specific and low-cost way and allow real-time monitoring of disease pathways and therapeutic effects. To provide such transformative tools creative thinking, innovative approach and the exploration of new research avenues that span different disciplines is necessary.
WHAT: The goal of the PRO-TOOLKITS project is to address this need by developing innovative cell-free point of care diagnostic kits and genetically encodable biosensing tools.
HOW: I oriented my independent career as a P.I. towards the design and development of synthetic nucleic acid-based nanodevices and nanomachines. With the help of an ERC Starting Grant I made ground-breaking contributions in the field of nucleic acid Nanotechnology. Motivated by these advancements I propose to challenge my know-how and expertise to explore new research avenues that will open exciting possibilities in biosensing applications. The key, ground-breaking IDEA underlying this project is to take advantage of my expertise and harness the advantageous features of RNA synthetic modules that can translate the expression of proteins in controlled in-vitro cell-free systems and can be also genetically encoded in living organisms and function inside the cells. I will develop rationally designed programmable nucleic acid modules that respond to a wide range of molecular markers and environmental stimuli through innovative nature-inspired mechanisms and that can be orthogonally wired to provide cell-free diagnostic kits and genetically encoded live-cell biosensing tools. The project will provide transformative approaches, methods and tools that will represent a genuine break-through in the fields of in-vitro diagnostics, biosensing and synthetic biology.
Max ERC Funding
1 999 375 €
Duration
Start date: 2019-10-01, End date: 2024-09-30
Project acronym ReadCalibration
Project Phonemic representations in speech perception and production: Recalibration by readingacquisition
Researcher (PI) Clara, Dominique, Sylvie Martin
Host Institution (HI) BCBL BASQUE CENTER ON COGNITION BRAIN AND LANGUAGE
Call Details Consolidator Grant (CoG), SH4, ERC-2018-COG
Summary The main goal of this project is to demonstrate that reading acquisition (RA) drastically reshapes our phonemic inventory, and to investigate the time-course and fine-grained properties of this recalibration. The main innovative and ground-breaking aspect of this project is the merging of two research fields, (1) reading acquisition and (2) phonemic recalibration, together with a deep and extensive exploration of the (3) perception-production link, which results in a new research line that pushes the boundaries of our understanding of the complex interactions between auditory and visual language perception and production.
We will demonstrate that phonemic representations (PRs) become more stable (less dispersed) during the process of learning to read, and that this recalibration varies according to the grapheme-phoneme conversion rules of the reading system. We will explore such recalibration by means of the first cross-linguistic longitudinal study examining the position and dispersion of PRs, both in perception and production of phonemes and words. Secondly, we will explore how recalibration develops when RA is impaired as is the case in dyslexic children –informing the research field on (4) dyslexia– and when pre-reading PRs are unstable as is the case in deaf children with cochlear implants –informing the research field on (5) deafness. Finally, the research will also be extended to PR recalibration during RA in a second language –informing the research on (6) bilingualism.
This proposal provides the first systematic investigation of phonemic recalibration during literacy acquisition, and will provide important insight for pragmatic research and theoretical accounts of language perception and production and phonemic recalibration. This project will also have major implications for the clinical field (theories and remediation of dyslexia and deafness) and for social policies and education (bilingualism, spoken and written language teaching).
Summary
The main goal of this project is to demonstrate that reading acquisition (RA) drastically reshapes our phonemic inventory, and to investigate the time-course and fine-grained properties of this recalibration. The main innovative and ground-breaking aspect of this project is the merging of two research fields, (1) reading acquisition and (2) phonemic recalibration, together with a deep and extensive exploration of the (3) perception-production link, which results in a new research line that pushes the boundaries of our understanding of the complex interactions between auditory and visual language perception and production.
We will demonstrate that phonemic representations (PRs) become more stable (less dispersed) during the process of learning to read, and that this recalibration varies according to the grapheme-phoneme conversion rules of the reading system. We will explore such recalibration by means of the first cross-linguistic longitudinal study examining the position and dispersion of PRs, both in perception and production of phonemes and words. Secondly, we will explore how recalibration develops when RA is impaired as is the case in dyslexic children –informing the research field on (4) dyslexia– and when pre-reading PRs are unstable as is the case in deaf children with cochlear implants –informing the research field on (5) deafness. Finally, the research will also be extended to PR recalibration during RA in a second language –informing the research on (6) bilingualism.
This proposal provides the first systematic investigation of phonemic recalibration during literacy acquisition, and will provide important insight for pragmatic research and theoretical accounts of language perception and production and phonemic recalibration. This project will also have major implications for the clinical field (theories and remediation of dyslexia and deafness) and for social policies and education (bilingualism, spoken and written language teaching).
Max ERC Funding
1 875 000 €
Duration
Start date: 2019-10-01, End date: 2024-09-30
Project acronym RememberEx
Project Human Subcortical-Cortical Circuit Dynamics for Remembering the Exceptional
Researcher (PI) Bryan STRANGE
Host Institution (HI) UNIVERSIDAD POLITECNICA DE MADRID
Call Details Consolidator Grant (CoG), LS5, ERC-2018-COG
Summary Our memory system is optimised for remembering the exceptional over the mundane. We remember better those events that violate predictions generated by the prevailing context, particularly because of surprise or emotional impact. Understanding how we form and retrieve long-term memories for important or salient events is critical for combating the rapidly growing incidence of pathologies associated with memory dysfunction with huge socio-econonomic burden. Human lesion and non-invasive functional imaging data, motivated by findings from animal models, have identified subcortical structures that are critical for upregulating hippocampal function during salient event memory. However, mechanistic understanding of these processes in humans remains scarce, and requires better experimental approaches such as direct intracranial recordings from, and focal electrical stimulation of, these subcortical structures.
This project will characterise human subcortico-cortical neuronal circuit dynamics associated with enhanced episodic memory for salient stimuli by studying direct recordings from human hippocampus, amygdala, nucleus accumbens, ventral midbrain and cortex. Within this framework, I will elucidate the electrophysiological mechanisms underlying amygdala-hippocampal-cortical coupling that lead to better memory for emotional stimuli, extend the hippocampal role in detecting unpredicted stimuli to define its role in orchestrating cortical dynamics in unpredictable contexts, and discover the neuronal response profile of the human mesolimbic dopamine system during salient stimulus encoding. The predicted results, based on my own preliminary data, will offer several conceptual breakthroughs, particularly regarding hippocampal function and the role of dopaminergic ventral midbrain in memory. The knowledge gained from this project is a fundamental requirement for designing therapeutic interventions for patients with memory deficits and other neuropsychiatric disorders.
Summary
Our memory system is optimised for remembering the exceptional over the mundane. We remember better those events that violate predictions generated by the prevailing context, particularly because of surprise or emotional impact. Understanding how we form and retrieve long-term memories for important or salient events is critical for combating the rapidly growing incidence of pathologies associated with memory dysfunction with huge socio-econonomic burden. Human lesion and non-invasive functional imaging data, motivated by findings from animal models, have identified subcortical structures that are critical for upregulating hippocampal function during salient event memory. However, mechanistic understanding of these processes in humans remains scarce, and requires better experimental approaches such as direct intracranial recordings from, and focal electrical stimulation of, these subcortical structures.
This project will characterise human subcortico-cortical neuronal circuit dynamics associated with enhanced episodic memory for salient stimuli by studying direct recordings from human hippocampus, amygdala, nucleus accumbens, ventral midbrain and cortex. Within this framework, I will elucidate the electrophysiological mechanisms underlying amygdala-hippocampal-cortical coupling that lead to better memory for emotional stimuli, extend the hippocampal role in detecting unpredicted stimuli to define its role in orchestrating cortical dynamics in unpredictable contexts, and discover the neuronal response profile of the human mesolimbic dopamine system during salient stimulus encoding. The predicted results, based on my own preliminary data, will offer several conceptual breakthroughs, particularly regarding hippocampal function and the role of dopaminergic ventral midbrain in memory. The knowledge gained from this project is a fundamental requirement for designing therapeutic interventions for patients with memory deficits and other neuropsychiatric disorders.
Max ERC Funding
2 000 000 €
Duration
Start date: 2019-05-01, End date: 2024-04-30
Project acronym SLING
Project Efficient algorithms for sustainable machine learning
Researcher (PI) Lorenzo ROSASCO
Host Institution (HI) UNIVERSITA DEGLI STUDI DI GENOVA
Call Details Consolidator Grant (CoG), PE6, ERC-2018-COG
Summary This project will develop and integrate the latest optimization and statistical advances into a new generation of resource-efficient algorithms for large-scale machine learning. State-of-the-art machine learning methods provide impressive results, opening new perspectives for science, technology, and society. However, they rely on massive computational resources to process huge manually annotated data-sets. The corresponding costs in terms of energy consumption and human efforts are not sustainable.
This project builds on the idea that improving efficiency is a key to scale the ambitions and applicability of machine learning. Achieving efficiency requires overcoming the traditional boundaries between statistics and computations, to develop new theory and algorithms.
Within a multidisciplinary approach, we will establish a new regularization theory of efficient machine learning.
We will develop models that incorporate budgeted computations, and numerical solutions with resources tailored to the statistically accuracy allowed by the data. Theoretical advances will provide the foundations for novel and sound algorithmic solutions. Close collaborations in diverse applied fields
will ensure that our research results and solutions will be apt and immediately applicable to real world scenarios.
The new algorithms developed in the project will contribute to boost the possibilities of Artificial Intelligence, modeling and decision making in a world of data with ever-increasing size and complexity.
Summary
This project will develop and integrate the latest optimization and statistical advances into a new generation of resource-efficient algorithms for large-scale machine learning. State-of-the-art machine learning methods provide impressive results, opening new perspectives for science, technology, and society. However, they rely on massive computational resources to process huge manually annotated data-sets. The corresponding costs in terms of energy consumption and human efforts are not sustainable.
This project builds on the idea that improving efficiency is a key to scale the ambitions and applicability of machine learning. Achieving efficiency requires overcoming the traditional boundaries between statistics and computations, to develop new theory and algorithms.
Within a multidisciplinary approach, we will establish a new regularization theory of efficient machine learning.
We will develop models that incorporate budgeted computations, and numerical solutions with resources tailored to the statistically accuracy allowed by the data. Theoretical advances will provide the foundations for novel and sound algorithmic solutions. Close collaborations in diverse applied fields
will ensure that our research results and solutions will be apt and immediately applicable to real world scenarios.
The new algorithms developed in the project will contribute to boost the possibilities of Artificial Intelligence, modeling and decision making in a world of data with ever-increasing size and complexity.
Max ERC Funding
1 977 500 €
Duration
Start date: 2019-11-01, End date: 2024-10-31
Project acronym SUBSILIENCE
Project Subsistence and human resilience to sudden climatic events in Europe during MIS3
Researcher (PI) ANA B. MARIN-ARROYO
Host Institution (HI) UNIVERSIDAD DE CANTABRIA
Call Details Consolidator Grant (CoG), SH6, ERC-2018-COG
Summary Climate has long been proposed as a possible trigger-factor for the extinction of Neanderthals and the rapid colonization of Europe by Anatomically Modern Humans (AMH). Abrupt and acute oscillations of climate, as recorded from polar ice sheets, are particularly threatening as they can push ecosystems towards catastrophic outcomes. Under these conditions, the survival of a species critically depends on their adaptive skills. Understanding the exact role that these episodes could have had in the Middle to Upper Palaeolithic transition is then essential to unravel the real causes of Neanderthal demise and AMH success. To do this, SUBSILIENCE will identify the subsistence strategies adopted by both human species in response to those climatic changes at 20 key archaeological sites located across southern European peninsulas. By applying zooarchaeological and taphonomic analyses, the behavioural flexibility and resilience of each human species will be assessed. In addition, to enable effective testing, local terrestrial climatic and environmental conditions will be accurately reconstructed using stable isotopes from animals consumed, producing a unique, continuous and properly-dated general environmental framework, improving existing knowledge. Finally, to further explore the problem, an innovative procedure to estimate prey abundance, ecology and human behaviour, involving the estimation of the ecosystem carrying capacity, will be developed. This multidisciplinary and novel approach will provide, for the first time, accurate answers to questions concerning a) which particular subsistence patterns (if any) favoured AMH over Neanderthals while coping with the changing environment and b) the extent to which climatic oscillations affected Neanderthal extinction. In this, it will be of relevance to the study of Prehistory on a pan-European scale.
Summary
Climate has long been proposed as a possible trigger-factor for the extinction of Neanderthals and the rapid colonization of Europe by Anatomically Modern Humans (AMH). Abrupt and acute oscillations of climate, as recorded from polar ice sheets, are particularly threatening as they can push ecosystems towards catastrophic outcomes. Under these conditions, the survival of a species critically depends on their adaptive skills. Understanding the exact role that these episodes could have had in the Middle to Upper Palaeolithic transition is then essential to unravel the real causes of Neanderthal demise and AMH success. To do this, SUBSILIENCE will identify the subsistence strategies adopted by both human species in response to those climatic changes at 20 key archaeological sites located across southern European peninsulas. By applying zooarchaeological and taphonomic analyses, the behavioural flexibility and resilience of each human species will be assessed. In addition, to enable effective testing, local terrestrial climatic and environmental conditions will be accurately reconstructed using stable isotopes from animals consumed, producing a unique, continuous and properly-dated general environmental framework, improving existing knowledge. Finally, to further explore the problem, an innovative procedure to estimate prey abundance, ecology and human behaviour, involving the estimation of the ecosystem carrying capacity, will be developed. This multidisciplinary and novel approach will provide, for the first time, accurate answers to questions concerning a) which particular subsistence patterns (if any) favoured AMH over Neanderthals while coping with the changing environment and b) the extent to which climatic oscillations affected Neanderthal extinction. In this, it will be of relevance to the study of Prehistory on a pan-European scale.
Max ERC Funding
2 000 000 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym TACITROOTS
Project The Accademia del Cimento in Florence: tracing the roots of the European scientific enterprise
Researcher (PI) Giulia Giannini
Host Institution (HI) UNIVERSITA DEGLI STUDI DI MILANO
Call Details Consolidator Grant (CoG), SH6, ERC-2018-COG
Summary The Accademia del Cimento (Florence) is the first European society to put experimentation at the core of scientific activity and to be supported by a public power. It lasted only ten years (1657-1667), the same years that saw the establishment of societies of greater fame and longevity such as the Royal Society and the Académie Royale des Sciences.
The copious amount of records – most still unpublished – left by its members casts new light onto the process of establishment of scientific societies in Europe, on the emergence of a shared scientific discourse, and its normalisation. It also clarifies the “viral” aspect of some experiments as well as the dynamics of competition, imitation and (self-) censorship from which these institutional and scientific endeavours originated.
This project analyses for the first time in its entirety the extensive corpus of unpublished documents (ca. 15,000 papers), descriptions of experiments and thousands of epistolary exchanges between members of the Cimento and scholars throughout Europe. By looking at the sources in their entirety, it aims at systematically connecting the strictly experimental, theoretical and philosophical aspects of the Accademia with its intellectual history. The research will thus analyse the hundreds of experiments designed and conducted by the members, without losing sight of the specific context in which they were produced and disseminated. It will reassess the Cimento’s contribution to the development of a scientific lexicon and the materiality of its work, which was heavily reliant on the design and use of scientific instruments.
The substantive body of correspondence – neglected by historians so far – will uncover members’ aims and philosophical concerns, (self-) censorship mechanisms as well as the Cimento’s ties with scholars in the founding process of other famous academies.
This research will thus provide new insights into the origin of scientific institutions in the Early Modern period.
Summary
The Accademia del Cimento (Florence) is the first European society to put experimentation at the core of scientific activity and to be supported by a public power. It lasted only ten years (1657-1667), the same years that saw the establishment of societies of greater fame and longevity such as the Royal Society and the Académie Royale des Sciences.
The copious amount of records – most still unpublished – left by its members casts new light onto the process of establishment of scientific societies in Europe, on the emergence of a shared scientific discourse, and its normalisation. It also clarifies the “viral” aspect of some experiments as well as the dynamics of competition, imitation and (self-) censorship from which these institutional and scientific endeavours originated.
This project analyses for the first time in its entirety the extensive corpus of unpublished documents (ca. 15,000 papers), descriptions of experiments and thousands of epistolary exchanges between members of the Cimento and scholars throughout Europe. By looking at the sources in their entirety, it aims at systematically connecting the strictly experimental, theoretical and philosophical aspects of the Accademia with its intellectual history. The research will thus analyse the hundreds of experiments designed and conducted by the members, without losing sight of the specific context in which they were produced and disseminated. It will reassess the Cimento’s contribution to the development of a scientific lexicon and the materiality of its work, which was heavily reliant on the design and use of scientific instruments.
The substantive body of correspondence – neglected by historians so far – will uncover members’ aims and philosophical concerns, (self-) censorship mechanisms as well as the Cimento’s ties with scholars in the founding process of other famous academies.
This research will thus provide new insights into the origin of scientific institutions in the Early Modern period.
Max ERC Funding
1 708 206 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym TechChange
Project Technological Change: New Sources, Consequences, and Impact Mitigation
Researcher (PI) Philipp Albert Theodor Kircher
Host Institution (HI) EUROPEAN UNIVERSITY INSTITUTE
Call Details Consolidator Grant (CoG), SH1, ERC-2018-COG
Summary Technological change in information technology has the potential of transforming the production process of firms. Some processes such as automation affect workers directly, while others such as the introduction of automated workflow and control tools simply allow firms to grow to bigger sizes. We call the latter Quantity-Biased Technological Change (QBTC).
The existing literature has done relatively little to understand the effects of such technological progress that affects the size and structure of firms and thereby indirectly the workforce. We aim to study this type of technological change, and especially how it affects the workforce in terms of employment and wage inequality.
We aim to explore this intuitive idea in a model with highly heterogeneous firms and workers and decreasing returns to firm size. When technology enables firms to manage more workers, productive firms are now less limited by size considerations and tend to expand, affecting the marginal product of the workers. A preliminary calibration that aims to explains changes in the firm size, wage, and profit distribution in Germany over the last 15 years shows evidence of quantitative importance of the channel and its interaction with other effects, such as Skill-Biased Technological Change (SBTC). Yet there remain many obstacles: accounting for worker heterogeneity within the firm, exploring issues of market power, micro-founding the source of QBTC, and possibly linking it to new innovations such as the rise of artificial intelligence in firm management.
In addition, we propose to revisit past periods of technological change and related policy interventions to gain insights for the future. To achieve this, we discuss how to analyze the past impact on regions with different industrial and occupational compositions. On top, we aim to explore a novel methodology to identify which and how many workers will be affected.
Summary
Technological change in information technology has the potential of transforming the production process of firms. Some processes such as automation affect workers directly, while others such as the introduction of automated workflow and control tools simply allow firms to grow to bigger sizes. We call the latter Quantity-Biased Technological Change (QBTC).
The existing literature has done relatively little to understand the effects of such technological progress that affects the size and structure of firms and thereby indirectly the workforce. We aim to study this type of technological change, and especially how it affects the workforce in terms of employment and wage inequality.
We aim to explore this intuitive idea in a model with highly heterogeneous firms and workers and decreasing returns to firm size. When technology enables firms to manage more workers, productive firms are now less limited by size considerations and tend to expand, affecting the marginal product of the workers. A preliminary calibration that aims to explains changes in the firm size, wage, and profit distribution in Germany over the last 15 years shows evidence of quantitative importance of the channel and its interaction with other effects, such as Skill-Biased Technological Change (SBTC). Yet there remain many obstacles: accounting for worker heterogeneity within the firm, exploring issues of market power, micro-founding the source of QBTC, and possibly linking it to new innovations such as the rise of artificial intelligence in firm management.
In addition, we propose to revisit past periods of technological change and related policy interventions to gain insights for the future. To achieve this, we discuss how to analyze the past impact on regions with different industrial and occupational compositions. On top, we aim to explore a novel methodology to identify which and how many workers will be affected.
Max ERC Funding
1 268 900 €
Duration
Start date: 2019-05-01, End date: 2024-04-30
Project acronym TRADITION
Project Long-term coastal adaptation, food security and poverty alleviation in Latin America
Researcher (PI) Andre Carlo COLONESE
Host Institution (HI) UNIVERSITAT AUTONOMA DE BARCELONA
Call Details Consolidator Grant (CoG), SH6, ERC-2018-COG
Summary TRADITION aims to understand the long-term trajectory of human interaction with coastal resources and its legacy to present day small-scale fisheries in Latin America. Founded on traditional knowledge rooted in the past, small-scale fisheries are a crucial source of food and livelihood for millions of people worldwide, and play a pivotal role in poverty eradication in developing countries. A thorough recognition of the cultural and socio-economic significance of Latin American fisheries requires a temporal component that only archaeology and history can provide. TRADITION will investigate a 4000-year record of coastal exploitation in one of the world's most threatened tropical environments: the Atlantic forest of Brazil. We will draw together archaeological, palaeoecological, historical and ethnographic records to address fundamental questions that impinge upon our current understanding of the development of small-scale fisheries in this region. How did coastal economies adapt to the spread of agriculture? What was the impact of past climate and environmental changes on coastal populations? What was the impact of European colonisation of the Americas on the development of small-scale fisheries? What was the role of historical institutions and regulations in the negotiation between traditional and modern practices in small-scale fisheries? How have the historical practices and events shaped current small-scale coastal communities, and can this knowledge benefit current management strategies. The answers will help us understand how coastal economies responded to unprecedented societal and environmental changes by adapting their subsistence practices, technology and culture, while contributing to the foundation of coastal societies in Latin America.
Summary
TRADITION aims to understand the long-term trajectory of human interaction with coastal resources and its legacy to present day small-scale fisheries in Latin America. Founded on traditional knowledge rooted in the past, small-scale fisheries are a crucial source of food and livelihood for millions of people worldwide, and play a pivotal role in poverty eradication in developing countries. A thorough recognition of the cultural and socio-economic significance of Latin American fisheries requires a temporal component that only archaeology and history can provide. TRADITION will investigate a 4000-year record of coastal exploitation in one of the world's most threatened tropical environments: the Atlantic forest of Brazil. We will draw together archaeological, palaeoecological, historical and ethnographic records to address fundamental questions that impinge upon our current understanding of the development of small-scale fisheries in this region. How did coastal economies adapt to the spread of agriculture? What was the impact of past climate and environmental changes on coastal populations? What was the impact of European colonisation of the Americas on the development of small-scale fisheries? What was the role of historical institutions and regulations in the negotiation between traditional and modern practices in small-scale fisheries? How have the historical practices and events shaped current small-scale coastal communities, and can this knowledge benefit current management strategies. The answers will help us understand how coastal economies responded to unprecedented societal and environmental changes by adapting their subsistence practices, technology and culture, while contributing to the foundation of coastal societies in Latin America.
Max ERC Funding
1 877 107 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym URBAG
Project Integrated System Analysis of Urban Vegetation and Agriculture
Researcher (PI) Gara Villalba Méndez
Host Institution (HI) UNIVERSITAT AUTONOMA DE BARCELONA
Call Details Consolidator Grant (CoG), SH2, ERC-2018-COG
Summary This research aims to find out how urban green infrastructures can be most efficient in contributing to urban sustainability. This will evaluate which combinations of urban, peri-urban agriculture and green spaces result in the best performance in terms of local and global environmental impact.
For this purpose, I will use novel and comprehensive analysis that will integrate the life cycle impacts of the resources required for green infrastructures with the understanding of how green infrastructures impact the urban atmosphere interaction. This comprehensive approach allows to capture the urban metabolism to optimize the food-energy-water nexus. In previous works, the impacts had been only studied individually.
The analysis will consist of 1) A geo-referenced land-use model to optimize urban and peri-urban food production in terms of nutrients, water, and energy, considering urban morphology and determining life cycle impacts 2) A spatially-temporally resolved framework for quantitative analysis and simulation of green infrastructures to determine the direct and indirect effects on the urban and regional atmosphere. The research will be implemented in two selected cities with different profiles, Barcelona and Oslo. The study ambitions to gather substantial quantitative evidence in green infrastructures and sustainability, contributing to cover the existing gap in previous works.
This project and the envisaged: Green infrastructures - A Guide for city planners and policy makers, are timely and urgent. Many cities are implementing green infrastructures despite having little quantitative and comprehensive knowledge as to which infrastructure strategies are more effective in promoting food production, air quality and temperature while reducing environmental impact. This intended Guide will contain evidence-based guidance and tools to create green infrastructure strategies; to help to meet sustainability targets, and promote wider and diffused social benefits.
Summary
This research aims to find out how urban green infrastructures can be most efficient in contributing to urban sustainability. This will evaluate which combinations of urban, peri-urban agriculture and green spaces result in the best performance in terms of local and global environmental impact.
For this purpose, I will use novel and comprehensive analysis that will integrate the life cycle impacts of the resources required for green infrastructures with the understanding of how green infrastructures impact the urban atmosphere interaction. This comprehensive approach allows to capture the urban metabolism to optimize the food-energy-water nexus. In previous works, the impacts had been only studied individually.
The analysis will consist of 1) A geo-referenced land-use model to optimize urban and peri-urban food production in terms of nutrients, water, and energy, considering urban morphology and determining life cycle impacts 2) A spatially-temporally resolved framework for quantitative analysis and simulation of green infrastructures to determine the direct and indirect effects on the urban and regional atmosphere. The research will be implemented in two selected cities with different profiles, Barcelona and Oslo. The study ambitions to gather substantial quantitative evidence in green infrastructures and sustainability, contributing to cover the existing gap in previous works.
This project and the envisaged: Green infrastructures - A Guide for city planners and policy makers, are timely and urgent. Many cities are implementing green infrastructures despite having little quantitative and comprehensive knowledge as to which infrastructure strategies are more effective in promoting food production, air quality and temperature while reducing environmental impact. This intended Guide will contain evidence-based guidance and tools to create green infrastructure strategies; to help to meet sustainability targets, and promote wider and diffused social benefits.
Max ERC Funding
1 893 754 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym ViroPedTher
Project Oncolytic viruses for the treatment of pediatric brain tumors: An integrated clinical and lab approach
Researcher (PI) marta ALONSO-ROLDAN
Host Institution (HI) UNIVERSIDAD DE NAVARRA
Call Details Consolidator Grant (CoG), LS7, ERC-2018-COG
Summary The overreaching goal of my lab is to improve the prognosis of patients with high-risk pediatric brain tumors. To this end, I propose to integrate clinical and lab-based research to develop tumor-targeted oncolytic adenoviruses with the capacity to elicit a therapeutic immune response in those tumors. Our research will use novel and relevant models to accomplish the experimental aims. We have previously worked with Delta-24-RGD (DNX-2401) a replication-competent adenovirus that has been translated to the clinical scenario. In 2017, the first clinical trial phase I with DNX-2401 for newly diagnosed Diffuse Intrinsic Pontine Gliomas (DIPG; a lethal pediatric brain tumor) opened propelled by my team. Preliminary results from the first trials revealed that the intratumoral injection of the virus instigated an initial phase of oncolysis followed by a delayed inflammatory response that ultimately resulted in complete regression in a subset of the patients without associated toxicities. I hypothesized that enhancement of the immune component of the DNX-2401-based therapy will result in the complete regression of the vast majority of pediatric brain tumors. In our specific approach, we propose to understand the immune microenvironment of DIPGs and the response to viral therapy in the context of the trial. Moreover, that knowledge will leverage the design of Delta-24-based adenoviruses to recruit lymphocytes to the tumor with the competence of different type of ligands to activate the tumor infiltrating lymphocytes. I expect that this combinatorial innovative treatment will efficiently challenge the profound and inherent tumor immunosuppression and, in turn, will elicit a robust anti-tumor immune response resulting in the significant improvement of the prognosis and quality of life of patients with pediatric brain tumors. This project has the potential to produce a vertical advance in the field of pediatric oncology.
Summary
The overreaching goal of my lab is to improve the prognosis of patients with high-risk pediatric brain tumors. To this end, I propose to integrate clinical and lab-based research to develop tumor-targeted oncolytic adenoviruses with the capacity to elicit a therapeutic immune response in those tumors. Our research will use novel and relevant models to accomplish the experimental aims. We have previously worked with Delta-24-RGD (DNX-2401) a replication-competent adenovirus that has been translated to the clinical scenario. In 2017, the first clinical trial phase I with DNX-2401 for newly diagnosed Diffuse Intrinsic Pontine Gliomas (DIPG; a lethal pediatric brain tumor) opened propelled by my team. Preliminary results from the first trials revealed that the intratumoral injection of the virus instigated an initial phase of oncolysis followed by a delayed inflammatory response that ultimately resulted in complete regression in a subset of the patients without associated toxicities. I hypothesized that enhancement of the immune component of the DNX-2401-based therapy will result in the complete regression of the vast majority of pediatric brain tumors. In our specific approach, we propose to understand the immune microenvironment of DIPGs and the response to viral therapy in the context of the trial. Moreover, that knowledge will leverage the design of Delta-24-based adenoviruses to recruit lymphocytes to the tumor with the competence of different type of ligands to activate the tumor infiltrating lymphocytes. I expect that this combinatorial innovative treatment will efficiently challenge the profound and inherent tumor immunosuppression and, in turn, will elicit a robust anti-tumor immune response resulting in the significant improvement of the prognosis and quality of life of patients with pediatric brain tumors. This project has the potential to produce a vertical advance in the field of pediatric oncology.
Max ERC Funding
2 000 000 €
Duration
Start date: 2019-03-01, End date: 2024-02-29