Project acronym 15CBOOKTRADE
Project The 15th-century Book Trade: An Evidence-based Assessment and Visualization of the Distribution, Sale, and Reception of Books in the Renaissance
Researcher (PI) Cristina Dondi
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Consolidator Grant (CoG), SH6, ERC-2013-CoG
Summary The idea that underpins this project is to use the material evidence from thousands of surviving 15th-c. books, as well as unique documentary evidence — the unpublished ledger of a Venetian bookseller in the 1480s which records the sale of 25,000 printed books with their prices — to address four fundamental questions relating to the introduction of printing in the West which have so far eluded scholarship, partly because of lack of evidence, partly because of the lack of effective tools to deal with existing evidence. The book trade differs from other trades operating in the medieval and early modern periods in that the goods traded survive in considerable numbers. Not only do they survive, but many of them bear stratified evidence of their history in the form of marks of ownership, prices, manuscript annotations, binding and decoration styles. A British Academy pilot project conceived by the PI produced a now internationally-used database which gathers together this kind of evidence for thousands of surviving 15th-c. printed books. For the first time, this makes it possible to track the circulation of books, their trade routes and later collecting, across Europe and the USA, and throughout the centuries. The objectives of this project are to examine (1) the distribution and trade-routes, national and international, of 15th-c. printed books, along with the identity of the buyers and users (private, institutional, religious, lay, female, male, and by profession) and their reading practices; (2) the books' contemporary market value; (3) the transmission and dissemination of the texts they contain, their survival and their loss (rebalancing potentially skewed scholarship); and (4) the circulation and re-use of the illustrations they contain. Finally, the project will experiment with the application of scientific visualization techniques to represent, geographically and chronologically, the movement of 15th-c. printed books and of the texts they contain.
Summary
The idea that underpins this project is to use the material evidence from thousands of surviving 15th-c. books, as well as unique documentary evidence — the unpublished ledger of a Venetian bookseller in the 1480s which records the sale of 25,000 printed books with their prices — to address four fundamental questions relating to the introduction of printing in the West which have so far eluded scholarship, partly because of lack of evidence, partly because of the lack of effective tools to deal with existing evidence. The book trade differs from other trades operating in the medieval and early modern periods in that the goods traded survive in considerable numbers. Not only do they survive, but many of them bear stratified evidence of their history in the form of marks of ownership, prices, manuscript annotations, binding and decoration styles. A British Academy pilot project conceived by the PI produced a now internationally-used database which gathers together this kind of evidence for thousands of surviving 15th-c. printed books. For the first time, this makes it possible to track the circulation of books, their trade routes and later collecting, across Europe and the USA, and throughout the centuries. The objectives of this project are to examine (1) the distribution and trade-routes, national and international, of 15th-c. printed books, along with the identity of the buyers and users (private, institutional, religious, lay, female, male, and by profession) and their reading practices; (2) the books' contemporary market value; (3) the transmission and dissemination of the texts they contain, their survival and their loss (rebalancing potentially skewed scholarship); and (4) the circulation and re-use of the illustrations they contain. Finally, the project will experiment with the application of scientific visualization techniques to represent, geographically and chronologically, the movement of 15th-c. printed books and of the texts they contain.
Max ERC Funding
1 999 172 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym 2D-USD
Project Ultrasonic Spray Deposition: Enabling new 2D based technologies
Researcher (PI) Valeria NICOLOSI
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Proof of Concept (PoC), PC1, ERC-2013-PoC
Summary This proposal will determine the technical and economic viability of scaling up ultra-thin film deposition processes for exfoliated single atomic layers.
The PI has developed methods to produce exfoliated nanosheets from a range of layered materials such as graphene, transition metal chalcogenides and transition metal oxides. These 2D materials have immediate and far-reaching potential in several high-impact technological applications such as microelectronics, composites and energy harvesting and storage.
2DNanoCaps (ERC ref: 278516) has already demonstrated that lab-scale ultra-thin graphene-based supercapacitor electrodes for energy storage result in unusually high power performance and extremely long device life-time (100% capacitance retention for 5000 charge-discharge cycles at the high power scan rate of 10,000 mV/s). This performance is remarkable- an order of magnitude better than similar systems produced with more conventional methods, which cause materials restacking and aggregation. 2D nanosheets also offer the chance of exploring the unique possibility of manufacturing conductive, robust, thin, easily assembled electrode and solid electrolytes to realize highly flexible and all-solid-state supercapacitors. This opportunity is particularly relevant from the industrial point of view especially in relation to the flammability issues of the electrolytes used for commercial energy storage devices at present.
In order to develop and exploit any of the applications listed above, it will be imperative to develop deposition methods and techniques capable of obtaining industrial-scale “sheet-like” coverage, where flake re-aggregation is avoided.
We believe our combination of unique material properties and cost effective, robust and production-scalable process of ultra-thin deposition will enable us to compete for significant global market opportunities in the energy-storage space
Summary
This proposal will determine the technical and economic viability of scaling up ultra-thin film deposition processes for exfoliated single atomic layers.
The PI has developed methods to produce exfoliated nanosheets from a range of layered materials such as graphene, transition metal chalcogenides and transition metal oxides. These 2D materials have immediate and far-reaching potential in several high-impact technological applications such as microelectronics, composites and energy harvesting and storage.
2DNanoCaps (ERC ref: 278516) has already demonstrated that lab-scale ultra-thin graphene-based supercapacitor electrodes for energy storage result in unusually high power performance and extremely long device life-time (100% capacitance retention for 5000 charge-discharge cycles at the high power scan rate of 10,000 mV/s). This performance is remarkable- an order of magnitude better than similar systems produced with more conventional methods, which cause materials restacking and aggregation. 2D nanosheets also offer the chance of exploring the unique possibility of manufacturing conductive, robust, thin, easily assembled electrode and solid electrolytes to realize highly flexible and all-solid-state supercapacitors. This opportunity is particularly relevant from the industrial point of view especially in relation to the flammability issues of the electrolytes used for commercial energy storage devices at present.
In order to develop and exploit any of the applications listed above, it will be imperative to develop deposition methods and techniques capable of obtaining industrial-scale “sheet-like” coverage, where flake re-aggregation is avoided.
We believe our combination of unique material properties and cost effective, robust and production-scalable process of ultra-thin deposition will enable us to compete for significant global market opportunities in the energy-storage space
Max ERC Funding
148 021 €
Duration
Start date: 2014-01-01, End date: 2014-12-31
Project acronym 2DIR SPECTROMETER
Project A step-change in sensitivity for two dimensional laser infrared spectroscopy
Researcher (PI) Jasper VAN THOR
Host Institution (HI) IMPERIAL COLLEGE OF SCIENCE TECHNOLOGY AND MEDICINE
Call Details Proof of Concept (PoC), PC1, ERC-2013-PoC
Summary "Here, we propose a novel design for a significantly improved detector for the emerging field of coherent two-dimension infrared (2DIR) spectroscopy, which is an optical analog of Nuclear Magnetic Resonance spectroscopy (NMR). 2DIR is a cutting edge technique which is rapidly growing and has applications in subjects as diverse as energy sciences, biophysics, biomedical research and physical chemistry. Currently, the single most important technical problem that is generally agreed to limit applications of the methodology is the sensitivity with which the signals are measured. Having worked on multiple stabilisation techniques during the ERC funded research it was realised that a straightforward design alteration of the infrared detector will improve the sensitivity very significantly, theoretically by more than one order of magnitude. Here, the technical principles are explained, and a plan for commercialising the instrument in collaboration with the current market leader - Infrared System Development Corp. (ISDC) -. We apply for funding to develop the prototype."
Summary
"Here, we propose a novel design for a significantly improved detector for the emerging field of coherent two-dimension infrared (2DIR) spectroscopy, which is an optical analog of Nuclear Magnetic Resonance spectroscopy (NMR). 2DIR is a cutting edge technique which is rapidly growing and has applications in subjects as diverse as energy sciences, biophysics, biomedical research and physical chemistry. Currently, the single most important technical problem that is generally agreed to limit applications of the methodology is the sensitivity with which the signals are measured. Having worked on multiple stabilisation techniques during the ERC funded research it was realised that a straightforward design alteration of the infrared detector will improve the sensitivity very significantly, theoretically by more than one order of magnitude. Here, the technical principles are explained, and a plan for commercialising the instrument in collaboration with the current market leader - Infrared System Development Corp. (ISDC) -. We apply for funding to develop the prototype."
Max ERC Funding
149 999 €
Duration
Start date: 2013-11-01, End date: 2014-10-31
Project acronym 3D-OA-HISTO
Project Development of 3D Histopathological Grading of Osteoarthritis
Researcher (PI) Simo Jaakko Saarakkala
Host Institution (HI) OULUN YLIOPISTO
Call Details Starting Grant (StG), LS7, ERC-2013-StG
Summary "Background: Osteoarthritis (OA) is a common musculoskeletal disease occurring worldwide. Despite extensive research, etiology of OA is still poorly understood. Histopathological grading (HPG) of 2D tissue sections is the gold standard reference method for determination of OA stage. However, traditional 2D-HPG is destructive and based only on subjective visual evaluation. These limitations induce bias to clinical in vitro OA diagnostics and basic research that both rely strongly on HPG.
Objectives: 1) To establish and validate the very first 3D-HPG of OA based on cutting-edge nano/micro-CT (Computed Tomography) technologies in vitro; 2) To use the established method to clarify the beginning phases of OA; and 3) To validate 3D-HPG of OA for in vivo use.
Methods: Several hundreds of human osteochondral samples from patients undergoing total knee arthroplasty will be collected. The samples will be imaged in vitro with nano/micro-CT and clinical high-end extremity CT devices using specific contrast-agents to quantify tissue constituents and structure in 3D in large volume. From this information, a novel 3D-HPG is developed with statistical classification algorithms. Finally, the developed novel 3D-HPG of OA will be applied clinically in vivo.
Significance: This is the very first study to establish 3D-HPG of OA pathology in vitro and in vivo. Furthermore, the developed technique hugely improves the understanding of the beginning phases of OA. Ultimately, the study will contribute for improving OA patients’ quality of life by slowing the disease progression, and for providing powerful tools to develop new OA therapies."
Summary
"Background: Osteoarthritis (OA) is a common musculoskeletal disease occurring worldwide. Despite extensive research, etiology of OA is still poorly understood. Histopathological grading (HPG) of 2D tissue sections is the gold standard reference method for determination of OA stage. However, traditional 2D-HPG is destructive and based only on subjective visual evaluation. These limitations induce bias to clinical in vitro OA diagnostics and basic research that both rely strongly on HPG.
Objectives: 1) To establish and validate the very first 3D-HPG of OA based on cutting-edge nano/micro-CT (Computed Tomography) technologies in vitro; 2) To use the established method to clarify the beginning phases of OA; and 3) To validate 3D-HPG of OA for in vivo use.
Methods: Several hundreds of human osteochondral samples from patients undergoing total knee arthroplasty will be collected. The samples will be imaged in vitro with nano/micro-CT and clinical high-end extremity CT devices using specific contrast-agents to quantify tissue constituents and structure in 3D in large volume. From this information, a novel 3D-HPG is developed with statistical classification algorithms. Finally, the developed novel 3D-HPG of OA will be applied clinically in vivo.
Significance: This is the very first study to establish 3D-HPG of OA pathology in vitro and in vivo. Furthermore, the developed technique hugely improves the understanding of the beginning phases of OA. Ultimately, the study will contribute for improving OA patients’ quality of life by slowing the disease progression, and for providing powerful tools to develop new OA therapies."
Max ERC Funding
1 500 000 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym 3DICE
Project 3D Interstellar Chemo-physical Evolution
Researcher (PI) Valentine Wakelam
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE9, ERC-2013-StG
Summary At the end of their life, stars spread their inner material into the diffuse interstellar medium. This diffuse medium gets locally denser and form dark clouds (also called dense or molecular clouds) whose innermost part is shielded from the external UV field by the dust, allowing for molecules to grow and get more complex. Gravitational collapse occurs inside these dense clouds, forming protostars and their surrounding disks, and eventually planetary systems like (or unlike) our solar system. The formation and evolution of molecules, minerals, ices and organics from the diffuse medium to planetary bodies, their alteration or preservation throughout this cosmic chemical history set the initial conditions for building planets, atmospheres and possibly the first bricks of life. The current view of interstellar chemistry is based on fragmental works on key steps of the sequence that are observed. The objective of this proposal is to follow the fractionation of the elements between the gas-phase and the interstellar grains, from the most diffuse medium to protoplanetary disks, in order to constrain the chemical composition of the material in which planets are formed. The potential outcome of this project is to get a consistent and more accurate description of the chemical evolution of interstellar matter. To achieve this objective, I will improve our chemical model by adding new processes on grain surfaces relevant under the diffuse medium conditions. This upgraded gas-grain model will be coupled to 3D dynamical models of the formation of dense clouds from diffuse medium and of protoplanetary disks from dense clouds. The computed chemical composition will also be used with 3D radiative transfer codes to study the chemical tracers of the physics of protoplanetary disk formation. The robustness of the model predictions will be studied with sensitivity analyses. Finally, model results will be confronted to observations to address some of the current challenges.
Summary
At the end of their life, stars spread their inner material into the diffuse interstellar medium. This diffuse medium gets locally denser and form dark clouds (also called dense or molecular clouds) whose innermost part is shielded from the external UV field by the dust, allowing for molecules to grow and get more complex. Gravitational collapse occurs inside these dense clouds, forming protostars and their surrounding disks, and eventually planetary systems like (or unlike) our solar system. The formation and evolution of molecules, minerals, ices and organics from the diffuse medium to planetary bodies, their alteration or preservation throughout this cosmic chemical history set the initial conditions for building planets, atmospheres and possibly the first bricks of life. The current view of interstellar chemistry is based on fragmental works on key steps of the sequence that are observed. The objective of this proposal is to follow the fractionation of the elements between the gas-phase and the interstellar grains, from the most diffuse medium to protoplanetary disks, in order to constrain the chemical composition of the material in which planets are formed. The potential outcome of this project is to get a consistent and more accurate description of the chemical evolution of interstellar matter. To achieve this objective, I will improve our chemical model by adding new processes on grain surfaces relevant under the diffuse medium conditions. This upgraded gas-grain model will be coupled to 3D dynamical models of the formation of dense clouds from diffuse medium and of protoplanetary disks from dense clouds. The computed chemical composition will also be used with 3D radiative transfer codes to study the chemical tracers of the physics of protoplanetary disk formation. The robustness of the model predictions will be studied with sensitivity analyses. Finally, model results will be confronted to observations to address some of the current challenges.
Max ERC Funding
1 166 231 €
Duration
Start date: 2013-09-01, End date: 2018-08-31
Project acronym 3DNANOMECH
Project Three-dimensional molecular resolution mapping of soft matter-liquid interfaces
Researcher (PI) Ricardo Garcia
Host Institution (HI) AGENCIA ESTATAL CONSEJO SUPERIOR DEINVESTIGACIONES CIENTIFICAS
Call Details Advanced Grant (AdG), PE4, ERC-2013-ADG
Summary Optical, electron and probe microscopes are enabling tools for discoveries and knowledge generation in nanoscale sicence and technology. High resolution –nanoscale or molecular-, noninvasive and label-free imaging of three-dimensional soft matter-liquid interfaces has not been achieved by any microscopy method.
Force microscopy (AFM) is considered the second most relevant advance in materials science since 1960. Despite its impressive range of applications, the technique has some key limitations. Force microscopy has not three dimensional depth. What lies above or in the subsurface is not readily characterized.
3DNanoMech proposes to design, build and operate a high speed force-based method for the three-dimensional characterization soft matter-liquid interfaces (3D AFM). The microscope will combine a detection method based on force perturbations, adaptive algorithms, high speed piezo actuators and quantitative-oriented multifrequency approaches. The development of the microscope cannot be separated from its applications: imaging the error-free DNA repair and to understand the relationship existing between the nanomechanical properties and the malignancy of cancer cells. Those problems encompass the different spatial –molecular-nano-mesoscopic- and time –milli to seconds- scales of the instrument.
In short, 3DNanoMech aims to image, map and measure with picoNewton, millisecond and angstrom resolution soft matter surfaces and interfaces in liquid. The long-term vision of 3DNanoMech is to replace models or computer animations of bimolecular-liquid interfaces by real time, molecular resolution maps of properties and processes.
Summary
Optical, electron and probe microscopes are enabling tools for discoveries and knowledge generation in nanoscale sicence and technology. High resolution –nanoscale or molecular-, noninvasive and label-free imaging of three-dimensional soft matter-liquid interfaces has not been achieved by any microscopy method.
Force microscopy (AFM) is considered the second most relevant advance in materials science since 1960. Despite its impressive range of applications, the technique has some key limitations. Force microscopy has not three dimensional depth. What lies above or in the subsurface is not readily characterized.
3DNanoMech proposes to design, build and operate a high speed force-based method for the three-dimensional characterization soft matter-liquid interfaces (3D AFM). The microscope will combine a detection method based on force perturbations, adaptive algorithms, high speed piezo actuators and quantitative-oriented multifrequency approaches. The development of the microscope cannot be separated from its applications: imaging the error-free DNA repair and to understand the relationship existing between the nanomechanical properties and the malignancy of cancer cells. Those problems encompass the different spatial –molecular-nano-mesoscopic- and time –milli to seconds- scales of the instrument.
In short, 3DNanoMech aims to image, map and measure with picoNewton, millisecond and angstrom resolution soft matter surfaces and interfaces in liquid. The long-term vision of 3DNanoMech is to replace models or computer animations of bimolecular-liquid interfaces by real time, molecular resolution maps of properties and processes.
Max ERC Funding
2 499 928 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym 4D-GENOME
Project Dynamics of human genome architecture in stable and transient gene expression changes
Researcher (PI) Thomas Graf
Host Institution (HI) FUNDACIO CENTRE DE REGULACIO GENOMICA
Call Details Synergy Grants (SyG), SYG6, ERC-2013-SyG
Summary The classical view of genomes as linear sequences has been replaced by a vision of nuclear organization that is both dynamic and complex, with chromosomes and genes non-randomly positioned in the nucleus. Process compartmentalization and spatial location of genes modulate the transcriptional output of the genomes. However, how the interplay between genome structure and gene regulation is established and maintained is still unclear. The aim of this project is to explore whether the genome 3D structure acts as an information source for modulating transcription in response to external stimuli. With a genuine interdisciplinary team effort, we will study the conformation of the genome at various integrated levels, from the nucleosome fiber to the distribution of chromosomes territories in the nuclear space. We will generate high-resolution 3D models of the spatial organization of the genomes of distinct eukaryotic cell types in interphase to identify differences in the chromatin landscape. We will follow the time course of structural changes in response to cues that affect gene expression either permanently or transiently. We will analyze the changes in genome structure during the stable trans-differentiation of immortalized B cells to macrophages and during the transient hormonal responses of differentiated cells. We plan to establish novel functional strategies, based on targeted and high-throughput reporter assays, to assess the relevance of the spatial environment on gene regulation. Using sophisticated modeling and computational approaches, we will combine high-resolution data from chromosome interactions, super-resolution images and omics information. Our long-term plan is to implement a 3D browser for the comprehensive mapping of chromatin properties and genomic features, to better understand how external signals are integrated at the genomic, epigenetic and structural level to orchestrate changes in gene expression that are cell specific and dynamic.
Summary
The classical view of genomes as linear sequences has been replaced by a vision of nuclear organization that is both dynamic and complex, with chromosomes and genes non-randomly positioned in the nucleus. Process compartmentalization and spatial location of genes modulate the transcriptional output of the genomes. However, how the interplay between genome structure and gene regulation is established and maintained is still unclear. The aim of this project is to explore whether the genome 3D structure acts as an information source for modulating transcription in response to external stimuli. With a genuine interdisciplinary team effort, we will study the conformation of the genome at various integrated levels, from the nucleosome fiber to the distribution of chromosomes territories in the nuclear space. We will generate high-resolution 3D models of the spatial organization of the genomes of distinct eukaryotic cell types in interphase to identify differences in the chromatin landscape. We will follow the time course of structural changes in response to cues that affect gene expression either permanently or transiently. We will analyze the changes in genome structure during the stable trans-differentiation of immortalized B cells to macrophages and during the transient hormonal responses of differentiated cells. We plan to establish novel functional strategies, based on targeted and high-throughput reporter assays, to assess the relevance of the spatial environment on gene regulation. Using sophisticated modeling and computational approaches, we will combine high-resolution data from chromosome interactions, super-resolution images and omics information. Our long-term plan is to implement a 3D browser for the comprehensive mapping of chromatin properties and genomic features, to better understand how external signals are integrated at the genomic, epigenetic and structural level to orchestrate changes in gene expression that are cell specific and dynamic.
Max ERC Funding
12 272 645 €
Duration
Start date: 2014-06-01, End date: 2019-05-31
Project acronym a SMILE
Project analyse Soluble + Membrane complexes with Improved LILBID Experiments
Researcher (PI) Nina Morgner
Host Institution (HI) JOHANN WOLFGANG GOETHE-UNIVERSITATFRANKFURT AM MAIN
Call Details Starting Grant (StG), PE4, ERC-2013-StG
Summary Crucial processes within cells depend on specific non-covalent interactions which mediate the assembly of proteins and other biomolecules. Deriving structural information to understand the function of these complex systems is the primary goal of Structural Biology.
In this application, the recently developed LILBID method (Laser Induced Liquid Bead Ion Desorption) will be optimized for investigation of macromolecular complexes with a mass accuracy two orders of magnitude better than in 1st generation spectrometers.
Controlled disassembly of the multiprotein complexes in the mass spectrometric analysis while keeping the 3D structure intact, will allow for the determination of complex stoichiometry and connectivity of the constituting proteins. Methods for such controlled disassembly will be developed in two separate units of the proposed LILBID spectrometer, in a collision chamber and in a laser dissociation chamber, enabling gas phase dissociation of protein complexes and removal of excess water/buffer molecules. As a third unit, a chamber allowing determination of ion mobility (IM) will be integrated to determine collisional cross sections (CCS). From CCS, unique information regarding the spatial arrangement of proteins in complexes or subcomplexes will then be obtainable from LILBID.
The proposed design of the new spectrometer will offer fundamentally new possibilities for the investigation of non-covalent RNA, soluble and membrane protein complexes, as well as broadening the applicability of non-covalent MS towards supercomplexes.
Summary
Crucial processes within cells depend on specific non-covalent interactions which mediate the assembly of proteins and other biomolecules. Deriving structural information to understand the function of these complex systems is the primary goal of Structural Biology.
In this application, the recently developed LILBID method (Laser Induced Liquid Bead Ion Desorption) will be optimized for investigation of macromolecular complexes with a mass accuracy two orders of magnitude better than in 1st generation spectrometers.
Controlled disassembly of the multiprotein complexes in the mass spectrometric analysis while keeping the 3D structure intact, will allow for the determination of complex stoichiometry and connectivity of the constituting proteins. Methods for such controlled disassembly will be developed in two separate units of the proposed LILBID spectrometer, in a collision chamber and in a laser dissociation chamber, enabling gas phase dissociation of protein complexes and removal of excess water/buffer molecules. As a third unit, a chamber allowing determination of ion mobility (IM) will be integrated to determine collisional cross sections (CCS). From CCS, unique information regarding the spatial arrangement of proteins in complexes or subcomplexes will then be obtainable from LILBID.
The proposed design of the new spectrometer will offer fundamentally new possibilities for the investigation of non-covalent RNA, soluble and membrane protein complexes, as well as broadening the applicability of non-covalent MS towards supercomplexes.
Max ERC Funding
1 264 477 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym A-BINGOS
Project Accreting binary populations in Nearby Galaxies: Observations and Simulations
Researcher (PI) Andreas Zezas
Host Institution (HI) IDRYMA TECHNOLOGIAS KAI EREVNAS
Call Details Consolidator Grant (CoG), PE9, ERC-2013-CoG
Summary "High-energy observations of our Galaxy offer a good, albeit not complete, picture of the X-ray source populations, in particular the accreting binary sources. Recent ability to study accreting binaries in nearby galaxies has shown that we would be short-sighted if we restricted ourselves to our Galaxy or to a few nearby ones. I propose an ambitious project that involves a comprehensive study of all the galaxies within 10 Mpc for which we can study in detail their X-ray sources and stellar populations. The study will combine data from a unique suite of observatories (Chandra, XMM-Newton, HST, Spitzer) with state-of-the-art theoretical modelling of binary systems. I propose a novel approach that links the accreting binary populations to their parent stellar populations and surpasses any current studies of X-ray binary populations, both in scale and in scope, by: (a) combining methods and results from several different areas of astrophysics (compact objects, binary systems, stellar populations, galaxy evolution); (b) using data from almost the whole electromagnetic spectrum (infrared to X-ray bands); (c) identifying and studying the different sub-populations of accreting binaries; and (d) performing direct comparison between observations and theoretical predictions, over a broad parameter space. The project: (a) will answer the long-standing question of the formation efficiency of accreting binaries in different environments; and (b) will constrain their evolutionary paths. As by-products the project will provide eagerly awaited input to the fields of gravitational-wave sources, γ-ray bursts, and X-ray emitting galaxies at cosmological distances and it will produce a heritage multi-wavelength dataset and library of models for future studies of galaxies and accreting binaries."
Summary
"High-energy observations of our Galaxy offer a good, albeit not complete, picture of the X-ray source populations, in particular the accreting binary sources. Recent ability to study accreting binaries in nearby galaxies has shown that we would be short-sighted if we restricted ourselves to our Galaxy or to a few nearby ones. I propose an ambitious project that involves a comprehensive study of all the galaxies within 10 Mpc for which we can study in detail their X-ray sources and stellar populations. The study will combine data from a unique suite of observatories (Chandra, XMM-Newton, HST, Spitzer) with state-of-the-art theoretical modelling of binary systems. I propose a novel approach that links the accreting binary populations to their parent stellar populations and surpasses any current studies of X-ray binary populations, both in scale and in scope, by: (a) combining methods and results from several different areas of astrophysics (compact objects, binary systems, stellar populations, galaxy evolution); (b) using data from almost the whole electromagnetic spectrum (infrared to X-ray bands); (c) identifying and studying the different sub-populations of accreting binaries; and (d) performing direct comparison between observations and theoretical predictions, over a broad parameter space. The project: (a) will answer the long-standing question of the formation efficiency of accreting binaries in different environments; and (b) will constrain their evolutionary paths. As by-products the project will provide eagerly awaited input to the fields of gravitational-wave sources, γ-ray bursts, and X-ray emitting galaxies at cosmological distances and it will produce a heritage multi-wavelength dataset and library of models for future studies of galaxies and accreting binaries."
Max ERC Funding
1 242 000 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym A-HERO
Project Anthelmintic Research and Optimization
Researcher (PI) Jennifer Irene Keiser
Host Institution (HI) SCHWEIZERISCHES TROPEN- UND PUBLIC HEALTH-INSTITUT
Call Details Consolidator Grant (CoG), LS7, ERC-2013-CoG
Summary "I propose an ambitious, yet feasible 5-year research project that will fill an important gap in global health. Specifically, I will develop and validate novel approaches for anthelmintic drug discovery and development. My proposal pursues the following five research questions: (i) Is a chip calorimeter suitable for high-throughput screening in anthelmintic drug discovery? (ii) Is combination chemotherapy safe and more efficacious than monotherapy against strongyloidiasis and trichuriasis? (iii) What are the key pharmacokinetic parameters of praziquantel in preschool-aged children and school-aged children infected with Schistosoma mansoni and S. haematobium using a novel and validated technology based on dried blood spotting? (iv) What are the metabolic consequences and clearance of praziquantel treatment in S. mansoni-infected mice and S. mansoni- and S. haematobium-infected children? (v) Which is the ideal compartment to study pharmacokinetic parameters for intestinal nematode infections and does age, nutrition, co-infection and infection intensity influence the efficacy of anthelmintic drugs?
My proposed research is of considerable public health relevance since it will ultimately result in improved treatments for soil-transmitted helminthiasis and pediatric schistosomiasis. Additionally, at the end of this project, I have generated comprehensive information on drug disposition of anthelmintics. A comprehensive database of metabolite profiles following praziquantel treatment will be available. Finally, the proof-of-concept of chip calorimetry in anthelmintic drug discovery has been established and broadly validated."
Summary
"I propose an ambitious, yet feasible 5-year research project that will fill an important gap in global health. Specifically, I will develop and validate novel approaches for anthelmintic drug discovery and development. My proposal pursues the following five research questions: (i) Is a chip calorimeter suitable for high-throughput screening in anthelmintic drug discovery? (ii) Is combination chemotherapy safe and more efficacious than monotherapy against strongyloidiasis and trichuriasis? (iii) What are the key pharmacokinetic parameters of praziquantel in preschool-aged children and school-aged children infected with Schistosoma mansoni and S. haematobium using a novel and validated technology based on dried blood spotting? (iv) What are the metabolic consequences and clearance of praziquantel treatment in S. mansoni-infected mice and S. mansoni- and S. haematobium-infected children? (v) Which is the ideal compartment to study pharmacokinetic parameters for intestinal nematode infections and does age, nutrition, co-infection and infection intensity influence the efficacy of anthelmintic drugs?
My proposed research is of considerable public health relevance since it will ultimately result in improved treatments for soil-transmitted helminthiasis and pediatric schistosomiasis. Additionally, at the end of this project, I have generated comprehensive information on drug disposition of anthelmintics. A comprehensive database of metabolite profiles following praziquantel treatment will be available. Finally, the proof-of-concept of chip calorimetry in anthelmintic drug discovery has been established and broadly validated."
Max ERC Funding
1 927 350 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym A2C2
Project Atmospheric flow Analogues and Climate Change
Researcher (PI) Pascal Yiou
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Advanced Grant (AdG), PE10, ERC-2013-ADG
Summary "The A2C2 project treats two major challenges in climate and atmospheric research: the time dependence of the climate attractor to external forcings (solar, volcanic eruptions and anthropogenic), and the attribution of extreme climate events occurring in the northern extra-tropics. The main difficulties are the limited climate information, the computer cost of model simulations, and mathematical assumptions that are hardly verified and often overlooked in the literature.
A2C2 proposes a practical framework to overcome those three difficulties, linking the theory of dynamical systems and statistics. We will generalize the methodology of flow analogues to multiple databases in order to obtain probabilistic descriptions of analogue decompositions.
The project is divided into three workpackages (WP). WP1 embeds the analogue method in the theory of dynamical systems in order to provide a metric of an attractor deformation in time. The important methodological step is to detect trends or persisting outliers in the dates and scores of analogues when the system yields time-varying forcings. This is done from idealized models and full size climate models in which the forcings (anthropogenic and natural) are known.
A2C2 creates an open source toolkit to compute flow analogues from a wide array of databases (WP2). WP3 treats the two scientific challenges with the analogue method and multiple model ensembles, hence allowing uncertainty estimates under realistic mathematical hypotheses. The flow analogue methodology allows a systematic and quasi real-time analysis of extreme events, which is currently out of the reach of conventional climate modeling approaches.
The major breakthrough of A2C2 is to bridge the gap between operational needs (the immediate analysis of climate events) and the understanding long-term climate changes. A2C2 opens new research horizons for the exploitation of ensembles of simulations and reliable estimates of uncertainty."
Summary
"The A2C2 project treats two major challenges in climate and atmospheric research: the time dependence of the climate attractor to external forcings (solar, volcanic eruptions and anthropogenic), and the attribution of extreme climate events occurring in the northern extra-tropics. The main difficulties are the limited climate information, the computer cost of model simulations, and mathematical assumptions that are hardly verified and often overlooked in the literature.
A2C2 proposes a practical framework to overcome those three difficulties, linking the theory of dynamical systems and statistics. We will generalize the methodology of flow analogues to multiple databases in order to obtain probabilistic descriptions of analogue decompositions.
The project is divided into three workpackages (WP). WP1 embeds the analogue method in the theory of dynamical systems in order to provide a metric of an attractor deformation in time. The important methodological step is to detect trends or persisting outliers in the dates and scores of analogues when the system yields time-varying forcings. This is done from idealized models and full size climate models in which the forcings (anthropogenic and natural) are known.
A2C2 creates an open source toolkit to compute flow analogues from a wide array of databases (WP2). WP3 treats the two scientific challenges with the analogue method and multiple model ensembles, hence allowing uncertainty estimates under realistic mathematical hypotheses. The flow analogue methodology allows a systematic and quasi real-time analysis of extreme events, which is currently out of the reach of conventional climate modeling approaches.
The major breakthrough of A2C2 is to bridge the gap between operational needs (the immediate analysis of climate events) and the understanding long-term climate changes. A2C2 opens new research horizons for the exploitation of ensembles of simulations and reliable estimates of uncertainty."
Max ERC Funding
1 491 457 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym AAREA
Project The Archaeology of Agricultural Resilience in Eastern Africa
Researcher (PI) Daryl Stump
Host Institution (HI) UNIVERSITY OF YORK
Call Details Starting Grant (StG), SH6, ERC-2013-StG
Summary "The twin concepts of sustainability and conservation that are so pivotal within current debates regarding economic development and biodiversity protection both contain an inherent temporal dimension, since both refer to the need to balance short-term gains with long-term resource maintenance. Proponents of resilience theory and of development based on ‘indigenous knowledge’ have thus argued for the necessity of including archaeological, historical and palaeoenvironmental components within development project design. Indeed, some have argued that archaeology should lead these interdisciplinary projects on the grounds that it provides the necessary time depth and bridges the social and natural sciences. The project proposed here accepts this logic and endorses this renewed contemporary relevance of archaeological research. However, it also needs to be admitted that moving beyond critiques of the misuse of historical data presents significant hurdles. When presenting results outside the discipline, for example, archaeological projects tend to downplay the poor archaeological visibility of certain agricultural practices, and computer models designed to test sustainability struggle to adequately account for local cultural preferences. This field will therefore not progress unless there is a frank appraisal of archaeology’s strengths and weaknesses. This project will provide this assessment by employing a range of established and groundbreaking archaeological and modelling techniques to examine the development of two east Africa agricultural systems: one at the abandoned site of Engaruka in Tanzania, commonly seen as an example of resource mismanagement and ecological collapse; and another at the current agricultural landscape in Konso, Ethiopia, described by the UN FAO as one of a select few African “lessons from the past”. The project thus aims to assess the sustainability of these systems, but will also assess the role archaeology can play in such debates worldwide."
Summary
"The twin concepts of sustainability and conservation that are so pivotal within current debates regarding economic development and biodiversity protection both contain an inherent temporal dimension, since both refer to the need to balance short-term gains with long-term resource maintenance. Proponents of resilience theory and of development based on ‘indigenous knowledge’ have thus argued for the necessity of including archaeological, historical and palaeoenvironmental components within development project design. Indeed, some have argued that archaeology should lead these interdisciplinary projects on the grounds that it provides the necessary time depth and bridges the social and natural sciences. The project proposed here accepts this logic and endorses this renewed contemporary relevance of archaeological research. However, it also needs to be admitted that moving beyond critiques of the misuse of historical data presents significant hurdles. When presenting results outside the discipline, for example, archaeological projects tend to downplay the poor archaeological visibility of certain agricultural practices, and computer models designed to test sustainability struggle to adequately account for local cultural preferences. This field will therefore not progress unless there is a frank appraisal of archaeology’s strengths and weaknesses. This project will provide this assessment by employing a range of established and groundbreaking archaeological and modelling techniques to examine the development of two east Africa agricultural systems: one at the abandoned site of Engaruka in Tanzania, commonly seen as an example of resource mismanagement and ecological collapse; and another at the current agricultural landscape in Konso, Ethiopia, described by the UN FAO as one of a select few African “lessons from the past”. The project thus aims to assess the sustainability of these systems, but will also assess the role archaeology can play in such debates worldwide."
Max ERC Funding
1 196 701 €
Duration
Start date: 2014-02-01, End date: 2018-01-31
Project acronym ABDESIGN
Project Computational design of novel protein function in antibodies
Researcher (PI) Sarel-Jacob Fleishman
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), LS1, ERC-2013-StG
Summary We propose to elucidate the structural design principles of naturally occurring antibody complementarity-determining regions (CDRs) and to computationally design novel antibody functions. Antibodies represent the most versatile known system for molecular recognition. Research has yielded many insights into antibody design principles and promising biotechnological and pharmaceutical applications. Still, our understanding of how CDRs encode specific loop conformations lags far behind our understanding of structure-function relationships in non-immunological scaffolds. Thus, design of antibodies from first principles has not been demonstrated. We propose a computational-experimental strategy to address this challenge. We will: (a) characterize the design principles and sequence elements that rigidify antibody CDRs. Natural antibody loops will be subjected to computational modeling, crystallography, and a combined in vitro evolution and deep-sequencing approach to isolate sequence features that rigidify loop backbones; (b) develop a novel computational-design strategy, which uses the >1000 solved structures of antibodies deposited in structure databases to realistically model CDRs and design them to recognize proteins that have not been co-crystallized with antibodies. For example, we will design novel antibodies targeting insulin, for which clinically useful diagnostics are needed. By accessing much larger sequence/structure spaces than are available to natural immune-system repertoires and experimental methods, computational antibody design could produce higher-specificity and higher-affinity binders, even to challenging targets; and (c) develop new strategies to program conformational change in CDRs, generating, e.g., the first allosteric antibodies. These will allow targeting, in principle, of any molecule, potentially revolutionizing how antibodies are generated for research and medicine, providing new insights on the design principles of protein functional sites.
Summary
We propose to elucidate the structural design principles of naturally occurring antibody complementarity-determining regions (CDRs) and to computationally design novel antibody functions. Antibodies represent the most versatile known system for molecular recognition. Research has yielded many insights into antibody design principles and promising biotechnological and pharmaceutical applications. Still, our understanding of how CDRs encode specific loop conformations lags far behind our understanding of structure-function relationships in non-immunological scaffolds. Thus, design of antibodies from first principles has not been demonstrated. We propose a computational-experimental strategy to address this challenge. We will: (a) characterize the design principles and sequence elements that rigidify antibody CDRs. Natural antibody loops will be subjected to computational modeling, crystallography, and a combined in vitro evolution and deep-sequencing approach to isolate sequence features that rigidify loop backbones; (b) develop a novel computational-design strategy, which uses the >1000 solved structures of antibodies deposited in structure databases to realistically model CDRs and design them to recognize proteins that have not been co-crystallized with antibodies. For example, we will design novel antibodies targeting insulin, for which clinically useful diagnostics are needed. By accessing much larger sequence/structure spaces than are available to natural immune-system repertoires and experimental methods, computational antibody design could produce higher-specificity and higher-affinity binders, even to challenging targets; and (c) develop new strategies to program conformational change in CDRs, generating, e.g., the first allosteric antibodies. These will allow targeting, in principle, of any molecule, potentially revolutionizing how antibodies are generated for research and medicine, providing new insights on the design principles of protein functional sites.
Max ERC Funding
1 499 930 €
Duration
Start date: 2013-09-01, End date: 2018-08-31
Project acronym ABEL
Project "Alpha-helical Barrels: Exploring, Understanding and Exploiting a New Class of Protein Structure"
Researcher (PI) Derek Neil Woolfson
Host Institution (HI) UNIVERSITY OF BRISTOL
Call Details Advanced Grant (AdG), LS9, ERC-2013-ADG
Summary "Recently through de novo peptide design, we have discovered and presented a new protein structure. This is an all-parallel, 6-helix bundle with a continuous central channel of 0.5 – 0.6 nm diameter. We posit that this is one of a broader class of protein structures that we call the alpha-helical barrels. Here, in three Work Packages, we propose to explore these structures and to develop protein functions within them. First, through a combination of computer-aided design, peptide synthesis and thorough biophysical characterization, we will examine the extents and limits of the alpha-helical-barrel structures. Whilst this is curiosity driven research, it also has practical consequences for the studies that will follow; that is, alpha-helical barrels made from increasing numbers of helices have channels or pores that increase in a predictable way. Second, we will use rational and empirical design approaches to engineer a range of functions within these cavities, including binding capabilities and enzyme-like activities. Finally, and taking the programme into another ambitious area, we will use the alpha-helical barrels to template other folds that are otherwise difficult to design and engineer, notably beta-barrels that insert into membranes to render ion-channel and sensor functions."
Summary
"Recently through de novo peptide design, we have discovered and presented a new protein structure. This is an all-parallel, 6-helix bundle with a continuous central channel of 0.5 – 0.6 nm diameter. We posit that this is one of a broader class of protein structures that we call the alpha-helical barrels. Here, in three Work Packages, we propose to explore these structures and to develop protein functions within them. First, through a combination of computer-aided design, peptide synthesis and thorough biophysical characterization, we will examine the extents and limits of the alpha-helical-barrel structures. Whilst this is curiosity driven research, it also has practical consequences for the studies that will follow; that is, alpha-helical barrels made from increasing numbers of helices have channels or pores that increase in a predictable way. Second, we will use rational and empirical design approaches to engineer a range of functions within these cavities, including binding capabilities and enzyme-like activities. Finally, and taking the programme into another ambitious area, we will use the alpha-helical barrels to template other folds that are otherwise difficult to design and engineer, notably beta-barrels that insert into membranes to render ion-channel and sensor functions."
Max ERC Funding
2 467 844 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ABSENS
Project Exploring the diagnostics market for simple and fast point-of-care antibody detection
Researcher (PI) M MERKX
Host Institution (HI) TECHNISCHE UNIVERSITEIT EINDHOVEN
Call Details Proof of Concept (PoC), PC1, ERC-2013-PoC
Summary Antibody detection assays are used in many fields of biomedicine including the diagnosis of infectious diseases, autoimmune diseases and allergies. Current analytical techniques for antibody detection come with intrinsic limitations such as the requirement for multiple time-consuming incubation steps, multiple reagents, and/or sophisticated equipment. Supported by an ERC consolidator grant we have developed a highly modular sensor concept for antibody-responsive reporter enzymes (AbSens) that addresses many of these challenges. Key advantages include the ability to monitor antibodies directly in solution, easy read-out based on a simple color reaction, adaptability to target any antibody of interest, and high affinity and specificity. We believe that this generic sensor platform could find applications in low-cost point-of-care diagnostics, clinical research, and the development of therapeutic antibodies.
The goal of AbSens is to identify those opportunities in the huge market of antibody-based diagnostics where our sensor platform provides unique advantages over existing technologies, both in terms of analytical performance and economics.
To enable the next step towards commercialization, the analytical performance of our technology will be compared to current gold standards using relevant clinical samples in collaboration with commercial parties and clinicians. Other commercially important parameters are the long-term stability of the assay components and the development of a yeast-based production system to lower the cost of enzyme production. Based on an in-depth market analysis and the feedback we receive from external stakeholders on the performance of our technology, a realistic strategy will be developed for the further commercialization. In anticipation of exploring the commercialization of our AbSens technology we filed a US provisional patent application in Sept. 2012 on the key underlying technology, which was recently continued via the PCT route.
Summary
Antibody detection assays are used in many fields of biomedicine including the diagnosis of infectious diseases, autoimmune diseases and allergies. Current analytical techniques for antibody detection come with intrinsic limitations such as the requirement for multiple time-consuming incubation steps, multiple reagents, and/or sophisticated equipment. Supported by an ERC consolidator grant we have developed a highly modular sensor concept for antibody-responsive reporter enzymes (AbSens) that addresses many of these challenges. Key advantages include the ability to monitor antibodies directly in solution, easy read-out based on a simple color reaction, adaptability to target any antibody of interest, and high affinity and specificity. We believe that this generic sensor platform could find applications in low-cost point-of-care diagnostics, clinical research, and the development of therapeutic antibodies.
The goal of AbSens is to identify those opportunities in the huge market of antibody-based diagnostics where our sensor platform provides unique advantages over existing technologies, both in terms of analytical performance and economics.
To enable the next step towards commercialization, the analytical performance of our technology will be compared to current gold standards using relevant clinical samples in collaboration with commercial parties and clinicians. Other commercially important parameters are the long-term stability of the assay components and the development of a yeast-based production system to lower the cost of enzyme production. Based on an in-depth market analysis and the feedback we receive from external stakeholders on the performance of our technology, a realistic strategy will be developed for the further commercialization. In anticipation of exploring the commercialization of our AbSens technology we filed a US provisional patent application in Sept. 2012 on the key underlying technology, which was recently continued via the PCT route.
Max ERC Funding
150 000 €
Duration
Start date: 2014-09-01, End date: 2015-08-31
Project acronym ACCLIMATE
Project Elucidating the Causes and Effects of Atlantic Circulation Changes through Model-Data Integration
Researcher (PI) Claire Waelbroeck
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE10, ERC-2013-ADG
Summary Rapid changes in ocean circulation and climate have been observed in marine sediment and ice cores, notably over the last 60 thousand years (ky), highlighting the non-linear character of the climate system and underlining the possibility of rapid climate shifts in response to anthropogenic greenhouse gas forcing.
To date, these rapid changes in climate and ocean circulation are still not fully explained. Two main obstacles prevent going beyond the current state of knowledge:
- Paleoclimatic proxy data are by essence only indirect indicators of the climatic variables, and thus can not be directly compared with model outputs;
- A 4-D (latitude, longitude, water depth, time) reconstruction of Atlantic water masses over the past 40 ky is lacking: previous studies have generated isolated records with disparate timescales which do not allow the causes of circulation changes to be identified.
Overcoming these two major limitations will lead to major breakthroughs in climate research. Concretely, I will create the first database of Atlantic deep-sea records over the last 40 ky, and extract full climatic information from these records through an innovative model-data integration scheme using an isotopic proxy forward modeling approach. The novelty and exceptional potential of this scheme is twofold: (i) it avoids hypotheses on proxy interpretation and hence suppresses or strongly reduces the errors of interpretation of paleoclimatic records; (ii) it produces states of the climate system that best explain the observations over the last 40 ky, while being consistent with the model physics.
Expected results include:
• The elucidation of the mechanisms explaining rapid changes in ocean circulation and climate over the last 40 ky,
• Improved climate model physics and parameterizations,
• The first projections of future climate changes obtained with a model able to reproduce the highly non linear behavior of the climate system observed over the last 40 ky.
Summary
Rapid changes in ocean circulation and climate have been observed in marine sediment and ice cores, notably over the last 60 thousand years (ky), highlighting the non-linear character of the climate system and underlining the possibility of rapid climate shifts in response to anthropogenic greenhouse gas forcing.
To date, these rapid changes in climate and ocean circulation are still not fully explained. Two main obstacles prevent going beyond the current state of knowledge:
- Paleoclimatic proxy data are by essence only indirect indicators of the climatic variables, and thus can not be directly compared with model outputs;
- A 4-D (latitude, longitude, water depth, time) reconstruction of Atlantic water masses over the past 40 ky is lacking: previous studies have generated isolated records with disparate timescales which do not allow the causes of circulation changes to be identified.
Overcoming these two major limitations will lead to major breakthroughs in climate research. Concretely, I will create the first database of Atlantic deep-sea records over the last 40 ky, and extract full climatic information from these records through an innovative model-data integration scheme using an isotopic proxy forward modeling approach. The novelty and exceptional potential of this scheme is twofold: (i) it avoids hypotheses on proxy interpretation and hence suppresses or strongly reduces the errors of interpretation of paleoclimatic records; (ii) it produces states of the climate system that best explain the observations over the last 40 ky, while being consistent with the model physics.
Expected results include:
• The elucidation of the mechanisms explaining rapid changes in ocean circulation and climate over the last 40 ky,
• Improved climate model physics and parameterizations,
• The first projections of future climate changes obtained with a model able to reproduce the highly non linear behavior of the climate system observed over the last 40 ky.
Max ERC Funding
3 000 000 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ACDC
Project Algorithms and Complexity of Highly Decentralized Computations
Researcher (PI) Fabian Daniel Kuhn
Host Institution (HI) ALBERT-LUDWIGS-UNIVERSITAET FREIBURG
Call Details Starting Grant (StG), PE6, ERC-2013-StG
Summary "Many of today's and tomorrow's computer systems are built on top of large-scale networks such as, e.g., the Internet, the world wide web, wireless ad hoc and sensor networks, or peer-to-peer networks. Driven by technological advances, new kinds of networks and applications have become possible and we can safely assume that this trend is going to continue. Often modern systems are envisioned to consist of a potentially large number of individual components that are organized in a completely decentralized way. There is no central authority that controls the topology of the network, how nodes join or leave the system, or in which way nodes communicate with each other. Also, many future distributed applications will be built using wireless devices that communicate via radio.
The general objective of the proposed project is to improve our understanding of the algorithmic and theoretical foundations of decentralized distributed systems. From an algorithmic point of view, decentralized networks and computations pose a number of fascinating and unique challenges that are not present in sequential or more standard distributed systems. As communication is limited and mostly between nearby nodes, each node of a large network can only maintain a very restricted view of the global state of the system. This is particularly true if the network can change dynamically, either by nodes joining or leaving the system or if the topology changes over time, e.g., because of the mobility of the devices in case of a wireless network. Nevertheless, the nodes of a network need to coordinate in order to achieve some global goal.
In particular, we plan to study algorithms and lower bounds for basic computation and information dissemination tasks in such systems. In addition, we are particularly interested in the complexity of distributed computations in dynamic and wireless networks."
Summary
"Many of today's and tomorrow's computer systems are built on top of large-scale networks such as, e.g., the Internet, the world wide web, wireless ad hoc and sensor networks, or peer-to-peer networks. Driven by technological advances, new kinds of networks and applications have become possible and we can safely assume that this trend is going to continue. Often modern systems are envisioned to consist of a potentially large number of individual components that are organized in a completely decentralized way. There is no central authority that controls the topology of the network, how nodes join or leave the system, or in which way nodes communicate with each other. Also, many future distributed applications will be built using wireless devices that communicate via radio.
The general objective of the proposed project is to improve our understanding of the algorithmic and theoretical foundations of decentralized distributed systems. From an algorithmic point of view, decentralized networks and computations pose a number of fascinating and unique challenges that are not present in sequential or more standard distributed systems. As communication is limited and mostly between nearby nodes, each node of a large network can only maintain a very restricted view of the global state of the system. This is particularly true if the network can change dynamically, either by nodes joining or leaving the system or if the topology changes over time, e.g., because of the mobility of the devices in case of a wireless network. Nevertheless, the nodes of a network need to coordinate in order to achieve some global goal.
In particular, we plan to study algorithms and lower bounds for basic computation and information dissemination tasks in such systems. In addition, we are particularly interested in the complexity of distributed computations in dynamic and wireless networks."
Max ERC Funding
1 148 000 €
Duration
Start date: 2013-11-01, End date: 2018-10-31
Project acronym ACOM
Project Commercial feasibility of microbial therapy
Researcher (PI) Willem Meindert DE VOS
Host Institution (HI) WAGENINGEN UNIVERSITY
Call Details Proof of Concept (PoC), PC1, ERC-2013-PoC
Summary Our body is colonized by complex microbial communities (our microbiome) that are most abundant in the intestinal tract where they contribute significantly to our health and disease. It has been established that aberrations in our microbiome are of particular importance in obesity, type 2 diabetes and metabolic syndrome, rapidly growing diseases with a drug market volume of over 5 B$ per year. We have discovered in the ERC project Microbes Inside that a particular bacterium is able to modify the intestinal microbiome and may be used to develop a new approach to treat these and other metabolic diseases. The Proof of Concept project ACOM aims to confirm the commercial and technological feasibility of this approach, consolidate and expand our IP position, and develop a product development plan. These form the elements of a business plan that is expected to result in establishing a spin out company (ACOM).
Summary
Our body is colonized by complex microbial communities (our microbiome) that are most abundant in the intestinal tract where they contribute significantly to our health and disease. It has been established that aberrations in our microbiome are of particular importance in obesity, type 2 diabetes and metabolic syndrome, rapidly growing diseases with a drug market volume of over 5 B$ per year. We have discovered in the ERC project Microbes Inside that a particular bacterium is able to modify the intestinal microbiome and may be used to develop a new approach to treat these and other metabolic diseases. The Proof of Concept project ACOM aims to confirm the commercial and technological feasibility of this approach, consolidate and expand our IP position, and develop a product development plan. These form the elements of a business plan that is expected to result in establishing a spin out company (ACOM).
Max ERC Funding
142 000 €
Duration
Start date: 2014-06-01, End date: 2015-05-31
Project acronym ACOPS
Project Advanced Coherent Ultrafast Laser Pulse Stacking
Researcher (PI) Jens Limpert
Host Institution (HI) FRIEDRICH-SCHILLER-UNIVERSITAT JENA
Call Details Consolidator Grant (CoG), PE2, ERC-2013-CoG
Summary "An important driver of scientific progress has always been the envisioning of applications far beyond existing technological capabilities. Such thinking creates new challenges for physicists, driven by the groundbreaking nature of the anticipated application. In the case of laser physics, one of these applications is laser wake-field particle acceleration and possible future uses thereof, such as in collider experiments, or for medical applications such as cancer treatment. To accelerate electrons and positrons to TeV-energies, a laser architecture is required that allows for the combination of high efficiency, Petawatt peak powers, and Megawatt average powers. Developing such a laser system would be a challenging task that might take decades of aggressive research, development, and, most important, revolutionary approaches and innovative ideas.
The goal of the ACOPS project is to develop a compact, efficient, scalable, and cost-effective high-average and high-peak power ultra-short pulse laser concept.
The proposed approach to this goal relies on the spatially and temporally separated amplification of ultrashort laser pulses in waveguide structures, followed by coherent combination into a single train of pulses with increased average power and pulse energy. This combination can be realized through the coherent addition of the output beams of spatially separated amplifiers, combined with the pulse stacking of temporally separated pulses in passive enhancement cavities, employing a fast-switching element as cavity dumper.
Therefore, the three main tasks are the development of kW-class high-repetition-rate driving lasers, the investigation of non-steady state pulse enhancement in passive cavities, and the development of a suitable dumping element.
If successful, the proposed concept would undoubtedly provide a tool that would allow researchers to surpass the current limits in high-field physics and accelerator science."
Summary
"An important driver of scientific progress has always been the envisioning of applications far beyond existing technological capabilities. Such thinking creates new challenges for physicists, driven by the groundbreaking nature of the anticipated application. In the case of laser physics, one of these applications is laser wake-field particle acceleration and possible future uses thereof, such as in collider experiments, or for medical applications such as cancer treatment. To accelerate electrons and positrons to TeV-energies, a laser architecture is required that allows for the combination of high efficiency, Petawatt peak powers, and Megawatt average powers. Developing such a laser system would be a challenging task that might take decades of aggressive research, development, and, most important, revolutionary approaches and innovative ideas.
The goal of the ACOPS project is to develop a compact, efficient, scalable, and cost-effective high-average and high-peak power ultra-short pulse laser concept.
The proposed approach to this goal relies on the spatially and temporally separated amplification of ultrashort laser pulses in waveguide structures, followed by coherent combination into a single train of pulses with increased average power and pulse energy. This combination can be realized through the coherent addition of the output beams of spatially separated amplifiers, combined with the pulse stacking of temporally separated pulses in passive enhancement cavities, employing a fast-switching element as cavity dumper.
Therefore, the three main tasks are the development of kW-class high-repetition-rate driving lasers, the investigation of non-steady state pulse enhancement in passive cavities, and the development of a suitable dumping element.
If successful, the proposed concept would undoubtedly provide a tool that would allow researchers to surpass the current limits in high-field physics and accelerator science."
Max ERC Funding
1 881 040 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ACQDIV
Project Acquisition processes in maximally diverse languages: Min(d)ing the ambient language
Researcher (PI) Sabine Erika Stoll
Host Institution (HI) UNIVERSITAT ZURICH
Call Details Consolidator Grant (CoG), SH4, ERC-2013-CoG
Summary "Children learn any language that they grow up with, adapting to any of the ca. 7000 languages of the world, no matter how divergent or complex their structures are. What cognitive processes make this extreme flexibility possible? This is one of the most burning questions in cognitive science and the ACQDIV project aims at answering it by testing and refining the following leading hypothesis: Language acquisition is flexible and adaptive to any kind of language because it relies on a small set of universal cognitive processes that variably target different structures at different times during acquisition in every language. The project aims at establishing the precise set of processes and at determining the conditions of variation across maximally diverse languages. This project focuses on three processes: (i) distributional learning, (ii) generalization-based learning and (iii) interaction-based learning. To investigate these processes I will work with a sample of five clusters of languages including longitudinal data of two languages each. The clusters were determined by a clustering algorithm seeking the structurally most divergent languages in a typological database. The languages are: Cluster 1: Slavey and Cree, Cluster 2: Indonesian and Yucatec, Cluster 3: Inuktitut and Chintang, Cluster 4: Sesotho and Russian, Cluster 5: Japanese and Turkish. For all languages, corpora are available, except for Slavey where fieldwork is planned. The leading hypothesis will be tested against the acquisition of aspect and negation in each language of the sample and also against the two structures in each language that are most salient and challenging in them (e. g. complex morphology in Chintang). The acquisition processes also depend on statistical patterns in the input children receive. I will examine these patterns across the sample with respect to repetitiveness effects, applying data-mining methods and systematically comparing child-directed and child-surrounding speech."
Summary
"Children learn any language that they grow up with, adapting to any of the ca. 7000 languages of the world, no matter how divergent or complex their structures are. What cognitive processes make this extreme flexibility possible? This is one of the most burning questions in cognitive science and the ACQDIV project aims at answering it by testing and refining the following leading hypothesis: Language acquisition is flexible and adaptive to any kind of language because it relies on a small set of universal cognitive processes that variably target different structures at different times during acquisition in every language. The project aims at establishing the precise set of processes and at determining the conditions of variation across maximally diverse languages. This project focuses on three processes: (i) distributional learning, (ii) generalization-based learning and (iii) interaction-based learning. To investigate these processes I will work with a sample of five clusters of languages including longitudinal data of two languages each. The clusters were determined by a clustering algorithm seeking the structurally most divergent languages in a typological database. The languages are: Cluster 1: Slavey and Cree, Cluster 2: Indonesian and Yucatec, Cluster 3: Inuktitut and Chintang, Cluster 4: Sesotho and Russian, Cluster 5: Japanese and Turkish. For all languages, corpora are available, except for Slavey where fieldwork is planned. The leading hypothesis will be tested against the acquisition of aspect and negation in each language of the sample and also against the two structures in each language that are most salient and challenging in them (e. g. complex morphology in Chintang). The acquisition processes also depend on statistical patterns in the input children receive. I will examine these patterns across the sample with respect to repetitiveness effects, applying data-mining methods and systematically comparing child-directed and child-surrounding speech."
Max ERC Funding
1 998 438 €
Duration
Start date: 2014-09-01, End date: 2019-08-31
Project acronym ACRCC
Project Understanding the atmospheric circulation response to climate change
Researcher (PI) Theodore Shepherd
Host Institution (HI) THE UNIVERSITY OF READING
Call Details Advanced Grant (AdG), PE10, ERC-2013-ADG
Summary Computer models based on known physical laws are our primary tool for predicting climate change. Yet the state-of-the-art models exhibit a disturbingly wide range of predictions of future climate change, especially when examined at the regional scale, which has not decreased as the models have become more comprehensive. The reasons for this are not understood. This represents a basic challenge to our fundamental understanding of climate.
The divergence of model projections is presumably related to systematic model errors in the large-scale fluxes of heat, moisture and momentum that control regional aspects of climate. That these errors stubbornly persist in spite of increases in the spatial resolution of the models suggests that they are associated with errors in the representation of unresolved processes, whose effects must be parameterised.
Most attention in climate science has hitherto focused on the thermodynamic aspects of climate. Dynamical aspects, which involve the atmospheric circulation, have received much less attention. However regional climate, including persistent climate regimes and extremes, is strongly controlled by atmospheric circulation patterns, which exhibit chaotic variability and whose representation in climate models depends sensitively on parameterised processes. Moreover the dynamical aspects of model projections are much less robust than the thermodynamic ones. There are good reasons to believe that model bias, the divergence of model projections, and chaotic variability are somehow related, although the relationships are not well understood. This calls for studying them together.
My proposed research will focus on this problem, addressing these three aspects of the atmospheric circulation response to climate change in parallel: (i) diagnosing the sources of model error; (ii) elucidating the relationship between model error and the spread in model projections; (iii) understanding the physical mechanisms of atmospheric variability.
Summary
Computer models based on known physical laws are our primary tool for predicting climate change. Yet the state-of-the-art models exhibit a disturbingly wide range of predictions of future climate change, especially when examined at the regional scale, which has not decreased as the models have become more comprehensive. The reasons for this are not understood. This represents a basic challenge to our fundamental understanding of climate.
The divergence of model projections is presumably related to systematic model errors in the large-scale fluxes of heat, moisture and momentum that control regional aspects of climate. That these errors stubbornly persist in spite of increases in the spatial resolution of the models suggests that they are associated with errors in the representation of unresolved processes, whose effects must be parameterised.
Most attention in climate science has hitherto focused on the thermodynamic aspects of climate. Dynamical aspects, which involve the atmospheric circulation, have received much less attention. However regional climate, including persistent climate regimes and extremes, is strongly controlled by atmospheric circulation patterns, which exhibit chaotic variability and whose representation in climate models depends sensitively on parameterised processes. Moreover the dynamical aspects of model projections are much less robust than the thermodynamic ones. There are good reasons to believe that model bias, the divergence of model projections, and chaotic variability are somehow related, although the relationships are not well understood. This calls for studying them together.
My proposed research will focus on this problem, addressing these three aspects of the atmospheric circulation response to climate change in parallel: (i) diagnosing the sources of model error; (ii) elucidating the relationship between model error and the spread in model projections; (iii) understanding the physical mechanisms of atmospheric variability.
Max ERC Funding
2 489 151 €
Duration
Start date: 2014-03-01, End date: 2020-02-29
Project acronym ACROSS
Project 3D Reconstruction and Modeling across Different Levels of Abstraction
Researcher (PI) Leif Kobbelt
Host Institution (HI) RHEINISCH-WESTFAELISCHE TECHNISCHE HOCHSCHULE AACHEN
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary "Digital 3D models are gaining more and more importance in diverse application fields ranging from computer graphics, multimedia and simulation sciences to engineering, architecture, and medicine. Powerful technologies to digitize the 3D shape of real objects and scenes are becoming available even to consumers. However, the raw geometric data emerging from, e.g., 3D scanning or multi-view stereo often lacks a consistent structure and meta-information which are necessary for the effective deployment of such models in sophisticated down-stream applications like animation, simulation, or CAD/CAM that go beyond mere visualization. Our goal is to develop new fundamental algorithms which transform raw geometric input data into augmented 3D models that are equipped with structural meta information such as feature aligned meshes, patch segmentations, local and global geometric constraints, statistical shape variation data, or even procedural descriptions. Our methodological approach is inspired by the human perceptual system that integrates bottom-up (data-driven) and top-down (model-driven) mechanisms in its hierarchical processing. Similarly we combine algorithms operating on different levels of abstraction into reconstruction and modeling networks. Instead of developing an individual solution for each specific application scenario, we create an eco-system of algorithms for automatic processing and interactive design of highly complex 3D models. A key concept is the information flow across all levels of abstraction in a bottom-up as well as top-down fashion. We not only aim at optimizing geometric representations but in fact at bridging the gap between reconstruction and recognition of geometric objects. The results from this project will make it possible to bring 3D models of real world objects into many highly relevant applications in science, industry, and entertainment, greatly reducing the excessive manual effort that is still necessary today."
Summary
"Digital 3D models are gaining more and more importance in diverse application fields ranging from computer graphics, multimedia and simulation sciences to engineering, architecture, and medicine. Powerful technologies to digitize the 3D shape of real objects and scenes are becoming available even to consumers. However, the raw geometric data emerging from, e.g., 3D scanning or multi-view stereo often lacks a consistent structure and meta-information which are necessary for the effective deployment of such models in sophisticated down-stream applications like animation, simulation, or CAD/CAM that go beyond mere visualization. Our goal is to develop new fundamental algorithms which transform raw geometric input data into augmented 3D models that are equipped with structural meta information such as feature aligned meshes, patch segmentations, local and global geometric constraints, statistical shape variation data, or even procedural descriptions. Our methodological approach is inspired by the human perceptual system that integrates bottom-up (data-driven) and top-down (model-driven) mechanisms in its hierarchical processing. Similarly we combine algorithms operating on different levels of abstraction into reconstruction and modeling networks. Instead of developing an individual solution for each specific application scenario, we create an eco-system of algorithms for automatic processing and interactive design of highly complex 3D models. A key concept is the information flow across all levels of abstraction in a bottom-up as well as top-down fashion. We not only aim at optimizing geometric representations but in fact at bridging the gap between reconstruction and recognition of geometric objects. The results from this project will make it possible to bring 3D models of real world objects into many highly relevant applications in science, industry, and entertainment, greatly reducing the excessive manual effort that is still necessary today."
Max ERC Funding
2 482 000 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym Actanthrope
Project Computational Foundations of Anthropomorphic Action
Researcher (PI) Jean Paul Laumond
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE7, ERC-2013-ADG
Summary Actanthrope intends to promote a neuro-robotics perspective to explore original models of anthropomorphic action. The project targets contributions to humanoid robot autonomy (for rescue and service robotics), to advanced human body simulation (for applications in ergonomics), and to a new theory of embodied intelligence (by promoting a motion-based semiotics of the human action).
Actions take place in the physical space while they originate in the –robot or human– sensory-motor space. Geometry is the core abstraction that makes the link between these spaces. Considering that the structure of actions inherits from that of the body, the underlying intuition is that actions can be segmented within discrete sub-spaces lying in the entire continuous posture space. Such sub-spaces are viewed as symbols bridging deliberative reasoning and reactive control. Actanthrope argues that geometric approaches to motion segmentation and generation as promising and innovative routes to explore embodied intelligence:
- Motion segmentation: what are the sub-manifolds that define the structure of a given action?
- Motion generation: among all the solution paths within a given sub-manifold, what is the underlying law that makes the selection?
In Robotics these questions are related to the competition between abstract symbol manipulation and physical signal processing. In Computational Neuroscience the questions refer to the quest of motion invariants. The ambition of the project is to promote a dual perspective: exploring the computational foundations of human action to make better robots, while simultaneously doing better robotics to better understand human action.
A unique “Anthropomorphic Action Factory” supports the methodology. It aims at attracting to a single lab, researchers with complementary know-how and solid mathematical background. All of them will benefit from unique equipments, while being stimulated by four challenges dealing with locomotion and manipulation actions.
Summary
Actanthrope intends to promote a neuro-robotics perspective to explore original models of anthropomorphic action. The project targets contributions to humanoid robot autonomy (for rescue and service robotics), to advanced human body simulation (for applications in ergonomics), and to a new theory of embodied intelligence (by promoting a motion-based semiotics of the human action).
Actions take place in the physical space while they originate in the –robot or human– sensory-motor space. Geometry is the core abstraction that makes the link between these spaces. Considering that the structure of actions inherits from that of the body, the underlying intuition is that actions can be segmented within discrete sub-spaces lying in the entire continuous posture space. Such sub-spaces are viewed as symbols bridging deliberative reasoning and reactive control. Actanthrope argues that geometric approaches to motion segmentation and generation as promising and innovative routes to explore embodied intelligence:
- Motion segmentation: what are the sub-manifolds that define the structure of a given action?
- Motion generation: among all the solution paths within a given sub-manifold, what is the underlying law that makes the selection?
In Robotics these questions are related to the competition between abstract symbol manipulation and physical signal processing. In Computational Neuroscience the questions refer to the quest of motion invariants. The ambition of the project is to promote a dual perspective: exploring the computational foundations of human action to make better robots, while simultaneously doing better robotics to better understand human action.
A unique “Anthropomorphic Action Factory” supports the methodology. It aims at attracting to a single lab, researchers with complementary know-how and solid mathematical background. All of them will benefit from unique equipments, while being stimulated by four challenges dealing with locomotion and manipulation actions.
Max ERC Funding
2 500 000 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym ACTAR TPC
Project Active Target and Time Projection Chamber
Researcher (PI) Gwen Grinyer
Host Institution (HI) GRAND ACCELERATEUR NATIONAL D'IONS LOURDS
Call Details Starting Grant (StG), PE2, ERC-2013-StG
Summary The active target and time projection chamber (ACTAR TPC) is a novel gas-filled detection system that will permit new studies into the structure and decays of the most exotic nuclei. The use of a gas volume that acts as a sensitive detection medium and as the reaction target itself (an “active target”) offers considerable advantages over traditional nuclear physics detectors and techniques. In high-energy physics, TPC detectors have found profitable applications but their use in nuclear physics has been limited. With the ACTAR TPC design, individual detection pad sizes of 2 mm are the smallest ever attempted in either discipline but is a requirement for high-efficiency and high-resolution nuclear spectroscopy. The corresponding large number of electronic channels (16000 from a surface of only 25×25 cm) requires new developments in high-density electronics and data-acquisition systems that are not yet available in the nuclear physics domain. New experiments in regions of the nuclear chart that cannot be presently contemplated will become feasible with ACTAR TPC.
Summary
The active target and time projection chamber (ACTAR TPC) is a novel gas-filled detection system that will permit new studies into the structure and decays of the most exotic nuclei. The use of a gas volume that acts as a sensitive detection medium and as the reaction target itself (an “active target”) offers considerable advantages over traditional nuclear physics detectors and techniques. In high-energy physics, TPC detectors have found profitable applications but their use in nuclear physics has been limited. With the ACTAR TPC design, individual detection pad sizes of 2 mm are the smallest ever attempted in either discipline but is a requirement for high-efficiency and high-resolution nuclear spectroscopy. The corresponding large number of electronic channels (16000 from a surface of only 25×25 cm) requires new developments in high-density electronics and data-acquisition systems that are not yet available in the nuclear physics domain. New experiments in regions of the nuclear chart that cannot be presently contemplated will become feasible with ACTAR TPC.
Max ERC Funding
1 290 000 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ACUITY
Project Algorithms for coping with uncertainty and intractability
Researcher (PI) Nikhil Bansal
Host Institution (HI) TECHNISCHE UNIVERSITEIT EINDHOVEN
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary The two biggest challenges in solving practical optimization problems are computational intractability, and the presence
of uncertainty: most problems are either NP-hard, or have incomplete input data which
makes an exact computation impossible.
Recently, there has been a huge progress in our understanding of intractability, based on spectacular algorithmic and lower bound techniques. For several problems, especially those with only local constraints, we can design optimum
approximation algorithms that are provably the best possible.
However, typical optimization problems usually involve complex global constraints and are much less understood. The situation is even worse for coping with uncertainty. Most of the algorithms are based on ad-hoc techniques and there is no deeper understanding of what makes various problems easy or hard.
This proposal describes several new directions, together with concrete intermediate goals, that will break important new ground in the theory of approximation and online algorithms. The particular directions we consider are (i) extend the primal dual method to systematically design online algorithms, (ii) build a structural theory of online problems based on work functions, (iii) develop new tools to use the power of strong convex relaxations and (iv) design new algorithmic approaches based on non-constructive proof techniques.
The proposed research is at the
cutting edge of algorithm design, and builds upon the recent success of the PI in resolving several longstanding questions in these areas. Any progress is likely to be a significant contribution to theoretical
computer science and combinatorial optimization.
Summary
The two biggest challenges in solving practical optimization problems are computational intractability, and the presence
of uncertainty: most problems are either NP-hard, or have incomplete input data which
makes an exact computation impossible.
Recently, there has been a huge progress in our understanding of intractability, based on spectacular algorithmic and lower bound techniques. For several problems, especially those with only local constraints, we can design optimum
approximation algorithms that are provably the best possible.
However, typical optimization problems usually involve complex global constraints and are much less understood. The situation is even worse for coping with uncertainty. Most of the algorithms are based on ad-hoc techniques and there is no deeper understanding of what makes various problems easy or hard.
This proposal describes several new directions, together with concrete intermediate goals, that will break important new ground in the theory of approximation and online algorithms. The particular directions we consider are (i) extend the primal dual method to systematically design online algorithms, (ii) build a structural theory of online problems based on work functions, (iii) develop new tools to use the power of strong convex relaxations and (iv) design new algorithmic approaches based on non-constructive proof techniques.
The proposed research is at the
cutting edge of algorithm design, and builds upon the recent success of the PI in resolving several longstanding questions in these areas. Any progress is likely to be a significant contribution to theoretical
computer science and combinatorial optimization.
Max ERC Funding
1 519 285 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym ADAPT
Project Life in a cold climate: the adaptation of cereals to new environments and the establishment of agriculture in Europe
Researcher (PI) Terence Austen Brown
Host Institution (HI) THE UNIVERSITY OF MANCHESTER
Call Details Advanced Grant (AdG), SH6, ERC-2013-ADG
Summary "This project explores the concept of agricultural spread as analogous to enforced climate change and asks how cereals adapted to new environments when agriculture was introduced into Europe. Archaeologists have long recognized that the ecological pressures placed on crops would have had an impact on the spread and subsequent development of agriculture, but previously there has been no means of directly assessing the scale and nature of this impact. Recent work that I have directed has shown how such a study could be carried out, and the purpose of this project is to exploit these breakthroughs with the goal of assessing the influence of environmental adaptation on the spread of agriculture, its adoption as the primary subsistence strategy, and the subsequent establishment of farming in different parts of Europe. This will correct the current imbalance between our understanding of the human and environmental dimensions to the domestication of Europe. I will use methods from population genomics to identify loci within the barley and wheat genomes that have undergone selection since the beginning of cereal cultivation in Europe. I will then use ecological modelling to identify those loci whose patterns of selection are associated with ecogeographical variables and hence represent adaptations to local environmental conditions. I will assign dates to the periods when adaptations occurred by sequencing ancient DNA from archaeobotanical assemblages and by computer methods that enable the temporal order of adaptations to be deduced. I will then synthesise the information on environmental adaptations with dating evidence for the spread of agriculture in Europe, which reveals pauses that might be linked to environmental adaptation, with demographic data that indicate regions where Neolithic populations declined, possibly due to inadequate crop productivity, and with an archaeobotanical database showing changes in the prevalence of individual cereals in different regions."
Summary
"This project explores the concept of agricultural spread as analogous to enforced climate change and asks how cereals adapted to new environments when agriculture was introduced into Europe. Archaeologists have long recognized that the ecological pressures placed on crops would have had an impact on the spread and subsequent development of agriculture, but previously there has been no means of directly assessing the scale and nature of this impact. Recent work that I have directed has shown how such a study could be carried out, and the purpose of this project is to exploit these breakthroughs with the goal of assessing the influence of environmental adaptation on the spread of agriculture, its adoption as the primary subsistence strategy, and the subsequent establishment of farming in different parts of Europe. This will correct the current imbalance between our understanding of the human and environmental dimensions to the domestication of Europe. I will use methods from population genomics to identify loci within the barley and wheat genomes that have undergone selection since the beginning of cereal cultivation in Europe. I will then use ecological modelling to identify those loci whose patterns of selection are associated with ecogeographical variables and hence represent adaptations to local environmental conditions. I will assign dates to the periods when adaptations occurred by sequencing ancient DNA from archaeobotanical assemblages and by computer methods that enable the temporal order of adaptations to be deduced. I will then synthesise the information on environmental adaptations with dating evidence for the spread of agriculture in Europe, which reveals pauses that might be linked to environmental adaptation, with demographic data that indicate regions where Neolithic populations declined, possibly due to inadequate crop productivity, and with an archaeobotanical database showing changes in the prevalence of individual cereals in different regions."
Max ERC Funding
2 492 964 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ADaPt
Project Adaptation, Dispersals and Phenotype: understanding the roles of climate,
natural selection and energetics in shaping global hunter-gatherer adaptability
Researcher (PI) Jay Stock
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Consolidator Grant (CoG), SH6, ERC-2013-CoG
Summary Relative to other species, humans are characterised by considerable biological diversity despite genetic homogeneity. This diversity is reflected in skeletal variation, but we lack sufficient understanding of the underlying mechanisms to adequately interpret the archaeological record. The proposed research will address problems in our current understanding of the origins of human variation in the past by: 1) documenting and interpreting the pattern of global hunter-gatherer variation relative to genetic phylogenies and climatic variation; 2) testing the relationship between environmental and skeletal variation among genetically related hunter-gatherers from different environments; 3) examining the adaptability of living humans to different environments, through the study of energetic expenditure and life history trade-offs associated with locomotion; and 4) investigating the relationship between muscle and skeletal variation associated with locomotion in diverse environments. This will be achieved by linking: a) detailed study of the global pattern of hunter-gatherer variation in the Late Pleistocene and Holocene with; b) ground-breaking experimental research which tests the relationship between energetic stress, muscle function, and bone variation in living humans. The first component tests the correspondence between skeletal variation and both genetic and climatic history, to infer mechanisms driving variation. The second component integrates this skeletal variation with experimental studies of living humans to, for the first time, directly test adaptive implications of skeletal variation observed in the past. ADaPt will provide the first links between prehistoric hunter-gatherer variation and the evolutionary parameters of life history and energetics that may have shaped our success as a species. It will lead to breakthroughs necessary to interpret variation in the archaeological record, relative to human dispersals and adaptation in the past.
Summary
Relative to other species, humans are characterised by considerable biological diversity despite genetic homogeneity. This diversity is reflected in skeletal variation, but we lack sufficient understanding of the underlying mechanisms to adequately interpret the archaeological record. The proposed research will address problems in our current understanding of the origins of human variation in the past by: 1) documenting and interpreting the pattern of global hunter-gatherer variation relative to genetic phylogenies and climatic variation; 2) testing the relationship between environmental and skeletal variation among genetically related hunter-gatherers from different environments; 3) examining the adaptability of living humans to different environments, through the study of energetic expenditure and life history trade-offs associated with locomotion; and 4) investigating the relationship between muscle and skeletal variation associated with locomotion in diverse environments. This will be achieved by linking: a) detailed study of the global pattern of hunter-gatherer variation in the Late Pleistocene and Holocene with; b) ground-breaking experimental research which tests the relationship between energetic stress, muscle function, and bone variation in living humans. The first component tests the correspondence between skeletal variation and both genetic and climatic history, to infer mechanisms driving variation. The second component integrates this skeletal variation with experimental studies of living humans to, for the first time, directly test adaptive implications of skeletal variation observed in the past. ADaPt will provide the first links between prehistoric hunter-gatherer variation and the evolutionary parameters of life history and energetics that may have shaped our success as a species. It will lead to breakthroughs necessary to interpret variation in the archaeological record, relative to human dispersals and adaptation in the past.
Max ERC Funding
1 911 485 €
Duration
Start date: 2014-07-01, End date: 2019-06-30
Project acronym ADHESWITCHES
Project Adhesion switches in cancer and development: from in vivo to synthetic biology
Researcher (PI) Mari Johanna Ivaska
Host Institution (HI) TURUN YLIOPISTO
Call Details Consolidator Grant (CoG), LS3, ERC-2013-CoG
Summary Integrins are transmembrane cell adhesion receptors controlling cell proliferation and migration. Our objective is to gain fundamentally novel mechanistic insight into the emerging new roles of integrins in cancer and to generate a road map of integrin dependent pathways critical in mammary gland development and integrin signalling thus opening new targets for therapeutic interventions. We will combine an in vivo based translational approach with cell and molecular biological studies aiming to identify entirely novel concepts in integrin function using cutting edge techniques and synthetic-biology tools.
The specific objectives are:
1) Integrin inactivation in branching morphogenesis and cancer invasion. Integrins regulate mammary gland development and cancer invasion but the role of integrin inactivating proteins in these processes is currently completely unknown. We will investigate this using genetically modified mice, ex-vivo organoid models and human tissues with the aim to identify beneficial combinational treatments against cancer invasion.
2) Endosomal adhesomes – cross-talk between integrin activity and integrin “inside-in signaling”. We hypothesize that endocytosed active integrins engage in specialized endosomal signaling that governs cell survival especially in cancer. RNAi cell arrays, super-resolution STED imaging and endosomal proteomics will be used to investigate integrin signaling in endosomes.
3) Spatio-temporal co-ordination of adhesion and endocytosis. Several cytosolic proteins compete for integrin binding to regulate activation, endocytosis and recycling. Photoactivatable protein-traps and predefined matrix micropatterns will be employed to mechanistically dissect the spatio-temporal dynamics and hierarchy of their recruitment.
We will employ innovative and unconventional techniques to address three major unanswered questions in the field and significantly advance our understanding of integrin function in development and cancer.
Summary
Integrins are transmembrane cell adhesion receptors controlling cell proliferation and migration. Our objective is to gain fundamentally novel mechanistic insight into the emerging new roles of integrins in cancer and to generate a road map of integrin dependent pathways critical in mammary gland development and integrin signalling thus opening new targets for therapeutic interventions. We will combine an in vivo based translational approach with cell and molecular biological studies aiming to identify entirely novel concepts in integrin function using cutting edge techniques and synthetic-biology tools.
The specific objectives are:
1) Integrin inactivation in branching morphogenesis and cancer invasion. Integrins regulate mammary gland development and cancer invasion but the role of integrin inactivating proteins in these processes is currently completely unknown. We will investigate this using genetically modified mice, ex-vivo organoid models and human tissues with the aim to identify beneficial combinational treatments against cancer invasion.
2) Endosomal adhesomes – cross-talk between integrin activity and integrin “inside-in signaling”. We hypothesize that endocytosed active integrins engage in specialized endosomal signaling that governs cell survival especially in cancer. RNAi cell arrays, super-resolution STED imaging and endosomal proteomics will be used to investigate integrin signaling in endosomes.
3) Spatio-temporal co-ordination of adhesion and endocytosis. Several cytosolic proteins compete for integrin binding to regulate activation, endocytosis and recycling. Photoactivatable protein-traps and predefined matrix micropatterns will be employed to mechanistically dissect the spatio-temporal dynamics and hierarchy of their recruitment.
We will employ innovative and unconventional techniques to address three major unanswered questions in the field and significantly advance our understanding of integrin function in development and cancer.
Max ERC Funding
1 887 910 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym AdOC
Project Advance Optical Clocks
Researcher (PI) Sebastien André Marcel Bize
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE2, ERC-2013-CoG
Summary "The proposed research program has three main objectives. The first and second objectives are to seek extreme precisions in optical atomic spectroscopy and optical clocks, and to use this quest as a mean of exploration in atomic physics. The third objective is to explore new possibilities that stem from extreme precision. These goals will be pursued via three complementary activities: #1: Search for extreme precisions with an Hg optical lattice clock. #2: Explore and exploit the rich Hg system, which is essentially unexplored in the cold and ultra-cold regime. #3: Identify new applications of clocks with extreme precision to Earth science. Clocks can measure directly the gravitational potential via Einstein’s gravitational redshift, leading to the idea of “clock-based geodesy”.
The 2 first activities are experimental and build on an existing setup, where we demonstrated the feasibility of an Hg optical lattice clock. Hg is chosen for its potential to surpass competing systems. We will investigate the unexplored physics of the Hg clock. This includes interactions between Hg atoms, lattice-induced light shifts, and sensitivity to external fields which are specific to the atomic species. Beyond, we will explore the fundamental limits of the optical lattice scheme. We will exploit other remarkable features of Hg associated to the high atomic number and the diversity of stable isotopes. These features enable tests of fundamental physical laws, ultra-precise measurements of isotope shifts, measurement of collisional properties toward evaporative cooling and quantum gases of Hg, investigation of forbidden transitions promising for measuring the nuclear anapole moment of Hg.
The third activity is theoretical and is aimed at initiating collaborations with experts in modelling Earth gravity. With this expertise, we will identify the most promising and realistic approaches for clocks and emerging remote comparison methods to contribute to geodesy, hydrology, oceanography, etc."
Summary
"The proposed research program has three main objectives. The first and second objectives are to seek extreme precisions in optical atomic spectroscopy and optical clocks, and to use this quest as a mean of exploration in atomic physics. The third objective is to explore new possibilities that stem from extreme precision. These goals will be pursued via three complementary activities: #1: Search for extreme precisions with an Hg optical lattice clock. #2: Explore and exploit the rich Hg system, which is essentially unexplored in the cold and ultra-cold regime. #3: Identify new applications of clocks with extreme precision to Earth science. Clocks can measure directly the gravitational potential via Einstein’s gravitational redshift, leading to the idea of “clock-based geodesy”.
The 2 first activities are experimental and build on an existing setup, where we demonstrated the feasibility of an Hg optical lattice clock. Hg is chosen for its potential to surpass competing systems. We will investigate the unexplored physics of the Hg clock. This includes interactions between Hg atoms, lattice-induced light shifts, and sensitivity to external fields which are specific to the atomic species. Beyond, we will explore the fundamental limits of the optical lattice scheme. We will exploit other remarkable features of Hg associated to the high atomic number and the diversity of stable isotopes. These features enable tests of fundamental physical laws, ultra-precise measurements of isotope shifts, measurement of collisional properties toward evaporative cooling and quantum gases of Hg, investigation of forbidden transitions promising for measuring the nuclear anapole moment of Hg.
The third activity is theoretical and is aimed at initiating collaborations with experts in modelling Earth gravity. With this expertise, we will identify the most promising and realistic approaches for clocks and emerging remote comparison methods to contribute to geodesy, hydrology, oceanography, etc."
Max ERC Funding
1 946 432 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym ADOS
Project AMPA Receptor Dynamic Organization and Synaptic transmission in health and disease
Researcher (PI) Daniel Georges Gustave Choquet
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), LS5, ERC-2013-ADG
Summary AMPA glutamate receptors (AMPAR) play key roles in information processing by the brain as they mediate nearly all fast excitatory synaptic transmission. Their spatio-temporal organization in the post synapse with respect to presynaptic glutamate release sites is a key determinant in synaptic transmission. The activity-dependent regulation of AMPAR organization is at the heart of synaptic plasticity processes underlying learning and memory. Dysfunction of synaptic transmission - hence AMPAR organization - is likely at the origin of a number of brain diseases.
Building on discoveries made during my past ERC grant, our new ground-breaking objective is to uncover the mechanisms that link synaptic transmission with the dynamic organization of AMPAR and associated proteins. For this aim, we have assembled a team of neurobiologists, computer scientists and chemists with a track record of collaboration. We will combine physiology, cellular and molecular neurobiology with development of novel quantitative imaging and biomolecular tools to probe the molecular dynamics that regulate synaptic transmission.
Live high content 3D SuperResolution Light Imaging (SRLI) combined with electron microscopy will allow unprecedented visualization of AMPAR organization in synapses at the scale of individual subunits up to the level of intact tissue. Simultaneous SRLI and electrophysiology will elucidate the intricate relations between dynamic AMPAR organization, trafficking and synaptic transmission. Novel peptide- and small protein-based probes used as protein-protein interaction reporters and modulators will be developed to image and directly interfere with synapse organization.
We will identify new processes that are fundamental to activity dependent modifications of synaptic transmission. We will apply the above findings to understand the causes of early cognitive deficits in models of neurodegenerative disorders and open new avenues of research for innovative therapies.
Summary
AMPA glutamate receptors (AMPAR) play key roles in information processing by the brain as they mediate nearly all fast excitatory synaptic transmission. Their spatio-temporal organization in the post synapse with respect to presynaptic glutamate release sites is a key determinant in synaptic transmission. The activity-dependent regulation of AMPAR organization is at the heart of synaptic plasticity processes underlying learning and memory. Dysfunction of synaptic transmission - hence AMPAR organization - is likely at the origin of a number of brain diseases.
Building on discoveries made during my past ERC grant, our new ground-breaking objective is to uncover the mechanisms that link synaptic transmission with the dynamic organization of AMPAR and associated proteins. For this aim, we have assembled a team of neurobiologists, computer scientists and chemists with a track record of collaboration. We will combine physiology, cellular and molecular neurobiology with development of novel quantitative imaging and biomolecular tools to probe the molecular dynamics that regulate synaptic transmission.
Live high content 3D SuperResolution Light Imaging (SRLI) combined with electron microscopy will allow unprecedented visualization of AMPAR organization in synapses at the scale of individual subunits up to the level of intact tissue. Simultaneous SRLI and electrophysiology will elucidate the intricate relations between dynamic AMPAR organization, trafficking and synaptic transmission. Novel peptide- and small protein-based probes used as protein-protein interaction reporters and modulators will be developed to image and directly interfere with synapse organization.
We will identify new processes that are fundamental to activity dependent modifications of synaptic transmission. We will apply the above findings to understand the causes of early cognitive deficits in models of neurodegenerative disorders and open new avenues of research for innovative therapies.
Max ERC Funding
2 491 157 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ADREEM
Project Adding Another Dimension – Arrays of 3D Bio-Responsive Materials
Researcher (PI) Mark Bradley
Host Institution (HI) THE UNIVERSITY OF EDINBURGH
Call Details Advanced Grant (AdG), LS9, ERC-2013-ADG
Summary This proposal is focused in the areas of chemical medicine and chemical biology with the key drivers being the discovery and development of new materials that have practical functionality and application. The project will enable the fabrication of thousands of three-dimensional “smart-polymers” that will allow: (i). The precise and controlled release of drugs upon the addition of either a small molecule trigger or in response to disease, (ii). The discovery of materials that control and manipulate cells with the identification of scaffolds that provide the necessary biochemical cues for directing cell fate and drive tissue regeneration and (iii). The development of new classes of “smart-polymers” able, in real-time, to sense and report bacterial contamination. The newly discovered materials will find multiple biomedical applications in regenerative medicine and biotechnology ranging from 3D cell culture, bone repair and niche stabilisation to bacterial sensing/removal, while offering a new paradigm in drug delivery with biomarker triggered drug release.
Summary
This proposal is focused in the areas of chemical medicine and chemical biology with the key drivers being the discovery and development of new materials that have practical functionality and application. The project will enable the fabrication of thousands of three-dimensional “smart-polymers” that will allow: (i). The precise and controlled release of drugs upon the addition of either a small molecule trigger or in response to disease, (ii). The discovery of materials that control and manipulate cells with the identification of scaffolds that provide the necessary biochemical cues for directing cell fate and drive tissue regeneration and (iii). The development of new classes of “smart-polymers” able, in real-time, to sense and report bacterial contamination. The newly discovered materials will find multiple biomedical applications in regenerative medicine and biotechnology ranging from 3D cell culture, bone repair and niche stabilisation to bacterial sensing/removal, while offering a new paradigm in drug delivery with biomarker triggered drug release.
Max ERC Funding
2 310 884 €
Duration
Start date: 2014-11-01, End date: 2019-10-31
Project acronym AEROBIC
Project Assessing the Effects of Rising O2 on Biogeochemical Cycles: Integrated Laboratory Experiments and Numerical Simulations
Researcher (PI) Itay Halevy
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), PE10, ERC-2013-StG
Summary The rise of atmospheric O2 ~2,500 million years ago is one of the most profound transitions in Earth's history. Yet, despite its central role in shaping Earth's surface environment, the cause for the rise of O2 remains poorly understood. Tight coupling between the O2 cycle and the biogeochemical cycles of redox-active elements, such as C, Fe and S, implies radical changes in these cycles before, during and after the rise of O2. These changes, too, are incompletely understood, but have left valuable information encoded in the geological record. This information has been qualitatively interpreted, leaving many aspects of the rise of O2, including its causes and constraints on ocean chemistry before and after it, topics of ongoing research and debate. Here, I outline a research program to address this fundamental question in geochemical Earth systems evolution. The inherently interdisciplinary program uniquely integrates laboratory experiments, numerical models, geological observations, and geochemical analyses. Laboratory experiments and geological observations will constrain unknown parameters of the early biogeochemical cycles, and, in combination with field studies, will validate and refine the use of paleoenvironmental proxies. The insight gained will be used to develop detailed models of the coupled biogeochemical cycles, which will themselves be used to quantitatively understand the events surrounding the rise of O2, and to illuminate the dynamics of elemental cycles in the early oceans.
This program is expected to yield novel, quantitative insight into these important events in Earth history and to have a major impact on our understanding of early ocean chemistry and the rise of O2. An ERC Starting Grant will enable me to use the excellent experimental and computational facilities at my disposal, to access the outstanding human resource at the Weizmann Institute of Science, and to address one of the major open questions in modern geochemistry.
Summary
The rise of atmospheric O2 ~2,500 million years ago is one of the most profound transitions in Earth's history. Yet, despite its central role in shaping Earth's surface environment, the cause for the rise of O2 remains poorly understood. Tight coupling between the O2 cycle and the biogeochemical cycles of redox-active elements, such as C, Fe and S, implies radical changes in these cycles before, during and after the rise of O2. These changes, too, are incompletely understood, but have left valuable information encoded in the geological record. This information has been qualitatively interpreted, leaving many aspects of the rise of O2, including its causes and constraints on ocean chemistry before and after it, topics of ongoing research and debate. Here, I outline a research program to address this fundamental question in geochemical Earth systems evolution. The inherently interdisciplinary program uniquely integrates laboratory experiments, numerical models, geological observations, and geochemical analyses. Laboratory experiments and geological observations will constrain unknown parameters of the early biogeochemical cycles, and, in combination with field studies, will validate and refine the use of paleoenvironmental proxies. The insight gained will be used to develop detailed models of the coupled biogeochemical cycles, which will themselves be used to quantitatively understand the events surrounding the rise of O2, and to illuminate the dynamics of elemental cycles in the early oceans.
This program is expected to yield novel, quantitative insight into these important events in Earth history and to have a major impact on our understanding of early ocean chemistry and the rise of O2. An ERC Starting Grant will enable me to use the excellent experimental and computational facilities at my disposal, to access the outstanding human resource at the Weizmann Institute of Science, and to address one of the major open questions in modern geochemistry.
Max ERC Funding
1 472 690 €
Duration
Start date: 2013-09-01, End date: 2018-08-31
Project acronym AEROCAT
Project Non-ordered nanoparticle superstructures – aerogels as efficient (electro-)catalysts
Researcher (PI) Alexander Eychmüller
Host Institution (HI) TECHNISCHE UNIVERSITAET DRESDEN
Call Details Advanced Grant (AdG), PE5, ERC-2013-ADG
Summary "AEROCAT aims at the elucidation of the potential of nanoparticle derived aerogels in catalytic applications. The materials will be produced from a variety of nanoparticles available in colloidal solutions, amongst which are metals and metal oxides. The evolving aerogels are extremely light, highly porous solids and have been demonstrated to exhibit in many cases the important properties of the nanosized objects they consist of instead of simply those of the respective bulk solids. The resulting aerogel materials will be characterized with respect to their morphology and composition and their resulting (electro-)catalytic properties examined in the light of the inherent electronic nature of the nanosized constituents. Using the knowledge gained within the project the aerogel materials will be further re-processed in order to exploit their full potential relevant to catalysis and electrocatalysis.
From the vast variety of possible applications of nanoparticle-based hydro- and aerogels like thermoelectrics, LEDs, pollutant clearance, sensorics and others we choose our strictly focused approach
(i) due to the paramount importance of catalysis for the Chemical Industry,
(ii) because we have successfully studied the Ethanol electrooxidation on a Pd-nanoparticle aerogel,
(iii) we have patented on the oxygen reduction reaction in fuel cells with bimetallic aerogels,
(iv) and we gained first and extremely promising results on the semi-hydrogenation of Acetylene on a mixed Pd/ZnO-nanoparticle aerogel.
With this we are on the forefront of a research field which impact might not be overestimated. We should quickly explore its potentials and transfer on a short track the knowledge gained into pre-industrial testing."
Summary
"AEROCAT aims at the elucidation of the potential of nanoparticle derived aerogels in catalytic applications. The materials will be produced from a variety of nanoparticles available in colloidal solutions, amongst which are metals and metal oxides. The evolving aerogels are extremely light, highly porous solids and have been demonstrated to exhibit in many cases the important properties of the nanosized objects they consist of instead of simply those of the respective bulk solids. The resulting aerogel materials will be characterized with respect to their morphology and composition and their resulting (electro-)catalytic properties examined in the light of the inherent electronic nature of the nanosized constituents. Using the knowledge gained within the project the aerogel materials will be further re-processed in order to exploit their full potential relevant to catalysis and electrocatalysis.
From the vast variety of possible applications of nanoparticle-based hydro- and aerogels like thermoelectrics, LEDs, pollutant clearance, sensorics and others we choose our strictly focused approach
(i) due to the paramount importance of catalysis for the Chemical Industry,
(ii) because we have successfully studied the Ethanol electrooxidation on a Pd-nanoparticle aerogel,
(iii) we have patented on the oxygen reduction reaction in fuel cells with bimetallic aerogels,
(iv) and we gained first and extremely promising results on the semi-hydrogenation of Acetylene on a mixed Pd/ZnO-nanoparticle aerogel.
With this we are on the forefront of a research field which impact might not be overestimated. We should quickly explore its potentials and transfer on a short track the knowledge gained into pre-industrial testing."
Max ERC Funding
2 194 000 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym AFMIDMOA
Project "Applying Fundamental Mathematics in Discrete Mathematics, Optimization, and Algorithmics"
Researcher (PI) Alexander Schrijver
Host Institution (HI) UNIVERSITEIT VAN AMSTERDAM
Call Details Advanced Grant (AdG), PE1, ERC-2013-ADG
Summary "This proposal aims at strengthening the connections between more fundamentally oriented areas of mathematics like algebra, geometry, analysis, and topology, and the more applied oriented and more recently emerging disciplines of discrete mathematics, optimization, and algorithmics.
The overall goal of the project is to obtain, with methods from fundamental mathematics, new effective tools to unravel the complexity of structures like graphs, networks, codes, knots, polynomials, and tensors, and to get a grip on such complex structures by new efficient characterizations, sharper bounds, and faster algorithms.
In the last few years, there have been several new developments where methods from representation theory, invariant theory, algebraic geometry, measure theory, functional analysis, and topology found new applications in discrete mathematics and optimization, both theoretically and algorithmically. Among the typical application areas are networks, coding, routing, timetabling, statistical and quantum physics, and computer science.
The project focuses in particular on:
A. Understanding partition functions with invariant theory and algebraic geometry
B. Graph limits, regularity, Hilbert spaces, and low rank approximation of polynomials
C. Reducing complexity in optimization by exploiting symmetry with representation theory
D. Reducing complexity in discrete optimization by homotopy and cohomology
These research modules are interconnected by themes like symmetry, regularity, and complexity, and by common methods from algebra, analysis, geometry, and topology."
Summary
"This proposal aims at strengthening the connections between more fundamentally oriented areas of mathematics like algebra, geometry, analysis, and topology, and the more applied oriented and more recently emerging disciplines of discrete mathematics, optimization, and algorithmics.
The overall goal of the project is to obtain, with methods from fundamental mathematics, new effective tools to unravel the complexity of structures like graphs, networks, codes, knots, polynomials, and tensors, and to get a grip on such complex structures by new efficient characterizations, sharper bounds, and faster algorithms.
In the last few years, there have been several new developments where methods from representation theory, invariant theory, algebraic geometry, measure theory, functional analysis, and topology found new applications in discrete mathematics and optimization, both theoretically and algorithmically. Among the typical application areas are networks, coding, routing, timetabling, statistical and quantum physics, and computer science.
The project focuses in particular on:
A. Understanding partition functions with invariant theory and algebraic geometry
B. Graph limits, regularity, Hilbert spaces, and low rank approximation of polynomials
C. Reducing complexity in optimization by exploiting symmetry with representation theory
D. Reducing complexity in discrete optimization by homotopy and cohomology
These research modules are interconnected by themes like symmetry, regularity, and complexity, and by common methods from algebra, analysis, geometry, and topology."
Max ERC Funding
2 001 598 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym AGESPACE
Project SPATIAL NAVIGATION – A UNIQUE WINDOW INTO MECHANISMS OF COGNITIVE AGEING
Researcher (PI) Thomas Wolbers
Host Institution (HI) DEUTSCHES ZENTRUM FUR NEURODEGENERATIVE ERKRANKUNGEN EV
Call Details Starting Grant (StG), SH4, ERC-2013-StG
Summary "By 2040, the European population aged over 60 will rise to 290 million, with those estimated to have dementia to 15.9 million. These dramatic demographic changes will pose huge challenges to health care systems, hence a detailed understanding of age-related cognitive and neurobiological changes is essential for helping elderly populations maintain independence. However, while existing research into cognitive ageing has carefully characterised developmental trajectories of functions such as memory and processing speed, one key cognitive ability that is particularly relevant to everyday functioning has received very little attention: In surveys, elderly people often report substantial declines in navigational abilities such as problems with finding one’s way in a novel environment. Such deficits severely restrict the mobility of elderly people and affect physical activity and social participation, but the underlying behavioural and neuronal mechanisms are poorly understood.
In this proposal, I will take a new approach to cognitive ageing that will bridge the gap between animal neurobiology and human cognitive neuroscience. With support from the ERC, I will create a team that will characterise the mechanisms mediating age-related changes in navigational processing in humans. The project will focus on three structures that perform key computations for spatial navigation, form a closely interconnected triadic network, and are particularly sensitive to the ageing process. Crucially, the team will employ an interdisciplinary methodological approach that combines mathematical modelling, brain imaging and innovative data analysis techniques with novel virtual environment technology, which allows for rigorous testing of predictions derived from animal findings. Finally, the proposal also incorporates a translational project aimed at improving spatial mnemonic functioning with a behavioural intervention, which provides a direct test of functional relevance and societal impact."
Summary
"By 2040, the European population aged over 60 will rise to 290 million, with those estimated to have dementia to 15.9 million. These dramatic demographic changes will pose huge challenges to health care systems, hence a detailed understanding of age-related cognitive and neurobiological changes is essential for helping elderly populations maintain independence. However, while existing research into cognitive ageing has carefully characterised developmental trajectories of functions such as memory and processing speed, one key cognitive ability that is particularly relevant to everyday functioning has received very little attention: In surveys, elderly people often report substantial declines in navigational abilities such as problems with finding one’s way in a novel environment. Such deficits severely restrict the mobility of elderly people and affect physical activity and social participation, but the underlying behavioural and neuronal mechanisms are poorly understood.
In this proposal, I will take a new approach to cognitive ageing that will bridge the gap between animal neurobiology and human cognitive neuroscience. With support from the ERC, I will create a team that will characterise the mechanisms mediating age-related changes in navigational processing in humans. The project will focus on three structures that perform key computations for spatial navigation, form a closely interconnected triadic network, and are particularly sensitive to the ageing process. Crucially, the team will employ an interdisciplinary methodological approach that combines mathematical modelling, brain imaging and innovative data analysis techniques with novel virtual environment technology, which allows for rigorous testing of predictions derived from animal findings. Finally, the proposal also incorporates a translational project aimed at improving spatial mnemonic functioning with a behavioural intervention, which provides a direct test of functional relevance and societal impact."
Max ERC Funding
1 318 990 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym ALEM
Project ADDITIONAL LOSSES IN ELECTRICAL MACHINES
Researcher (PI) Matti Antero Arkkio
Host Institution (HI) AALTO KORKEAKOULUSAATIO SR
Call Details Advanced Grant (AdG), PE8, ERC-2013-ADG
Summary "Electrical motors consume about 40 % of the electrical energy produced in the European Union. About 90 % of this energy is converted to mechanical work. However, 0.5-2.5 % of it goes to so called additional load losses whose exact origins are unknown. Our ambitious aim is to reveal the origins of these losses, build up numerical tools for modeling them and optimize electrical motors to minimize the losses.
As the hypothesis of the research, we assume that the additional losses mainly result from the deterioration of the core materials during the manufacturing process of the machine. By calorimetric measurements, we have found that the core losses of electrical machines may be twice as large as comprehensive loss models predict. The electrical steel sheets are punched, welded together and shrink fit to the frame. This causes residual strains in the core sheets deteriorating their magnetic characteristics. The cutting burrs make galvanic contacts between the sheets and form paths for inter-lamination currents. Another potential source of additional losses are the circulating currents between the parallel strands of random-wound armature windings. The stochastic nature of these potential sources of additional losses puts more challenge on the research.
We shall develop a physical loss model that couples the mechanical strains and electromagnetic losses in electrical steel sheets and apply the new model for comprehensive loss analysis of electrical machines. The stochastic variables related to the core losses and circulating-current losses will be discretized together with the temporal and spatial discretization of the electromechanical field variables. The numerical stochastic loss model will be used to search for such machine constructions that are insensitive to the manufacturing defects. We shall validate the new numerical loss models by electromechanical and calorimetric measurements."
Summary
"Electrical motors consume about 40 % of the electrical energy produced in the European Union. About 90 % of this energy is converted to mechanical work. However, 0.5-2.5 % of it goes to so called additional load losses whose exact origins are unknown. Our ambitious aim is to reveal the origins of these losses, build up numerical tools for modeling them and optimize electrical motors to minimize the losses.
As the hypothesis of the research, we assume that the additional losses mainly result from the deterioration of the core materials during the manufacturing process of the machine. By calorimetric measurements, we have found that the core losses of electrical machines may be twice as large as comprehensive loss models predict. The electrical steel sheets are punched, welded together and shrink fit to the frame. This causes residual strains in the core sheets deteriorating their magnetic characteristics. The cutting burrs make galvanic contacts between the sheets and form paths for inter-lamination currents. Another potential source of additional losses are the circulating currents between the parallel strands of random-wound armature windings. The stochastic nature of these potential sources of additional losses puts more challenge on the research.
We shall develop a physical loss model that couples the mechanical strains and electromagnetic losses in electrical steel sheets and apply the new model for comprehensive loss analysis of electrical machines. The stochastic variables related to the core losses and circulating-current losses will be discretized together with the temporal and spatial discretization of the electromechanical field variables. The numerical stochastic loss model will be used to search for such machine constructions that are insensitive to the manufacturing defects. We shall validate the new numerical loss models by electromechanical and calorimetric measurements."
Max ERC Funding
2 489 949 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym ALERT
Project ALERT - The Apertif-LOFAR Exploration of the Radio Transient Sky
Researcher (PI) Albert Van Leeuwen
Host Institution (HI) STICHTING ASTRON, NETHERLANDS INSTITUTE FOR RADIO ASTRONOMY
Call Details Consolidator Grant (CoG), PE9, ERC-2013-CoG
Summary "In our largely unchanging radio Universe, a highly dynamic component was recently discovered: flashes of bright radio emission that last only milliseconds but appear all over the sky. Some of these radio bursts can be traced to intermittently pulsating neutron stars. Other bursts however, apparently originate far outside our Galaxy. Due to great observational challenges, the evolution of the neutron stars is not understood, while more importantly, the nature of the extragalactic bursts remains an outright mystery.
My overall aim is to understand the physics that drives both kinds of brief and luminous bursts.
My primary goal is to identify the highly compact astrophysical explosions powering the extragalactic bursts. My previous surveys are the state of the art in fast-transient detection; I will now increase by a factor of 10 this exploration volume. In real-time I will provide arcsec positions, 10,000-fold more accurate than currently possible, to localize such extragalactic bursts for the first time and understand their origin.
My secondary goal is to unravel the unexplained evolution of intermittently pulsating neutron stars (building on e.g., my recent papers in Science, 2013), by doubling their number and modeling their population.
To achieve these goals, I will carry out a highly innovative survey: the Apertif-LOFAR Exploration of the Radio Transient Sky. ALERT is over an order of magnitude more sensitive than all current state-of-the art fast-transient surveys.
Through its novel, extremely wide field-of-view, Westerbork/Apertif will detect many tens of extragalactic bursts. Through real-time triggers to LOFAR I will next provide the precise localisation that is essential for radio, optical and high-energy follow-up to, for the first time, shed light on the physics and objects driving these bursts – evaporating primordial black holes; explosions in host galaxies; or, the unknown?"
Summary
"In our largely unchanging radio Universe, a highly dynamic component was recently discovered: flashes of bright radio emission that last only milliseconds but appear all over the sky. Some of these radio bursts can be traced to intermittently pulsating neutron stars. Other bursts however, apparently originate far outside our Galaxy. Due to great observational challenges, the evolution of the neutron stars is not understood, while more importantly, the nature of the extragalactic bursts remains an outright mystery.
My overall aim is to understand the physics that drives both kinds of brief and luminous bursts.
My primary goal is to identify the highly compact astrophysical explosions powering the extragalactic bursts. My previous surveys are the state of the art in fast-transient detection; I will now increase by a factor of 10 this exploration volume. In real-time I will provide arcsec positions, 10,000-fold more accurate than currently possible, to localize such extragalactic bursts for the first time and understand their origin.
My secondary goal is to unravel the unexplained evolution of intermittently pulsating neutron stars (building on e.g., my recent papers in Science, 2013), by doubling their number and modeling their population.
To achieve these goals, I will carry out a highly innovative survey: the Apertif-LOFAR Exploration of the Radio Transient Sky. ALERT is over an order of magnitude more sensitive than all current state-of-the art fast-transient surveys.
Through its novel, extremely wide field-of-view, Westerbork/Apertif will detect many tens of extragalactic bursts. Through real-time triggers to LOFAR I will next provide the precise localisation that is essential for radio, optical and high-energy follow-up to, for the first time, shed light on the physics and objects driving these bursts – evaporating primordial black holes; explosions in host galaxies; or, the unknown?"
Max ERC Funding
1 999 823 €
Duration
Start date: 2014-12-01, End date: 2019-11-30
Project acronym ALEXANDRIA
Project "Foundations for Temporal Retrieval, Exploration and Analytics in Web Archives"
Researcher (PI) Wolfgang Nejdl
Host Institution (HI) GOTTFRIED WILHELM LEIBNIZ UNIVERSITAET HANNOVER
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary "Significant parts of our cultural heritage are produced on the Web, yet only insufficient opportunities exist for accessing and exploring the past of the Web. The ALEXANDRIA project aims to develop models, tools and techniques necessary to archive and index relevant parts of the Web, and to retrieve and explore this information in a meaningful way. While the easy accessibility to the current Web is a good baseline, optimal access to Web archives requires new models and algorithms for retrieval, exploration, and analytics which go far beyond what is needed to access the current state of the Web. This includes taking into account the unique temporal dimension of Web archives, structured semantic information already available on the Web, as well as social media and network information.
Within ALEXANDRIA, we will significantly advance semantic and time-based indexing for Web archives using human-compiled knowledge available on the Web, to efficiently index, retrieve and explore information about entities and events from the past. In doing so, we will focus on the concurrent evolution of this knowledge and the Web content to be indexed, and take into account diversity and incompleteness of this knowledge. We will further investigate mixed crowd- and machine-based Web analytics to support long- running and collaborative retrieval and analysis processes on Web archives. Usage of implicit human feedback will be essential to provide better indexing through insights during the analysis process and to better focus harvesting of content.
The ALEXANDRIA Testbed will provide an important context for research, exploration and evaluation of the concepts, methods and algorithms developed in this project, and will provide both relevant collections and algorithms that enable further research on and practical application of our research results to existing archives like the Internet Archive, the Internet Memory Foundation and Web archives maintained by European national libraries."
Summary
"Significant parts of our cultural heritage are produced on the Web, yet only insufficient opportunities exist for accessing and exploring the past of the Web. The ALEXANDRIA project aims to develop models, tools and techniques necessary to archive and index relevant parts of the Web, and to retrieve and explore this information in a meaningful way. While the easy accessibility to the current Web is a good baseline, optimal access to Web archives requires new models and algorithms for retrieval, exploration, and analytics which go far beyond what is needed to access the current state of the Web. This includes taking into account the unique temporal dimension of Web archives, structured semantic information already available on the Web, as well as social media and network information.
Within ALEXANDRIA, we will significantly advance semantic and time-based indexing for Web archives using human-compiled knowledge available on the Web, to efficiently index, retrieve and explore information about entities and events from the past. In doing so, we will focus on the concurrent evolution of this knowledge and the Web content to be indexed, and take into account diversity and incompleteness of this knowledge. We will further investigate mixed crowd- and machine-based Web analytics to support long- running and collaborative retrieval and analysis processes on Web archives. Usage of implicit human feedback will be essential to provide better indexing through insights during the analysis process and to better focus harvesting of content.
The ALEXANDRIA Testbed will provide an important context for research, exploration and evaluation of the concepts, methods and algorithms developed in this project, and will provide both relevant collections and algorithms that enable further research on and practical application of our research results to existing archives like the Internet Archive, the Internet Memory Foundation and Web archives maintained by European national libraries."
Max ERC Funding
2 493 600 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym ALUNIF
Project Algorithms and Lower Bounds: A Unified Approach
Researcher (PI) Rahul Santhanam
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary One of the fundamental goals of theoretical computer science is to
understand the possibilities and limits of efficient computation. This
quest has two dimensions. The
theory of algorithms focuses on finding efficient solutions to
problems, while computational complexity theory aims to understand when
and why problems are hard to solve. These two areas have different
philosophies and use different sets of techniques. However, in recent
years there have been indications of deep and mysterious connections
between them.
In this project, we propose to explore and develop the connections between
algorithmic analysis and complexity lower bounds in a systematic way.
On the one hand, we plan to use complexity lower bound techniques as inspiration
to design new and improved algorithms for Satisfiability and other
NP-complete problems, as well as to analyze existing algorithms better.
On the other hand, we plan to strengthen implications yielding circuit
lower bounds from non-trivial algorithms for Satisfiability, and to derive
new circuit lower bounds using these stronger implications.
This project has potential for massive impact in both the areas of algorithms
and computational complexity. Improved algorithms for Satisfiability could lead
to improved SAT solvers, and the new analytical tools would lead to a better
understanding of existing heuristics. Complexity lower bound questions are
fundamental
but notoriously difficult, and new lower bounds would open the way to
unconditionally secure cryptographic protocols and derandomization of
probabilistic algorithms. More broadly, this project aims to initiate greater
dialogue between the two areas, with an exchange of ideas and techniques
which leads to accelerated progress in both, as well as a deeper understanding
of the nature of efficient computation.
Summary
One of the fundamental goals of theoretical computer science is to
understand the possibilities and limits of efficient computation. This
quest has two dimensions. The
theory of algorithms focuses on finding efficient solutions to
problems, while computational complexity theory aims to understand when
and why problems are hard to solve. These two areas have different
philosophies and use different sets of techniques. However, in recent
years there have been indications of deep and mysterious connections
between them.
In this project, we propose to explore and develop the connections between
algorithmic analysis and complexity lower bounds in a systematic way.
On the one hand, we plan to use complexity lower bound techniques as inspiration
to design new and improved algorithms for Satisfiability and other
NP-complete problems, as well as to analyze existing algorithms better.
On the other hand, we plan to strengthen implications yielding circuit
lower bounds from non-trivial algorithms for Satisfiability, and to derive
new circuit lower bounds using these stronger implications.
This project has potential for massive impact in both the areas of algorithms
and computational complexity. Improved algorithms for Satisfiability could lead
to improved SAT solvers, and the new analytical tools would lead to a better
understanding of existing heuristics. Complexity lower bound questions are
fundamental
but notoriously difficult, and new lower bounds would open the way to
unconditionally secure cryptographic protocols and derandomization of
probabilistic algorithms. More broadly, this project aims to initiate greater
dialogue between the two areas, with an exchange of ideas and techniques
which leads to accelerated progress in both, as well as a deeper understanding
of the nature of efficient computation.
Max ERC Funding
1 274 496 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym AMAIZE
Project Atlas of leaf growth regulatory networks in MAIZE
Researcher (PI) Dirk, Gustaaf Inzé
Host Institution (HI) VIB
Call Details Advanced Grant (AdG), LS9, ERC-2013-ADG
Summary "Understanding how organisms regulate size is one of the most fascinating open questions in biology. The aim of the AMAIZE project is to unravel how growth of maize leaves is controlled. Maize leaf development offers great opportunities to study the dynamics of growth regulatory networks, essentially because leaf development is a linear system with cell division at the leaf basis followed by cell expansion and maturation. Furthermore, the growth zone is relatively large allowing easy access of tissues at different positions. Four different perturbations of maize leaf size will be analyzed with cellular resolution: wild-type and plants having larger leaves (as a consequence of GA20OX1 overexpression), both grown under either well-watered or mild drought conditions. Firstly, a 3D cellular map of the growth zone of the fourth leaf will be made. RNA-SEQ of three different tissues (adaxial- and abaxial epidermis; mesophyll) obtained by laser dissection with an interval of 2.5 mm along the growth zone will allow for the analysis of the transcriptome with high resolution. Additionally, the composition of fifty selected growth regulatory protein complexes and DNA targets of transcription factors will be determined with an interval of 5 mm along the growth zone. Computational methods will be used to construct comprehensive integrative maps of the cellular and molecular processes occurring along the growth zone. Finally, selected regulatory nodes of the growth regulatory networks will be further functionally analyzed using a transactivation system in maize.
AMAIZE opens up new perspectives for the identification of optimal growth regulatory networks that can be selected for by advanced breeding or for which more robust variants (e.g. reduced susceptibility to drought) can be obtained through genetic engineering. The ability to improve the growth of maize and in analogy other cereals could have a high impact in providing food security"
Summary
"Understanding how organisms regulate size is one of the most fascinating open questions in biology. The aim of the AMAIZE project is to unravel how growth of maize leaves is controlled. Maize leaf development offers great opportunities to study the dynamics of growth regulatory networks, essentially because leaf development is a linear system with cell division at the leaf basis followed by cell expansion and maturation. Furthermore, the growth zone is relatively large allowing easy access of tissues at different positions. Four different perturbations of maize leaf size will be analyzed with cellular resolution: wild-type and plants having larger leaves (as a consequence of GA20OX1 overexpression), both grown under either well-watered or mild drought conditions. Firstly, a 3D cellular map of the growth zone of the fourth leaf will be made. RNA-SEQ of three different tissues (adaxial- and abaxial epidermis; mesophyll) obtained by laser dissection with an interval of 2.5 mm along the growth zone will allow for the analysis of the transcriptome with high resolution. Additionally, the composition of fifty selected growth regulatory protein complexes and DNA targets of transcription factors will be determined with an interval of 5 mm along the growth zone. Computational methods will be used to construct comprehensive integrative maps of the cellular and molecular processes occurring along the growth zone. Finally, selected regulatory nodes of the growth regulatory networks will be further functionally analyzed using a transactivation system in maize.
AMAIZE opens up new perspectives for the identification of optimal growth regulatory networks that can be selected for by advanced breeding or for which more robust variants (e.g. reduced susceptibility to drought) can be obtained through genetic engineering. The ability to improve the growth of maize and in analogy other cereals could have a high impact in providing food security"
Max ERC Funding
2 418 429 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym AMD
Project Algorithmic Mechanism Design: Beyond Truthful Mechanisms
Researcher (PI) Michal Feldman
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), PE6, ERC-2013-StG
Summary "The first decade of Algorithmic Mechanism Design (AMD) concentrated, very successfully, on the design of truthful mechanisms for the allocation of resources among agents with private preferences.
Truthful mechanisms are ones that incentivize rational users to report their preferences truthfully.
Truthfulness, however, for all its theoretical appeal, suffers from several inherent limitations, mainly its high communication and computation complexities.
It is not surprising, therefore, that practical applications forego truthfulness and use simpler mechanisms instead.
Simplicity in itself, however, is not sufficient, as any meaningful mechanism should also have some notion of fairness; otherwise agents will stop using it over time.
In this project I plan to develop an innovative AMD theoretical framework that will go beyond truthfulness and focus instead on the natural themes of simplicity and fairness, in addition to computational tractability.
One of my primary goals will be the design of simple and fair poly-time mechanisms that perform at near optimal levels with respect to important economic objectives such as social welfare and revenue.
To this end, I will work toward providing precise definitions of simplicity and fairness and quantifying the effects of these restrictions on the performance levels that can be obtained.
A major challenge in the evaluation of non-truthful mechanisms is defining a reasonable behavior model that will enable their evaluation.
The success of this project could have a broad impact on Europe and beyond, as it would guide the design of natural mechanisms for markets of tens of billions of dollars in revenue, such as online advertising, or sales of wireless frequencies.
The timing of this project is ideal, as the AMD field is now sufficiently mature to lead to a breakthrough and at the same time young enough to be receptive to new approaches and themes."
Summary
"The first decade of Algorithmic Mechanism Design (AMD) concentrated, very successfully, on the design of truthful mechanisms for the allocation of resources among agents with private preferences.
Truthful mechanisms are ones that incentivize rational users to report their preferences truthfully.
Truthfulness, however, for all its theoretical appeal, suffers from several inherent limitations, mainly its high communication and computation complexities.
It is not surprising, therefore, that practical applications forego truthfulness and use simpler mechanisms instead.
Simplicity in itself, however, is not sufficient, as any meaningful mechanism should also have some notion of fairness; otherwise agents will stop using it over time.
In this project I plan to develop an innovative AMD theoretical framework that will go beyond truthfulness and focus instead on the natural themes of simplicity and fairness, in addition to computational tractability.
One of my primary goals will be the design of simple and fair poly-time mechanisms that perform at near optimal levels with respect to important economic objectives such as social welfare and revenue.
To this end, I will work toward providing precise definitions of simplicity and fairness and quantifying the effects of these restrictions on the performance levels that can be obtained.
A major challenge in the evaluation of non-truthful mechanisms is defining a reasonable behavior model that will enable their evaluation.
The success of this project could have a broad impact on Europe and beyond, as it would guide the design of natural mechanisms for markets of tens of billions of dollars in revenue, such as online advertising, or sales of wireless frequencies.
The timing of this project is ideal, as the AMD field is now sufficiently mature to lead to a breakthrough and at the same time young enough to be receptive to new approaches and themes."
Max ERC Funding
1 394 600 €
Duration
Start date: 2013-11-01, End date: 2018-10-31
Project acronym AngioBone
Project Angiogenic growth, specialization, ageing and regeneration
of bone vessels
Researcher (PI) Ralf Heinrich Adams
Host Institution (HI) WESTFAELISCHE WILHELMS-UNIVERSITAET MUENSTER
Call Details Advanced Grant (AdG), LS3, ERC-2013-ADG
Summary The skeleton and the sinusoidal vasculature form a functional unit with great relevance in health, regeneration, and disease. Currently, fundamental aspects of sinusoidal vessel growth, specialization, arteriovenous organization and the consequences for tissue perfusion, or the changes occurring during ageing remain unknown. Our preliminary data indicate that key principles of bone vascularization and the role of molecular regulators are highly distinct from other organs. I therefore propose to use powerful combination of mouse genetics, fate mapping, transcriptional profiling, computational biology, confocal and two-photon microscopy, micro-CT and PET imaging, biochemistry and cell biology to characterize the growth, differentiation, dynamics, and ageing of the bone vasculature. In addition to established angiogenic pathways, the role of highly promising novel candidate regulators will be investigated in endothelial cells and perivascular osteoprogenitors with sophisticated inducible and cell type-specific genetic methods in the mouse. Complementing these powerful in vivo approaches, 3D co-cultures generated by cell printing technologies will provide insight into the communication between different cell types. The dynamics of sinusoidal vessel growth and regeneration will be monitored by two-photon imaging in the skull. Finally, I will explore the architectural, cellular and molecular changes and the role of capillary endothelial subpopulations in the sinusoidal vasculature of ageing and osteoporotic mice.
Technological advancements, such as new transgenic strains, mutant models or cell printing approaches, are important aspects of this proposal. AngioBone will provide a first conceptual framework for normal and deregulated function of the bone sinusoidal vasculature. It will also break new ground by analyzing the role of blood vessels in ageing and identifying novel strategies for tissue engineering and, potentially, the prevention/treatment of osteoporosis.
Summary
The skeleton and the sinusoidal vasculature form a functional unit with great relevance in health, regeneration, and disease. Currently, fundamental aspects of sinusoidal vessel growth, specialization, arteriovenous organization and the consequences for tissue perfusion, or the changes occurring during ageing remain unknown. Our preliminary data indicate that key principles of bone vascularization and the role of molecular regulators are highly distinct from other organs. I therefore propose to use powerful combination of mouse genetics, fate mapping, transcriptional profiling, computational biology, confocal and two-photon microscopy, micro-CT and PET imaging, biochemistry and cell biology to characterize the growth, differentiation, dynamics, and ageing of the bone vasculature. In addition to established angiogenic pathways, the role of highly promising novel candidate regulators will be investigated in endothelial cells and perivascular osteoprogenitors with sophisticated inducible and cell type-specific genetic methods in the mouse. Complementing these powerful in vivo approaches, 3D co-cultures generated by cell printing technologies will provide insight into the communication between different cell types. The dynamics of sinusoidal vessel growth and regeneration will be monitored by two-photon imaging in the skull. Finally, I will explore the architectural, cellular and molecular changes and the role of capillary endothelial subpopulations in the sinusoidal vasculature of ageing and osteoporotic mice.
Technological advancements, such as new transgenic strains, mutant models or cell printing approaches, are important aspects of this proposal. AngioBone will provide a first conceptual framework for normal and deregulated function of the bone sinusoidal vasculature. It will also break new ground by analyzing the role of blood vessels in ageing and identifying novel strategies for tissue engineering and, potentially, the prevention/treatment of osteoporosis.
Max ERC Funding
2 478 750 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ANOBEST
Project Structure function and pharmacology of calcium-activated chloride channels: Anoctamins and Bestrophins
Researcher (PI) Raimund Dutzler
Host Institution (HI) UNIVERSITAT ZURICH
Call Details Advanced Grant (AdG), LS1, ERC-2013-ADG
Summary Calcium-activated chloride channels (CaCCs) play key roles in a range of physiological processes such as the control of membrane excitability, photoreception and epithelial secretion. Although the importance of these channels has been recognized for more than 30 years their molecular identity remained obscure. The recent discovery of two protein families encoding for CaCCs, Anoctamins and Bestrophins, was a scientific breakthrough that has provided first insight into two novel ion channel architectures. Within this proposal we aim to determine the first high resolution structures of members of both families and study their functional behavior by an interdisciplinary approach combining biochemistry, X-ray crystallography and electrophysiology. The structural investigation of eukaryotic membrane proteins is extremely challenging and will require us to investigate large numbers of candidates to single out family members with superior biochemical properties. During the last year we have made large progress in this direction. By screening numerous eukaryotic Anoctamins and prokaryotic Bestrophins we have identified well-behaved proteins for both families, which were successfully scaled-up and purified. Additional family members will be identified within the course of the project. For these stable proteins we plan to grow crystals diffracting to high resolution and to proceed with structure determination. With first structural information in hand we will perform detailed functional studies using electrophysiology and complementary biophysical techniques to gain mechanistic insight into ion permeation and gating. As the pharmacology of both families is still in its infancy we will in later stages also engage in the identification and characterization of inhibitors and activators of Anoctamins and Bestrophins to open up a field that may ultimately lead to the discovery of novel therapeutic strategies targeting calcium-activated chloride channels.
Summary
Calcium-activated chloride channels (CaCCs) play key roles in a range of physiological processes such as the control of membrane excitability, photoreception and epithelial secretion. Although the importance of these channels has been recognized for more than 30 years their molecular identity remained obscure. The recent discovery of two protein families encoding for CaCCs, Anoctamins and Bestrophins, was a scientific breakthrough that has provided first insight into two novel ion channel architectures. Within this proposal we aim to determine the first high resolution structures of members of both families and study their functional behavior by an interdisciplinary approach combining biochemistry, X-ray crystallography and electrophysiology. The structural investigation of eukaryotic membrane proteins is extremely challenging and will require us to investigate large numbers of candidates to single out family members with superior biochemical properties. During the last year we have made large progress in this direction. By screening numerous eukaryotic Anoctamins and prokaryotic Bestrophins we have identified well-behaved proteins for both families, which were successfully scaled-up and purified. Additional family members will be identified within the course of the project. For these stable proteins we plan to grow crystals diffracting to high resolution and to proceed with structure determination. With first structural information in hand we will perform detailed functional studies using electrophysiology and complementary biophysical techniques to gain mechanistic insight into ion permeation and gating. As the pharmacology of both families is still in its infancy we will in later stages also engage in the identification and characterization of inhibitors and activators of Anoctamins and Bestrophins to open up a field that may ultimately lead to the discovery of novel therapeutic strategies targeting calcium-activated chloride channels.
Max ERC Funding
2 176 000 €
Duration
Start date: 2014-02-01, End date: 2020-01-31
Project acronym ANTIVIRNA
Project Structural and mechanistic studies of RNA-guided and RNA-targeting antiviral defense pathways
Researcher (PI) Martin Jinek
Host Institution (HI) UNIVERSITAT ZURICH
Call Details Starting Grant (StG), LS1, ERC-2013-StG
Summary The evolutionary pressures exerted by viruses on their host cells constitute a major force that drives the evolution of cellular antiviral mechanisms. The proposed research is motivated by our interest in the roles of protein-RNA interactions in both prokaryotic and eukaryotic antiviral pathways and will proceed in two directions. The first project stems from our current work on the CRISPR pathway, a recently discovered RNA-guided adaptive defense mechanism in bacteria and archaea that silences mobile genetic elements such as viruses (bacteriophages) and plasmids. CRISPR systems rely on short RNAs (crRNAs) that associate with CRISPR-associated (Cas) proteins and function as sequence-specific guides in the detection and destruction of invading nucleic acids. To obtain molecular insights into the mechanisms of crRNA-guided interference, we will pursue structural and functional studies of DNA-targeting ribonuceoprotein complexes from type II and III CRISPR systems. Our work will shed light on the function of these systems in microbial pathogenesis and provide a framework for the informed engineering of RNA-guided gene targeting technologies. The second proposed research direction centres on RNA-targeting antiviral strategies employed by the human innate immune system. Here, our work will focus on structural studies of major interferon-induced effector proteins, initially examining the allosteric activation mechanism of RNase L and subsequently focusing on other antiviral nucleases and RNA helicases, as well as mechanisms by which RNA viruses evade the innate immune response of the host. In our investigations, we plan to approach these questions using an integrated strategy combining structural biology, biochemistry and biophysics with cell-based functional studies. Together, our studies will provide fundamental molecular insights into RNA-centred antiviral mechanisms and their impact on human health and disease.
Summary
The evolutionary pressures exerted by viruses on their host cells constitute a major force that drives the evolution of cellular antiviral mechanisms. The proposed research is motivated by our interest in the roles of protein-RNA interactions in both prokaryotic and eukaryotic antiviral pathways and will proceed in two directions. The first project stems from our current work on the CRISPR pathway, a recently discovered RNA-guided adaptive defense mechanism in bacteria and archaea that silences mobile genetic elements such as viruses (bacteriophages) and plasmids. CRISPR systems rely on short RNAs (crRNAs) that associate with CRISPR-associated (Cas) proteins and function as sequence-specific guides in the detection and destruction of invading nucleic acids. To obtain molecular insights into the mechanisms of crRNA-guided interference, we will pursue structural and functional studies of DNA-targeting ribonuceoprotein complexes from type II and III CRISPR systems. Our work will shed light on the function of these systems in microbial pathogenesis and provide a framework for the informed engineering of RNA-guided gene targeting technologies. The second proposed research direction centres on RNA-targeting antiviral strategies employed by the human innate immune system. Here, our work will focus on structural studies of major interferon-induced effector proteins, initially examining the allosteric activation mechanism of RNase L and subsequently focusing on other antiviral nucleases and RNA helicases, as well as mechanisms by which RNA viruses evade the innate immune response of the host. In our investigations, we plan to approach these questions using an integrated strategy combining structural biology, biochemistry and biophysics with cell-based functional studies. Together, our studies will provide fundamental molecular insights into RNA-centred antiviral mechanisms and their impact on human health and disease.
Max ERC Funding
1 467 180 €
Duration
Start date: 2013-11-01, End date: 2018-10-31
Project acronym AOC
Project Adversary-Oriented Computing
Researcher (PI) Rachid Guerraoui
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary "Recent technological evolutions, including the cloud, the multicore, the social and the mobiles ones, are turning computing ubiquitously distributed. Yet, building high-assurance distributed programs is notoriously challenging. One of the main reasons is that these systems usually seek to achieve several goals at the same time. In short, they need to be efficient, responding effectively in various average-case conditions, as well as reliable, behaving correctly in severe, worst-case conditions. As a consequence, they typically intermingle different strategies: each to cope with some specific condition, e.g., with or without node failures, message losses, time-outs, contention, cache misses,
over-sizing, malicious attacks, etc. The resulting programs end up hard to design, prove, verify, implement, test and debug. Not surprisingly, there are anecdotal evidences of the fragility of the most celebrated distributed systems.
The goal of this project is to contribute to building high-assurance distributed programs by introducing a new dimension for separating and isolating their concerns, as well as a new scheme for composing and reusing them in a modular manner. In short, the project will explore the inherent power and limitations of a novel paradigm, Adversary-Oriented Computing (AOC). Sub-programs, each implementing a specific strategy to cope with a given adversary, modelling a specific working condition, are designed, proved, verified, implemented, tested and debugged independently. They are then composed, possibly dynamically, as black-boxes within the same global program. The AOC project is ambitious and it seeks to fundamentally revisit the way distributed algorithms are designed and distributed systems are implemented. The gain expected in comparison with today's approaches is substantial, and I believe it will be proportional to the degree of difficulty of the distributed problem at hand."
Summary
"Recent technological evolutions, including the cloud, the multicore, the social and the mobiles ones, are turning computing ubiquitously distributed. Yet, building high-assurance distributed programs is notoriously challenging. One of the main reasons is that these systems usually seek to achieve several goals at the same time. In short, they need to be efficient, responding effectively in various average-case conditions, as well as reliable, behaving correctly in severe, worst-case conditions. As a consequence, they typically intermingle different strategies: each to cope with some specific condition, e.g., with or without node failures, message losses, time-outs, contention, cache misses,
over-sizing, malicious attacks, etc. The resulting programs end up hard to design, prove, verify, implement, test and debug. Not surprisingly, there are anecdotal evidences of the fragility of the most celebrated distributed systems.
The goal of this project is to contribute to building high-assurance distributed programs by introducing a new dimension for separating and isolating their concerns, as well as a new scheme for composing and reusing them in a modular manner. In short, the project will explore the inherent power and limitations of a novel paradigm, Adversary-Oriented Computing (AOC). Sub-programs, each implementing a specific strategy to cope with a given adversary, modelling a specific working condition, are designed, proved, verified, implemented, tested and debugged independently. They are then composed, possibly dynamically, as black-boxes within the same global program. The AOC project is ambitious and it seeks to fundamentally revisit the way distributed algorithms are designed and distributed systems are implemented. The gain expected in comparison with today's approaches is substantial, and I believe it will be proportional to the degree of difficulty of the distributed problem at hand."
Max ERC Funding
2 147 012 €
Duration
Start date: 2014-06-01, End date: 2019-05-31
Project acronym APARTHEID-STOPS
Project Apartheid -- The Global Itinerary: South African Cultural Formations in Transnational Circulation, 1948-1990
Researcher (PI) Louise Bethlehem
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Consolidator Grant (CoG), SH5, ERC-2013-CoG
Summary This proposal proceeds from an anomaly. Apartheid routinely breached the separation that it names. Whereas the South African regime was deeply isolationist in international terms, new research links it to the Cold War and decolonization. Yet this trend does not consider sufficiently that the global contest over the meaning of apartheid and resistance to it occurs on the terrain of culture. My project argues that studying the global circulation of South African cultural formations in the apartheid era provides novel historiographic leverage over Western liberalism during the Cold War. It recasts apartheid as an apparatus of transnational cultural production, turning existing historiography inside out. This study seeks:
• To provide the first systematic account of the deterritorialization of “apartheid”—as political signifier and as apparatus generating circuits of transnational cultural production.
• To analyze these itinerant cultural formations across media and national borders, articulating new intersections.
• To map the itineraries of major South African exiles, where exile is taken to be a system of interlinked circuits of affiliation and cultural production.
• To revise the historiography of states other than South Africa through the lens of deterritorialized apartheid-era formations at their respective destinations.
• To show how apartheid reveals contradictions within Western liberalism during the Cold War, with special reference to racial inequality.
Methodologically, I introduce the model of thick convergence to analyze three periods:
1. Kliptown & Bandung: Novel possibilities, 1948-1960.
2. Sharpeville & Memphis: Drumming up resistance, 1960-1976.
3. From Soweto to Berlin: Spectacle at the barricades, 1976-1990.
Each explores a cultural dominant in the form of texts, soundscapes or photographs. My work stands at the frontier of transnational research, furnishing powerful new insights into why South Africa matters on the stage of global history.
Summary
This proposal proceeds from an anomaly. Apartheid routinely breached the separation that it names. Whereas the South African regime was deeply isolationist in international terms, new research links it to the Cold War and decolonization. Yet this trend does not consider sufficiently that the global contest over the meaning of apartheid and resistance to it occurs on the terrain of culture. My project argues that studying the global circulation of South African cultural formations in the apartheid era provides novel historiographic leverage over Western liberalism during the Cold War. It recasts apartheid as an apparatus of transnational cultural production, turning existing historiography inside out. This study seeks:
• To provide the first systematic account of the deterritorialization of “apartheid”—as political signifier and as apparatus generating circuits of transnational cultural production.
• To analyze these itinerant cultural formations across media and national borders, articulating new intersections.
• To map the itineraries of major South African exiles, where exile is taken to be a system of interlinked circuits of affiliation and cultural production.
• To revise the historiography of states other than South Africa through the lens of deterritorialized apartheid-era formations at their respective destinations.
• To show how apartheid reveals contradictions within Western liberalism during the Cold War, with special reference to racial inequality.
Methodologically, I introduce the model of thick convergence to analyze three periods:
1. Kliptown & Bandung: Novel possibilities, 1948-1960.
2. Sharpeville & Memphis: Drumming up resistance, 1960-1976.
3. From Soweto to Berlin: Spectacle at the barricades, 1976-1990.
Each explores a cultural dominant in the form of texts, soundscapes or photographs. My work stands at the frontier of transnational research, furnishing powerful new insights into why South Africa matters on the stage of global history.
Max ERC Funding
1 861 238 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym APHOTOREACTOR
Project Entirely Self-organized: Arrayed Single-Particle-in-a-Cavity Reactors for Highly Efficient and Selective Catalytic/Photocatalytic Energy Conversion and Solar Light Reaction Engineering
Researcher (PI) Patrik Schmuki
Host Institution (HI) FRIEDRICH-ALEXANDER-UNIVERSITAET ERLANGEN NUERNBERG
Call Details Advanced Grant (AdG), PE5, ERC-2013-ADG
Summary The proposal is built on the core idea to use an ensemble of multiple level self-organization processes to create a next generation photocatalytic platform that provides unprecedented property and reactivity control. As a main output, the project will yield a novel highly precise combined catalyst/photocatalyst assembly to: 1) provide a massive step ahead in photocatalytic applications such as direct solar hydrogen generation, pollution degradation (incl. CO2 decomposition), N2 fixation, or photocatalytic organic synthesis. It will drastically enhance efficiency and selectivity of photocatalytic reactions, and enable a high number of organic synthetic reactions to be carried out economically (and ecologically) via combined catalytic/photocatalytic pathways. Even more, it will establish an entirely new generation of “100% depoisoning”, anti-aggregation catalysts with substantially enhanced catalyst life-time. For this, a series of self-assembly processes on the mesoscale will be used to create highly uniform arrays of single-catalyst-particle-in-a-single-TiO2-cavity; target is a 100% reliable placement of a single <10 nm particle in a 10 nm cavity. Thus catalytic features of, for example Pt nanoparticles, can ideally interact with the photocatalytic properties of a TiO2 cavity. The cavity will be optimized for optical and electronic properties by doping and band-gap engineering; the geometry will be tuned to the range of a few nm.. This nanoscopic design yields to a radical change in the controllability of length and time-scales (reactant, charge carrier and ionic transport in the substrate) in combined photocatalytic/catalytic reactions. It is of key importance that all nanoscale assembly principles used in this work are scalable and allow to create square meters of nanoscopically ordered catalyst surfaces. We target to demonstrate the feasibility of the implementation of the nanoscale principles in a prototype macroscopic reactor.
Summary
The proposal is built on the core idea to use an ensemble of multiple level self-organization processes to create a next generation photocatalytic platform that provides unprecedented property and reactivity control. As a main output, the project will yield a novel highly precise combined catalyst/photocatalyst assembly to: 1) provide a massive step ahead in photocatalytic applications such as direct solar hydrogen generation, pollution degradation (incl. CO2 decomposition), N2 fixation, or photocatalytic organic synthesis. It will drastically enhance efficiency and selectivity of photocatalytic reactions, and enable a high number of organic synthetic reactions to be carried out economically (and ecologically) via combined catalytic/photocatalytic pathways. Even more, it will establish an entirely new generation of “100% depoisoning”, anti-aggregation catalysts with substantially enhanced catalyst life-time. For this, a series of self-assembly processes on the mesoscale will be used to create highly uniform arrays of single-catalyst-particle-in-a-single-TiO2-cavity; target is a 100% reliable placement of a single <10 nm particle in a 10 nm cavity. Thus catalytic features of, for example Pt nanoparticles, can ideally interact with the photocatalytic properties of a TiO2 cavity. The cavity will be optimized for optical and electronic properties by doping and band-gap engineering; the geometry will be tuned to the range of a few nm.. This nanoscopic design yields to a radical change in the controllability of length and time-scales (reactant, charge carrier and ionic transport in the substrate) in combined photocatalytic/catalytic reactions. It is of key importance that all nanoscale assembly principles used in this work are scalable and allow to create square meters of nanoscopically ordered catalyst surfaces. We target to demonstrate the feasibility of the implementation of the nanoscale principles in a prototype macroscopic reactor.
Max ERC Funding
2 427 000 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym APPL
Project Anionic PhosPhoLipids in plant receptor kinase signaling
Researcher (PI) Yvon Jaillais
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), LS3, ERC-2013-StG
Summary "In plants, receptor kinases form the largest family of plasma membrane (PM) receptors and they are involved in virtually all aspects of the plant life, including development, immunity and reproduction. In animals, key molecules that orchestrate the recruitment of signaling proteins to membranes are anionic phospholipids (e.g. phosphatidylinositol phosphate or PIPs). Besides, recent reports in animal and yeast cells suggest the existence of PM nanodomains that are independent of cholesterol and lipid phase and rely on anionic phospholipids as well as electrostatic protein/lipid interactions. Strikingly, we know very little on the role of anionic phospholipids in plant signaling. However, our preliminary data suggest that BKI1, an inhibitory protein of the steroid receptor kinase BRI1, interacts with various PIPs in vitro and is likely targeted to the PM by electrostatic interactions with these anionic lipids. These results open the possibility that BRI1, but also other receptor kinases, might be regulated by anionic phospholipids in plants. Here, we propose to analyze the function of anionic phospholipids in BRI1 signaling, using the root epidermis as a model system. First, we will ask what are the lipids that control membrane surface charge in this tissue and recruit BR-signaling component to the PM. Second, we will probe the presence of PIP-enriched nanodomains at the plant PM using super-resolution microscopy techniques and investigate the roles of these domains in BRI1 signaling. Finally, we will analyze the function of the BKI1-related plant-specific family of anionic phospholipid effectors in plant development. In summary, using a transversal approach ranging from in vitro studies to in vivo validation and whole organism physiology, this work will unravel the interplay between anionic phospholipids and receptor signaling in plants."
Summary
"In plants, receptor kinases form the largest family of plasma membrane (PM) receptors and they are involved in virtually all aspects of the plant life, including development, immunity and reproduction. In animals, key molecules that orchestrate the recruitment of signaling proteins to membranes are anionic phospholipids (e.g. phosphatidylinositol phosphate or PIPs). Besides, recent reports in animal and yeast cells suggest the existence of PM nanodomains that are independent of cholesterol and lipid phase and rely on anionic phospholipids as well as electrostatic protein/lipid interactions. Strikingly, we know very little on the role of anionic phospholipids in plant signaling. However, our preliminary data suggest that BKI1, an inhibitory protein of the steroid receptor kinase BRI1, interacts with various PIPs in vitro and is likely targeted to the PM by electrostatic interactions with these anionic lipids. These results open the possibility that BRI1, but also other receptor kinases, might be regulated by anionic phospholipids in plants. Here, we propose to analyze the function of anionic phospholipids in BRI1 signaling, using the root epidermis as a model system. First, we will ask what are the lipids that control membrane surface charge in this tissue and recruit BR-signaling component to the PM. Second, we will probe the presence of PIP-enriched nanodomains at the plant PM using super-resolution microscopy techniques and investigate the roles of these domains in BRI1 signaling. Finally, we will analyze the function of the BKI1-related plant-specific family of anionic phospholipid effectors in plant development. In summary, using a transversal approach ranging from in vitro studies to in vivo validation and whole organism physiology, this work will unravel the interplay between anionic phospholipids and receptor signaling in plants."
Max ERC Funding
1 797 840 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym APPLAUSE
Project Adolescent Precursors to Psychiatric Disorders – Learing from Analysis of User-Service Engagement
Researcher (PI) Sara Evans
Host Institution (HI) LONDON SCHOOL OF ECONOMICS AND POLITICAL SCIENCE
Call Details Starting Grant (StG), LS7, ERC-2013-StG
Summary APPLAUSE’s aim is to produce a body of evidence that illustrates how young people with mental health problems currently interact with both formal mental health services and informal social and familial support structures. Careful analysis of data gathered in the UK and Brazil will allow formulation of globally relevant insights into mental health care delivery for young people, which will be presented internationally as a resource for future health care service design.
APPLAUSE will allow the collection of an important data set that does not currently exist in this field, and will look to other disciplines for innovative approaches to data analysis. Whist standard analysis may allow for snapshots of health service use, using innovative life course methods will allow us to to characterise patterns of complete service use of each individual participant’s experience of accessing mental health care and social support.
Adolescence is a critical period in mental health development, which has been largely neglected by public health efforts. Psychiatric disorders rank as the primary cause of disability among individuals aged 10-24 years, worldwide. Moreover, many health risk behaviours emerge during adolescence and 70% of adult psychiatric disorders are preceded by mental health problems during adolescent years. However, delays to receiving care for psychiatric disorders, following disorder onset, avreage more than ten years and little is known about factors which impede access to and continuity of care among young people with mental health problems. APPLAUSE will analyse current access models, reports of individual experiences of positive and negative interactions with health care services and the culturally embedded social factors that impact on such access. Addressing this complex problem from a global perspective will advance the development of a more diverse and innovative set of strategies for improving earlier access to care.
Summary
APPLAUSE’s aim is to produce a body of evidence that illustrates how young people with mental health problems currently interact with both formal mental health services and informal social and familial support structures. Careful analysis of data gathered in the UK and Brazil will allow formulation of globally relevant insights into mental health care delivery for young people, which will be presented internationally as a resource for future health care service design.
APPLAUSE will allow the collection of an important data set that does not currently exist in this field, and will look to other disciplines for innovative approaches to data analysis. Whist standard analysis may allow for snapshots of health service use, using innovative life course methods will allow us to to characterise patterns of complete service use of each individual participant’s experience of accessing mental health care and social support.
Adolescence is a critical period in mental health development, which has been largely neglected by public health efforts. Psychiatric disorders rank as the primary cause of disability among individuals aged 10-24 years, worldwide. Moreover, many health risk behaviours emerge during adolescence and 70% of adult psychiatric disorders are preceded by mental health problems during adolescent years. However, delays to receiving care for psychiatric disorders, following disorder onset, avreage more than ten years and little is known about factors which impede access to and continuity of care among young people with mental health problems. APPLAUSE will analyse current access models, reports of individual experiences of positive and negative interactions with health care services and the culturally embedded social factors that impact on such access. Addressing this complex problem from a global perspective will advance the development of a more diverse and innovative set of strategies for improving earlier access to care.
Max ERC Funding
1 499 948 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym AQSER
Project Automorphic q-series and their application
Researcher (PI) Kathrin Bringmann
Host Institution (HI) UNIVERSITAET ZU KOELN
Call Details Starting Grant (StG), PE1, ERC-2013-StG
Summary This proposal aims to unravel mysteries at the frontier of number theory and other areas of mathematics and physics. The main focus will be to understand and exploit “modularity” of q-hypergeometric series. “Modular forms are functions on the complex plane that are inordinately symmetric.” (Mazur) The motivation comes from the wide-reaching applications of modularity in combinatorics, percolation, Lie theory, and physics (black holes).
The interplay between automorphic forms, q-series, and other areas of mathematics and physics is often two-sided. On the one hand, the other areas provide interesting examples of automorphic objects and predict their behavior. Sometimes these even motivate new classes of automorphic objects which have not been previously studied. On the other hand, knowing that certain generating functions are modular gives one access to deep theoretical tools to prove results in other areas. “Mathematics is a language, and we need that language to understand the physics of our universe.”(Ooguri) Understanding this interplay has attracted attention of researchers from a variety of areas. However, proofs of modularity of q-hypergeometric series currently fall far short of a comprehensive theory to describe the interplay between them and automorphic forms. A recent conjecture of W. Nahm relates the modularity of such series to K-theory. In this proposal I aim to fill this gap and provide a better understanding of this interplay by building a general structural framework enveloping these q-series. For this I will employ new kinds of automorphic objects and embed the functions of interest into bigger families
A successful outcome of the proposed research will open further horizons and also answer open questions, even those in other areas which were not addressed in this proposal; for example the new theory could be applied to better understand Donaldson invariants.
Summary
This proposal aims to unravel mysteries at the frontier of number theory and other areas of mathematics and physics. The main focus will be to understand and exploit “modularity” of q-hypergeometric series. “Modular forms are functions on the complex plane that are inordinately symmetric.” (Mazur) The motivation comes from the wide-reaching applications of modularity in combinatorics, percolation, Lie theory, and physics (black holes).
The interplay between automorphic forms, q-series, and other areas of mathematics and physics is often two-sided. On the one hand, the other areas provide interesting examples of automorphic objects and predict their behavior. Sometimes these even motivate new classes of automorphic objects which have not been previously studied. On the other hand, knowing that certain generating functions are modular gives one access to deep theoretical tools to prove results in other areas. “Mathematics is a language, and we need that language to understand the physics of our universe.”(Ooguri) Understanding this interplay has attracted attention of researchers from a variety of areas. However, proofs of modularity of q-hypergeometric series currently fall far short of a comprehensive theory to describe the interplay between them and automorphic forms. A recent conjecture of W. Nahm relates the modularity of such series to K-theory. In this proposal I aim to fill this gap and provide a better understanding of this interplay by building a general structural framework enveloping these q-series. For this I will employ new kinds of automorphic objects and embed the functions of interest into bigger families
A successful outcome of the proposed research will open further horizons and also answer open questions, even those in other areas which were not addressed in this proposal; for example the new theory could be applied to better understand Donaldson invariants.
Max ERC Funding
1 240 500 €
Duration
Start date: 2014-01-01, End date: 2019-04-30