Project acronym dEMORY
Project Dissecting the Role of Dendrites in Memory
Researcher (PI) Panayiota Poirazi
Host Institution (HI) FOUNDATION FOR RESEARCH AND TECHNOLOGY HELLAS
Call Details Starting Grant (StG), LS5, ERC-2012-StG_20111109
Summary Understanding the rules and mechanisms underlying memory formation, storage and retrieval is a grand challenge in neuroscience. In light of cumulating evidence regarding non-linear dendritic events (dendritic-spikes, branch strength potentiation, temporal sequence detection etc) together with activity-dependent rewiring of the connection matrix, the classical notion of information storage via Hebbian-like changes in synaptic connections is inadequate. While more recent plasticity theories consider non-linear dendritic properties, a unifying theory of how dendrites are utilized to achieve memory coding, storing and/or retrieval is cruelly missing. Using computational models, we will simulate memory processes in three key brain regions: the hippocampus, the amygdala and the prefrontal cortex. Models will incorporate biologically constrained dendrites and state-of-the-art plasticity rules and will span different levels of abstraction, ranging from detailed biophysical single neurons and circuits to integrate-and-fire networks and abstract theoretical models. Our main goal is to dissect the role of dendrites in information processing and storage across the three different regions by systematically altering their anatomical, biophysical and plasticity properties. Findings will further our understanding of the fundamental computations supported by these structures and how these computations, reinforced by plasticity mechanisms, sub-serve memory formation and associated dysfunctions, thus opening new avenues for hypothesis driven experimentation and development of novel treatments for memory-related diseases. Identification of dendrites as the key processing units across brain regions and complexity levels will lay the foundations for a new era in computational and experimental neuroscience and serve as the basis for groundbreaking advances in the robotics and artificial intelligence fields while also having a large impact on the machine learning community.
Summary
Understanding the rules and mechanisms underlying memory formation, storage and retrieval is a grand challenge in neuroscience. In light of cumulating evidence regarding non-linear dendritic events (dendritic-spikes, branch strength potentiation, temporal sequence detection etc) together with activity-dependent rewiring of the connection matrix, the classical notion of information storage via Hebbian-like changes in synaptic connections is inadequate. While more recent plasticity theories consider non-linear dendritic properties, a unifying theory of how dendrites are utilized to achieve memory coding, storing and/or retrieval is cruelly missing. Using computational models, we will simulate memory processes in three key brain regions: the hippocampus, the amygdala and the prefrontal cortex. Models will incorporate biologically constrained dendrites and state-of-the-art plasticity rules and will span different levels of abstraction, ranging from detailed biophysical single neurons and circuits to integrate-and-fire networks and abstract theoretical models. Our main goal is to dissect the role of dendrites in information processing and storage across the three different regions by systematically altering their anatomical, biophysical and plasticity properties. Findings will further our understanding of the fundamental computations supported by these structures and how these computations, reinforced by plasticity mechanisms, sub-serve memory formation and associated dysfunctions, thus opening new avenues for hypothesis driven experimentation and development of novel treatments for memory-related diseases. Identification of dendrites as the key processing units across brain regions and complexity levels will lay the foundations for a new era in computational and experimental neuroscience and serve as the basis for groundbreaking advances in the robotics and artificial intelligence fields while also having a large impact on the machine learning community.
Max ERC Funding
1 398 000 €
Duration
Start date: 2012-10-01, End date: 2017-09-30
Project acronym NGHCS
Project NGHCS: Creating the Next-Generation Mobile Human-Centered Systems
Researcher (PI) Vasiliki (Vana) Kalogeraki
Host Institution (HI) ATHENS UNIVERSITY OF ECONOMICS AND BUSINESS - RESEARCH CENTER
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary Advances in sensor networking and the availability of every day, low-cost sensor enabled devices has led to integrating sensors to instrument the physical world in a variety of economically vital sectors of agriculture, transportation, healthcare, critical infrastructures and emergency response. At the same time, social computing is now undergoing a major revolution: social networks, as exemplified by Twitter or Facebook, have significantly changed the way humans interact with one another. We are now entering a new era where people and systems are becoming increasingly integrated and this development is effectively leading us to large-scale mobile human-centered systems. Our goal is to develop a comprehensive framework to simplify the development of mobile human-centered systems, as well as make them predictable and reliable. Our work has the following research thrusts: First, we develop techniques for dealing efficiently with dynamic unpredictable factors that such complex systems face, including dynamic workloads, unpredictable occurrence of events, real-time demands of applications, as well as user changes and urban dynamics. To achieve this, we will investigate the use of mathematical models to control the behavior of the applications in the absence of perfect system models and a priori information on load and human usage patterns. Second, we will develop the foundations needed to meet the end-to-end timeliness and reliability demands for the range of distributed systems that we will consider by developing novel techniques at different layers of the distributed environment and studying the tradeoffs involved. Third, we will develop general techniques to push computation and data storage as much as possible to the mobile devices, and to integrate participatory sensing and crowdsourcing techniques. The outcome of the proposed work is expected to have significant impact on a wide variety of distributed systems application domains.
Summary
Advances in sensor networking and the availability of every day, low-cost sensor enabled devices has led to integrating sensors to instrument the physical world in a variety of economically vital sectors of agriculture, transportation, healthcare, critical infrastructures and emergency response. At the same time, social computing is now undergoing a major revolution: social networks, as exemplified by Twitter or Facebook, have significantly changed the way humans interact with one another. We are now entering a new era where people and systems are becoming increasingly integrated and this development is effectively leading us to large-scale mobile human-centered systems. Our goal is to develop a comprehensive framework to simplify the development of mobile human-centered systems, as well as make them predictable and reliable. Our work has the following research thrusts: First, we develop techniques for dealing efficiently with dynamic unpredictable factors that such complex systems face, including dynamic workloads, unpredictable occurrence of events, real-time demands of applications, as well as user changes and urban dynamics. To achieve this, we will investigate the use of mathematical models to control the behavior of the applications in the absence of perfect system models and a priori information on load and human usage patterns. Second, we will develop the foundations needed to meet the end-to-end timeliness and reliability demands for the range of distributed systems that we will consider by developing novel techniques at different layers of the distributed environment and studying the tradeoffs involved. Third, we will develop general techniques to push computation and data storage as much as possible to the mobile devices, and to integrate participatory sensing and crowdsourcing techniques. The outcome of the proposed work is expected to have significant impact on a wide variety of distributed systems application domains.
Max ERC Funding
960 000 €
Duration
Start date: 2013-03-01, End date: 2019-02-28
Project acronym PHOTOMETA
Project Photonic Metamaterials: From Basic Research to Applications
Researcher (PI) Costas Soukoulis
Host Institution (HI) IDRYMA TECHNOLOGIAS KAI EREVNAS
Call Details Advanced Grant (AdG), PE3, ERC-2012-ADG_20120216
Summary Novel artificial materials (photonic crystals (PCs), negative index materials (NIMs), and plasmonics) enable the realization of innovative EM properties unattainable in naturally existing materials. These materials, called metamaterials (MMs), have been in the foreground of scientific interest in the last ten years. However, many serious obstacles must be overcome before the impressive possibilities of MMs, especially in the optical regime, become real applications.
The present project combines NIMs, PCs, and aspects of plasmonics in a unified way in order to promote the development of functional MMs, and mainly functional optical MMs (OMMs). It identifies the main obstacles, proposes specific approaches to deal with them, and intends to study unexplored capabilities of OMMs. The project objectives are: (a) Design and realization of 3d OMMs, and achieve new metasurface designs applying Babinet’s principle. (b) Understanding and reducing the losses in OMM by incorporating gain and EM induced transparency (EIT). (c) Achieving highly efficient PC nanolasers and surface plasmons (SPs) lasers. (d) Use chiral MMs and SPs to reduce and manipulate Casimir forces, and (e) Using MMs, combined with nonlinear materials, for THz generation, and tunable response.(f)Calculate electron- phonon scattering and edge collisions in graphene and in graphene-based molecules. The unifying link in all these objectives is the endowment of photons with novel properties through imaginative use of EM-field / artificial-matter interactions. Some of these objectives seem almost certainly realizable; others are more risky but with higher reward if accomplished; some are directed towards new specific applications, while others explore new physical reality.
The accomplishment of those objectives requires novel ideas, advanced computational techniques, nanofabrication approaches, and testing. The broad expertise of the PI and his team, and their pioneering contributions to NIMs, PCs, and plasmonics qualifies them for facing the challenges and ensuring the maximum possible success of the project.
Summary
Novel artificial materials (photonic crystals (PCs), negative index materials (NIMs), and plasmonics) enable the realization of innovative EM properties unattainable in naturally existing materials. These materials, called metamaterials (MMs), have been in the foreground of scientific interest in the last ten years. However, many serious obstacles must be overcome before the impressive possibilities of MMs, especially in the optical regime, become real applications.
The present project combines NIMs, PCs, and aspects of plasmonics in a unified way in order to promote the development of functional MMs, and mainly functional optical MMs (OMMs). It identifies the main obstacles, proposes specific approaches to deal with them, and intends to study unexplored capabilities of OMMs. The project objectives are: (a) Design and realization of 3d OMMs, and achieve new metasurface designs applying Babinet’s principle. (b) Understanding and reducing the losses in OMM by incorporating gain and EM induced transparency (EIT). (c) Achieving highly efficient PC nanolasers and surface plasmons (SPs) lasers. (d) Use chiral MMs and SPs to reduce and manipulate Casimir forces, and (e) Using MMs, combined with nonlinear materials, for THz generation, and tunable response.(f)Calculate electron- phonon scattering and edge collisions in graphene and in graphene-based molecules. The unifying link in all these objectives is the endowment of photons with novel properties through imaginative use of EM-field / artificial-matter interactions. Some of these objectives seem almost certainly realizable; others are more risky but with higher reward if accomplished; some are directed towards new specific applications, while others explore new physical reality.
The accomplishment of those objectives requires novel ideas, advanced computational techniques, nanofabrication approaches, and testing. The broad expertise of the PI and his team, and their pioneering contributions to NIMs, PCs, and plasmonics qualifies them for facing the challenges and ensuring the maximum possible success of the project.
Max ERC Funding
2 100 000 €
Duration
Start date: 2013-03-01, End date: 2019-02-28
Project acronym SPADE
Project Sophisticated Program Analysis, Declaratively
Researcher (PI) Ioannis Smaragdakis
Host Institution (HI) ETHNIKO KAI KAPODISTRIAKO PANEPISTIMIO ATHINON
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary Static program analysis is a fundamental computing challenge. We have recently demonstrated significant advantages from expressing analyses for Java declaratively, in the Datalog language. This means that the algorithm is in a form that resembles a pure logical specification, rather than a step-by-step definition of the execution. The declarative specification does not merely cover the main logic of the algorithm, but its entire implementation, including the handling of complex semantic features (such as native methods, reflection, threads) of the Java language. Surprisingly, the declarative specification can be made to execute up to an order of magnitude faster than the dominant pre-existing implementations of the same algorithms. Armed with this past experience, the SPADE project aims to develop a next-generation approach to the design and declarative implementation of static program analyses. This will include a) a substantially more flexible notion of context-sensitive analysis, which allows context to vary according to introspective observations; b) a flow-sensitive analysis framework that can be used as the basis for dataflow analysis; c) an approach to producing parallel implementations of analyses by exploiting the parallelism inherent in the declarative specification; d) an exploration of adapting analysis logic to multiple languages and paradigms, including C (using the LLVM infrastructure), functional languages (e.g., Scheme), and dynamic languages (notably, Javascript); e) client analyses algorithms (e.g., may-happen-in-parallel, bug finding analyses such as race and atomicity-violation detectors, etc.) expressed modularly over the underlying substrate of points-to analysis.
The work will have applications to multiple languages and a variety of analyses. Concretely, our precise and scalable analysis algorithms will enhance optimizing compilers, program analyzers for error detection, and program understanding tools.
Summary
Static program analysis is a fundamental computing challenge. We have recently demonstrated significant advantages from expressing analyses for Java declaratively, in the Datalog language. This means that the algorithm is in a form that resembles a pure logical specification, rather than a step-by-step definition of the execution. The declarative specification does not merely cover the main logic of the algorithm, but its entire implementation, including the handling of complex semantic features (such as native methods, reflection, threads) of the Java language. Surprisingly, the declarative specification can be made to execute up to an order of magnitude faster than the dominant pre-existing implementations of the same algorithms. Armed with this past experience, the SPADE project aims to develop a next-generation approach to the design and declarative implementation of static program analyses. This will include a) a substantially more flexible notion of context-sensitive analysis, which allows context to vary according to introspective observations; b) a flow-sensitive analysis framework that can be used as the basis for dataflow analysis; c) an approach to producing parallel implementations of analyses by exploiting the parallelism inherent in the declarative specification; d) an exploration of adapting analysis logic to multiple languages and paradigms, including C (using the LLVM infrastructure), functional languages (e.g., Scheme), and dynamic languages (notably, Javascript); e) client analyses algorithms (e.g., may-happen-in-parallel, bug finding analyses such as race and atomicity-violation detectors, etc.) expressed modularly over the underlying substrate of points-to analysis.
The work will have applications to multiple languages and a variety of analyses. Concretely, our precise and scalable analysis algorithms will enhance optimizing compilers, program analyzers for error detection, and program understanding tools.
Max ERC Funding
1 042 616 €
Duration
Start date: 2013-01-01, End date: 2019-03-31
Project acronym TRANSARREST
Project Keeping gene expression in check: eliciting the role of transcription in the maintenance of genome integrity
Researcher (PI) Maria Fousteri
Host Institution (HI) BIOMEDICAL SCIENCES RESEARCH CENTER ALEXANDER FLEMING
Call Details Starting Grant (StG), LS1, ERC-2012-StG_20111109
Summary Genomic integrity is essential for accurate gene expression and epigenetic inheritance. On the other hand, a prolonged transcriptional arrest can challenge genome stability, contributing to genetic and epigenetic defects and the mechanisms of ageing and disease.
Here we aim to identify the molecular mechanisms that couple transcriptional arrest to chromatin alteration and repair. We wish to explore the idea that transcription suppresses cellular toxicity and preserves genetic and epigenetic inheritance.
Towards these goals our work will be focused on:
1. Deciphering the molecular events impinging on the manner cells respond when the progress of a transcribing RNA polymerase II is blocked.
2. Exploring a novel, so far unanticipated function of key players of the transcription-associated repair pathways, such as the Cockayne Syndrome (CS) proteins, not related to repair.
3. Understanding the role of transcription in chemotherapeutic-driven toxicity.
4. Investigating novel post-translational modifications of CS and determining their function.
These objectives will be addressed using advanced proteomics and genome wide technologies in combination with biochemical and cellular techniques in normal human cells and a large battery of patient-derived cell lines. Our rational is that better understanding of CS function will help reach our ultimate goal, which is to identify the regulatory cascades involved in the interplay between genomic stability and transcription. The novel key idea put forward in this proposal is that active transcription itself directly contributes to genome integrity. While the role of DNA damage-driven transcription blockage in promoting repair is well established, the protective role of active transcription in genome stability is entirely unexplored.
If successful, the proposed studies may help reveal the underlying causes of related disorders and explain their clinical features.
Summary
Genomic integrity is essential for accurate gene expression and epigenetic inheritance. On the other hand, a prolonged transcriptional arrest can challenge genome stability, contributing to genetic and epigenetic defects and the mechanisms of ageing and disease.
Here we aim to identify the molecular mechanisms that couple transcriptional arrest to chromatin alteration and repair. We wish to explore the idea that transcription suppresses cellular toxicity and preserves genetic and epigenetic inheritance.
Towards these goals our work will be focused on:
1. Deciphering the molecular events impinging on the manner cells respond when the progress of a transcribing RNA polymerase II is blocked.
2. Exploring a novel, so far unanticipated function of key players of the transcription-associated repair pathways, such as the Cockayne Syndrome (CS) proteins, not related to repair.
3. Understanding the role of transcription in chemotherapeutic-driven toxicity.
4. Investigating novel post-translational modifications of CS and determining their function.
These objectives will be addressed using advanced proteomics and genome wide technologies in combination with biochemical and cellular techniques in normal human cells and a large battery of patient-derived cell lines. Our rational is that better understanding of CS function will help reach our ultimate goal, which is to identify the regulatory cascades involved in the interplay between genomic stability and transcription. The novel key idea put forward in this proposal is that active transcription itself directly contributes to genome integrity. While the role of DNA damage-driven transcription blockage in promoting repair is well established, the protective role of active transcription in genome stability is entirely unexplored.
If successful, the proposed studies may help reveal the underlying causes of related disorders and explain their clinical features.
Max ERC Funding
1 500 000 €
Duration
Start date: 2012-11-01, End date: 2018-10-31