Project acronym 5D-NanoTrack
Project Five-Dimensional Localization Microscopy for Sub-Cellular Dynamics
Researcher (PI) Yoav SHECHTMAN
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Starting Grant (StG), PE7, ERC-2018-STG
Summary The sub-cellular processes that control the most critical aspects of life occur in three-dimensions (3D), and are intrinsically dynamic. While super-resolution microscopy has revolutionized cellular imaging in recent years, our current capability to observe the dynamics of life on the nanoscale is still extremely limited, due to inherent trade-offs between spatial, temporal and spectral resolution using existing approaches.
We propose to develop and demonstrate an optical microscopy methodology that would enable live sub-cellular observation in unprecedented detail. Making use of multicolor 3D point-spread-function (PSF) engineering, a technique I have recently developed, we will be able to simultaneously track multiple markers inside live cells, at high speed and in five-dimensions (3D, time, and color).
Multicolor 3D PSF engineering holds the potential of being a uniquely powerful method for 5D tracking. However, it is not yet applicable to live-cell imaging, due to significant bottlenecks in optical engineering and signal processing, which we plan to overcome in this project. Importantly, we will also demonstrate the efficacy of our method using a challenging biological application: real-time visualization of chromatin dynamics - the spatiotemporal organization of DNA. This is a highly suitable problem due to its fundamental importance, its role in a variety of cellular processes, and the lack of appropriate tools for studying it.
The project is divided into 3 aims:
1. Technology development: diffractive-element design for multicolor 3D PSFs.
2. System design: volumetric tracking of dense emitters.
3. Live-cell measurements: chromatin dynamics.
Looking ahead, here we create the imaging tools that pave the way towards the holy grail of chromatin visualization: dynamic observation of the 3D positions of the ~3 billion DNA base-pairs in a live human cell. Beyond that, our results will be applicable to numerous 3D micro/nanoscale tracking applications.
Summary
The sub-cellular processes that control the most critical aspects of life occur in three-dimensions (3D), and are intrinsically dynamic. While super-resolution microscopy has revolutionized cellular imaging in recent years, our current capability to observe the dynamics of life on the nanoscale is still extremely limited, due to inherent trade-offs between spatial, temporal and spectral resolution using existing approaches.
We propose to develop and demonstrate an optical microscopy methodology that would enable live sub-cellular observation in unprecedented detail. Making use of multicolor 3D point-spread-function (PSF) engineering, a technique I have recently developed, we will be able to simultaneously track multiple markers inside live cells, at high speed and in five-dimensions (3D, time, and color).
Multicolor 3D PSF engineering holds the potential of being a uniquely powerful method for 5D tracking. However, it is not yet applicable to live-cell imaging, due to significant bottlenecks in optical engineering and signal processing, which we plan to overcome in this project. Importantly, we will also demonstrate the efficacy of our method using a challenging biological application: real-time visualization of chromatin dynamics - the spatiotemporal organization of DNA. This is a highly suitable problem due to its fundamental importance, its role in a variety of cellular processes, and the lack of appropriate tools for studying it.
The project is divided into 3 aims:
1. Technology development: diffractive-element design for multicolor 3D PSFs.
2. System design: volumetric tracking of dense emitters.
3. Live-cell measurements: chromatin dynamics.
Looking ahead, here we create the imaging tools that pave the way towards the holy grail of chromatin visualization: dynamic observation of the 3D positions of the ~3 billion DNA base-pairs in a live human cell. Beyond that, our results will be applicable to numerous 3D micro/nanoscale tracking applications.
Max ERC Funding
1 802 500 €
Duration
Start date: 2018-11-01, End date: 2023-10-31
Project acronym AGALT
Project Asymptotic Geometric Analysis and Learning Theory
Researcher (PI) Shahar Mendelson
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Starting Grant (StG), PE1, ERC-2007-StG
Summary In a typical learning problem one tries to approximate an unknown function by a function from a given class using random data, sampled according to an unknown measure. In this project we will be interested in parameters that govern the complexity of a learning problem. It turns out that this complexity is determined by the geometry of certain sets in high dimension that are connected to the given class (random coordinate projections of the class). Thus, one has to understand the structure of these sets as a function of the dimension - which is given by the cardinality of the random sample. The resulting analysis leads to many theoretical questions in Asymptotic Geometric Analysis, Probability (most notably, Empirical Processes Theory) and Combinatorics, which are of independent interest beyond the application to Learning Theory. Our main goal is to describe the role of various complexity parameters involved in a learning problem, to analyze the connections between them and to investigate the way they determine the geometry of the relevant high dimensional sets. Some of the questions we intend to tackle are well known open problems and making progress towards their solution will have a significant theoretical impact. Moreover, this project should lead to a more complete theory of learning and is likely to have some practical impact, for example, in the design of more efficient learning algorithms.
Summary
In a typical learning problem one tries to approximate an unknown function by a function from a given class using random data, sampled according to an unknown measure. In this project we will be interested in parameters that govern the complexity of a learning problem. It turns out that this complexity is determined by the geometry of certain sets in high dimension that are connected to the given class (random coordinate projections of the class). Thus, one has to understand the structure of these sets as a function of the dimension - which is given by the cardinality of the random sample. The resulting analysis leads to many theoretical questions in Asymptotic Geometric Analysis, Probability (most notably, Empirical Processes Theory) and Combinatorics, which are of independent interest beyond the application to Learning Theory. Our main goal is to describe the role of various complexity parameters involved in a learning problem, to analyze the connections between them and to investigate the way they determine the geometry of the relevant high dimensional sets. Some of the questions we intend to tackle are well known open problems and making progress towards their solution will have a significant theoretical impact. Moreover, this project should lead to a more complete theory of learning and is likely to have some practical impact, for example, in the design of more efficient learning algorithms.
Max ERC Funding
750 000 €
Duration
Start date: 2009-03-01, End date: 2014-02-28
Project acronym ARITHQUANTUMCHAOS
Project Arithmetic and Quantum Chaos
Researcher (PI) Zeev Rudnick
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Advanced Grant (AdG), PE1, ERC-2012-ADG_20120216
Summary Quantum Chaos is an emerging discipline which is crossing over from Physics into Pure Mathematics. The recent crossover is driven in part by a connection with Number Theory. This project explores several aspects of this interrelationship and is composed of a number of sub-projects. The sub-projects deal with: statistics of energy levels and wave functions of pseudo-integrable systems, a hitherto unexplored subject in the mathematical community which is not well understood in the physics community; with statistics of zeros of zeta functions over function fields, a purely number theoretic topic which is linked to the subproject on Quantum Chaos through the mysterious connections to Random Matrix Theory and an analogy between energy levels and zeta zeros; and with spatial statistics in arithmetic.
Summary
Quantum Chaos is an emerging discipline which is crossing over from Physics into Pure Mathematics. The recent crossover is driven in part by a connection with Number Theory. This project explores several aspects of this interrelationship and is composed of a number of sub-projects. The sub-projects deal with: statistics of energy levels and wave functions of pseudo-integrable systems, a hitherto unexplored subject in the mathematical community which is not well understood in the physics community; with statistics of zeros of zeta functions over function fields, a purely number theoretic topic which is linked to the subproject on Quantum Chaos through the mysterious connections to Random Matrix Theory and an analogy between energy levels and zeta zeros; and with spatial statistics in arithmetic.
Max ERC Funding
1 714 000 €
Duration
Start date: 2013-02-01, End date: 2019-01-31
Project acronym BeyondA1
Project Set theory beyond the first uncountable cardinal
Researcher (PI) Assaf Shmuel Rinot
Host Institution (HI) BAR ILAN UNIVERSITY
Call Details Starting Grant (StG), PE1, ERC-2018-STG
Summary We propose to establish a research group that will unveil the combinatorial nature of the second uncountable cardinal. This includes its Ramsey-theoretic, order-theoretic, graph-theoretic and topological features. Among others, we will be directly addressing fundamental problems due to Erdos, Rado, Galvin, and Shelah.
While some of these problems are old and well-known, an unexpected series of breakthroughs from the last three years suggest that now is a promising point in time to carry out such a project. Indeed, through a short period, four previously unattainable problems concerning the second uncountable cardinal were successfully tackled: Aspero on a club-guessing problem of Shelah, Krueger on the club-isomorphism problem for Aronszajn trees, Neeman on the isomorphism problem for dense sets of reals, and the PI on the Souslin problem. Each of these results was obtained through the development of a completely new technical framework, and these frameworks could now pave the way for the solution of some major open questions.
A goal of the highest risk in this project is the discovery of a consistent (possibly, parameterized) forcing axiom that will (preferably, simultaneously) provide structure theorems for stationary sets, linearly ordered sets, trees, graphs, and partition relations, as well as the refutation of various forms of club-guessing principles, all at the level of the second uncountable cardinal. In comparison, at the level of the first uncountable cardinal, a forcing axiom due to Foreman, Magidor and Shelah achieves exactly that.
To approach our goals, the proposed project is divided into four core areas: Uncountable trees, Ramsey theory on ordinals, Club-guessing principles, and Forcing Axioms. There is a rich bilateral interaction between any pair of the four different cores, but the proposed division will allow an efficient allocation of manpower, and will increase the chances of parallel success.
Summary
We propose to establish a research group that will unveil the combinatorial nature of the second uncountable cardinal. This includes its Ramsey-theoretic, order-theoretic, graph-theoretic and topological features. Among others, we will be directly addressing fundamental problems due to Erdos, Rado, Galvin, and Shelah.
While some of these problems are old and well-known, an unexpected series of breakthroughs from the last three years suggest that now is a promising point in time to carry out such a project. Indeed, through a short period, four previously unattainable problems concerning the second uncountable cardinal were successfully tackled: Aspero on a club-guessing problem of Shelah, Krueger on the club-isomorphism problem for Aronszajn trees, Neeman on the isomorphism problem for dense sets of reals, and the PI on the Souslin problem. Each of these results was obtained through the development of a completely new technical framework, and these frameworks could now pave the way for the solution of some major open questions.
A goal of the highest risk in this project is the discovery of a consistent (possibly, parameterized) forcing axiom that will (preferably, simultaneously) provide structure theorems for stationary sets, linearly ordered sets, trees, graphs, and partition relations, as well as the refutation of various forms of club-guessing principles, all at the level of the second uncountable cardinal. In comparison, at the level of the first uncountable cardinal, a forcing axiom due to Foreman, Magidor and Shelah achieves exactly that.
To approach our goals, the proposed project is divided into four core areas: Uncountable trees, Ramsey theory on ordinals, Club-guessing principles, and Forcing Axioms. There is a rich bilateral interaction between any pair of the four different cores, but the proposed division will allow an efficient allocation of manpower, and will increase the chances of parallel success.
Max ERC Funding
1 362 500 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym BNYQ
Project Breaking the Nyquist Barrier: A New Paradigm in Data Conversion and Transmission
Researcher (PI) Yonina Eldar
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Consolidator Grant (CoG), PE7, ERC-2014-CoG
Summary Digital signal processing (DSP) is a revolutionary paradigm shift enabling processing of physical data in the digital domain where design and implementation are considerably simplified. However, state-of-the-art analog-to-digital convertors (ADCs) preclude high-rate wideband sampling and processing with low cost and energy consumption, presenting a major bottleneck. This is mostly due to a traditional assumption that sampling must be performed at the Nyquist rate, that is, twice the signal bandwidth. Modern applications including communications, medical imaging, radar and more use signals with high bandwidth, resulting in prohibitively large Nyquist rates.
Our ambitious goal is to introduce a paradigm shift in ADC design that will enable systems capable of low-rate, wideband sensing and low-rate DSP.
While DSP has a rich history in exploiting structure to reduce dimensionality and perform efficient parameter extraction, current ADCs do not exploit such knowledge.
We challenge current practice that separates the sampling stage from the processing stage and exploit structure in analog signals already in the ADC, to drastically reduce the sampling and processing rates.
Our preliminary data shows that this allows substantial savings in sampling and processing rates --- we show rate reduction of 1/28 in ultrasound imaging, and 1/30 in radar detection.
To achieve our overreaching goal we focus on three interconnected objectives -- developing the 1) theory 2) hardware and 3) applications of sub-Nyquist sampling.
Our methodology ties together two areas on the frontier of signal processing: compressed sensing (CS), focused on finite length vectors, and analog sampling. Our research plan also inherently relies on advances in several other important areas within signal processing and combines multi-disciplinary research at the intersection of signal processing, information theory, optimization, estimation theory and hardware design.
Summary
Digital signal processing (DSP) is a revolutionary paradigm shift enabling processing of physical data in the digital domain where design and implementation are considerably simplified. However, state-of-the-art analog-to-digital convertors (ADCs) preclude high-rate wideband sampling and processing with low cost and energy consumption, presenting a major bottleneck. This is mostly due to a traditional assumption that sampling must be performed at the Nyquist rate, that is, twice the signal bandwidth. Modern applications including communications, medical imaging, radar and more use signals with high bandwidth, resulting in prohibitively large Nyquist rates.
Our ambitious goal is to introduce a paradigm shift in ADC design that will enable systems capable of low-rate, wideband sensing and low-rate DSP.
While DSP has a rich history in exploiting structure to reduce dimensionality and perform efficient parameter extraction, current ADCs do not exploit such knowledge.
We challenge current practice that separates the sampling stage from the processing stage and exploit structure in analog signals already in the ADC, to drastically reduce the sampling and processing rates.
Our preliminary data shows that this allows substantial savings in sampling and processing rates --- we show rate reduction of 1/28 in ultrasound imaging, and 1/30 in radar detection.
To achieve our overreaching goal we focus on three interconnected objectives -- developing the 1) theory 2) hardware and 3) applications of sub-Nyquist sampling.
Our methodology ties together two areas on the frontier of signal processing: compressed sensing (CS), focused on finite length vectors, and analog sampling. Our research plan also inherently relies on advances in several other important areas within signal processing and combines multi-disciplinary research at the intersection of signal processing, information theory, optimization, estimation theory and hardware design.
Max ERC Funding
2 400 000 €
Duration
Start date: 2015-08-01, End date: 2020-07-31
Project acronym CASe
Project Combinatorics with an analytic structure
Researcher (PI) Karim ADIPRASITO
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), PE1, ERC-2016-STG
Summary "Combinatorics, and its interplay with geometry, has fascinated our ancestors as shown by early stone carvings in the Neolithic period. Modern combinatorics is motivated by the ubiquity of its structures in both pure and applied mathematics.
The work of Hochster and Stanley, who realized the relation of enumerative questions to commutative algebra and toric geometry made a vital contribution to the development of this subject. Their work was a central contribution to the classification of face numbers of simple polytopes, and the initial success lead to a wealth of research in which combinatorial problems were translated to algebra and geometry and then solved using deep results such as Saito's hard Lefschetz theorem. As a caveat, this also made branches of combinatorics reliant on algebra and geometry to provide new ideas.
In this proposal, I want to reverse this approach and extend our understanding of geometry and algebra guided by combinatorial methods. In this spirit I propose new combinatorial approaches to the interplay of curvature and topology, to isoperimetry, geometric analysis, and intersection theory, to name a few. In addition, while these subjects are interesting by themselves, they are also designed to advance classical topics, for example, the diameter of polyhedra (as in the Hirsch conjecture), arrangement theory (and the study of arrangement complements), Hodge theory (as in Grothendieck's standard conjectures), and realization problems of discrete objects (as in Connes embedding problem for type II factors).
This proposal is supported by the review of some already developed tools, such as relative Stanley--Reisner theory (which is equipped to deal with combinatorial isoperimetries), combinatorial Hodge theory (which extends the ``K\""ahler package'' to purely combinatorial settings), and discrete PDEs (which were used to construct counterexamples to old problems in discrete geometry)."
Summary
"Combinatorics, and its interplay with geometry, has fascinated our ancestors as shown by early stone carvings in the Neolithic period. Modern combinatorics is motivated by the ubiquity of its structures in both pure and applied mathematics.
The work of Hochster and Stanley, who realized the relation of enumerative questions to commutative algebra and toric geometry made a vital contribution to the development of this subject. Their work was a central contribution to the classification of face numbers of simple polytopes, and the initial success lead to a wealth of research in which combinatorial problems were translated to algebra and geometry and then solved using deep results such as Saito's hard Lefschetz theorem. As a caveat, this also made branches of combinatorics reliant on algebra and geometry to provide new ideas.
In this proposal, I want to reverse this approach and extend our understanding of geometry and algebra guided by combinatorial methods. In this spirit I propose new combinatorial approaches to the interplay of curvature and topology, to isoperimetry, geometric analysis, and intersection theory, to name a few. In addition, while these subjects are interesting by themselves, they are also designed to advance classical topics, for example, the diameter of polyhedra (as in the Hirsch conjecture), arrangement theory (and the study of arrangement complements), Hodge theory (as in Grothendieck's standard conjectures), and realization problems of discrete objects (as in Connes embedding problem for type II factors).
This proposal is supported by the review of some already developed tools, such as relative Stanley--Reisner theory (which is equipped to deal with combinatorial isoperimetries), combinatorial Hodge theory (which extends the ``K\""ahler package'' to purely combinatorial settings), and discrete PDEs (which were used to construct counterexamples to old problems in discrete geometry)."
Max ERC Funding
1 337 200 €
Duration
Start date: 2016-12-01, End date: 2021-11-30
Project acronym CloudRadioNet
Project Cloud Wireless Networks: An Information Theoretic Framework
Researcher (PI) Shlomo Shamai Shitz
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Advanced Grant (AdG), PE7, ERC-2015-AdG
Summary This five years research proposal is focused on the development of novel information theoretic concepts and techniques and their usage, as to identify the ultimate communications limits and potential of different cloud radio network structures, in which the central signal processing is migrated to the cloud (remote central units), via fronthaul/backhaul infrastructure links. Moreover, it is also directed to introduce and study the optimal or close to optimal strategies for those systems that are to be motivated by the developed theory. We plan to address wireless networks, having future cellular technology in mind, but the basic tools and approaches to be built and researched are relevant to other communication networks as well. Cloud communication networks motivate novel information theoretic views, and perspectives that put backhaul/fronthaul connections in the center, thus deviating considerably from standard theoretical studies of communications links and networks, which are applied to this domain. Our approach accounts for the fact that in such networks information theoretic separation concepts are no longer optimal, hence isolating simple basic components of the network is essentially suboptimal. The proposed view incorporates, in a unified way, under the general cover of information theory: Multi-terminal distributed networks; Basic and timely concepts of distributed coding and communications; Network communications and primarily network coding, Index coding, as associated with interference alignment and caching; Information-Estimation relations and signal processing, addressing the impact of distributed channel state information directly; A variety of fundamental concepts in optimization and random matrix theories. This path provides a natural theoretical framework directed towards better understanding the potential and limitation of cloud networks on one hand and paves the way to innovative communications design principles on the other.
Summary
This five years research proposal is focused on the development of novel information theoretic concepts and techniques and their usage, as to identify the ultimate communications limits and potential of different cloud radio network structures, in which the central signal processing is migrated to the cloud (remote central units), via fronthaul/backhaul infrastructure links. Moreover, it is also directed to introduce and study the optimal or close to optimal strategies for those systems that are to be motivated by the developed theory. We plan to address wireless networks, having future cellular technology in mind, but the basic tools and approaches to be built and researched are relevant to other communication networks as well. Cloud communication networks motivate novel information theoretic views, and perspectives that put backhaul/fronthaul connections in the center, thus deviating considerably from standard theoretical studies of communications links and networks, which are applied to this domain. Our approach accounts for the fact that in such networks information theoretic separation concepts are no longer optimal, hence isolating simple basic components of the network is essentially suboptimal. The proposed view incorporates, in a unified way, under the general cover of information theory: Multi-terminal distributed networks; Basic and timely concepts of distributed coding and communications; Network communications and primarily network coding, Index coding, as associated with interference alignment and caching; Information-Estimation relations and signal processing, addressing the impact of distributed channel state information directly; A variety of fundamental concepts in optimization and random matrix theories. This path provides a natural theoretical framework directed towards better understanding the potential and limitation of cloud networks on one hand and paves the way to innovative communications design principles on the other.
Max ERC Funding
1 981 782 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym CONC-VIA-RIEMANN
Project High-Dimensional Convexity, Isoperimetry and Concentration via a Riemannian Vantage Point
Researcher (PI) Emanuel Milman
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Starting Grant (StG), PE1, ERC-2014-STG
Summary "In recent years, the importance of superimposing the contribution of the measure to that of the metric, in determining the underlying space's (generalized Ricci) curvature, has been clarified in the works of Lott, Sturm, Villani and others, following the definition of Curvature-Dimension introduced by Bakry and Emery. We wish to systematically incorporate
this important idea of considering the measure and metric in tandem, in the study of questions pertaining to isoperimetric and concentration properties of convex domains in high-dimensional Euclidean space, where a-priori there is only a trivial metric (Euclidean) and trivial measure (Lebesgue).
The first step of enriching the class of uniform measures on convex domains to that of non-negatively curved (""log-concave"") measures in Euclidean space has been very successfully implemented in the last decades, leading to substantial progress in our understanding of volumetric properties of convex domains, mostly regarding concentration of linear functionals. However, the potential advantages of altering the Euclidean metric into a more general Riemannian one or exploiting related Riemannian structures have not been systematically explored. Our main paradigm is that in order to progress in non-linear questions pertaining to concentration in Euclidean space, it is imperative to cast and study these problems in the more general Riemannian context.
As witnessed by our own work over the last years, we expect that broadening the scope and incorporating tools from the Riemannian world will lead to significant progress in our understanding of the qualitative and quantitative structure of isoperimetric minimizers in the purely Euclidean setting. Such progress would have dramatic impact on long-standing fundamental conjectures regarding concentration of measure on high-dimensional convex domains, as well as other closely related fields such as Probability Theory, Learning Theory, Random Matrix Theory and Algorithmic Geometry."
Summary
"In recent years, the importance of superimposing the contribution of the measure to that of the metric, in determining the underlying space's (generalized Ricci) curvature, has been clarified in the works of Lott, Sturm, Villani and others, following the definition of Curvature-Dimension introduced by Bakry and Emery. We wish to systematically incorporate
this important idea of considering the measure and metric in tandem, in the study of questions pertaining to isoperimetric and concentration properties of convex domains in high-dimensional Euclidean space, where a-priori there is only a trivial metric (Euclidean) and trivial measure (Lebesgue).
The first step of enriching the class of uniform measures on convex domains to that of non-negatively curved (""log-concave"") measures in Euclidean space has been very successfully implemented in the last decades, leading to substantial progress in our understanding of volumetric properties of convex domains, mostly regarding concentration of linear functionals. However, the potential advantages of altering the Euclidean metric into a more general Riemannian one or exploiting related Riemannian structures have not been systematically explored. Our main paradigm is that in order to progress in non-linear questions pertaining to concentration in Euclidean space, it is imperative to cast and study these problems in the more general Riemannian context.
As witnessed by our own work over the last years, we expect that broadening the scope and incorporating tools from the Riemannian world will lead to significant progress in our understanding of the qualitative and quantitative structure of isoperimetric minimizers in the purely Euclidean setting. Such progress would have dramatic impact on long-standing fundamental conjectures regarding concentration of measure on high-dimensional convex domains, as well as other closely related fields such as Probability Theory, Learning Theory, Random Matrix Theory and Algorithmic Geometry."
Max ERC Funding
1 194 190 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym CRYOMATH
Project Cryo-electron microscopy: mathematical foundations and algorithms
Researcher (PI) Yoel SHKOLNISKY
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Consolidator Grant (CoG), PE1, ERC-2016-COG
Summary The importance of understanding the functions of the basic building blocks of life, such as proteins, cannot be overstated (as asserted by two recent Nobel prizes in Chemistry), as this understanding unravels the mechanisms that control all organisms. The critical step towards such an understanding is to reveal the structures of these building blocks. A leading method for resolving such structures is cryo-electron microscopy (cryo-EM), in which the structure of a molecule is recovered from its images taken by an electron microscope, by using sophisticated mathematical algorithms (to which my group has made several key mathematical and algorithmic contributions). Due to hardware breakthroughs in the past three years, cryo-EM has made a giant leap forward, introducing capabilities that until recently were unimaginable, opening an opportunity to revolutionize our biological understanding. As extracting information from cryo-EM experiments completely relies on mathematical algorithms, the method’s deep mathematical challenges that have emerged must be solved as soon as possible. Only then cryo-EM could realize its nearly inconceivable potential. These challenges, for which no adequate solutions exist (or none at all), focus on integrating information from huge sets of extremely noisy images reliability and efficiently. Based on the experience of my research group in developing algorithms for cryo-EM data processing, gained during the past eight years, we will address the three key open challenges of the field – a) deriving reliable and robust reconstruction algorithms from cryo-EM data, b) developing tools to process heterogeneous cryo-EM data sets, and c) devising validation and quality measures for structures determined from cryo-EM data. The fourth goal of the project, which ties all goals together and promotes the broad interdisciplinary impact of the project, is to merge all our algorithms into a software platform for state-of-the-art processing of cryo-EM data.
Summary
The importance of understanding the functions of the basic building blocks of life, such as proteins, cannot be overstated (as asserted by two recent Nobel prizes in Chemistry), as this understanding unravels the mechanisms that control all organisms. The critical step towards such an understanding is to reveal the structures of these building blocks. A leading method for resolving such structures is cryo-electron microscopy (cryo-EM), in which the structure of a molecule is recovered from its images taken by an electron microscope, by using sophisticated mathematical algorithms (to which my group has made several key mathematical and algorithmic contributions). Due to hardware breakthroughs in the past three years, cryo-EM has made a giant leap forward, introducing capabilities that until recently were unimaginable, opening an opportunity to revolutionize our biological understanding. As extracting information from cryo-EM experiments completely relies on mathematical algorithms, the method’s deep mathematical challenges that have emerged must be solved as soon as possible. Only then cryo-EM could realize its nearly inconceivable potential. These challenges, for which no adequate solutions exist (or none at all), focus on integrating information from huge sets of extremely noisy images reliability and efficiently. Based on the experience of my research group in developing algorithms for cryo-EM data processing, gained during the past eight years, we will address the three key open challenges of the field – a) deriving reliable and robust reconstruction algorithms from cryo-EM data, b) developing tools to process heterogeneous cryo-EM data sets, and c) devising validation and quality measures for structures determined from cryo-EM data. The fourth goal of the project, which ties all goals together and promotes the broad interdisciplinary impact of the project, is to merge all our algorithms into a software platform for state-of-the-art processing of cryo-EM data.
Max ERC Funding
1 751 250 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym DANCER
Project DAtacommunications based on NanophotoniC Resonators
Researcher (PI) John William Whelan-Curtin
Host Institution (HI) CORK INSTITUTE OF TECHNOLOGY
Call Details Starting Grant (StG), PE7, ERC-2013-StG
Summary A key challenge for the 21st century is, therefore to provide billions of people with the means to access, move and manipulate, what has become, huge volumes of information. The environmental and economic implications becoming serious, making energy efficient data communications key to the operation of today’s society.
In this project, the Principal Investigator will develop a new framework for optical interconnects and provide a common platform that spans Fibre-to-the-home to chip-to-chip links, even as far as global on-chip interconnects. The project is based on the efficient coupling of the Photonic Crystal resonators with the outside world. These provide the ultimate confinement of light in both space and time allowing orders of magnitude improvements in performance relative to the state of the art, yet in a simpler simple system- the innovator’s dream. New versions of the key components of optical links- light sources, modulators and photo-detectors- will be realised in this new framework providing a new paradigm for energy efficient communication.
Summary
A key challenge for the 21st century is, therefore to provide billions of people with the means to access, move and manipulate, what has become, huge volumes of information. The environmental and economic implications becoming serious, making energy efficient data communications key to the operation of today’s society.
In this project, the Principal Investigator will develop a new framework for optical interconnects and provide a common platform that spans Fibre-to-the-home to chip-to-chip links, even as far as global on-chip interconnects. The project is based on the efficient coupling of the Photonic Crystal resonators with the outside world. These provide the ultimate confinement of light in both space and time allowing orders of magnitude improvements in performance relative to the state of the art, yet in a simpler simple system- the innovator’s dream. New versions of the key components of optical links- light sources, modulators and photo-detectors- will be realised in this new framework providing a new paradigm for energy efficient communication.
Max ERC Funding
1 495 450 €
Duration
Start date: 2013-12-01, End date: 2019-05-31