Project acronym CULTIVATE MSS
Project Cultural Values and the International Trade in Medieval European Manuscripts, c. 1900-1945
Researcher (PI) Laura Janet CLEAVER
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Consolidator Grant (CoG), SH5, ERC-2018-COG
Summary CULTIVATE MSS aims to assess the significance of the trade in medieval manuscripts for the development of ideas about the nature and value of European culture in the early 20th century, a crucial period for the development of modern European nation states. Although recent technological developments have facilitated quantitative analyses of provenance data, charting in increasing detail the early-20th-century movement of manuscripts, including an exodus of works to America, qualitative analyses have failed to keep pace, leaving questions of how and why particular books were valued underexplored. The PI’s role in the development of the Schoenberg Database of Manuscripts, which has begun to make available historic data about books, has revealed the need for a reassessment of the relationship between collecting and scholarship, and the potential for existing data about the manuscript trade to be used, with unpublished archival sources, to identify and compare the economic and philosophical values projected onto books. Thus the project uses the PI’s expertise to develop a multi-disciplinary approach to assess the roles of collectors, scholars and dealers in the formation of collections of medieval manuscripts, and the impact of this on scholarship, comparing the English-speaking world, France and Germany. It will analyse published and unpublished accounts of manuscripts, together with price data, to reconstruct values projected onto books. It will seek to contextualise these values within the history of the early 20th century, assessing the impact of two world wars and other political and economic shifts on the trade in books and attitudes to manuscripts as objects of national significance. The Middle Ages are often identified with the emergence of European cultural identities, thus a reappraisal of the historiography of the study of medieval manuscripts has the potential to impact research about attitudes to European culture and identity in a wide range of disciplines.
Summary
CULTIVATE MSS aims to assess the significance of the trade in medieval manuscripts for the development of ideas about the nature and value of European culture in the early 20th century, a crucial period for the development of modern European nation states. Although recent technological developments have facilitated quantitative analyses of provenance data, charting in increasing detail the early-20th-century movement of manuscripts, including an exodus of works to America, qualitative analyses have failed to keep pace, leaving questions of how and why particular books were valued underexplored. The PI’s role in the development of the Schoenberg Database of Manuscripts, which has begun to make available historic data about books, has revealed the need for a reassessment of the relationship between collecting and scholarship, and the potential for existing data about the manuscript trade to be used, with unpublished archival sources, to identify and compare the economic and philosophical values projected onto books. Thus the project uses the PI’s expertise to develop a multi-disciplinary approach to assess the roles of collectors, scholars and dealers in the formation of collections of medieval manuscripts, and the impact of this on scholarship, comparing the English-speaking world, France and Germany. It will analyse published and unpublished accounts of manuscripts, together with price data, to reconstruct values projected onto books. It will seek to contextualise these values within the history of the early 20th century, assessing the impact of two world wars and other political and economic shifts on the trade in books and attitudes to manuscripts as objects of national significance. The Middle Ages are often identified with the emergence of European cultural identities, thus a reappraisal of the historiography of the study of medieval manuscripts has the potential to impact research about attitudes to European culture and identity in a wide range of disciplines.
Max ERC Funding
1 832 711 €
Duration
Start date: 2019-05-01, End date: 2024-04-30
Project acronym CuP
Project Circuit-level Photonic design
Researcher (PI) Peter BIENSTMAN
Host Institution (HI) UNIVERSITEIT GENT
Call Details Proof of Concept (PoC), PC1, ERC-2012-PoC
Summary "In the ERC-project NaResCo, we have developed a tool Caphe to model and simulate complex photonic integrated circuits (PICs). Caphe is aimed at narrowing the ""design gap"" experienced by photonic IC designers today. The ever increasing complexity of photonic ICs is making their design much harder, while existing tools for photonic IC design are lagging. The result is a gap between what is needed and what is available to do the job.
In the course of NaResCo, we have developed Caphe to the point where we have demonstrated its capabilities, and have applied it to the simulation of photonic reservoirs, which pose a considerable design challenge. The results encouraged us to look into the commercial potential of this tool. In CuP, we will push Caphe to a pre-commercial level. The main activities will focus on the development of a roadmap, technical improvements to make the tool commercially viable, elaboration of a licensing scheme (pre-commercial and commercial), demonstrate and document user cases (alpha-customers).The final exploitation strategy can be either through a new university spin-off company or licensing to an existing software vendor."
Summary
"In the ERC-project NaResCo, we have developed a tool Caphe to model and simulate complex photonic integrated circuits (PICs). Caphe is aimed at narrowing the ""design gap"" experienced by photonic IC designers today. The ever increasing complexity of photonic ICs is making their design much harder, while existing tools for photonic IC design are lagging. The result is a gap between what is needed and what is available to do the job.
In the course of NaResCo, we have developed Caphe to the point where we have demonstrated its capabilities, and have applied it to the simulation of photonic reservoirs, which pose a considerable design challenge. The results encouraged us to look into the commercial potential of this tool. In CuP, we will push Caphe to a pre-commercial level. The main activities will focus on the development of a roadmap, technical improvements to make the tool commercially viable, elaboration of a licensing scheme (pre-commercial and commercial), demonstrate and document user cases (alpha-customers).The final exploitation strategy can be either through a new university spin-off company or licensing to an existing software vendor."
Max ERC Funding
149 907 €
Duration
Start date: 2013-02-01, End date: 2014-01-31
Project acronym CUREORCURSE
Project Non-elected politics.Cure or Curse for the Crisis of Representative Democracy?
Researcher (PI) Jean-Benoit PILET
Host Institution (HI) UNIVERSITE LIBRE DE BRUXELLES
Call Details Consolidator Grant (CoG), SH2, ERC-2017-COG
Summary Evidence of a growing disengagement of citizens from politics is multiplying. Electoral turnout reaches historically low levels. Anti-establishment and populist parties are on the rise. Fewer and fewer Europeans trust their representative institutions. In response, we have observed a multiplication of institutional reforms aimed at revitalizing representative democracy. Two in particular stand out: the delegation of some political decision-making powers to (1) selected citizens and to (2) selected experts. But there is a paradox in attempting to cure the crisis of representative democracy by introducing such reforms. In representative democracy, control over political decision-making is vested in elected representatives. Delegating political decision-making to selected experts/citizens is at odds with this definition. It empowers the non-elected. If these reforms show that politics could work without elected officials, could we really expect that citizens’ support for representative democracy would be boosted and that citizens would re-engage with representative politics? In that sense, would it be a cure for the crisis of representative democracy, or rather a curse? Our central hypothesis is that there is no universal and univocal healing (or harming) effect of non-elected politics on support for representative democracy. In order to verify it, I propose to collect data across Europe on three elements: (1) a detailed study of the preferences of Europeans on how democracy should work and on institutional reforms towards non-elected politics, (2) a comprehensive inventory of all actual cases of empowerment of citizens and experts implemented across Europe since 2000, and (3) an analysis of the impact of exposure to non-elected politics on citizens’ attitudes towards representative democracy. An innovative combination of online survey experiments and of panel surveys will be used to answer this topical research question with far-reaching societal implication.
Summary
Evidence of a growing disengagement of citizens from politics is multiplying. Electoral turnout reaches historically low levels. Anti-establishment and populist parties are on the rise. Fewer and fewer Europeans trust their representative institutions. In response, we have observed a multiplication of institutional reforms aimed at revitalizing representative democracy. Two in particular stand out: the delegation of some political decision-making powers to (1) selected citizens and to (2) selected experts. But there is a paradox in attempting to cure the crisis of representative democracy by introducing such reforms. In representative democracy, control over political decision-making is vested in elected representatives. Delegating political decision-making to selected experts/citizens is at odds with this definition. It empowers the non-elected. If these reforms show that politics could work without elected officials, could we really expect that citizens’ support for representative democracy would be boosted and that citizens would re-engage with representative politics? In that sense, would it be a cure for the crisis of representative democracy, or rather a curse? Our central hypothesis is that there is no universal and univocal healing (or harming) effect of non-elected politics on support for representative democracy. In order to verify it, I propose to collect data across Europe on three elements: (1) a detailed study of the preferences of Europeans on how democracy should work and on institutional reforms towards non-elected politics, (2) a comprehensive inventory of all actual cases of empowerment of citizens and experts implemented across Europe since 2000, and (3) an analysis of the impact of exposure to non-elected politics on citizens’ attitudes towards representative democracy. An innovative combination of online survey experiments and of panel surveys will be used to answer this topical research question with far-reaching societal implication.
Max ERC Funding
1 981 589 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym CutLoops
Project Loop amplitudes in quantum field theory
Researcher (PI) Ruth Britto
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Consolidator Grant (CoG), PE2, ERC-2014-CoG
Summary The traditional formulation of relativistic quantum theory is ill-equipped to handle the range of difficult computations needed to describe particle collisions at the Large Hadron Collider (LHC) within a suitable time frame. Yet, recent work shows that probability amplitudes in quantum gauge field theories, such as those describing the Standard Model and its extensions, take surprisingly simple forms. The simplicity indicates deep structure in gauge theory that has already led to dramatic computational improvements, but remains to be fully understood. For precision calculations and investigations of the deep structure of gauge theory, a comprehensive method for computing multi-loop amplitudes systematically and efficiently must be found.
The goal of this proposal is to construct a new and complete approach to computing amplitudes from a detailed understanding of their singularities, based on prior successes of so-called on-shell methods combined with the latest developments in the mathematics of Feynman integrals. Scattering processes relevant to the LHC and to formal investigations of quantum field theory will be computed within the new framework.
Summary
The traditional formulation of relativistic quantum theory is ill-equipped to handle the range of difficult computations needed to describe particle collisions at the Large Hadron Collider (LHC) within a suitable time frame. Yet, recent work shows that probability amplitudes in quantum gauge field theories, such as those describing the Standard Model and its extensions, take surprisingly simple forms. The simplicity indicates deep structure in gauge theory that has already led to dramatic computational improvements, but remains to be fully understood. For precision calculations and investigations of the deep structure of gauge theory, a comprehensive method for computing multi-loop amplitudes systematically and efficiently must be found.
The goal of this proposal is to construct a new and complete approach to computing amplitudes from a detailed understanding of their singularities, based on prior successes of so-called on-shell methods combined with the latest developments in the mathematics of Feynman integrals. Scattering processes relevant to the LHC and to formal investigations of quantum field theory will be computed within the new framework.
Max ERC Funding
1 954 065 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym CUTS
Project Creative Undoing and Textual Scholarship:
A Rapprochement between Genetic Criticism and Scholarly Editing
Researcher (PI) Dirk Van Hulle
Host Institution (HI) UNIVERSITEIT ANTWERPEN
Call Details Starting Grant (StG), SH5, ERC-2012-StG_20111124
Summary "In the past few decades, the disciplines of textual scholarship and genetic criticism have insisted on their respective differences. Nonetheless, a rapprochement would be mutually beneficial. The proposed research endeavours to innovate scholarly editing with the combined forces of these two disciplines. Since genetic criticism has objected to the subservient role of manuscript research in textual criticism, the proposed research suggests a reversal of roles: instead of employing manuscript research with a view to making an edition, an electronic edition can be designed in such a way that it becomes a tool for manuscript research and genetic criticism. The research hypothesis is that such a rapprochement can be achieved by means of an approach to textual variants that values creative undoing (ways of de-composing a text as an integral part of composition and literary invention) more than has hitherto been the case in textual scholarship. This change of outlook will be tested by means of the marginalia, notes and manuscripts of an author whose work is paradigmatic for genetic criticism: Samuel Beckett. His manuscripts will serve as a case study to determine the functions of creative undoing in the process of literary invention and its theoretical and practical implications for electronic scholarly editing and the genetic analysis of modern manuscripts. Extrapolating from this case study, the results are employed to tackle a topical issue in European textual scholarship. The envisaged rapprochement between the disciplines of genetic criticism and textual scholarship is the core of this proposal’s endeavour to advance the state of the art in these disciplines by giving shape to a new orientation within scholarly editing."
Summary
"In the past few decades, the disciplines of textual scholarship and genetic criticism have insisted on their respective differences. Nonetheless, a rapprochement would be mutually beneficial. The proposed research endeavours to innovate scholarly editing with the combined forces of these two disciplines. Since genetic criticism has objected to the subservient role of manuscript research in textual criticism, the proposed research suggests a reversal of roles: instead of employing manuscript research with a view to making an edition, an electronic edition can be designed in such a way that it becomes a tool for manuscript research and genetic criticism. The research hypothesis is that such a rapprochement can be achieved by means of an approach to textual variants that values creative undoing (ways of de-composing a text as an integral part of composition and literary invention) more than has hitherto been the case in textual scholarship. This change of outlook will be tested by means of the marginalia, notes and manuscripts of an author whose work is paradigmatic for genetic criticism: Samuel Beckett. His manuscripts will serve as a case study to determine the functions of creative undoing in the process of literary invention and its theoretical and practical implications for electronic scholarly editing and the genetic analysis of modern manuscripts. Extrapolating from this case study, the results are employed to tackle a topical issue in European textual scholarship. The envisaged rapprochement between the disciplines of genetic criticism and textual scholarship is the core of this proposal’s endeavour to advance the state of the art in these disciplines by giving shape to a new orientation within scholarly editing."
Max ERC Funding
1 147 740 €
Duration
Start date: 2013-01-01, End date: 2017-12-31
Project acronym CYRE
Project Cytokine Receptor Signaling Revisited: Implementing novel concepts for cytokine-based therapies
Researcher (PI) Jan Tavernier
Host Institution (HI) VIB
Call Details Advanced Grant (AdG), LS1, ERC-2013-ADG
Summary "Cytokine receptor signaling is an essential part of the intercellular communication networks that govern key physiological processes in the body. Cytokine dysfunction is associated with numerous pathologies including autoimmune disorders and cancer, and both cytokines and cytokine antagonists have found their way into the clinic. Yet, there are still many unfulfilled promises and opportunities. In this project we will reinvestigate key aspects of cytokine receptor activation and signaling using novel insights and techniques recently developed in our laboratory. This will include the AcTakine concept for cell-specific targeting of cytokine activity, and applications of our MAPPIT, KISS and Virotrap toolboxes to systematically map protein interactions involved in cytokine signaling. We expect to obtain important new insights, both in fundamental and in applied medical sciences."
Summary
"Cytokine receptor signaling is an essential part of the intercellular communication networks that govern key physiological processes in the body. Cytokine dysfunction is associated with numerous pathologies including autoimmune disorders and cancer, and both cytokines and cytokine antagonists have found their way into the clinic. Yet, there are still many unfulfilled promises and opportunities. In this project we will reinvestigate key aspects of cytokine receptor activation and signaling using novel insights and techniques recently developed in our laboratory. This will include the AcTakine concept for cell-specific targeting of cytokine activity, and applications of our MAPPIT, KISS and Virotrap toolboxes to systematically map protein interactions involved in cytokine signaling. We expect to obtain important new insights, both in fundamental and in applied medical sciences."
Max ERC Funding
2 487 728 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym DAFINET
Project Dynamic Attitude Fixing: A novel theory of opinion dynamics in social networks and its implications for computational propaganda in hybrid social networks (containing humans and bots)
Researcher (PI) Michael QUAYLE
Host Institution (HI) UNIVERSITY OF LIMERICK
Call Details Starting Grant (StG), SH3, ERC-2018-STG
Summary Understanding the coordination of attitudes in societies is vitally important for many disciplines and global social challenges. Network opinion dynamics are poorly understood, especially in hybrid networks where automated (bot) agents seek to influence economic or political processes (e.g. USA: Trump vs Clinton; UK: Brexit). A dynamic fixing theory of attitudes is proposed, premised on three features of attitudes demonstrated in ethnomethodology and social psychology; that people: 1) simultaneously hold a repertoire of multiple (sometimes ambivalent) attitudes, 2) express attitudes to enact social identity; and 3) are accountable for attitude expression in interaction. It is proposed that interactions between agents generate symbolic links between attitudes with the emergent social-symbolic structure generating perceived ingroup similarity and outgroup difference in a multilayer network. Thus attitudes can become dynamically fixed when constellations of attitudes are locked-in to identities via multilayer networks of attitude agreement and disagreement; a process intensified by conflict, threat or zero-sum partisan processes (e.g. elections/referenda). Agent-based simulations will validate the theory and explore the hypothesized channels of bot influence. Network experiments with human and hybrid networks will test theoretically derived hypotheses. Observational network studies will assess model fit using historical Twitter data. Results will provide a social-psychological-network theory for attitude dynamics and vulnerability to computational propaganda in hybrid networks.
The theory will explain:
(a) when and how consensus can propagate rapidly through networks (since identity processes fix attitudes already contained within repertoires);
(b) limits of identity-related attitude propagation (since attitudes outside of repertoires will not be easily adopted); and
(c) how attitudes can often ‘roll back’ after events (since contextual changes ‘unfix’ attitudes).
Summary
Understanding the coordination of attitudes in societies is vitally important for many disciplines and global social challenges. Network opinion dynamics are poorly understood, especially in hybrid networks where automated (bot) agents seek to influence economic or political processes (e.g. USA: Trump vs Clinton; UK: Brexit). A dynamic fixing theory of attitudes is proposed, premised on three features of attitudes demonstrated in ethnomethodology and social psychology; that people: 1) simultaneously hold a repertoire of multiple (sometimes ambivalent) attitudes, 2) express attitudes to enact social identity; and 3) are accountable for attitude expression in interaction. It is proposed that interactions between agents generate symbolic links between attitudes with the emergent social-symbolic structure generating perceived ingroup similarity and outgroup difference in a multilayer network. Thus attitudes can become dynamically fixed when constellations of attitudes are locked-in to identities via multilayer networks of attitude agreement and disagreement; a process intensified by conflict, threat or zero-sum partisan processes (e.g. elections/referenda). Agent-based simulations will validate the theory and explore the hypothesized channels of bot influence. Network experiments with human and hybrid networks will test theoretically derived hypotheses. Observational network studies will assess model fit using historical Twitter data. Results will provide a social-psychological-network theory for attitude dynamics and vulnerability to computational propaganda in hybrid networks.
The theory will explain:
(a) when and how consensus can propagate rapidly through networks (since identity processes fix attitudes already contained within repertoires);
(b) limits of identity-related attitude propagation (since attitudes outside of repertoires will not be easily adopted); and
(c) how attitudes can often ‘roll back’ after events (since contextual changes ‘unfix’ attitudes).
Max ERC Funding
1 499 925 €
Duration
Start date: 2018-12-01, End date: 2023-11-30
Project acronym DaMon
Project Datacenter Monitoring for Improving Insight and Efficiency
Researcher (PI) Lieven EECKHOUT
Host Institution (HI) UNIVERSITEIT GENT
Call Details Proof of Concept (PoC), PC1, ERC-2012-PoC
Summary We have developed a novel monitoring mechanism for the datacenter that has the potential to be a holistic monitoring solution that will help optimize day-to-day operations, identify business opportunities, reduce operational expenses, and reduce carbon footprint. The key idea is to monitor the types and the latencies of the requests coming in into the datacenter, and correlate the request stream with the hardware resources used in the datacenter. Although conceptually simple, this monitoring solution provides a wealth of invaluable information and creates a number of opportunities, both at the technical and business side, which no other existing monitoring solution offers.
In this project, we will study the broader market, for where to exploit the technology, ranging from web application companies, to datacenter and cloud providers, to datacenter owners; define business needs and requirements; evaluate the technology across market segments with friendly customers; and develop an IP portfolio and business plan. The end result of the project is to create the essential conditions to start the early commercial stage of this idea.
Summary
We have developed a novel monitoring mechanism for the datacenter that has the potential to be a holistic monitoring solution that will help optimize day-to-day operations, identify business opportunities, reduce operational expenses, and reduce carbon footprint. The key idea is to monitor the types and the latencies of the requests coming in into the datacenter, and correlate the request stream with the hardware resources used in the datacenter. Although conceptually simple, this monitoring solution provides a wealth of invaluable information and creates a number of opportunities, both at the technical and business side, which no other existing monitoring solution offers.
In this project, we will study the broader market, for where to exploit the technology, ranging from web application companies, to datacenter and cloud providers, to datacenter owners; define business needs and requirements; evaluate the technology across market segments with friendly customers; and develop an IP portfolio and business plan. The end result of the project is to create the essential conditions to start the early commercial stage of this idea.
Max ERC Funding
149 880 €
Duration
Start date: 2012-11-01, End date: 2013-10-31
Project acronym DAMONA
Project Mutation and Recombination in the Cattle Germline: Genomic Analysis and Impact on Fertility
Researcher (PI) Michel Alphonse Julien Georges
Host Institution (HI) UNIVERSITE DE LIEGE
Call Details Advanced Grant (AdG), LS2, ERC-2012-ADG_20120314
Summary "Mutation and recombination are fundamental biological processes that determine adaptability of populations. The mutation rate reflects the equilibrium between the need to adapt, the burden of mutation load, the “cost of fidelity”, and random drift that determines a lower limit in achievable fidelity. Recombination fulfills an essential mechanistic role during meiosis, ensuring proper chromosomal segregation. Recombination affects the rate of creation and loss of favorable haplotypes, imposing 2nd-order selection pressure on modifiers of recombination.
It is becoming apparent that recombination and mutation rates vary between individuals, and that these differences are in part inherited. Both processes are therefore “evolvable”, and amenable to genomic analysis. Identifying genetic determinants underlying these differences will provide insights in the regulation of mutation and recombination. The mutational load, and in particular the number of lethal equivalents per individual, remains poorly defined as epidemiological and molecular data yield estimates that differ by one order of magnitude. A relationship between recombination and fertility has been reported in women but awaits confirmation.
Population structure (small effective population size; large harems), phenotypic data collection (systematic recording of > 50 traits on millions of cows), and large-scale SNP genotyping (for genomic selection), make cattle populations uniquely suited for genetic analysis. DAMONA proposes to exploit these unique resources, combined with recent advances in next generation sequencing and genotyping, to:
(i) quantify and characterize inter-individual variation in male and female mutation and recombination rates,
(ii) map, fine-map and identify causative genes underlying QTL for these four phenotypes,
(iii) test the effect of loss-of-function variants on >50 traits including fertility, and
(iv) study the effect of variation in recombination on fertility."
Summary
"Mutation and recombination are fundamental biological processes that determine adaptability of populations. The mutation rate reflects the equilibrium between the need to adapt, the burden of mutation load, the “cost of fidelity”, and random drift that determines a lower limit in achievable fidelity. Recombination fulfills an essential mechanistic role during meiosis, ensuring proper chromosomal segregation. Recombination affects the rate of creation and loss of favorable haplotypes, imposing 2nd-order selection pressure on modifiers of recombination.
It is becoming apparent that recombination and mutation rates vary between individuals, and that these differences are in part inherited. Both processes are therefore “evolvable”, and amenable to genomic analysis. Identifying genetic determinants underlying these differences will provide insights in the regulation of mutation and recombination. The mutational load, and in particular the number of lethal equivalents per individual, remains poorly defined as epidemiological and molecular data yield estimates that differ by one order of magnitude. A relationship between recombination and fertility has been reported in women but awaits confirmation.
Population structure (small effective population size; large harems), phenotypic data collection (systematic recording of > 50 traits on millions of cows), and large-scale SNP genotyping (for genomic selection), make cattle populations uniquely suited for genetic analysis. DAMONA proposes to exploit these unique resources, combined with recent advances in next generation sequencing and genotyping, to:
(i) quantify and characterize inter-individual variation in male and female mutation and recombination rates,
(ii) map, fine-map and identify causative genes underlying QTL for these four phenotypes,
(iii) test the effect of loss-of-function variants on >50 traits including fertility, and
(iv) study the effect of variation in recombination on fertility."
Max ERC Funding
2 258 000 €
Duration
Start date: 2013-03-01, End date: 2018-02-28
Project acronym DANCER
Project DAtacommunications based on NanophotoniC Resonators
Researcher (PI) John William Whelan-Curtin
Host Institution (HI) CORK INSTITUTE OF TECHNOLOGY
Call Details Starting Grant (StG), PE7, ERC-2013-StG
Summary A key challenge for the 21st century is, therefore to provide billions of people with the means to access, move and manipulate, what has become, huge volumes of information. The environmental and economic implications becoming serious, making energy efficient data communications key to the operation of today’s society.
In this project, the Principal Investigator will develop a new framework for optical interconnects and provide a common platform that spans Fibre-to-the-home to chip-to-chip links, even as far as global on-chip interconnects. The project is based on the efficient coupling of the Photonic Crystal resonators with the outside world. These provide the ultimate confinement of light in both space and time allowing orders of magnitude improvements in performance relative to the state of the art, yet in a simpler simple system- the innovator’s dream. New versions of the key components of optical links- light sources, modulators and photo-detectors- will be realised in this new framework providing a new paradigm for energy efficient communication.
Summary
A key challenge for the 21st century is, therefore to provide billions of people with the means to access, move and manipulate, what has become, huge volumes of information. The environmental and economic implications becoming serious, making energy efficient data communications key to the operation of today’s society.
In this project, the Principal Investigator will develop a new framework for optical interconnects and provide a common platform that spans Fibre-to-the-home to chip-to-chip links, even as far as global on-chip interconnects. The project is based on the efficient coupling of the Photonic Crystal resonators with the outside world. These provide the ultimate confinement of light in both space and time allowing orders of magnitude improvements in performance relative to the state of the art, yet in a simpler simple system- the innovator’s dream. New versions of the key components of optical links- light sources, modulators and photo-detectors- will be realised in this new framework providing a new paradigm for energy efficient communication.
Max ERC Funding
1 495 450 €
Duration
Start date: 2013-12-01, End date: 2019-05-31
Project acronym DARWIN
Project Deep mm-Wave RF-CMOS Integrated Circuits
Researcher (PI) Michel Steyaert
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE7, ERC-2008-AdG
Summary Wireless and mobile communication systems have become an important part of our daily environment. Since the introduction of the GSM-network in the early nineties, different wireless applications such as WiFi, Bluetooth, GPS, etc. have been brought into the market. This has become possible due to the high integration of integrated circuits in relatively cheap technologies. Besides the digital signal processing, those wireless applications require complex analog circuits operating at very high frequencies (RF circuits). In the early days these were implemented as discrete components or standalone ICs in expensive technologies such as GaAs, InP and SiGe. Due to the research towards nanometer CMOS technologies, and due to improved RF circuit techniques, RF-CMOS has been introduced since the mid nineties. The intention of this research project is to take the next big leap forward in wireless applications, i.e. the exploration and research, based on the vast RF-CMOS knowledge already existing, towards the Extremely High Frequencies which is above 70 GHz up to 300GHz, with wavelengths close to 1 mm. The research project is a logical evolution of the RF-CMOS research knowledges of the team. For that the "natural evolution" acronym DARWIN (Deep mm-Wave RF CMOS Integrated Circuits (with the M of CMOS inverted (W)) is choosen. Implementing circuit techniques in standard CMOS technologies at those frequencies is again an enormous challenge and will open a lot of new opportunities and applications towards the future due to possibilities in safety monitoring, e.g. collision radar detection for automobiles at 77 GHz, the need for high data-rate telecommunication systems, with capacity of 1-10 Gbps, and imaging for medical and security systems. The goal of the proposed project is to perform the necessary fundamental basic research to be able to implement these 70-300 GHz applications in CMOS technology (45 nm and below).
Summary
Wireless and mobile communication systems have become an important part of our daily environment. Since the introduction of the GSM-network in the early nineties, different wireless applications such as WiFi, Bluetooth, GPS, etc. have been brought into the market. This has become possible due to the high integration of integrated circuits in relatively cheap technologies. Besides the digital signal processing, those wireless applications require complex analog circuits operating at very high frequencies (RF circuits). In the early days these were implemented as discrete components or standalone ICs in expensive technologies such as GaAs, InP and SiGe. Due to the research towards nanometer CMOS technologies, and due to improved RF circuit techniques, RF-CMOS has been introduced since the mid nineties. The intention of this research project is to take the next big leap forward in wireless applications, i.e. the exploration and research, based on the vast RF-CMOS knowledge already existing, towards the Extremely High Frequencies which is above 70 GHz up to 300GHz, with wavelengths close to 1 mm. The research project is a logical evolution of the RF-CMOS research knowledges of the team. For that the "natural evolution" acronym DARWIN (Deep mm-Wave RF CMOS Integrated Circuits (with the M of CMOS inverted (W)) is choosen. Implementing circuit techniques in standard CMOS technologies at those frequencies is again an enormous challenge and will open a lot of new opportunities and applications towards the future due to possibilities in safety monitoring, e.g. collision radar detection for automobiles at 77 GHz, the need for high data-rate telecommunication systems, with capacity of 1-10 Gbps, and imaging for medical and security systems. The goal of the proposed project is to perform the necessary fundamental basic research to be able to implement these 70-300 GHz applications in CMOS technology (45 nm and below).
Max ERC Funding
2 042 640 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym DBSModel
Project Multiscale Modelling of the Neuromuscular System for Closed Loop Deep Brain Stimulation
Researcher (PI) Madeleine Mary Lowery
Host Institution (HI) UNIVERSITY COLLEGE DUBLIN, NATIONAL UNIVERSITY OF IRELAND, DUBLIN
Call Details Consolidator Grant (CoG), PE7, ERC-2014-CoG
Summary Deep brain stimulation (DBS) is an effective therapy for treating the symptoms of Parkinson’s disease (PD). Despite its success, the mechanisms of DBS are not understood and there is a need to improve DBS to improve long-term stimulation in a wider patient population, limit side-effects, and extend battery life. Currently DBS operates in ‘open-loop’, with stimulus parameters empirically set. Closed-loop DBS, which adjusts parameters based on the state of the system, has the potential to overcome current limitations to increase therapeutic efficacy while reducing side-effects, costs and energy. Several key questions need to be addressed before closed loop DBS can be implemented clinically.
This research will develop a new multiscale model of the neuromuscular system for closed-loop DBS. The model will simulate neural sensing and stimulation on a scale not previously considered, encompassing the electric field around the electrode, the effect on individual neurons and neural networks, and generation of muscle force. This will involve integration across multiple temporal and spatial scales, in a complex system with incomplete knowledge of system variables. Experiments will be conducted to validate the model, and identify new biomarkers of neural activity that can used with signals from the brain to enable continuous symptom monitoring. The model will be used to design a new control strategy for closed-loop DBS that can accommodate the nonlinear nature of the system, and short- and long-term changes in system behavior.
Though challenging, this research will provide new insights into the changes that take place in PD and the mechanisms by which DBS exerts its therapeutic influence. This knowledge will be used to design a new strategy for closed-loop DBS, ready for testing in patients, with the potential to significantly improve patient outcomes in PD and fundamentally change the way in which implanted devices utilise electrical stimulation to modulate neural activity.
Summary
Deep brain stimulation (DBS) is an effective therapy for treating the symptoms of Parkinson’s disease (PD). Despite its success, the mechanisms of DBS are not understood and there is a need to improve DBS to improve long-term stimulation in a wider patient population, limit side-effects, and extend battery life. Currently DBS operates in ‘open-loop’, with stimulus parameters empirically set. Closed-loop DBS, which adjusts parameters based on the state of the system, has the potential to overcome current limitations to increase therapeutic efficacy while reducing side-effects, costs and energy. Several key questions need to be addressed before closed loop DBS can be implemented clinically.
This research will develop a new multiscale model of the neuromuscular system for closed-loop DBS. The model will simulate neural sensing and stimulation on a scale not previously considered, encompassing the electric field around the electrode, the effect on individual neurons and neural networks, and generation of muscle force. This will involve integration across multiple temporal and spatial scales, in a complex system with incomplete knowledge of system variables. Experiments will be conducted to validate the model, and identify new biomarkers of neural activity that can used with signals from the brain to enable continuous symptom monitoring. The model will be used to design a new control strategy for closed-loop DBS that can accommodate the nonlinear nature of the system, and short- and long-term changes in system behavior.
Though challenging, this research will provide new insights into the changes that take place in PD and the mechanisms by which DBS exerts its therapeutic influence. This knowledge will be used to design a new strategy for closed-loop DBS, ready for testing in patients, with the potential to significantly improve patient outcomes in PD and fundamentally change the way in which implanted devices utilise electrical stimulation to modulate neural activity.
Max ERC Funding
1 999 474 €
Duration
Start date: 2015-08-01, End date: 2020-07-31
Project acronym DC_Nutrient
Project Investigating nutrients as key determinants of DC-induced CD8 T cell responses
Researcher (PI) David FINLAY
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Consolidator Grant (CoG), LS6, ERC-2017-COG
Summary A new immunoregulatory axis has emerged in recent years demonstrating that cellular metabolism is crucial in controlling immune responses. This regulatory axis is acutely sensitive to nutrients that fuel metabolic pathways and support nutrient sensitive signalling pathways. My recent research demonstrates that nutrients are dynamically controlled and are not equally available to all immune cells. The data shows that activated T cells, clustered around a dendritic cell (DC), can consume the available nutrients, leaving the DC nutrient deprived in vitro. This local regulation of the DC nutrient microenvironment by neighbouring cells has profound effects on DC function and T cell responses. Nutrient deprived DC have altered signalling (decreased mTORC1 activity), increased pro-inflammatory functions (IL12 and costimulatory molecule expression) and induce enhanced T cell responses (proliferation, IFNγ production). However, proving this, particularly in vivo, is a major challenge as the tools to investigate nutrient dynamics within complex microenvironments have not yet been developed. This research programme will generate innovative new technologies to measure the local distribution of glucose, glutamine and leucine (all of which control mTORC1 signalling) to be visualised and quantified. These technologies will pioneer a new era of in vivo nutrient analysis. Nutrient deprivation of antigen presenting DC will then be investigated (using our new technologies) in response to various stimuli within the inflammatory lymph node and correlated to CD8 T cell responses. We will generate state-of-the-art transgenic mice to specifically knock-down nutrient transporters for glucose, glutamine, or leucine in DC to definitively prove that the availability of these nutrients to antigen presenting DC is a key mechanism for controlling CD8 T cells responses. This would be a paradigm shifting discovery that would open new horizons for the study of nutrient-regulated immune responses.
Summary
A new immunoregulatory axis has emerged in recent years demonstrating that cellular metabolism is crucial in controlling immune responses. This regulatory axis is acutely sensitive to nutrients that fuel metabolic pathways and support nutrient sensitive signalling pathways. My recent research demonstrates that nutrients are dynamically controlled and are not equally available to all immune cells. The data shows that activated T cells, clustered around a dendritic cell (DC), can consume the available nutrients, leaving the DC nutrient deprived in vitro. This local regulation of the DC nutrient microenvironment by neighbouring cells has profound effects on DC function and T cell responses. Nutrient deprived DC have altered signalling (decreased mTORC1 activity), increased pro-inflammatory functions (IL12 and costimulatory molecule expression) and induce enhanced T cell responses (proliferation, IFNγ production). However, proving this, particularly in vivo, is a major challenge as the tools to investigate nutrient dynamics within complex microenvironments have not yet been developed. This research programme will generate innovative new technologies to measure the local distribution of glucose, glutamine and leucine (all of which control mTORC1 signalling) to be visualised and quantified. These technologies will pioneer a new era of in vivo nutrient analysis. Nutrient deprivation of antigen presenting DC will then be investigated (using our new technologies) in response to various stimuli within the inflammatory lymph node and correlated to CD8 T cell responses. We will generate state-of-the-art transgenic mice to specifically knock-down nutrient transporters for glucose, glutamine, or leucine in DC to definitively prove that the availability of these nutrients to antigen presenting DC is a key mechanism for controlling CD8 T cells responses. This would be a paradigm shifting discovery that would open new horizons for the study of nutrient-regulated immune responses.
Max ERC Funding
1 995 861 €
Duration
Start date: 2018-05-01, End date: 2023-04-30
Project acronym DCRIDDLE
Project A novel physiological role for IRE1 and RIDD..., maintaining the balance between tolerance and immunity?
Researcher (PI) Sophie Janssens
Host Institution (HI) VIB
Call Details Consolidator Grant (CoG), LS3, ERC-2018-COG
Summary Dendritic cells (DCs) play a crucial role as gatekeepers of the immune system, coordinating the balance between protective immunity and tolerance to self antigens. What determines the switch between immunogenic versus tolerogenic antigen presentation remains one of the most puzzling questions in immunology. My team recently discovered an unanticipated link between a conserved stress response in the endoplasmic reticulum (ER) and tolerogenic DC maturation, thereby setting the stage for new insights in this fundamental branch in immunology.
Specifically, we found that one of the branches of the unfolded protein response (UPR), the IRE1/XBP1 signaling axis, is constitutively active in murine dendritic cells (cDC1s), without any signs of an overt UPR gene signature. Based on preliminary data we hypothesize that IRE1 is activated by apoptotic cell uptake, orchestrating a metabolic response from the ER to ensure tolerogenic antigen presentation. This entirely novel physiological function for IRE1 entails a paradigm shift in the UPR field, as it reveals that IRE1’s functions might stretch far from its well-established function induced by chronic ER stress. The aim of my research program is to establish whether IRE1 in DCs is the hitherto illusive switch between tolerogenic and immunogenic maturation. To this end, we will dissect its function in vivo both in steady-state conditions and in conditions of danger (viral infection models). In line with our data, IRE1 has recently been identified as a candidate gene for autoimmune disease based on Genome Wide Association Studies (GWAS). Therefore, I envisage that my research program will not only have a large impact on the field of DC biology and apoptotic cell clearance, but will also yield new insights in diseases like autoimmunity, graft versus host disease or tumor immunology, all associated with disturbed balances between tolerogenic and immunogenic responses.
Summary
Dendritic cells (DCs) play a crucial role as gatekeepers of the immune system, coordinating the balance between protective immunity and tolerance to self antigens. What determines the switch between immunogenic versus tolerogenic antigen presentation remains one of the most puzzling questions in immunology. My team recently discovered an unanticipated link between a conserved stress response in the endoplasmic reticulum (ER) and tolerogenic DC maturation, thereby setting the stage for new insights in this fundamental branch in immunology.
Specifically, we found that one of the branches of the unfolded protein response (UPR), the IRE1/XBP1 signaling axis, is constitutively active in murine dendritic cells (cDC1s), without any signs of an overt UPR gene signature. Based on preliminary data we hypothesize that IRE1 is activated by apoptotic cell uptake, orchestrating a metabolic response from the ER to ensure tolerogenic antigen presentation. This entirely novel physiological function for IRE1 entails a paradigm shift in the UPR field, as it reveals that IRE1’s functions might stretch far from its well-established function induced by chronic ER stress. The aim of my research program is to establish whether IRE1 in DCs is the hitherto illusive switch between tolerogenic and immunogenic maturation. To this end, we will dissect its function in vivo both in steady-state conditions and in conditions of danger (viral infection models). In line with our data, IRE1 has recently been identified as a candidate gene for autoimmune disease based on Genome Wide Association Studies (GWAS). Therefore, I envisage that my research program will not only have a large impact on the field of DC biology and apoptotic cell clearance, but will also yield new insights in diseases like autoimmunity, graft versus host disease or tumor immunology, all associated with disturbed balances between tolerogenic and immunogenic responses.
Max ERC Funding
1 999 196 €
Duration
Start date: 2019-02-01, End date: 2024-01-31
Project acronym DelCancer
Project The role of loss-of-heterozygosity in cancer development and progression
Researcher (PI) Anna Sablina
Host Institution (HI) VIB
Call Details Starting Grant (StG), LS4, ERC-2012-StG_20111109
Summary Somatically acquired loss-of-heterozygosity (LOH) is extremely common in cancer; deletions of recessive cancer genes, miRNAs, and regulatory elements, can confer selective growth advantage, whereas deletions over fragile sites are thought to reflect an increased local rate of DNA breakage. However, most LOHs in cancer genomes remain unexplained. Here we plan to combine a TALEN technology and the experimental models of cell transformation derived from primary human cells to delete specific chromosomal regions that are frequently lost in cancer samples. The development of novel strategies to introduce large chromosomal rearrangements into the genome of primary human cells will offer new perspectives for studying gene function, for elucidating chromosomal organisation, and for increasing our understanding of the molecular mechanisms and pathways underlying cancer development.Using this technology to genetically engineer cells that model cancer-associated genetic alterations, we will identify LOH regions critical for the development and progression of human cancers, and will investigate the cooperative effect of loss of genes, non-coding RNAs, and regulatory elements located within the deleted regions on cancer-associated phenotypes. We will assess how disruption of the three-dimensional chromosomal network in cells with specific chromosomal deletions contributes to cell transformation. Isogenic cell lines harbouring targeted chromosomal alterations will also serve us as a platform to identify compounds with specificity for particular genetic abnormalities. As a next step, we plan to unravel the mechanisms by which particular homozygous deletions contribute to cancer-associated phenotypes. If successful, the results of these studies will represent an important step towards understanding oncogenesis, and could yield new diagnostic and prognostic markers as well as identify potential therapeutic targets.
Summary
Somatically acquired loss-of-heterozygosity (LOH) is extremely common in cancer; deletions of recessive cancer genes, miRNAs, and regulatory elements, can confer selective growth advantage, whereas deletions over fragile sites are thought to reflect an increased local rate of DNA breakage. However, most LOHs in cancer genomes remain unexplained. Here we plan to combine a TALEN technology and the experimental models of cell transformation derived from primary human cells to delete specific chromosomal regions that are frequently lost in cancer samples. The development of novel strategies to introduce large chromosomal rearrangements into the genome of primary human cells will offer new perspectives for studying gene function, for elucidating chromosomal organisation, and for increasing our understanding of the molecular mechanisms and pathways underlying cancer development.Using this technology to genetically engineer cells that model cancer-associated genetic alterations, we will identify LOH regions critical for the development and progression of human cancers, and will investigate the cooperative effect of loss of genes, non-coding RNAs, and regulatory elements located within the deleted regions on cancer-associated phenotypes. We will assess how disruption of the three-dimensional chromosomal network in cells with specific chromosomal deletions contributes to cell transformation. Isogenic cell lines harbouring targeted chromosomal alterations will also serve us as a platform to identify compounds with specificity for particular genetic abnormalities. As a next step, we plan to unravel the mechanisms by which particular homozygous deletions contribute to cancer-associated phenotypes. If successful, the results of these studies will represent an important step towards understanding oncogenesis, and could yield new diagnostic and prognostic markers as well as identify potential therapeutic targets.
Max ERC Funding
1 498 764 €
Duration
Start date: 2012-10-01, End date: 2017-09-30
Project acronym DEMIURGE
Project Automatic Design of Robot Swarms
Researcher (PI) Mauro Birattari
Host Institution (HI) UNIVERSITE LIBRE DE BRUXELLES
Call Details Consolidator Grant (CoG), PE6, ERC-2015-CoG
Summary The scope of this project is the automatic design of robot swarms. Swarm robotics is an appealing approach to the coordination of large groups of robots. Up to now, robot swarms have been designed via some labor-intensive process.
My goal is to advance the state of the art in swarm robotics by developing the DEMIURGE: an intelligent system that is able to design and realize robot swarms in a totally integrated and automatic way
The DEMIURGE is a novel concept. Starting from requirements expressed in a specification language that I will define, the DEMIURGE will design all aspects of a robot swarm - hardware and control software.
The DEMIURGE will cast a design problem into an optimization problem and will tackle it in a computation-intensive way. In this project, I will study different control software structures, optimization algorithms, ways to specify requirements, validation protocols, on-line adaptation mechanisms and techniques for re-design at run time.
Summary
The scope of this project is the automatic design of robot swarms. Swarm robotics is an appealing approach to the coordination of large groups of robots. Up to now, robot swarms have been designed via some labor-intensive process.
My goal is to advance the state of the art in swarm robotics by developing the DEMIURGE: an intelligent system that is able to design and realize robot swarms in a totally integrated and automatic way
The DEMIURGE is a novel concept. Starting from requirements expressed in a specification language that I will define, the DEMIURGE will design all aspects of a robot swarm - hardware and control software.
The DEMIURGE will cast a design problem into an optimization problem and will tackle it in a computation-intensive way. In this project, I will study different control software structures, optimization algorithms, ways to specify requirements, validation protocols, on-line adaptation mechanisms and techniques for re-design at run time.
Max ERC Funding
2 000 000 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym DEVHEALTH
Project UNDERSTANDING HEALTH ACROSS THE LIFECOURSE:
AN INTEGRATED DEVELOPMENTAL APPROACH
Researcher (PI) James Joseph Heckman
Host Institution (HI) UNIVERSITY COLLEGE DUBLIN, NATIONAL UNIVERSITY OF IRELAND, DUBLIN
Call Details Advanced Grant (AdG), SH1, ERC-2010-AdG_20100407
Summary This proposal seeks support for a research group led by James Heckman of the Geary Institute at University College Dublin to produce an integrated developmental approach to health that studies the origins and the evolution of health inequalities over the lifecourse and across generations, and the role played by cognition, personality, genes, and environments. Major experimental and nonexperimental international datasets will be analyzed. A practical guide to implementing related policy will be produced. We will build a science of human development that draws on, extends, and unites research on the biology and epidemiology of health disparities with medical economics and the economics of skill formation. The goal is to develop an integrated framework to jointly model the economic, social and biological mechanisms that produce the evolution and the intergenerational transmission of health and of the capabilities that foster health. The following tasks will be undertaken: (1) We will quantify the importance of early-life conditions in explaining the existence of health disparities across the lifecourse. (2) We will understand how health inequalities are transmitted across generations. (3) We will assess the health benefits from early childhood interventions. (4) We will examine the role of genes and environments in the aetiology and evolution of disease. (5) We will analyze how health inequalities emerge and evolve across the lifecourse. (6) We will give biological foundations to both our models and the health measures we will use. The proposed research will investigate causal channels for promoting health. It will compare the relative effectiveness of interventions at various stages of the life cycle and the benefits and costs of later remediation if early adversity is not adequately eliminated. It will guide the design of current and prospective experimental and longitudinal studies and policy formulation, and will train young scholars in frontier methods of research
Summary
This proposal seeks support for a research group led by James Heckman of the Geary Institute at University College Dublin to produce an integrated developmental approach to health that studies the origins and the evolution of health inequalities over the lifecourse and across generations, and the role played by cognition, personality, genes, and environments. Major experimental and nonexperimental international datasets will be analyzed. A practical guide to implementing related policy will be produced. We will build a science of human development that draws on, extends, and unites research on the biology and epidemiology of health disparities with medical economics and the economics of skill formation. The goal is to develop an integrated framework to jointly model the economic, social and biological mechanisms that produce the evolution and the intergenerational transmission of health and of the capabilities that foster health. The following tasks will be undertaken: (1) We will quantify the importance of early-life conditions in explaining the existence of health disparities across the lifecourse. (2) We will understand how health inequalities are transmitted across generations. (3) We will assess the health benefits from early childhood interventions. (4) We will examine the role of genes and environments in the aetiology and evolution of disease. (5) We will analyze how health inequalities emerge and evolve across the lifecourse. (6) We will give biological foundations to both our models and the health measures we will use. The proposed research will investigate causal channels for promoting health. It will compare the relative effectiveness of interventions at various stages of the life cycle and the benefits and costs of later remediation if early adversity is not adequately eliminated. It will guide the design of current and prospective experimental and longitudinal studies and policy formulation, and will train young scholars in frontier methods of research
Max ERC Funding
2 505 222 €
Duration
Start date: 2011-05-01, End date: 2016-04-30
Project acronym DigitalMemories
Project We are all Ayotzinapa: The role of Digital Media in the Shaping of Transnational Memories on Disappearance
Researcher (PI) Silvana Mandolessi
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), SH5, ERC-2015-STG
Summary The project seeks to study the role of digital media in the shaping of transnational memories on disappearance. It investigates a novel case that is in process of shaping: the disappearance of 43 students in Mexico in September 2014. The role of the new media in getting citizens’ attention and in marking a “turning point” was crucial to the upsurge of a counter-movement against the Mexican government and qualifies the event as significant for the transnational arena.
The groundbreaking aspect of the project consists in proposing a double approach:
a) a theoretical approach in which “disappearance” is considered as a particular crime that becomes a model for analyzing digital memory. Disappearance is a technology that produces a subject with a new ontological status: the disappeared are non-beings, because they are neither alive nor dead. This ontological status transgresses the clear boundaries separating life and death, past, present and future, materiality and immateriality, personal and collective spheres. “Digital memory”, i.e. a memory mediated by digital technology, is also determined by the transgression of the boundaries of given categories
b) a multidisciplinary approach situating Mexico´s case in a long transnational history of disappearance in the Hispanic World, including Argentina and Spain. This longer history seeks to compare disappearance as a mnemonic object developed in the global sphere –in social network sites as blogs, Facebook, Twitter and YouTube– in Mexico and the social performances and artistic representations –literature, photo exhibitions, and films– developed in Spain and Argentina.
The Mexican case represents a paradigm for the redefinition of the relationship between media and memory. The main output of the project will consist in constructing a theoretical model for analyzing digital mnemonic objects in the rise of networked social movements with a transnational scope.
Summary
The project seeks to study the role of digital media in the shaping of transnational memories on disappearance. It investigates a novel case that is in process of shaping: the disappearance of 43 students in Mexico in September 2014. The role of the new media in getting citizens’ attention and in marking a “turning point” was crucial to the upsurge of a counter-movement against the Mexican government and qualifies the event as significant for the transnational arena.
The groundbreaking aspect of the project consists in proposing a double approach:
a) a theoretical approach in which “disappearance” is considered as a particular crime that becomes a model for analyzing digital memory. Disappearance is a technology that produces a subject with a new ontological status: the disappeared are non-beings, because they are neither alive nor dead. This ontological status transgresses the clear boundaries separating life and death, past, present and future, materiality and immateriality, personal and collective spheres. “Digital memory”, i.e. a memory mediated by digital technology, is also determined by the transgression of the boundaries of given categories
b) a multidisciplinary approach situating Mexico´s case in a long transnational history of disappearance in the Hispanic World, including Argentina and Spain. This longer history seeks to compare disappearance as a mnemonic object developed in the global sphere –in social network sites as blogs, Facebook, Twitter and YouTube– in Mexico and the social performances and artistic representations –literature, photo exhibitions, and films– developed in Spain and Argentina.
The Mexican case represents a paradigm for the redefinition of the relationship between media and memory. The main output of the project will consist in constructing a theoretical model for analyzing digital mnemonic objects in the rise of networked social movements with a transnational scope.
Max ERC Funding
1 444 125 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym DISPATCH Neuro-Sense
Project Distributed Signal Processing Algorithms for Chronic Neuro-Sensor Networks
Researcher (PI) Alexander BERTRAND
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE6, ERC-2018-STG
Summary The possibility to chronically monitor the brain 24/7 in daily-life activities would revolutionize human-machine interactions and health care, e.g., in the context of neuroprostheses, neurological disorders, and brain-computer interfaces (BCI). Such chronic systems must satisfy challenging energy and miniaturization constraints, leading to modular designs in which multiple networked miniature neuro-sensor modules form a ‘neuro-sensor network’ (NSN).
However, current multi-channel neural signal processing (NSP) algorithms were designed for traditional neuro-sensor arrays with central access to all channels. These algorithms are not suited for NSNs, as they require unrealistic bandwidth budgets to centralize the data, yet a joint neural data analysis across NSN modules is crucial.
The central idea of this project is to remove this algorithm bottleneck by designing novel scalable, distributed NSP algorithms to let the modules of an NSN jointly process the recorded neural data through in-network data fusion and with a minimal exchange of data.
To guarantee impact, we mainly focus on establishing a new non-invasive NSN concept based on electroencephalography (EEG). By combining multiple ‘smart’ mini-EEG modules into an ‘EEG sensor network’ (EEG-Net), we compensate for the lack of spatial information captured by current stand-alone mini-EEG devices, without compromising in ‘wearability’. Equipping such EEG-Nets with distributed NSP algorithms will allow to process high-density EEG data at viable energy levels, which is a game changer towards high-performance chronic EEG for, e.g., epilepsy monitoring, neuroprostheses, and BCI.
We will validate these claims in an EEG-Net prototype in the above 3 use cases, benefiting from ongoing collaborations with the KUL university hospital. In addition, to demonstrate the general applicability of our novel NSP algorithms, we will validate them in other emerging NSN types as well, such as modular or untethered neural probes.
Summary
The possibility to chronically monitor the brain 24/7 in daily-life activities would revolutionize human-machine interactions and health care, e.g., in the context of neuroprostheses, neurological disorders, and brain-computer interfaces (BCI). Such chronic systems must satisfy challenging energy and miniaturization constraints, leading to modular designs in which multiple networked miniature neuro-sensor modules form a ‘neuro-sensor network’ (NSN).
However, current multi-channel neural signal processing (NSP) algorithms were designed for traditional neuro-sensor arrays with central access to all channels. These algorithms are not suited for NSNs, as they require unrealistic bandwidth budgets to centralize the data, yet a joint neural data analysis across NSN modules is crucial.
The central idea of this project is to remove this algorithm bottleneck by designing novel scalable, distributed NSP algorithms to let the modules of an NSN jointly process the recorded neural data through in-network data fusion and with a minimal exchange of data.
To guarantee impact, we mainly focus on establishing a new non-invasive NSN concept based on electroencephalography (EEG). By combining multiple ‘smart’ mini-EEG modules into an ‘EEG sensor network’ (EEG-Net), we compensate for the lack of spatial information captured by current stand-alone mini-EEG devices, without compromising in ‘wearability’. Equipping such EEG-Nets with distributed NSP algorithms will allow to process high-density EEG data at viable energy levels, which is a game changer towards high-performance chronic EEG for, e.g., epilepsy monitoring, neuroprostheses, and BCI.
We will validate these claims in an EEG-Net prototype in the above 3 use cases, benefiting from ongoing collaborations with the KUL university hospital. In addition, to demonstrate the general applicability of our novel NSP algorithms, we will validate them in other emerging NSN types as well, such as modular or untethered neural probes.
Max ERC Funding
1 489 656 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym DIVISIONPLANESWITCH
Project Control mechanisms that pattern microtubules for switching cell division planes during plant morphogenesis
Researcher (PI) Pankaj Bacharam Dhonukshe
Host Institution (HI) VIB
Call Details Starting Grant (StG), LS3, ERC-2012-StG_20111109
Summary Oriented cell divisions dictate morphogenesis by shaping tissues and organs of multicellular organisms. Oriented cell divisions have profound influence in plants because their cell positions are locked by shared cell walls. A relay of cell divisions involving precise division plane switches determines embryonic body plan, organ layout and organ architecture in plants. Cell division planes in plants are specified by reorganization of premitotic cortical microtubule array and how this occurs is a long-standing key question.
My recent results establish, for the first time in plants, an in vivo inducible and traceable, precise 90º cell division plane switch system. With this system I identified a pathway that proceeds from transcriptional activation through a signaling module all the way to the activation of microtubule regulators that orchestrate switches in premitotic microtubule organization and cell division planes. My findings provide a first paradigm in plants of how genetic circuitry patterns cell division planes via feeding onto cellular machinery and pave the way for unraveling mechanistic control of cell division plane switch.
By establishing a precise cell division plane switch system I am in a unique position to answer:
1. What transcriptional program and molecular players control premitotic microtubule reorganization?
2. Which mechanisms switch premitotic microtubule array?
3. What influence do identified players and mechanisms have on different types of oriented cell divisions in plants?
For this I propose a systematic research plan combining (i) forward genetics and expression profile screens for identifying a suite of microtubule regulators, (ii) state-of-the-art microscopy and modeling approaches for uncovering mechanisms of their actions and (iii) their tissue-specific manipulations to modify plant form.
By unraveling players and mechanisms this proposal shall resolve regulation of oriented cell divisions and expand plant engineering toolbox.
Summary
Oriented cell divisions dictate morphogenesis by shaping tissues and organs of multicellular organisms. Oriented cell divisions have profound influence in plants because their cell positions are locked by shared cell walls. A relay of cell divisions involving precise division plane switches determines embryonic body plan, organ layout and organ architecture in plants. Cell division planes in plants are specified by reorganization of premitotic cortical microtubule array and how this occurs is a long-standing key question.
My recent results establish, for the first time in plants, an in vivo inducible and traceable, precise 90º cell division plane switch system. With this system I identified a pathway that proceeds from transcriptional activation through a signaling module all the way to the activation of microtubule regulators that orchestrate switches in premitotic microtubule organization and cell division planes. My findings provide a first paradigm in plants of how genetic circuitry patterns cell division planes via feeding onto cellular machinery and pave the way for unraveling mechanistic control of cell division plane switch.
By establishing a precise cell division plane switch system I am in a unique position to answer:
1. What transcriptional program and molecular players control premitotic microtubule reorganization?
2. Which mechanisms switch premitotic microtubule array?
3. What influence do identified players and mechanisms have on different types of oriented cell divisions in plants?
For this I propose a systematic research plan combining (i) forward genetics and expression profile screens for identifying a suite of microtubule regulators, (ii) state-of-the-art microscopy and modeling approaches for uncovering mechanisms of their actions and (iii) their tissue-specific manipulations to modify plant form.
By unraveling players and mechanisms this proposal shall resolve regulation of oriented cell divisions and expand plant engineering toolbox.
Max ERC Funding
1 500 000 €
Duration
Start date: 2013-01-01, End date: 2017-12-31
Project acronym DOSE
Project Dosage sensitive genes in evolution and disease
Researcher (PI) Aoife Mclysaght
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Starting Grant (StG), LS8, ERC-2012-StG_20111109
Summary Evolutionary change of gene copy number through gene duplication is a relatively pervasive phenomenon in eukaryotic genomes. However, for a subset of genes such changes are deleterious because they result in imbalances in the cell. Such dosage-sensitive genes have been increasingly implicated in disease, particularly through the association of copy number variants (CNVs) with pathogenicity.
In my lab we have previously discovered that many genes in the human genome which were retained after whole genome duplication (WGD) are refractory to gene duplication both over evolutionary timescales and within populations. These are expected characteristics of dosage-balanced genes. Many of these genes are implicated in human disease. I now propose to take a computational (dry-lab) approach to examine the evolution of dosage-balanced genes further and to develop a sophisticated model of evolutionary constraint of copy number. These models will enable the identification of dosage-balanced genes and their consideration as novel candidate disease loci.
Recognising and interpreting patterns of constraint is the cornerstone of molecular evolution. Through careful analysis of genome sequences with respect to gene duplication over evolutionary times and within populations, we will develop a formal and generalised model of copy-number evolution and constraint. We will use these models to identify candidate disease loci within pathogenic CNVs. We will also study the characteristics of known disease genes in order to identify novel candidate loci for dosage-dependent disease.
This is an ambitious and high impact project that has the potential to yield major insights into gene copy-number constraint and its relationship to complex disease.
Summary
Evolutionary change of gene copy number through gene duplication is a relatively pervasive phenomenon in eukaryotic genomes. However, for a subset of genes such changes are deleterious because they result in imbalances in the cell. Such dosage-sensitive genes have been increasingly implicated in disease, particularly through the association of copy number variants (CNVs) with pathogenicity.
In my lab we have previously discovered that many genes in the human genome which were retained after whole genome duplication (WGD) are refractory to gene duplication both over evolutionary timescales and within populations. These are expected characteristics of dosage-balanced genes. Many of these genes are implicated in human disease. I now propose to take a computational (dry-lab) approach to examine the evolution of dosage-balanced genes further and to develop a sophisticated model of evolutionary constraint of copy number. These models will enable the identification of dosage-balanced genes and their consideration as novel candidate disease loci.
Recognising and interpreting patterns of constraint is the cornerstone of molecular evolution. Through careful analysis of genome sequences with respect to gene duplication over evolutionary times and within populations, we will develop a formal and generalised model of copy-number evolution and constraint. We will use these models to identify candidate disease loci within pathogenic CNVs. We will also study the characteristics of known disease genes in order to identify novel candidate loci for dosage-dependent disease.
This is an ambitious and high impact project that has the potential to yield major insights into gene copy-number constraint and its relationship to complex disease.
Max ERC Funding
1 358 534 €
Duration
Start date: 2013-01-01, End date: 2018-12-31
Project acronym DOUBLE EXPRESS
Project Gene expression level as a keystone to understanding gene duplication: evolutionary constraints, opportunities, and disease
Researcher (PI) Aoife MCLYSAGHT
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Consolidator Grant (CoG), LS8, ERC-2017-COG
Summary Duplicate genes are important in disease, are a hugely important source of evolutionary novelty, and for many years we thought we understood them. We thought that duplication relieved selective constraints. We thought that gene knockout neutrality was due to redundancy. We thought that a duplicate is a duplicate is a duplicate. Evidence is accumulating challenging each of these views. Rather than being the result of an unbiased process, the genes that tend to duplicate in our genome and others are quickly evolving, non-essential genes, irrespective of current duplication status. Conversely, genes retained after whole genome duplication (WGD) are slowly evolving, important genes.
I propose that different resolution of the evolutionary constraints imposed by the demands of gene expression can explain these contrasting relationships. I propose that the opposing constraints on gene-by-gene duplications as compared to WGD channel these different sets of genes into remarkably different evolutionary trajectories. In particular, in much the same way that individual gene duplication creates an opportunity for the evolution of a new gene, the co-evolution of expression of sets of interacting genes after WGD creates an opportunity for the evolution of new biochemical pathways and protein complexes. Furthermore, I suggest a common mechanism of pathogenicity for many duplication events independent of the biochemical function of the encoded genes.
With the availability of abundant high-quality genomics data, now is an opportune time to address these questions. Primarily through computational and statistical analysis I will reveal the relationship between gene duplication and expression and test a model that the indirect costs of gene expression are a major determinant of the outcome of gene duplication. I will explore the effects this has on gene and genome evolution. Finally, I will link the patterns of gene expression and duplicability to pathogenic effects.
Summary
Duplicate genes are important in disease, are a hugely important source of evolutionary novelty, and for many years we thought we understood them. We thought that duplication relieved selective constraints. We thought that gene knockout neutrality was due to redundancy. We thought that a duplicate is a duplicate is a duplicate. Evidence is accumulating challenging each of these views. Rather than being the result of an unbiased process, the genes that tend to duplicate in our genome and others are quickly evolving, non-essential genes, irrespective of current duplication status. Conversely, genes retained after whole genome duplication (WGD) are slowly evolving, important genes.
I propose that different resolution of the evolutionary constraints imposed by the demands of gene expression can explain these contrasting relationships. I propose that the opposing constraints on gene-by-gene duplications as compared to WGD channel these different sets of genes into remarkably different evolutionary trajectories. In particular, in much the same way that individual gene duplication creates an opportunity for the evolution of a new gene, the co-evolution of expression of sets of interacting genes after WGD creates an opportunity for the evolution of new biochemical pathways and protein complexes. Furthermore, I suggest a common mechanism of pathogenicity for many duplication events independent of the biochemical function of the encoded genes.
With the availability of abundant high-quality genomics data, now is an opportune time to address these questions. Primarily through computational and statistical analysis I will reveal the relationship between gene duplication and expression and test a model that the indirect costs of gene expression are a major determinant of the outcome of gene duplication. I will explore the effects this has on gene and genome evolution. Finally, I will link the patterns of gene expression and duplicability to pathogenic effects.
Max ERC Funding
1 824 794 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym DOUBLE-TROUBLE
Project Replaying the ‘genome duplication’ tape of life: the importance of polyploidy for adaptation in a changing environment
Researcher (PI) Yves VAN DE PEER
Host Institution (HI) VIB
Call Details Advanced Grant (AdG), LS8, ERC-2018-ADG
Summary Thousands of species are polyploid. However, the long-term establishment of organisms that have undergone ancient whole genome duplications (WGDs) has been exceedingly rare and when we analyse the genomes of plants and animals, we can, at most, find evidence for a very limited number of WGDs that survived on the longer term. The paucity of (established) ancient genome duplications and the existence of so many species that are currently polyploid provides a fascinating paradox. There is growing evidence that the majority of ancient WGDs were established at specific times in evolution, for instance during periods of environmental change and periods of mass-extinction. The reason for this ‘stress’-polyploidy relationship has been the subject of considerable speculation and several hypotheses have been put forward to explain this observation: (a) stressful conditions promote polyploid formation; (b) polyploidisation causes a niche shift allowing polyploids to grow in conditions that are unsuitable for their non-polyploid ancestors; and (c) polyploids have an increased evolvability and consequently adapt faster to a changing environment. Here, we want to unravel the mechanistic underpinnings of why and how polyploids can outcompete non-polyploids. We will address these questions by replaying the ‘genome duplication tape of life’ in two different model systems, namely Chlamydomonas and Spirodela. We will run long-term evolutionary (and resequencing) experiments. We will complement these experiments with in-silico experiments based on so-called digital organisms running on artificial genomes. Complementary modelling approaches will also be employed to study the effects of polyploidy from an eco-evolutionary dynamics perspective. By integrating the results obtained from these in vivo and in silico experiments, we will obtain important novel insights in the adaptive potential of polyploids under stressful conditions or during times of environmental and/or climate change.
Summary
Thousands of species are polyploid. However, the long-term establishment of organisms that have undergone ancient whole genome duplications (WGDs) has been exceedingly rare and when we analyse the genomes of plants and animals, we can, at most, find evidence for a very limited number of WGDs that survived on the longer term. The paucity of (established) ancient genome duplications and the existence of so many species that are currently polyploid provides a fascinating paradox. There is growing evidence that the majority of ancient WGDs were established at specific times in evolution, for instance during periods of environmental change and periods of mass-extinction. The reason for this ‘stress’-polyploidy relationship has been the subject of considerable speculation and several hypotheses have been put forward to explain this observation: (a) stressful conditions promote polyploid formation; (b) polyploidisation causes a niche shift allowing polyploids to grow in conditions that are unsuitable for their non-polyploid ancestors; and (c) polyploids have an increased evolvability and consequently adapt faster to a changing environment. Here, we want to unravel the mechanistic underpinnings of why and how polyploids can outcompete non-polyploids. We will address these questions by replaying the ‘genome duplication tape of life’ in two different model systems, namely Chlamydomonas and Spirodela. We will run long-term evolutionary (and resequencing) experiments. We will complement these experiments with in-silico experiments based on so-called digital organisms running on artificial genomes. Complementary modelling approaches will also be employed to study the effects of polyploidy from an eco-evolutionary dynamics perspective. By integrating the results obtained from these in vivo and in silico experiments, we will obtain important novel insights in the adaptive potential of polyploids under stressful conditions or during times of environmental and/or climate change.
Max ERC Funding
2 500 000 €
Duration
Start date: 2020-01-01, End date: 2024-12-31
Project acronym DOUBLE-UP
Project The importance of gene and genome duplications for natural and artificial organism populations
Researcher (PI) Yves Eddy Philomena Van De Peer
Host Institution (HI) VIB
Call Details Advanced Grant (AdG), LS2, ERC-2012-ADG_20120314
Summary The long-term establishment of ancient organisms that have undergone whole genome duplications has been exceedingly rare. On the other hand, tens of thousands of now-living species are polyploid and contain multiple copies of their genome. The paucity of ancient genome duplications and the existence of so many species that are currently polyploid provide an interesting and fascinating enigma. A question that remains is whether these older genome duplications have survived by coincidence or because they did occur at very specific times, for instance during major ecological upheavals and periods of extinction. It has indeed been proposed that chromosome doubling conveys greater stress tolerance by fostering slower development, delayed reproduction and longer life span. Furthermore, polyploids have also been considered to have greater ability to colonize new or disturbed habitats. If polyploidy allowed many plant lineages to survive and adapt during global changes, as suggested, we might wonder whether polyploidy will confer a similar advantage in the current period of global warming and general ecological pressure caused by the human race. Given predictions that species extinction is now occurring at as high rates as during previous mass extinctions, will the presumed extra adaptability of polyploid plants mean they will become the dominant species? In the current proposal, we hope to address these questions at different levels through 1) the analysis of whole plant genome sequence data and 2) the in silico modelling of artificial gene regulatory networks to mimic the genomic consequences of genome doubling and how this may affect network structure and dosage balance. Furthermore, we aim at using simulated robotic models running on artificial gene regulatory networks in complex environments to evaluate how both natural and artificial organism populations can potentially benefit from gene and genome duplications for adaptation, survival, and evolution in general.
Summary
The long-term establishment of ancient organisms that have undergone whole genome duplications has been exceedingly rare. On the other hand, tens of thousands of now-living species are polyploid and contain multiple copies of their genome. The paucity of ancient genome duplications and the existence of so many species that are currently polyploid provide an interesting and fascinating enigma. A question that remains is whether these older genome duplications have survived by coincidence or because they did occur at very specific times, for instance during major ecological upheavals and periods of extinction. It has indeed been proposed that chromosome doubling conveys greater stress tolerance by fostering slower development, delayed reproduction and longer life span. Furthermore, polyploids have also been considered to have greater ability to colonize new or disturbed habitats. If polyploidy allowed many plant lineages to survive and adapt during global changes, as suggested, we might wonder whether polyploidy will confer a similar advantage in the current period of global warming and general ecological pressure caused by the human race. Given predictions that species extinction is now occurring at as high rates as during previous mass extinctions, will the presumed extra adaptability of polyploid plants mean they will become the dominant species? In the current proposal, we hope to address these questions at different levels through 1) the analysis of whole plant genome sequence data and 2) the in silico modelling of artificial gene regulatory networks to mimic the genomic consequences of genome doubling and how this may affect network structure and dosage balance. Furthermore, we aim at using simulated robotic models running on artificial gene regulatory networks in complex environments to evaluate how both natural and artificial organism populations can potentially benefit from gene and genome duplications for adaptation, survival, and evolution in general.
Max ERC Funding
2 217 525 €
Duration
Start date: 2013-10-01, End date: 2018-09-30
Project acronym DPMP
Project Dependable Performance on Many-Thread Processors
Researcher (PI) Lieven Eeckhout
Host Institution (HI) UNIVERSITEIT GENT
Call Details Starting Grant (StG), PE6, ERC-2010-StG_20091028
Summary Contemporary microprocessors seek at improving performance through thread-level parallelism by co-executing multiple threads on a single microprocessor chip. Projections suggest that future processors will feature multiple tens to hundreds of threads, hence called many-thread processors. Many-thread processors, however, lead to non-dependable performance: co-executing threads affect each other s performance in unpredictable ways because of resource sharing across threads. Failure to deliver dependable performance leads to missed deadlines, priority inversion, unbalanced parallel execution, etc., which will severely impact the usage model and the performance growth path for many important future and emerging application domains (e.g., media, medical, datacenter).
DPMP envisions that performance introspection using a cycle accounting architecture that tracks per-thread performance, will be the breakthrough to delivering dependable performance in future many-thread processors. To this end, DPMP will develop a hardware cycle accounting architecture that estimates single-thread progress during many-thread execution. The ability to track per-thread progress enables system software to deliver dependable performance by assigning hardware resources to threads depending on their relative progress. Through this cooperative hardware-software approach, this project addresses a fundamental problem in multi-threaded ad multi/many-core processing.
Summary
Contemporary microprocessors seek at improving performance through thread-level parallelism by co-executing multiple threads on a single microprocessor chip. Projections suggest that future processors will feature multiple tens to hundreds of threads, hence called many-thread processors. Many-thread processors, however, lead to non-dependable performance: co-executing threads affect each other s performance in unpredictable ways because of resource sharing across threads. Failure to deliver dependable performance leads to missed deadlines, priority inversion, unbalanced parallel execution, etc., which will severely impact the usage model and the performance growth path for many important future and emerging application domains (e.g., media, medical, datacenter).
DPMP envisions that performance introspection using a cycle accounting architecture that tracks per-thread performance, will be the breakthrough to delivering dependable performance in future many-thread processors. To this end, DPMP will develop a hardware cycle accounting architecture that estimates single-thread progress during many-thread execution. The ability to track per-thread progress enables system software to deliver dependable performance by assigning hardware resources to threads depending on their relative progress. Through this cooperative hardware-software approach, this project addresses a fundamental problem in multi-threaded ad multi/many-core processing.
Max ERC Funding
1 389 000 €
Duration
Start date: 2010-10-01, End date: 2016-09-30
Project acronym DRAIOCHT
Project DRAIOCHT- A low-cost minimally invasive platform medical device for the treatment of disorders of the cardiovascular system.
Researcher (PI) Martin O'HALLORAN
Host Institution (HI) NATIONAL UNIVERSITY OF IRELAND GALWAY
Call Details Proof of Concept (PoC), ERC-2017-PoC
Summary Microwave ablation produces a small and highly controllable heating zone, ideal for the thermal treatment of a range of medical conditions where damage to adjacent structures is a significant risk.
Potential applications range from the treatment of small soft-tissue cancers to the management of various venous disorders. The DRAIOCHT project will involve the design, development and pre-clinical evaluation of a novel microwave ablation device for the treatment of disorders of the venous system. The proposed device will provide a new low cost therapeutic medical device, that places greater value on the protection of crucial functional tissue and nerves compared to current ablation devices on the market. The project is of significant economic and societal benefits due to the high prevalence of venous diseases in Europe.
Summary
Microwave ablation produces a small and highly controllable heating zone, ideal for the thermal treatment of a range of medical conditions where damage to adjacent structures is a significant risk.
Potential applications range from the treatment of small soft-tissue cancers to the management of various venous disorders. The DRAIOCHT project will involve the design, development and pre-clinical evaluation of a novel microwave ablation device for the treatment of disorders of the venous system. The proposed device will provide a new low cost therapeutic medical device, that places greater value on the protection of crucial functional tissue and nerves compared to current ablation devices on the market. The project is of significant economic and societal benefits due to the high prevalence of venous diseases in Europe.
Max ERC Funding
149 954 €
Duration
Start date: 2019-01-01, End date: 2020-06-30
Project acronym DRY-2-DRY
Project Do droughts self-propagate and self-intensify?
Researcher (PI) Diego González Miralles
Host Institution (HI) UNIVERSITEIT GENT
Call Details Starting Grant (StG), PE10, ERC-2016-STG
Summary Droughts cause agricultural loss, forest mortality and drinking water scarcity. Their predicted increase in recurrence and intensity poses serious threats to future global food security. Several historically unprecedented droughts have already occurred over the last decade in Europe, Australia and the USA. The cost of the ongoing Californian drought is estimated to be about US$3 billion. Still today, the knowledge of how droughts start and evolve remains limited, and so does the understanding of how climate change may affect them.
Positive feedbacks from land have been suggested as critical for the occurrence of recent droughts: as rainfall deficits dry out soil and vegetation, the evaporation of land water is reduced, then the local air becomes too dry to yield rainfall, which further enhances drought conditions. Importantly, this is not just a 'local' feedback, as remote regions may rely on evaporated water transported by winds from the drought-affected region. Following this rationale, droughts self-propagate and self-intensify.
However, a global capacity to observe these processes is lacking. Furthermore, climate and forecast models are immature when it comes to representing the influences of land on rainfall. Do climate models underestimate this land feedback? If so, future drought aggravation will be greater than currently expected. At the moment, this remains largely speculative, given the limited number of studies of these processes.
I propose to use novel in situ and satellite records of soil moisture, evaporation and precipitation, in combination with new mechanistic models that can map water vapour trajectories and explore multi-dimensional feedbacks. DRY-2-DRY will not only advance our fundamental knowledge of the mechanisms triggering droughts, it will also provide independent evidence of the extent to which managing land cover can help 'dampen' drought events, and enable progress towards more accurate short-term and long-term drought forecasts.
Summary
Droughts cause agricultural loss, forest mortality and drinking water scarcity. Their predicted increase in recurrence and intensity poses serious threats to future global food security. Several historically unprecedented droughts have already occurred over the last decade in Europe, Australia and the USA. The cost of the ongoing Californian drought is estimated to be about US$3 billion. Still today, the knowledge of how droughts start and evolve remains limited, and so does the understanding of how climate change may affect them.
Positive feedbacks from land have been suggested as critical for the occurrence of recent droughts: as rainfall deficits dry out soil and vegetation, the evaporation of land water is reduced, then the local air becomes too dry to yield rainfall, which further enhances drought conditions. Importantly, this is not just a 'local' feedback, as remote regions may rely on evaporated water transported by winds from the drought-affected region. Following this rationale, droughts self-propagate and self-intensify.
However, a global capacity to observe these processes is lacking. Furthermore, climate and forecast models are immature when it comes to representing the influences of land on rainfall. Do climate models underestimate this land feedback? If so, future drought aggravation will be greater than currently expected. At the moment, this remains largely speculative, given the limited number of studies of these processes.
I propose to use novel in situ and satellite records of soil moisture, evaporation and precipitation, in combination with new mechanistic models that can map water vapour trajectories and explore multi-dimensional feedbacks. DRY-2-DRY will not only advance our fundamental knowledge of the mechanisms triggering droughts, it will also provide independent evidence of the extent to which managing land cover can help 'dampen' drought events, and enable progress towards more accurate short-term and long-term drought forecasts.
Max ERC Funding
1 465 000 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym DYNPOR
Project First principle molecular dynamics simulations for complex chemical transformations in nanoporous materials
Researcher (PI) Véronique Van Speybroeck
Host Institution (HI) UNIVERSITEIT GENT
Call Details Consolidator Grant (CoG), PE4, ERC-2014-CoG
Summary Chemical transformations in nanoporous materials are vital in many application domains, such as catalysis, molecular separations, sustainable chemistry,…. Model-guided design is indispensable to tailoring materials at the nanometer scale level.
At real operating conditions, chemical transformations taking place at the nanometer scale have a very complex nature, due to the interplay of several factors such as the number of particles present in the pores of the material, framework flexibility, competitive pathways, entropy effects,… The textbook concept of a single transition state is far too simplistic in such cases. A restricted number of configurations of the potential energy surface is not sufficient to capture the complexity of the transformation.
My objective is to simulate complex chemical transformations in nanoporous materials using first principle molecular dynamics methods at real operating conditions, capturing the full complexity of the free energy surface. To achieve these goals advanced sampling methods will be used to explore the interesting regions of the free energy surface. The number of guest molecules at real operating conditions will be derived and the diffusion of small molecules through pores with blocking molecules will be studied. New theoretical models will be developed to keep track of both the framework flexibility and entropy of the lattice.
The selected applications are timely and rely on an extensive network with prominent experimental partners. The applications will encompass contemporary catalytic conversions in zeolites, active site engineering in metal organic frameworks and structural transitions in nanoporous materials, and the expected outcomes will have the potential to yield groundbreaking new insights.
The results are expected to have impact far beyond the horizon of the current project as they will contribute to the transition from static to dynamically based modeling tools within heterogeneous catalysis
Summary
Chemical transformations in nanoporous materials are vital in many application domains, such as catalysis, molecular separations, sustainable chemistry,…. Model-guided design is indispensable to tailoring materials at the nanometer scale level.
At real operating conditions, chemical transformations taking place at the nanometer scale have a very complex nature, due to the interplay of several factors such as the number of particles present in the pores of the material, framework flexibility, competitive pathways, entropy effects,… The textbook concept of a single transition state is far too simplistic in such cases. A restricted number of configurations of the potential energy surface is not sufficient to capture the complexity of the transformation.
My objective is to simulate complex chemical transformations in nanoporous materials using first principle molecular dynamics methods at real operating conditions, capturing the full complexity of the free energy surface. To achieve these goals advanced sampling methods will be used to explore the interesting regions of the free energy surface. The number of guest molecules at real operating conditions will be derived and the diffusion of small molecules through pores with blocking molecules will be studied. New theoretical models will be developed to keep track of both the framework flexibility and entropy of the lattice.
The selected applications are timely and rely on an extensive network with prominent experimental partners. The applications will encompass contemporary catalytic conversions in zeolites, active site engineering in metal organic frameworks and structural transitions in nanoporous materials, and the expected outcomes will have the potential to yield groundbreaking new insights.
The results are expected to have impact far beyond the horizon of the current project as they will contribute to the transition from static to dynamically based modeling tools within heterogeneous catalysis
Max ERC Funding
1 993 750 €
Duration
Start date: 2015-08-01, End date: 2020-07-31
Project acronym E-DUALITY
Project Exploring Duality for Future Data-driven Modelling
Researcher (PI) Johan SUYKENS
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE7, ERC-2017-ADG
Summary Future data-driven modelling is increasingly challenging for many systems due to higher complexity levels, such as in energy systems, environmental and climate modelling, traffic and transport, industrial processes, health, safety, and others. This requires powerful concepts and frameworks that enable the design of high quality predictive models. In this proposal E-DUALITY we will explore and engineer the potential of duality principles for future data-driven modelling. An existing example illustrating the important role of duality in this context is support vector machines, which possess primal and dual model representations, in terms of feature maps and kernels, respectively. Within this project, besides using existing notions of duality that are relevant for data-driven modelling (e.g. Lagrange duality, Legendre-Fenchel duality, Monge-Kantorovich duality), we will also explore new ones. Duality principles will be employed for obtaining a generically applicable framework with unifying insights, handling different system complexity levels, optimal model representations and designing efficient algorithms. This will require taking an integrative approach across different research fields. The new framework should be able to include e.g. multi-view and multiple function learning, multiplex and multilayer networks, tensor models, multi-scale and deep architectures as particular instances and to combine several of such characteristics, in addition to simple basic schemes. It will include both parametric and kernel-based approaches for tasks as regression, classification, clustering, dimensionality reduction, outlier detection and dynamical systems modelling. Higher risk elements are the search for new standard forms in modelling systems with different complexity levels, matching models and representations to system characteristics, and developing algorithms for large scale applications within this powerful new framework.
Summary
Future data-driven modelling is increasingly challenging for many systems due to higher complexity levels, such as in energy systems, environmental and climate modelling, traffic and transport, industrial processes, health, safety, and others. This requires powerful concepts and frameworks that enable the design of high quality predictive models. In this proposal E-DUALITY we will explore and engineer the potential of duality principles for future data-driven modelling. An existing example illustrating the important role of duality in this context is support vector machines, which possess primal and dual model representations, in terms of feature maps and kernels, respectively. Within this project, besides using existing notions of duality that are relevant for data-driven modelling (e.g. Lagrange duality, Legendre-Fenchel duality, Monge-Kantorovich duality), we will also explore new ones. Duality principles will be employed for obtaining a generically applicable framework with unifying insights, handling different system complexity levels, optimal model representations and designing efficient algorithms. This will require taking an integrative approach across different research fields. The new framework should be able to include e.g. multi-view and multiple function learning, multiplex and multilayer networks, tensor models, multi-scale and deep architectures as particular instances and to combine several of such characteristics, in addition to simple basic schemes. It will include both parametric and kernel-based approaches for tasks as regression, classification, clustering, dimensionality reduction, outlier detection and dynamical systems modelling. Higher risk elements are the search for new standard forms in modelling systems with different complexity levels, matching models and representations to system characteristics, and developing algorithms for large scale applications within this powerful new framework.
Max ERC Funding
2 492 500 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym E-SWARM
Project Engineering Swarm Intelligence Systems
Researcher (PI) Marco Dorigo
Host Institution (HI) UNIVERSITE LIBRE DE BRUXELLES
Call Details Advanced Grant (AdG), PE6, ERC-2009-AdG
Summary Swarm intelligence is the discipline that deals with natural and artificial systems composed of many individuals that coordinate using decentralized control and self-organization. In this project, we focus on the design and implementation of artificial swarm intelligence systems for the solution of complex problems. Our current understanding of how to use swarms of artificial agents largely relies on rules of thumb and intuition based on the experience of individual researchers. This is not sufficient for us to design swarm intelligence systems at the level of complexity required by many real-world applications, or to accurately predict the behavior of the systems we design. The goal of the E-SWARM is to develop a rigorous engineering methodology for the design and implementation of artificial swarm intelligence systems. We believe that in the future, swarm intelligence will be an important tool for researchers and engineers interested in solving certain classes of complex problems. To build the foundations of this discipline and to develop an appropriate methodology, we will proceed in parallel both at an abstract level and by tackling a number of challenging problems in selected research domains. The research domains we have chosen are optimization, robotics, networks, and data mining.
Summary
Swarm intelligence is the discipline that deals with natural and artificial systems composed of many individuals that coordinate using decentralized control and self-organization. In this project, we focus on the design and implementation of artificial swarm intelligence systems for the solution of complex problems. Our current understanding of how to use swarms of artificial agents largely relies on rules of thumb and intuition based on the experience of individual researchers. This is not sufficient for us to design swarm intelligence systems at the level of complexity required by many real-world applications, or to accurately predict the behavior of the systems we design. The goal of the E-SWARM is to develop a rigorous engineering methodology for the design and implementation of artificial swarm intelligence systems. We believe that in the future, swarm intelligence will be an important tool for researchers and engineers interested in solving certain classes of complex problems. To build the foundations of this discipline and to develop an appropriate methodology, we will proceed in parallel both at an abstract level and by tackling a number of challenging problems in selected research domains. The research domains we have chosen are optimization, robotics, networks, and data mining.
Max ERC Funding
2 016 000 €
Duration
Start date: 2010-06-01, End date: 2015-05-31
Project acronym EASY
Project Ejection Accretion Structures in YSOs (EASY)
Researcher (PI) Thomas RAY
Host Institution (HI) DUBLIN INSTITUTE FOR ADVANCED STUDIES
Call Details Advanced Grant (AdG), PE9, ERC-2016-ADG
Summary For a number of reasons, in particular their proximity and the abundant range of diagnostics to determine their characteristics, outflows from young stellar objects (YSOs) offer us the best opportunity of discovering how astrophysical jets are generated and the nature of the link between outflows and their accretion disks. Models predict that the jet is initially launched from within 0.1 to a few au of the star and focused on scales at most ten times larger. Thus, even for the nearest star formation region, we need high spatial resolution to image the “central engine” and test current models.
With these ideas in mind, and the availability of a whole new set of observational and computational resources, it is proposed to investigate the origin of YSO jets, and the jet/accretion zone link, using a number of highly novel approaches to test magneto-hydrodynamic (MHD) models including:
(a) Near-infrared interferometry to determine the spatial distribution and kinematics of the outflow as it is launched as a way of discriminating between competing models.
(b) A multi-epoch study of the strength and configuration of the magnetic field of the parent star to see whether model values and geometries agree with observations and the nature of its variability.
(c) Examining, through high spatial resolution radio observations, how the ionized component of these jets are collimated very close to the source and how shocks in the flow can give rise to low energy cosmic rays.
(d) Use the James Webb Space Telescope (JWST) and, in particular, the Mid-Infrared Instrument (MIRI) and Near-Infrared Spectrograph (NIRSpec) to investigate with high spatial resolution atomic jets from protostars that are still acquiring most of their mass. In addition, we will study how accretion is affected by metallicity by studying young solar-like stars in the low metallicity Magellanic Clouds.
In all cases the required observational campaigns have been approved.
Summary
For a number of reasons, in particular their proximity and the abundant range of diagnostics to determine their characteristics, outflows from young stellar objects (YSOs) offer us the best opportunity of discovering how astrophysical jets are generated and the nature of the link between outflows and their accretion disks. Models predict that the jet is initially launched from within 0.1 to a few au of the star and focused on scales at most ten times larger. Thus, even for the nearest star formation region, we need high spatial resolution to image the “central engine” and test current models.
With these ideas in mind, and the availability of a whole new set of observational and computational resources, it is proposed to investigate the origin of YSO jets, and the jet/accretion zone link, using a number of highly novel approaches to test magneto-hydrodynamic (MHD) models including:
(a) Near-infrared interferometry to determine the spatial distribution and kinematics of the outflow as it is launched as a way of discriminating between competing models.
(b) A multi-epoch study of the strength and configuration of the magnetic field of the parent star to see whether model values and geometries agree with observations and the nature of its variability.
(c) Examining, through high spatial resolution radio observations, how the ionized component of these jets are collimated very close to the source and how shocks in the flow can give rise to low energy cosmic rays.
(d) Use the James Webb Space Telescope (JWST) and, in particular, the Mid-Infrared Instrument (MIRI) and Near-Infrared Spectrograph (NIRSpec) to investigate with high spatial resolution atomic jets from protostars that are still acquiring most of their mass. In addition, we will study how accretion is affected by metallicity by studying young solar-like stars in the low metallicity Magellanic Clouds.
In all cases the required observational campaigns have been approved.
Max ERC Funding
1 853 090 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym ECHR
Project Strengthening the European Court of Human Rights: More Accountability Through Better Legal Reasoning
Researcher (PI) Eva Brems
Host Institution (HI) UNIVERSITEIT GENT
Call Details Starting Grant (StG), SH2, ERC-2009-StG
Summary Human rights are under pressure, in Europe as elsewhere, due to several developments, namely [1] War on terror: the pressures generated by competing discourses [2] Coping with the dangers of rights inflation [3] Conflicting rights: how to handle rights as contested claims [4] The challenges of dealing with universality under fire In this context, the human rights leadership of the European Court of Human Rights is of crucial importance. Yet the Court is not fit for purpose. Inconsistencies and sloppy legal reasoning undermine both its credibility and the impact of its decisions. The research programme that I propose will strengthen the consistency and persuasiveness of Court s legal reasoning so as to improve its accountability and transparency. My aim is to identify new technical solutions for important human rights problems, by the development and application of creative methodologies. The substantive innovations within the field of European human rights law that I propose to make are: [a] the development of new legal tools, which will consistently integrate the accommodation of the particularities of non-dominant groups into the reasoning of the European Court of Human Rights [b] the development of a new theoretical framework combining minimum and maximum approaches to human rights protection, followed by its translation into clear legal criteria for use by the European Court of Human Rights [c] the development of a script that will enable the adoption of a consistent approach by the European Court of Human Rights to conflicts between human rights My methodological approach is characterized by the combination of empirical and normative dimensions, a 360° comparison, and the integration of qualitative research methods (interviews and focus groups with key stakeholders).
Summary
Human rights are under pressure, in Europe as elsewhere, due to several developments, namely [1] War on terror: the pressures generated by competing discourses [2] Coping with the dangers of rights inflation [3] Conflicting rights: how to handle rights as contested claims [4] The challenges of dealing with universality under fire In this context, the human rights leadership of the European Court of Human Rights is of crucial importance. Yet the Court is not fit for purpose. Inconsistencies and sloppy legal reasoning undermine both its credibility and the impact of its decisions. The research programme that I propose will strengthen the consistency and persuasiveness of Court s legal reasoning so as to improve its accountability and transparency. My aim is to identify new technical solutions for important human rights problems, by the development and application of creative methodologies. The substantive innovations within the field of European human rights law that I propose to make are: [a] the development of new legal tools, which will consistently integrate the accommodation of the particularities of non-dominant groups into the reasoning of the European Court of Human Rights [b] the development of a new theoretical framework combining minimum and maximum approaches to human rights protection, followed by its translation into clear legal criteria for use by the European Court of Human Rights [c] the development of a script that will enable the adoption of a consistent approach by the European Court of Human Rights to conflicts between human rights My methodological approach is characterized by the combination of empirical and normative dimensions, a 360° comparison, and the integration of qualitative research methods (interviews and focus groups with key stakeholders).
Max ERC Funding
1 370 000 €
Duration
Start date: 2009-11-01, End date: 2014-10-31
Project acronym ECMETABOLISM
Project Targeting endothelial metabolism: a novel anti-angiogenic therapy
Researcher (PI) Peter Frans Martha Carmeliet
Host Institution (HI) VIB
Call Details Advanced Grant (AdG), LS2, ERC-2010-AdG_20100317
Summary Current anti-angiogenesis based anti-tumor therapy relies on starving tumors by blocking their vascular supply via inhibition of growth factors. However, limitations such as resistance and toxicity, mandate conceptually distinct approaches. We will explore an entirely novel and long-overlooked strategy to discover additional anti-angiogenic candidates, based on the following innovative concept: ¿rather than STARVING TUMORS BY BLOCKING THEIR VASCULAR SUPPLY, we intend TO STARVE BLOOD VESSELS BY BLOCKING THEIR METABOLIC ENERGY SUPPLY¿, so that new vessels cannot form and nourish the growing tumor. This project is a completely new research avenue in our group, but we expect that it will offer refreshing long-term research and translational opportunities for the field.
Because so little is known on endothelial cell (EC) metabolism, we will (i) via a multi-disciplinary systems-biology approach of transcriptomics, proteomics, computational network modeling, metabolomics and flux-omics, draw an endothelio-metabolic map in angiogenesis. This will allow us to identify metabolic regulators of angiogenesis, which will be further validated and characterized in (ii) loss and gain-of-function studies in various angiogenesis models in vitro and (iii) in vivo in zebrafish (knockdown; zinc finger nuclease mediated knockout), providing prescreen data to select the most promising candidates. (iv) EC-specific down-regulation (miR RNAi) or knockout studies of selected candidates in mice will confirm their relevance for angiogenic phenotypes in a preclinical model; and ultimately (v) a translational study evaluating EC metabolism-targeted anti-angiogenic strategies (pharmacological inhibitors, antibodies, small molecular compounds) will be performed in tumor models in the mouse.
Summary
Current anti-angiogenesis based anti-tumor therapy relies on starving tumors by blocking their vascular supply via inhibition of growth factors. However, limitations such as resistance and toxicity, mandate conceptually distinct approaches. We will explore an entirely novel and long-overlooked strategy to discover additional anti-angiogenic candidates, based on the following innovative concept: ¿rather than STARVING TUMORS BY BLOCKING THEIR VASCULAR SUPPLY, we intend TO STARVE BLOOD VESSELS BY BLOCKING THEIR METABOLIC ENERGY SUPPLY¿, so that new vessels cannot form and nourish the growing tumor. This project is a completely new research avenue in our group, but we expect that it will offer refreshing long-term research and translational opportunities for the field.
Because so little is known on endothelial cell (EC) metabolism, we will (i) via a multi-disciplinary systems-biology approach of transcriptomics, proteomics, computational network modeling, metabolomics and flux-omics, draw an endothelio-metabolic map in angiogenesis. This will allow us to identify metabolic regulators of angiogenesis, which will be further validated and characterized in (ii) loss and gain-of-function studies in various angiogenesis models in vitro and (iii) in vivo in zebrafish (knockdown; zinc finger nuclease mediated knockout), providing prescreen data to select the most promising candidates. (iv) EC-specific down-regulation (miR RNAi) or knockout studies of selected candidates in mice will confirm their relevance for angiogenic phenotypes in a preclinical model; and ultimately (v) a translational study evaluating EC metabolism-targeted anti-angiogenic strategies (pharmacological inhibitors, antibodies, small molecular compounds) will be performed in tumor models in the mouse.
Max ERC Funding
2 365 224 €
Duration
Start date: 2011-05-01, End date: 2016-04-30
Project acronym EcoBox
Project Ecosystem in a box: Dissecting the dynamics of a defined microbial community in vitro
Researcher (PI) Karoline FAUST
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), LS2, ERC-2018-STG
Summary The dynamics of microbial communities may be driven by the interactions between community members, controlled by the environment, shaped by immigration or random events, influenced by evolutionary processes or result from an interplay of all these factors. This project aims to improve our understanding of how community structure and the environment impact community dynamics. Towards this aim, a defined in vitro community of human gut bacteria will be assembled, since their genomes are available and their metabolism is comparatively well resolved.
In the first step, we will quantify the intrinsic variability of community dynamics and look for alternative stable states. Next, we will systematically vary community structure as well as nutrient supply and monitor their effects on the dynamics. Finally, we will measure model parameters, evaluate to what extent different community models predict observed community dynamics and validate the models by identifying and experimentally validating keystone species.
Studies of microbial community dynamics are hampered by the cost of obtaining densely sampled time series in replicates and by the difficulty of community manipulation. We will address these challenges by setting up an in vitro system for parallel and automated cultivation in well-controlled conditions and by working with defined communities, where every community member is known.
The proposed project will discern how external factors and community structure drive community dynamics and encode this knowledge in mathematical models. Moreover, the project has the potential to transform our view on alternative microbial communities and their interpretation. In addition, the project will extend our knowledge of human gut microorganisms and their interactions. These insights will ease the design of defined gut communities optimized for therapeutic purposes.
Summary
The dynamics of microbial communities may be driven by the interactions between community members, controlled by the environment, shaped by immigration or random events, influenced by evolutionary processes or result from an interplay of all these factors. This project aims to improve our understanding of how community structure and the environment impact community dynamics. Towards this aim, a defined in vitro community of human gut bacteria will be assembled, since their genomes are available and their metabolism is comparatively well resolved.
In the first step, we will quantify the intrinsic variability of community dynamics and look for alternative stable states. Next, we will systematically vary community structure as well as nutrient supply and monitor their effects on the dynamics. Finally, we will measure model parameters, evaluate to what extent different community models predict observed community dynamics and validate the models by identifying and experimentally validating keystone species.
Studies of microbial community dynamics are hampered by the cost of obtaining densely sampled time series in replicates and by the difficulty of community manipulation. We will address these challenges by setting up an in vitro system for parallel and automated cultivation in well-controlled conditions and by working with defined communities, where every community member is known.
The proposed project will discern how external factors and community structure drive community dynamics and encode this knowledge in mathematical models. Moreover, the project has the potential to transform our view on alternative microbial communities and their interpretation. In addition, the project will extend our knowledge of human gut microorganisms and their interactions. These insights will ease the design of defined gut communities optimized for therapeutic purposes.
Max ERC Funding
1 493 899 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym ELECTRIC
Project Chip Scale Electrically Powered Optical Frequency Combs
Researcher (PI) Bart Johan KUYKEN
Host Institution (HI) UNIVERSITEIT GENT
Call Details Starting Grant (StG), PE7, ERC-2017-STG
Summary In ELECTRIC, I will integrate electrically powered optical frequency combs on mass manufacturable silicon chips. This will allow for making use of all the advantageous properties of these light sources in real-life situations.
Optical frequency combs are light sources with a spectrum consisting of millions of laser lines, equally spaced in frequency. This equifrequency spacing provides a link between the radio frequency band and the optical frequency band of the electromagnetic spectrum. This property has literally revolutionized the field of frequency metrology and precision laser spectroscopy. Recently, their application field has been extended. Amongst others, their unique properties have been exploited in precision distant measurement experiments as well as optical waveform and microwave synthesis demonstrators. Moreover, so called “dual-comb spectroscopy” experiments have demonstrated broadband Fourier Transform Infrared spectroscopy with ultra-high resolution and record acquisition speeds. However, most of these demonstrations required large bulky experimental setups which hampers wide deployment.
I will build frequency combs on optical chips that can be mass-manufactured. Unlike the current chip scale Kerr comb based solutions they do not need to be optically pumped with a powerful continuous wave laser and can have a narrower comb spacing. The challenge here is two-fold. First, we need to make electrically powered integrated low noise oscillators. Second, we need to lower the threshold of current on-chip nonlinear optical interactions by an order of magnitude to use them in on-chip OFC generators.
Specifically I will achieve this goal by:
• Making use of ultra-efficient nonlinear optical interactions based on soliton compression in dispersion engineered III-V waveguides and plasmonic enhanced second order nonlinear materials.
• Enhance the performance of ultra-low noise silicon nitride mode locked lasers with these nonlinear components.
Summary
In ELECTRIC, I will integrate electrically powered optical frequency combs on mass manufacturable silicon chips. This will allow for making use of all the advantageous properties of these light sources in real-life situations.
Optical frequency combs are light sources with a spectrum consisting of millions of laser lines, equally spaced in frequency. This equifrequency spacing provides a link between the radio frequency band and the optical frequency band of the electromagnetic spectrum. This property has literally revolutionized the field of frequency metrology and precision laser spectroscopy. Recently, their application field has been extended. Amongst others, their unique properties have been exploited in precision distant measurement experiments as well as optical waveform and microwave synthesis demonstrators. Moreover, so called “dual-comb spectroscopy” experiments have demonstrated broadband Fourier Transform Infrared spectroscopy with ultra-high resolution and record acquisition speeds. However, most of these demonstrations required large bulky experimental setups which hampers wide deployment.
I will build frequency combs on optical chips that can be mass-manufactured. Unlike the current chip scale Kerr comb based solutions they do not need to be optically pumped with a powerful continuous wave laser and can have a narrower comb spacing. The challenge here is two-fold. First, we need to make electrically powered integrated low noise oscillators. Second, we need to lower the threshold of current on-chip nonlinear optical interactions by an order of magnitude to use them in on-chip OFC generators.
Specifically I will achieve this goal by:
• Making use of ultra-efficient nonlinear optical interactions based on soliton compression in dispersion engineered III-V waveguides and plasmonic enhanced second order nonlinear materials.
• Enhance the performance of ultra-low noise silicon nitride mode locked lasers with these nonlinear components.
Max ERC Funding
1 391 250 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym ELECTROTALK
Project Starting an electrical conversation between microorganisms and electrodes to achieve bioproduction
Researcher (PI) Korneel Pieter Herman Leo Ann Rabaey
Host Institution (HI) UNIVERSITEIT GENT
Call Details Starting Grant (StG), LS9, ERC-2012-StG_20111109
Summary "Electrochemically active bacteria enable a host of novel processes in bioproduction, bioenergy and bioremediation. Key to the success of these processes is effective adherence of the bacterial cells to an electrode surface and subsequent equally effective electron exchange with the electrode. While the cellular mechanisms for electron transfer are increasingly known, what drives bacterial adsorption and desorption to positively or negatively polarized electrodes is largely unknown. Particularly processes driven by cathodes tend to be slow, and suffer from limited microbial adherence and lack of growth of the microorganisms. ELECTROTALK aims at developing a mechanistic understanding of mobility towards and microbial adherence at surfaces, from single cell level to complete biofilm formation. Based on this knowledge, effectively catalyzed bio-electrodes will be developed for novel bioproduction processes. Such bioproduction processes, termed microbial electrosynthesis, are independent of arable land availability, promise high production densities and enable the capture of CO2 or more efficient resource-usage for a range of products. Understanding the nature of the microorganism-electrode interaction will create a window of opportunity to improve this process and achieve effective bioproduction. Moreover, as the electrical interaction directly relates to microbial activity electrodes may serve as a means to start up a conversation with the cells. To achieve our aims we will: (i) select and characterize biocatalysts both as pure cultures and microbial communities; (ii) investigate cell adherence and electron transfer in function of electrode topography and chemistry as well as under different operational conditions; (iii) develop an electrode-microorganism combination achieving effective electron transfer; and (iv) electrochemically construct biofilms with defined structure or stratification."
Summary
"Electrochemically active bacteria enable a host of novel processes in bioproduction, bioenergy and bioremediation. Key to the success of these processes is effective adherence of the bacterial cells to an electrode surface and subsequent equally effective electron exchange with the electrode. While the cellular mechanisms for electron transfer are increasingly known, what drives bacterial adsorption and desorption to positively or negatively polarized electrodes is largely unknown. Particularly processes driven by cathodes tend to be slow, and suffer from limited microbial adherence and lack of growth of the microorganisms. ELECTROTALK aims at developing a mechanistic understanding of mobility towards and microbial adherence at surfaces, from single cell level to complete biofilm formation. Based on this knowledge, effectively catalyzed bio-electrodes will be developed for novel bioproduction processes. Such bioproduction processes, termed microbial electrosynthesis, are independent of arable land availability, promise high production densities and enable the capture of CO2 or more efficient resource-usage for a range of products. Understanding the nature of the microorganism-electrode interaction will create a window of opportunity to improve this process and achieve effective bioproduction. Moreover, as the electrical interaction directly relates to microbial activity electrodes may serve as a means to start up a conversation with the cells. To achieve our aims we will: (i) select and characterize biocatalysts both as pure cultures and microbial communities; (ii) investigate cell adherence and electron transfer in function of electrode topography and chemistry as well as under different operational conditions; (iii) develop an electrode-microorganism combination achieving effective electron transfer; and (iv) electrochemically construct biofilms with defined structure or stratification."
Max ERC Funding
1 494 126 €
Duration
Start date: 2012-10-01, End date: 2017-09-30
Project acronym ELITE
Project Early Life Traces, Evolution, and Implications for Astrobiology
Researcher (PI) Emmanuelle J Javaux
Host Institution (HI) UNIVERSITE DE LIEGE
Call Details Starting Grant (StG), PE10, ERC-2012-StG_20111012
Summary Tracking the early traces of life preserved in very old rocks and reconstructing the major steps of its evolution is an exciting and most challenging domain of research. How amazing it is to have a cell that is 1.5 or 3.2 billion years old under a microscope! From these and other disseminated fragments of life preserved along the geological timescale, one can build the puzzle of biosphere evolution and rising biological complexity. The possibility that life may exist beyond Earth on other habitable planets lies yet at another scale of scientific debates and popular dreams. We have the chance now to live at a time when technology enable us to study in the finest details the very old record of life, or to land on planets with microscope and analytical tools, mimicking a geologist exploring extraterrestrial rocky outcrops to find traces of water and perhaps life. There is still a lot to be done however, to solve major questions of life evolution on Earth, and to look for unambiguous life traces, on Earth or beyond. The project ELiTE aims to provide key answers to some of these fundamental questions.
Astrobiology studies the origin, evolution and distribution of life in the Universe, starting with life on Earth, the only biological planet known so far. The ambitious objectives of the project ELiTE are the following:
1) The identification of Early traces of life and their preservation conditions, in Precambrian rocks of established age
2) The characterization of their biological affinities, using innovative approaches comprising micro to nanoscale morphological, ultrastructural and chemical analyses of fossil and recent analog material
3) The determination of the timing of major steps in evolution. In particular, the project ELiTE aims to decipher two major and inter-related steps in early life evolution and the rise of biological complexity: the evolution of cyanobacteria, responsible for Earth oxygenation and ancestor of the chloroplast, influencing drastically the evolution of life and the planet Earth, and the evolution of the domain Eucarya since LECA (Last Eucaryotic Universal Ancestor).
4) The determination of causes of observed pattern of evolution in relation with the environmental context (oxygenation, impacts, glaciations, tectonics, nutrient availability in changing ocean chemistry) and biological innovations and interactions (ecosystems evolution).
Objective 1 has implications for the search for unambiguous traces of life on Earth and beyond Earth. Objectives 2 to 4 have implications for the understanding of causes and patterns of biological evolution and rise of complexity in Earth life. Providing answers to these most fundamental questions will have major impact on our understanding of early life evolution, with implications for the search for life beyond Earth.
Summary
Tracking the early traces of life preserved in very old rocks and reconstructing the major steps of its evolution is an exciting and most challenging domain of research. How amazing it is to have a cell that is 1.5 or 3.2 billion years old under a microscope! From these and other disseminated fragments of life preserved along the geological timescale, one can build the puzzle of biosphere evolution and rising biological complexity. The possibility that life may exist beyond Earth on other habitable planets lies yet at another scale of scientific debates and popular dreams. We have the chance now to live at a time when technology enable us to study in the finest details the very old record of life, or to land on planets with microscope and analytical tools, mimicking a geologist exploring extraterrestrial rocky outcrops to find traces of water and perhaps life. There is still a lot to be done however, to solve major questions of life evolution on Earth, and to look for unambiguous life traces, on Earth or beyond. The project ELiTE aims to provide key answers to some of these fundamental questions.
Astrobiology studies the origin, evolution and distribution of life in the Universe, starting with life on Earth, the only biological planet known so far. The ambitious objectives of the project ELiTE are the following:
1) The identification of Early traces of life and their preservation conditions, in Precambrian rocks of established age
2) The characterization of their biological affinities, using innovative approaches comprising micro to nanoscale morphological, ultrastructural and chemical analyses of fossil and recent analog material
3) The determination of the timing of major steps in evolution. In particular, the project ELiTE aims to decipher two major and inter-related steps in early life evolution and the rise of biological complexity: the evolution of cyanobacteria, responsible for Earth oxygenation and ancestor of the chloroplast, influencing drastically the evolution of life and the planet Earth, and the evolution of the domain Eucarya since LECA (Last Eucaryotic Universal Ancestor).
4) The determination of causes of observed pattern of evolution in relation with the environmental context (oxygenation, impacts, glaciations, tectonics, nutrient availability in changing ocean chemistry) and biological innovations and interactions (ecosystems evolution).
Objective 1 has implications for the search for unambiguous traces of life on Earth and beyond Earth. Objectives 2 to 4 have implications for the understanding of causes and patterns of biological evolution and rise of complexity in Earth life. Providing answers to these most fundamental questions will have major impact on our understanding of early life evolution, with implications for the search for life beyond Earth.
Max ERC Funding
1 470 736 €
Duration
Start date: 2013-01-01, End date: 2018-12-31
Project acronym ELR1K
Project Enhancing Large-scale chemical Reactions based on Elementary Kinetics
Researcher (PI) Joris Wilfried Maria Cornelius THYBAUT
Host Institution (HI) UNIVERSITEIT GENT
Call Details Proof of Concept (PoC), PC1, ERC-2015-PoC
Summary A proof-of-concept software tool for microkinetic model construction, ready for commercialization will be developed starting from an available version of the tool that has been validated in an academic research context.
A detailed understanding of the elementary steps involved in large-scale chemical reactions provides several advantages: it may not only lead to a better control of the corresponding processes and, hence, safer operation, it also provides a sound basis for enhanced process design. More particularly in the area of catalysis, material development has typically occurred using trial-and-error procedures. While the number of catalysts that could be evaluated has been augmented by so-called high-throughput techniques, an adequate understanding of the underlying phenomena was still limited. Microkinetic modeling is ideally suited to get a view on these phenomena, as it accounts for all elementary steps in the reaction mechanism without any simplifying assumption.
The construction of such microkinetic models and corresponding determination of kinetics and catalyst descriptors is not straightforward and requires dedicated (software) tools. The microkinetic engine (μKE), developed within the ‘Catalytic Reaction Engineering’ research group of the Laboratory for Chemical Technology at Ghent University, is such a unique piece of software to facilitate microkinetic model construction. Without requiring any programming from the end-user, kinetic models based on elementary steps can be developed. It allows identifying the kinetically relevant steps that entail the opportunity to further improve the concerned catalytic materials and, implicitly, also the reactors and processes in which they are employed. The microkinetic engine has been validated against in-house data as well as within a limited number of - exclusive - collaborations. Its anticipated commercialization requires enhanced robustness, complementary functionalities and an increased user friendliness.
Summary
A proof-of-concept software tool for microkinetic model construction, ready for commercialization will be developed starting from an available version of the tool that has been validated in an academic research context.
A detailed understanding of the elementary steps involved in large-scale chemical reactions provides several advantages: it may not only lead to a better control of the corresponding processes and, hence, safer operation, it also provides a sound basis for enhanced process design. More particularly in the area of catalysis, material development has typically occurred using trial-and-error procedures. While the number of catalysts that could be evaluated has been augmented by so-called high-throughput techniques, an adequate understanding of the underlying phenomena was still limited. Microkinetic modeling is ideally suited to get a view on these phenomena, as it accounts for all elementary steps in the reaction mechanism without any simplifying assumption.
The construction of such microkinetic models and corresponding determination of kinetics and catalyst descriptors is not straightforward and requires dedicated (software) tools. The microkinetic engine (μKE), developed within the ‘Catalytic Reaction Engineering’ research group of the Laboratory for Chemical Technology at Ghent University, is such a unique piece of software to facilitate microkinetic model construction. Without requiring any programming from the end-user, kinetic models based on elementary steps can be developed. It allows identifying the kinetically relevant steps that entail the opportunity to further improve the concerned catalytic materials and, implicitly, also the reactors and processes in which they are employed. The microkinetic engine has been validated against in-house data as well as within a limited number of - exclusive - collaborations. Its anticipated commercialization requires enhanced robustness, complementary functionalities and an increased user friendliness.
Max ERC Funding
150 000 €
Duration
Start date: 2016-09-01, End date: 2017-08-31
Project acronym EMIS
Project An Intense Summer Monsoon in a Cool World, Climate and East Asian Monsoon during Interglacials with a special emphasis on the Interglacials 500,000 years ago and before
Researcher (PI) André, Léon Berger
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Advanced Grant (AdG), PE10, ERC-2008-AdG
Summary Asian monsoon is a spectacular occurrence in the climate system. What make it so powerful are the combination of thermal contrast between the World s largest landmass (Eurasian continent) and ocean basin (the Indo-Pacific Ocean) and the presence of the World s largest ridge, the Tibetan Plateau. Climatologically, monsoon regions are the most convectively active areas and account for the majority of global atmospheric heat and moisture transport. Moreover, the economy, culture and rhythms of life of 60% of humanity are critically influenced by the evolution and variability of the Asian monsoon. The need to better understand the monsoon leads inevitably to the close inspection of its activity during the geological times to provide a long-term perspective from which any future change may be more effectively assessed. Our research proposal aims to understand the seeming paradox of the exceptionally intense East Asian summer monsoon (actually the strongest over the last one million years) which occurred during the relatively cool interglacial (MIS-13), 500,000 years ago. This will be done using first a model of intermediate complexity (LOVECLIM) to achieve a number of sensitivity experiments to the astronomical forcing, the Eurasian and North American ice sheets, the Tibetan Plateau and the Ocean. Ocean-atmosphere coupled general circulation models will then be used to confirm the main processes underlined by LOVECLIM, in particular those related to the wave train topographically induced by the Eurasian ice sheet, to the Tibetan Plateau, to the sea-surface temperature and to their role in reinforcing the East Asian summer monsoon. This monsoon of MIS-13 will be compared with the monsoon which occurred during the other interglacials of the upper Pleistocene and Holocene (about the last 700,000 years). All simulation results will be compared with the available proxy records, in particular-but not exclusively-those coming from the loess-soil sequences in China.
Summary
Asian monsoon is a spectacular occurrence in the climate system. What make it so powerful are the combination of thermal contrast between the World s largest landmass (Eurasian continent) and ocean basin (the Indo-Pacific Ocean) and the presence of the World s largest ridge, the Tibetan Plateau. Climatologically, monsoon regions are the most convectively active areas and account for the majority of global atmospheric heat and moisture transport. Moreover, the economy, culture and rhythms of life of 60% of humanity are critically influenced by the evolution and variability of the Asian monsoon. The need to better understand the monsoon leads inevitably to the close inspection of its activity during the geological times to provide a long-term perspective from which any future change may be more effectively assessed. Our research proposal aims to understand the seeming paradox of the exceptionally intense East Asian summer monsoon (actually the strongest over the last one million years) which occurred during the relatively cool interglacial (MIS-13), 500,000 years ago. This will be done using first a model of intermediate complexity (LOVECLIM) to achieve a number of sensitivity experiments to the astronomical forcing, the Eurasian and North American ice sheets, the Tibetan Plateau and the Ocean. Ocean-atmosphere coupled general circulation models will then be used to confirm the main processes underlined by LOVECLIM, in particular those related to the wave train topographically induced by the Eurasian ice sheet, to the Tibetan Plateau, to the sea-surface temperature and to their role in reinforcing the East Asian summer monsoon. This monsoon of MIS-13 will be compared with the monsoon which occurred during the other interglacials of the upper Pleistocene and Holocene (about the last 700,000 years). All simulation results will be compared with the available proxy records, in particular-but not exclusively-those coming from the loess-soil sequences in China.
Max ERC Funding
893 880 €
Duration
Start date: 2008-11-01, End date: 2013-10-31
Project acronym ENIGMO
Project "Gut microbiota, innate immunity and endocannabinoid system interactions link metabolic inflammation with the hallmarks of obesity and type 2 diabetes"
Researcher (PI) Patrice Daniel Cani
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Starting Grant (StG), LS4, ERC-2013-StG
Summary "Obesity and type 2 diabetes are characterized by metabolic inflammation and an altered endocannabinoid system (eCB) tone. We have provided evidence that gut microbiota modulate both intestinal and adipose tissue eCB system tone. Insulin resistance and inflammation have been linked to microbiota-host interaction via different Toll-Like Receptors (TLR’s). Our preliminary data show that tamoxifen-induced epithelial intestinal cells deletion of the key signalling adaptor MyD88 (myeloid differentiation primary-response gene 88), that encompass most of the TLR’s, protect mice against diet-induced obesity and inflammation. A phenomenon closely linked with changes in the intestinal eCB system tone and antimicrobial peptides production. Moreover, we discovered that the recently identified bacteria living in the mucus layer, namely Akkermansia muciniphila, plays a central role in the regulation of host energy metabolism by putative mechanisms linking both the intestinal eCB system and the innate immune system. Thus these preliminary data support the existence of unidentified mechanisms linking the innate immune system, the gut microbiota and host metabolism. In this high-risk/high-gain research program, we propose to elucidate what could be one of the most fundamental processes shared by different key hallmarks of obesity and related diseases. A careful and thorough analysis of the molecular and cellular events linking gut microbiota, the innate immune system and eCB system in specific organs has the potential to unravel new therapeutic targets. We anticipate the key role of MyD88 and the enzyme NAPE-PLD (N-acylphosphatidylethanolamine phospholipase-D) involved in the synthesis of N-acylethanolamines family to be key determinant in such pathophysiological aspects. Thus, these approaches could provide different perspectives about disease pathogenesis and knowledge-based evidence of new therapeutic options for obesity and associated metabolic disorders in the future."
Summary
"Obesity and type 2 diabetes are characterized by metabolic inflammation and an altered endocannabinoid system (eCB) tone. We have provided evidence that gut microbiota modulate both intestinal and adipose tissue eCB system tone. Insulin resistance and inflammation have been linked to microbiota-host interaction via different Toll-Like Receptors (TLR’s). Our preliminary data show that tamoxifen-induced epithelial intestinal cells deletion of the key signalling adaptor MyD88 (myeloid differentiation primary-response gene 88), that encompass most of the TLR’s, protect mice against diet-induced obesity and inflammation. A phenomenon closely linked with changes in the intestinal eCB system tone and antimicrobial peptides production. Moreover, we discovered that the recently identified bacteria living in the mucus layer, namely Akkermansia muciniphila, plays a central role in the regulation of host energy metabolism by putative mechanisms linking both the intestinal eCB system and the innate immune system. Thus these preliminary data support the existence of unidentified mechanisms linking the innate immune system, the gut microbiota and host metabolism. In this high-risk/high-gain research program, we propose to elucidate what could be one of the most fundamental processes shared by different key hallmarks of obesity and related diseases. A careful and thorough analysis of the molecular and cellular events linking gut microbiota, the innate immune system and eCB system in specific organs has the potential to unravel new therapeutic targets. We anticipate the key role of MyD88 and the enzyme NAPE-PLD (N-acylphosphatidylethanolamine phospholipase-D) involved in the synthesis of N-acylethanolamines family to be key determinant in such pathophysiological aspects. Thus, these approaches could provide different perspectives about disease pathogenesis and knowledge-based evidence of new therapeutic options for obesity and associated metabolic disorders in the future."
Max ERC Funding
1 494 640 €
Duration
Start date: 2013-10-01, End date: 2018-09-30
Project acronym ENVIROIMMUNE
Project Environmental modulators of the immune cell balance in health and disease
Researcher (PI) Markus Kleinewietfeld
Host Institution (HI) VIB
Call Details Starting Grant (StG), LS6, ERC-2014-STG
Summary The incidence of autoimmune diseases in developed societies is increasing at high rates, but the underlying cause for this phenomenon has not been elucidated yet. Since the genetic architect remains considerably stable, this increase is likely associated with changes in the environment. Autoimmunity is linked to an imbalance of pro-inflammatory Th17 cells and anti-inflammatory Foxp3+ regulatory T cells (Treg). However, little is known regarding environmental factors that influence the Th17/Treg balance. We recently discovered that a sodium-rich diet severely exacerbates experimental autoimmune encephalomyelitis (EAE) through an increased induction of pathogenic Th17 cells. Surprisingly, our preliminary data indicate that high-salt conditions also significantly impair Treg function, resembling a phenotype observed in several human autoimmune diseases. In addition, we have evidence that a high-salt diet affects the gut microbiota, implicating possible indirect effects on immune cells in vivo. Based on these findings we hypothesize that excess dietary salt represents an environmental risk factor for autoimmune diseases by modulating the Th17/Treg balance by several direct and indirect mechanisms. To address this hypothesis we will 1) examine the underlying mechanisms of high-salt induced Treg dysfunction and effects on the Treg/Th17 balance by molecular and functional analysis in vitro and compare it to known risk variants of human autoimmune diseases, and 2) define direct and indirect effects of excess dietary salt on the Th17/Treg balance and autoimmunity in vivo and explore potential novel pathways for targeted interventions. Thus, the proposed study will uncover the impact of a newly discovered environmental modulator of the immune cell balance and will ultimately pave the way for new approaches in therapy and prevention of autoimmune diseases.
Summary
The incidence of autoimmune diseases in developed societies is increasing at high rates, but the underlying cause for this phenomenon has not been elucidated yet. Since the genetic architect remains considerably stable, this increase is likely associated with changes in the environment. Autoimmunity is linked to an imbalance of pro-inflammatory Th17 cells and anti-inflammatory Foxp3+ regulatory T cells (Treg). However, little is known regarding environmental factors that influence the Th17/Treg balance. We recently discovered that a sodium-rich diet severely exacerbates experimental autoimmune encephalomyelitis (EAE) through an increased induction of pathogenic Th17 cells. Surprisingly, our preliminary data indicate that high-salt conditions also significantly impair Treg function, resembling a phenotype observed in several human autoimmune diseases. In addition, we have evidence that a high-salt diet affects the gut microbiota, implicating possible indirect effects on immune cells in vivo. Based on these findings we hypothesize that excess dietary salt represents an environmental risk factor for autoimmune diseases by modulating the Th17/Treg balance by several direct and indirect mechanisms. To address this hypothesis we will 1) examine the underlying mechanisms of high-salt induced Treg dysfunction and effects on the Treg/Th17 balance by molecular and functional analysis in vitro and compare it to known risk variants of human autoimmune diseases, and 2) define direct and indirect effects of excess dietary salt on the Th17/Treg balance and autoimmunity in vivo and explore potential novel pathways for targeted interventions. Thus, the proposed study will uncover the impact of a newly discovered environmental modulator of the immune cell balance and will ultimately pave the way for new approaches in therapy and prevention of autoimmune diseases.
Max ERC Funding
1 499 041 €
Duration
Start date: 2015-08-01, End date: 2020-07-31
Project acronym ENVIRONAGE
Project Unravelling environmental exposures
in the core axis of ageing
Researcher (PI) Tim Nawrot
Host Institution (HI) UNIVERSITEIT HASSELT
Call Details Starting Grant (StG), LS7, ERC-2012-StG_20111109
Summary "AGEing is a complex phenotype responsive to a plethora of ENVIRONmental inputs (ENVIRONAGE). Age related conditions in adults often find their origin in risk factors operative in early life. The molecular mechanisms behind these phenomena remain largely unknown. Mitochondria are involved in a variety of critical cell functions, including oxidative energy production and programmed cell death. Recently, I established in a study of 175 newborns, a strong association between mitochondrial DNA content and in utero exposure to particulate air pollution. Telomere length is highly heritable and erosion leads to an increasingly vulnerable structural integrity of the chromosomes. It is considered a marker of overall biological age compared with chronological age. In this regard, I demonstrated the heritability of telomere length and the influence of smoking on telomere erosion. These results support the ENVIRONAGE hypothesis, i.e. that environmental inputs influence biomolecular markers of ageing including mitochondrial function, telomere length along with DNA repair and epigenetics as the ‘core axis of ageing’. The aim is to establish prospective epidemiological evidence for molecular mechanisms or early biomarkers, which may underlie the origins or reflect the risk of age-related diseases and to understand its association with other processes and the influence of environmental factors. To this end, I will establish a birth cohort and a cohort in middle-aged and elderly. I measure environmental pollutants, in interaction with parameters that I consider to have an important role in the ageing process (mitochondrial function, telomere length, epigenetics and DNA repair capacity). ENVIRONAGE integrates environmental influences and molecular mechanisms on ageing. The common molecular epidemiological strategies in newborns, middle-aged and elderly to unravel the environmental influence on ageing are groundbreaking."
Summary
"AGEing is a complex phenotype responsive to a plethora of ENVIRONmental inputs (ENVIRONAGE). Age related conditions in adults often find their origin in risk factors operative in early life. The molecular mechanisms behind these phenomena remain largely unknown. Mitochondria are involved in a variety of critical cell functions, including oxidative energy production and programmed cell death. Recently, I established in a study of 175 newborns, a strong association between mitochondrial DNA content and in utero exposure to particulate air pollution. Telomere length is highly heritable and erosion leads to an increasingly vulnerable structural integrity of the chromosomes. It is considered a marker of overall biological age compared with chronological age. In this regard, I demonstrated the heritability of telomere length and the influence of smoking on telomere erosion. These results support the ENVIRONAGE hypothesis, i.e. that environmental inputs influence biomolecular markers of ageing including mitochondrial function, telomere length along with DNA repair and epigenetics as the ‘core axis of ageing’. The aim is to establish prospective epidemiological evidence for molecular mechanisms or early biomarkers, which may underlie the origins or reflect the risk of age-related diseases and to understand its association with other processes and the influence of environmental factors. To this end, I will establish a birth cohort and a cohort in middle-aged and elderly. I measure environmental pollutants, in interaction with parameters that I consider to have an important role in the ageing process (mitochondrial function, telomere length, epigenetics and DNA repair capacity). ENVIRONAGE integrates environmental influences and molecular mechanisms on ageing. The common molecular epidemiological strategies in newborns, middle-aged and elderly to unravel the environmental influence on ageing are groundbreaking."
Max ERC Funding
1 473 910 €
Duration
Start date: 2013-03-01, End date: 2018-02-28
Project acronym EPIC
Project Earth-like Planet Imaging with Cognitive computing
Researcher (PI) Olivier ABSIL
Host Institution (HI) UNIVERSITE DE LIEGE
Call Details Consolidator Grant (CoG), PE9, ERC-2018-COG
Summary One of the most ambitious goals of modern astrophysics is to characterise the physical and chemical properties of rocky planets orbiting in the habitable zone of nearby Sun-like stars. Although the observation of planetary transits could in a few limited cases be used to reach such a goal, it is widely recognised that only direct imaging techniques will enable such a feat on a statistically significant sample of planetary systems. Direct imaging of Earth-like exoplanets is however a formidable challenge due to the huge contrast and minute angular separation between such planets and their host star. The proposed EPIC project aims to enable the direct detection and characterisation of terrestrial planets located in the habitable zone of nearby stars using ground-based high-contrast imaging in the thermal infrared domain. To reach that ambitious goal, the project will focus on two main research directions: (i) the development and implementation of high-contrast imaging techniques and technologies addressing the smallest possible angular separations from bright, nearby stars, and (ii) the adaptation of state-of-the-art machine learning techniques to the problem of image processing in high-contrast imaging. While the ultimate goal of this research can likely only be reached with the advent of giant telescopes such as the Extremely Large Telescope (ELT) around 2025, the EPIC project will lay the stepping stones towards that goal and produce several high-impact results along the way, e.g. by re-assessing the occurrence rate of giant planets in direct imaging surveys at the most relevant angular separations (i.e., close to the snow line), by conducting the deepest high-contrast imaging search for rocky planets in the alpha Centauri system, by preparing the scientific exploitation of the ELT, and by providing the first open-source high-contrast image processing toolbox relying on supervised machine learning techniques.
Summary
One of the most ambitious goals of modern astrophysics is to characterise the physical and chemical properties of rocky planets orbiting in the habitable zone of nearby Sun-like stars. Although the observation of planetary transits could in a few limited cases be used to reach such a goal, it is widely recognised that only direct imaging techniques will enable such a feat on a statistically significant sample of planetary systems. Direct imaging of Earth-like exoplanets is however a formidable challenge due to the huge contrast and minute angular separation between such planets and their host star. The proposed EPIC project aims to enable the direct detection and characterisation of terrestrial planets located in the habitable zone of nearby stars using ground-based high-contrast imaging in the thermal infrared domain. To reach that ambitious goal, the project will focus on two main research directions: (i) the development and implementation of high-contrast imaging techniques and technologies addressing the smallest possible angular separations from bright, nearby stars, and (ii) the adaptation of state-of-the-art machine learning techniques to the problem of image processing in high-contrast imaging. While the ultimate goal of this research can likely only be reached with the advent of giant telescopes such as the Extremely Large Telescope (ELT) around 2025, the EPIC project will lay the stepping stones towards that goal and produce several high-impact results along the way, e.g. by re-assessing the occurrence rate of giant planets in direct imaging surveys at the most relevant angular separations (i.e., close to the snow line), by conducting the deepest high-contrast imaging search for rocky planets in the alpha Centauri system, by preparing the scientific exploitation of the ELT, and by providing the first open-source high-contrast image processing toolbox relying on supervised machine learning techniques.
Max ERC Funding
2 178 125 €
Duration
Start date: 2019-05-01, End date: 2024-04-30
Project acronym EpiTALL
Project Dynamic interplay between DNA methylation, histone modifications and super enhancer activity in normal T cells and during malignant T cell transformation
Researcher (PI) Pieter Van vlierberghe
Host Institution (HI) UNIVERSITEIT GENT
Call Details Starting Grant (StG), LS4, ERC-2014-STG
Summary Dynamic interplay between histone modifications and DNA methylation defines the chromatin structure of the humane genome and serves as a conceptual framework to understand transcriptional regulation in normal development and human disease. The ultimate goal of this research proposal is to study the chromatin architecture during normal and malignant T cell differentiation in order to define how DNA methylation drives oncogenic gene expression as a novel concept in cancer research.
T-cell acute lymphoblastic leukemia (T-ALL) accounts for 15% of pediatric and 25% of adult ALL cases and was originally identified as a highly aggressive tumor entity. T-ALL therapy has been intensified leading to gradual improvements in survival. However, 20% of pediatric and 50% of adult T-ALL cases still relapse and ultimately die because of refractory disease. Research efforts have unravelled the complex genetic basis of T-ALL but failed to identify new promising targets for precision therapy.
Recent studies have identified a subset of T-ALLs whose transcriptional programs resemble those of early T-cell progenitors (ETPs), myeloid precursors and hematopoietic stem cells. Importantly, these so-called ETP-ALLs are characterized by early treatment failure and an extremely poor prognosis. The unique ETP-ALL gene expression signature suggests that the epigenomic landscape in ETP-ALL is markedly different as compared to other genetic subtypes of human T-ALL.
My project aims to identify genome-wide patterns of DNA methylation and histone modifications in genetic subtypes of human T-ALL as a basis for elucidating how DNA methylation drives the expression of critical oncogenes in the context of poor prognostic ETP-ALL. Given that these ETP-ALL patients completely fail current chemotherapy treatment, tackling this completely novel aspect of ETP-ALL genetics will yield new targets for therapeutic intervention in this aggressive haematological malignancy.
Summary
Dynamic interplay between histone modifications and DNA methylation defines the chromatin structure of the humane genome and serves as a conceptual framework to understand transcriptional regulation in normal development and human disease. The ultimate goal of this research proposal is to study the chromatin architecture during normal and malignant T cell differentiation in order to define how DNA methylation drives oncogenic gene expression as a novel concept in cancer research.
T-cell acute lymphoblastic leukemia (T-ALL) accounts for 15% of pediatric and 25% of adult ALL cases and was originally identified as a highly aggressive tumor entity. T-ALL therapy has been intensified leading to gradual improvements in survival. However, 20% of pediatric and 50% of adult T-ALL cases still relapse and ultimately die because of refractory disease. Research efforts have unravelled the complex genetic basis of T-ALL but failed to identify new promising targets for precision therapy.
Recent studies have identified a subset of T-ALLs whose transcriptional programs resemble those of early T-cell progenitors (ETPs), myeloid precursors and hematopoietic stem cells. Importantly, these so-called ETP-ALLs are characterized by early treatment failure and an extremely poor prognosis. The unique ETP-ALL gene expression signature suggests that the epigenomic landscape in ETP-ALL is markedly different as compared to other genetic subtypes of human T-ALL.
My project aims to identify genome-wide patterns of DNA methylation and histone modifications in genetic subtypes of human T-ALL as a basis for elucidating how DNA methylation drives the expression of critical oncogenes in the context of poor prognostic ETP-ALL. Given that these ETP-ALL patients completely fail current chemotherapy treatment, tackling this completely novel aspect of ETP-ALL genetics will yield new targets for therapeutic intervention in this aggressive haematological malignancy.
Max ERC Funding
958 750 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym EPLORE
Project EPidemiological Left ventriclar Outcomes Research in Europe
Researcher (PI) Jan Albert Hendrik Staessen
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), LS7, ERC-2011-ADG_20110310
Summary Heart failure (HF) affects 15 million Europeans and entails higher mortality and health care costs than cancer. EPLORE addresses this issue by prospective epidemiological research in 4 European countries and by a proof-of-concept clinical trial. WP1 will for the first time at the population level document the incidence and progression of subclinical LV dysfunction and clarify whether asymptomatic LV dysfunction, as picked up by the newest echocardiographic techniques, predicts cardiovascular (CV) outcomes, including HF. WP2 will investigate the contribution of ventricular-arterial coupling disease and mechanical LV dyssynchrony to subclinical LV dysfunction. WP3 will identify a set of urinary polypeptides that signify early LV dysfunction and validate these biomarkers by showing that they predict deterioration of LV function, progression to HF and the incidence of CV complications over and beyond established risk factors. WP3 will also search for novel panels of circulating biomarkers, of which combined measurement will add information (accuracy, sensitivity and specificity) to established biomarkers (e.g., NT-proBNP) and identify genetic variants involved in the progression of LV dysfunction, either causally or as biomarker. WP4 consists of a randomised clinical trial to translate in a high-risk high-gain setting the results of WPs 1-3 into clinical practice and to identify a new treatment modality that potentially slows progression of diastolic LV dysfunction. Dissemination in WP5 will contribute to new guidelines for the prevention and treatment of HF. WP6 includes governance, monitoring research strategies and output, and protection of IPR. In conclusion, EPLORE will advance risk stratification and the early diagnosis of subclinical HF. The project will potentially result into specific treatments for diastolic LV dysfunction and inform guidelines for prevention and treatment of HF. It will benefit 20% of Europeans who currently have subclinical LV dysfunction.
Summary
Heart failure (HF) affects 15 million Europeans and entails higher mortality and health care costs than cancer. EPLORE addresses this issue by prospective epidemiological research in 4 European countries and by a proof-of-concept clinical trial. WP1 will for the first time at the population level document the incidence and progression of subclinical LV dysfunction and clarify whether asymptomatic LV dysfunction, as picked up by the newest echocardiographic techniques, predicts cardiovascular (CV) outcomes, including HF. WP2 will investigate the contribution of ventricular-arterial coupling disease and mechanical LV dyssynchrony to subclinical LV dysfunction. WP3 will identify a set of urinary polypeptides that signify early LV dysfunction and validate these biomarkers by showing that they predict deterioration of LV function, progression to HF and the incidence of CV complications over and beyond established risk factors. WP3 will also search for novel panels of circulating biomarkers, of which combined measurement will add information (accuracy, sensitivity and specificity) to established biomarkers (e.g., NT-proBNP) and identify genetic variants involved in the progression of LV dysfunction, either causally or as biomarker. WP4 consists of a randomised clinical trial to translate in a high-risk high-gain setting the results of WPs 1-3 into clinical practice and to identify a new treatment modality that potentially slows progression of diastolic LV dysfunction. Dissemination in WP5 will contribute to new guidelines for the prevention and treatment of HF. WP6 includes governance, monitoring research strategies and output, and protection of IPR. In conclusion, EPLORE will advance risk stratification and the early diagnosis of subclinical HF. The project will potentially result into specific treatments for diastolic LV dysfunction and inform guidelines for prevention and treatment of HF. It will benefit 20% of Europeans who currently have subclinical LV dysfunction.
Max ERC Funding
2 391 440 €
Duration
Start date: 2012-07-01, End date: 2017-06-30
Project acronym EPOS CRYSTALLI
Project Epitaxial thin-film organic semiconductor crystals and devices
Researcher (PI) Paul Heremans
Host Institution (HI) INTERUNIVERSITAIR MICRO-ELECTRONICA CENTRUM
Call Details Advanced Grant (AdG), PE7, ERC-2012-ADG_20120216
Summary "Today, organic semiconductor devices are severely limited by the strong disorder in the amorphous or polycrystalline semiconductor films. This disorder is in fact due to the nature of the films, and is NOT an intrinsic molecular property. Indeed, single-crystal organic semiconductors are known, and display exciting characteristics and high performance. Unfortunately, they are today only grown as individual objects, not applicable to integrable thin-film transistors (TFT), solar cells (OPV), and light-emitting diodes (OLED) or transistors (OLET).
In this project, we propose a radical shift in the film formation of organic semiconductors, to master the nucleation and growth of highly crystalline thin films on arbitrary surfaces. We propose several possible templates for crystal growth, control of nucleation sites and new techniques to impose gradients in supersaturation of the environment from which the molecules condense in a growing crystal. Fundamental understanding of the thin-film crystal forming processes will be acquired by in-situ monitoring, and by modelling of nucleation and growth processes. We will apply similar methodologies to hetero-epitaxy of thin-film crystals, i.e. growth of crystalline layers of different types of molecules, and to doping of crystals. This will open a gateway to use the immense libraries of organic semiconducting molecules for application in high-performance crystalline heterojunction devices.
Proof-of-principle devices will complement the materials science study and establish new research domains. We propose integrable crystalline TFTs, as these are also useful to further probe the physics of crystalline organic semiconductors. Crystalline heterojunction OPVs promise combined high exciton diffusion lengths and carrier mobilities. We will explore the benefits of crystallinity in heterojunction OLEDs and OLETs towards higher current densities and brightness, which may lead to the elusive electrically pumped organic laser."
Summary
"Today, organic semiconductor devices are severely limited by the strong disorder in the amorphous or polycrystalline semiconductor films. This disorder is in fact due to the nature of the films, and is NOT an intrinsic molecular property. Indeed, single-crystal organic semiconductors are known, and display exciting characteristics and high performance. Unfortunately, they are today only grown as individual objects, not applicable to integrable thin-film transistors (TFT), solar cells (OPV), and light-emitting diodes (OLED) or transistors (OLET).
In this project, we propose a radical shift in the film formation of organic semiconductors, to master the nucleation and growth of highly crystalline thin films on arbitrary surfaces. We propose several possible templates for crystal growth, control of nucleation sites and new techniques to impose gradients in supersaturation of the environment from which the molecules condense in a growing crystal. Fundamental understanding of the thin-film crystal forming processes will be acquired by in-situ monitoring, and by modelling of nucleation and growth processes. We will apply similar methodologies to hetero-epitaxy of thin-film crystals, i.e. growth of crystalline layers of different types of molecules, and to doping of crystals. This will open a gateway to use the immense libraries of organic semiconducting molecules for application in high-performance crystalline heterojunction devices.
Proof-of-principle devices will complement the materials science study and establish new research domains. We propose integrable crystalline TFTs, as these are also useful to further probe the physics of crystalline organic semiconductors. Crystalline heterojunction OPVs promise combined high exciton diffusion lengths and carrier mobilities. We will explore the benefits of crystallinity in heterojunction OLEDs and OLETs towards higher current densities and brightness, which may lead to the elusive electrically pumped organic laser."
Max ERC Funding
2 499 408 €
Duration
Start date: 2013-01-01, End date: 2017-12-31
Project acronym EQUOP
Project Equal opportunities for migrant youth in educational systems with high levels of social and ethnic segregation: assessing the impact of school team resources
Researcher (PI) Dirk Jean Alexander Jacobs
Host Institution (HI) UNIVERSITE LIBRE DE BRUXELLES
Call Details Starting Grant (StG), SH2, ERC-2011-StG_20101124
Summary Although a gap in educational performance of migrant children compared to children without a migration background is to be observed in most industrialized countries, it is particularly big in countries as Belgium, Germany, Austria and the Netherlands, as has been attested by the PISA-data. Social and ethnic segregation, which is particularly high in these educational systems, seems to be one of the important explanatory factors. This project wants to disentangle what are the crucial factors by which this high level of segregation impacts on unequal opportunities for immigrant children. Going beyond the classic composition effect model (looking at peer group effects, i.e. positive or negative influences of pupils on each other), this project wants to also examine the potential impact of differentiated teacher profiles on group performance. The project wishes to test the hypothesis that the link between school composition and educational performance is a (partly) spurious effect, caused by mediating effect of teacher characteristics. We hypothesize that better skilled and more positively oriented teachers are overrepresented in schools with an 'easier' school population, while so-called 'difficult' schools (populated by working-class immigrant children) have difficulty in attracting and - especially - keeping competent and motivated staff. In order to examine this hypothesis a mixed methods approach will be used, combining quantitative statistical analysis (on new and existing data, for instance multi-level analysis of the PISA-data set and other eligible datasets), qualitative case studies and focus groups. Secondary analysis of existing data-sets (PISA, TIMMS, PIRLS) will be undertaken and new data will be collected (taking the Flemish and Francophone educational systems in Belgium as case-studies).
Summary
Although a gap in educational performance of migrant children compared to children without a migration background is to be observed in most industrialized countries, it is particularly big in countries as Belgium, Germany, Austria and the Netherlands, as has been attested by the PISA-data. Social and ethnic segregation, which is particularly high in these educational systems, seems to be one of the important explanatory factors. This project wants to disentangle what are the crucial factors by which this high level of segregation impacts on unequal opportunities for immigrant children. Going beyond the classic composition effect model (looking at peer group effects, i.e. positive or negative influences of pupils on each other), this project wants to also examine the potential impact of differentiated teacher profiles on group performance. The project wishes to test the hypothesis that the link between school composition and educational performance is a (partly) spurious effect, caused by mediating effect of teacher characteristics. We hypothesize that better skilled and more positively oriented teachers are overrepresented in schools with an 'easier' school population, while so-called 'difficult' schools (populated by working-class immigrant children) have difficulty in attracting and - especially - keeping competent and motivated staff. In order to examine this hypothesis a mixed methods approach will be used, combining quantitative statistical analysis (on new and existing data, for instance multi-level analysis of the PISA-data set and other eligible datasets), qualitative case studies and focus groups. Secondary analysis of existing data-sets (PISA, TIMMS, PIRLS) will be undertaken and new data will be collected (taking the Flemish and Francophone educational systems in Belgium as case-studies).
Max ERC Funding
1 276 071 €
Duration
Start date: 2012-01-01, End date: 2017-06-30
Project acronym ERQUAF
Project Entanglement and Renormalisation for Quantum Fields
Researcher (PI) Jutho Jan J HAEGEMAN
Host Institution (HI) UNIVERSITEIT GENT
Call Details Starting Grant (StG), PE2, ERC-2016-STG
Summary Over the past fifteen years, the paradigm of quantum entanglement has revolutionised the understanding of strongly correlated lattice systems. Entanglement and closely related concepts originating from quantum information theory are optimally suited for quantifying and characterising quantum correlations and have therefore proven instrumental for the classification of the exotic phases discovered in condensed quantum matter. One groundbreaking development originating from this research is a novel class of variational many body wave functions known as tensor network states. Their explicit local structure and unique entanglement features make them very flexible and extremely powerful both as a numerical simulation method and as a theoretical tool.
The goal of this proposal is to lift this “entanglement methodology” into the realm of quantum field theory. In high energy physics, the widespread interest in entanglement has only been triggered recently due to the intriguing connections between entanglement and the structure of spacetime that arise in black hole physics and quantum gravity. During the past few years, direct continuum limits of various tensor network ansätze have been formulated. However, the application thereof is largely unexplored territory and holds promising potential. This proposal formulates several advancements and developments for the theoretical and computational study of continuous quantum systems, gauge theories and exotic quantum phases, but also for establishing the intricate relation between entanglement, renormalisation and geometry in the context of the holographic principle. Ultimately, these developments will radically alter the way in which to approach some of the most challenging questions in physics, ranging from the simulation of cold atom systems to non-equilibrium or high-density situations in quantum chromodynamics and the standard model.
Summary
Over the past fifteen years, the paradigm of quantum entanglement has revolutionised the understanding of strongly correlated lattice systems. Entanglement and closely related concepts originating from quantum information theory are optimally suited for quantifying and characterising quantum correlations and have therefore proven instrumental for the classification of the exotic phases discovered in condensed quantum matter. One groundbreaking development originating from this research is a novel class of variational many body wave functions known as tensor network states. Their explicit local structure and unique entanglement features make them very flexible and extremely powerful both as a numerical simulation method and as a theoretical tool.
The goal of this proposal is to lift this “entanglement methodology” into the realm of quantum field theory. In high energy physics, the widespread interest in entanglement has only been triggered recently due to the intriguing connections between entanglement and the structure of spacetime that arise in black hole physics and quantum gravity. During the past few years, direct continuum limits of various tensor network ansätze have been formulated. However, the application thereof is largely unexplored territory and holds promising potential. This proposal formulates several advancements and developments for the theoretical and computational study of continuous quantum systems, gauge theories and exotic quantum phases, but also for establishing the intricate relation between entanglement, renormalisation and geometry in the context of the holographic principle. Ultimately, these developments will radically alter the way in which to approach some of the most challenging questions in physics, ranging from the simulation of cold atom systems to non-equilibrium or high-density situations in quantum chromodynamics and the standard model.
Max ERC Funding
1 499 375 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym ERSTRESS
Project Role of Endoplasmic Reticulum Stress in dendritic cells and immune-mediated lung diseases
Researcher (PI) Bart Lambrecht
Host Institution (HI) UNIVERSITEIT GENT
Call Details Starting Grant (StG), LS6, ERC-2010-StG_20091118
Summary My overall aim is to understand the physiologic and medical importance of lung dendritic cells (DC) and to define the suitability of inhibitors of their function for the treatment of inflammatory lung diseases like asthma and COPD.
Lung dendritic cells (DC) play crucial roles in the regulation of lung immunity. We still do not fully understand how they get activated in response to different types of environmental triggers like allergens, cigarette smoke and pathogens. Although recognition of conserved motifs by pattern recognition receptors on DCs could be a key event, these stimuli are also accompanied by accumulation of unfolded proteins in the endoplasmic reticulum (ER). Cells respond by mounting the unfolded protein response (UPR) that acts to ameliorate protein folding, but intersects with metabolism, induction of alarm signals and cellular suicide mechanisms. I hypothesize that the presence of unfolded proteins and ER stress in DCs is a crucial endogenous danger signal that is vital to understanding their biology and their involvement in inflammatory lung diseases.
My specific aims are to :
1.define the fine tuning of ER stress pathways in various lung DC subsets in health and disease
2. define the specific role of ER stress proteins XBP1, JIK and ORMDL3 in DCs
3. test if interfering with ER stress pathways alters the course of inflammatory lung disease
To approach these aims, I have developed mouse models of lung disease that are centered around lung DCs and where ER stress pathways can be genetically deleted. Using a combination of cell biological and immunological techniques I hope to achieve definitive answers as to how ER stress pathways regulate the function of DCs. Manipulation of ER stress pathways by drugs will have a major impact on very common diseases like diabetes, cardiovascular and neurodegenerative disease. Through the current proposal, I hope to extend this exciting field to lung biology.
Summary
My overall aim is to understand the physiologic and medical importance of lung dendritic cells (DC) and to define the suitability of inhibitors of their function for the treatment of inflammatory lung diseases like asthma and COPD.
Lung dendritic cells (DC) play crucial roles in the regulation of lung immunity. We still do not fully understand how they get activated in response to different types of environmental triggers like allergens, cigarette smoke and pathogens. Although recognition of conserved motifs by pattern recognition receptors on DCs could be a key event, these stimuli are also accompanied by accumulation of unfolded proteins in the endoplasmic reticulum (ER). Cells respond by mounting the unfolded protein response (UPR) that acts to ameliorate protein folding, but intersects with metabolism, induction of alarm signals and cellular suicide mechanisms. I hypothesize that the presence of unfolded proteins and ER stress in DCs is a crucial endogenous danger signal that is vital to understanding their biology and their involvement in inflammatory lung diseases.
My specific aims are to :
1.define the fine tuning of ER stress pathways in various lung DC subsets in health and disease
2. define the specific role of ER stress proteins XBP1, JIK and ORMDL3 in DCs
3. test if interfering with ER stress pathways alters the course of inflammatory lung disease
To approach these aims, I have developed mouse models of lung disease that are centered around lung DCs and where ER stress pathways can be genetically deleted. Using a combination of cell biological and immunological techniques I hope to achieve definitive answers as to how ER stress pathways regulate the function of DCs. Manipulation of ER stress pathways by drugs will have a major impact on very common diseases like diabetes, cardiovascular and neurodegenerative disease. Through the current proposal, I hope to extend this exciting field to lung biology.
Max ERC Funding
1 499 580 €
Duration
Start date: 2010-12-01, End date: 2015-11-30
Project acronym European Unions
Project Labour Politics and the EU's New Economic Governance Regime
Researcher (PI) Roland ERNE
Host Institution (HI) UNIVERSITY COLLEGE DUBLIN, NATIONAL UNIVERSITY OF IRELAND, DUBLIN
Call Details Consolidator Grant (CoG), SH2, ERC-2016-COG
Summary Trade unions play a major role in democratic interest intermediation. This role is currently threatened by the increasingly authoritarian strain in EU’s new economic governance (NEG). This project aims to explore the challenges and possibilities that the NEG poses to labour politics. Until recently, European labour politics has mainly been shaped by horizontal market integration through the free movement of goods, capital, services and people. After the financial crisis, the latter has been complemented by vertical integration effected through the direct surveillance of member states. The resulting NEG opens contradictory possibilities for labour movements in Europe.
On the one hand, the reliance of the NEG on vertical surveillance makes decisions taken in its name more tangible, offering concrete targets for contentious transnational collective action. On the other hand however, the NEG mimics the governance structures of multinational firms, by using key performance indicators that put countries in competition with one another. This constitutes a deterrent to transnational collective action. The NEG’s interventionist and competitive strains also pose the threat of nationalist counter-movements, thus making European collective action ever more vital for the future of EU integration and democracy.
This project has the following objectives:
1. To understand the interrelation between NEG and existing ‘horizontal’ EU economic governance; and the shifts in labour politics triggered by NEG;
2. To open up novel analytical approaches that are able to capture both national and transnational social processes at work;
3. To analyse the responses of established trade unions and new social movements to NEG in selected subject areas and economic sectors at national and EU levels, and their feedback effects on NEG;
4. To develop a new scientific paradigm capable of accounting for the interplay between EU economic governance, labour politics and EU democracy.
Summary
Trade unions play a major role in democratic interest intermediation. This role is currently threatened by the increasingly authoritarian strain in EU’s new economic governance (NEG). This project aims to explore the challenges and possibilities that the NEG poses to labour politics. Until recently, European labour politics has mainly been shaped by horizontal market integration through the free movement of goods, capital, services and people. After the financial crisis, the latter has been complemented by vertical integration effected through the direct surveillance of member states. The resulting NEG opens contradictory possibilities for labour movements in Europe.
On the one hand, the reliance of the NEG on vertical surveillance makes decisions taken in its name more tangible, offering concrete targets for contentious transnational collective action. On the other hand however, the NEG mimics the governance structures of multinational firms, by using key performance indicators that put countries in competition with one another. This constitutes a deterrent to transnational collective action. The NEG’s interventionist and competitive strains also pose the threat of nationalist counter-movements, thus making European collective action ever more vital for the future of EU integration and democracy.
This project has the following objectives:
1. To understand the interrelation between NEG and existing ‘horizontal’ EU economic governance; and the shifts in labour politics triggered by NEG;
2. To open up novel analytical approaches that are able to capture both national and transnational social processes at work;
3. To analyse the responses of established trade unions and new social movements to NEG in selected subject areas and economic sectors at national and EU levels, and their feedback effects on NEG;
4. To develop a new scientific paradigm capable of accounting for the interplay between EU economic governance, labour politics and EU democracy.
Max ERC Funding
1 997 132 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym EUTHORITY
Project Conflict and Cooperation in the EU Heterarchical Legal System
Researcher (PI) Arthur Dyevre
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), SH2, ERC-2014-STG
Summary Supranational legal regimes are increasingly enforced by multi-level, non-hierarchical court systems, in which judges at the upper, supranational echelon do not have the power to reverse domestic court decisions. Yet the incentives and dynamics that shape the complex patterns of conflict and cooperation observed in the most important of all such court structures, the EU legal system, are still poorly understood. To what extent are domestic courts able to negotiate the terms of their cooperation with the Court of Justice? How do national courts differ in that respect? Are the non-compliance threats issued by domestic courts all equally credible? Do the rare cases where these threats have been put to execution pose a systemic risk to the authority of EU law? How are the domestic courts’ incentives to cooperate with EU judges affected by the sort of political backsliding witnessed in Hungary and Romania in recent years? Our interdisciplinary research project addresses these puzzles of legal integration with the avowed aim of helping judges and policy-makers make more informed choices when faced with compliance problems in the judicial realm. Grounded in a general theory of judicial behaviour, our generic hypothesis is that the authority of EU law at domestic level is determined by domestic politics as well as by judicial attitudes towards integration. EUTHORITY seeks to refine this hypothesis using game theoretic modelling to analyse strategic interactions among the Court of Justice and domestic courts and politicians. Theory-building combines with a large-scale data-collection effort. We undertake to compile longitudinal data on the institutional characteristics and doctrinal responses to legal integration of 68 domestic apex courts across the EU 28 Member States. With a view to construct an annual, court-specific index of legal integration, we also conduct an expert survey asking academic lawyers and practitioners to assess their courts' attitudes towards EU law.
Summary
Supranational legal regimes are increasingly enforced by multi-level, non-hierarchical court systems, in which judges at the upper, supranational echelon do not have the power to reverse domestic court decisions. Yet the incentives and dynamics that shape the complex patterns of conflict and cooperation observed in the most important of all such court structures, the EU legal system, are still poorly understood. To what extent are domestic courts able to negotiate the terms of their cooperation with the Court of Justice? How do national courts differ in that respect? Are the non-compliance threats issued by domestic courts all equally credible? Do the rare cases where these threats have been put to execution pose a systemic risk to the authority of EU law? How are the domestic courts’ incentives to cooperate with EU judges affected by the sort of political backsliding witnessed in Hungary and Romania in recent years? Our interdisciplinary research project addresses these puzzles of legal integration with the avowed aim of helping judges and policy-makers make more informed choices when faced with compliance problems in the judicial realm. Grounded in a general theory of judicial behaviour, our generic hypothesis is that the authority of EU law at domestic level is determined by domestic politics as well as by judicial attitudes towards integration. EUTHORITY seeks to refine this hypothesis using game theoretic modelling to analyse strategic interactions among the Court of Justice and domestic courts and politicians. Theory-building combines with a large-scale data-collection effort. We undertake to compile longitudinal data on the institutional characteristics and doctrinal responses to legal integration of 68 domestic apex courts across the EU 28 Member States. With a view to construct an annual, court-specific index of legal integration, we also conduct an expert survey asking academic lawyers and practitioners to assess their courts' attitudes towards EU law.
Max ERC Funding
1 475 150 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym EVALISA
Project "The Evolution of Case, Alignment and Argument Structure in Indo-European"
Researcher (PI) Jóhanna Barðdal
Host Institution (HI) UNIVERSITEIT GENT
Call Details Starting Grant (StG), SH4, ERC-2012-StG_20111124
Summary "Alignment and argument structure lies at the heart of all current theoretical models in linguistics, both syntactic models and research within typology. In spite of that, no large-scale comprehensive study of the historical development of case marking and argument structure has been carried out in modern times, using modern linguistic approaches and frameworks, and covering an entire language family from its first documentation until modern times. The project EVALISA aims to investigate case marking and argument structure from a historical perspective, or more precisely non-nominative case marking of subjects, focusing on its development through the history of the Indo-European languages. One of the products emerging from the project is an electronically searchable database of predicates taking non-nominative subject marking, available to the research community at large, for further research on the topic. Another product is a typology of grammaticalization paths of non-nominative case marking of subjects. This is a timely enterprise given that non-nominative subject marking is extremely common in the languages of world. A third product is a methodology for reconstructing syntax and grammar, based on the tools of Construction Grammar. The theoretical framework of Construction Grammar is easily extendible to syntactic reconstruction, due to the basic status of form–meaning pairings in that model, and hence the more lexicon-like status of the grammar. This creates a natural leap for Construction Grammar from synchronic form–meaning pairings to historical reconstruction, based on form–meaning pairings. This methodology is of importance for scholars within anthropological linguistics, working on the history of oral or less-documented languages."
Summary
"Alignment and argument structure lies at the heart of all current theoretical models in linguistics, both syntactic models and research within typology. In spite of that, no large-scale comprehensive study of the historical development of case marking and argument structure has been carried out in modern times, using modern linguistic approaches and frameworks, and covering an entire language family from its first documentation until modern times. The project EVALISA aims to investigate case marking and argument structure from a historical perspective, or more precisely non-nominative case marking of subjects, focusing on its development through the history of the Indo-European languages. One of the products emerging from the project is an electronically searchable database of predicates taking non-nominative subject marking, available to the research community at large, for further research on the topic. Another product is a typology of grammaticalization paths of non-nominative case marking of subjects. This is a timely enterprise given that non-nominative subject marking is extremely common in the languages of world. A third product is a methodology for reconstructing syntax and grammar, based on the tools of Construction Grammar. The theoretical framework of Construction Grammar is easily extendible to syntactic reconstruction, due to the basic status of form–meaning pairings in that model, and hence the more lexicon-like status of the grammar. This creates a natural leap for Construction Grammar from synchronic form–meaning pairings to historical reconstruction, based on form–meaning pairings. This methodology is of importance for scholars within anthropological linguistics, working on the history of oral or less-documented languages."
Max ERC Funding
1 498 744 €
Duration
Start date: 2013-10-01, End date: 2018-09-30
Project acronym EVERBONE
Project Mechano-activated Extracellular Vesicle Based Repair of Bone
Researcher (PI) David HOEY
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Proof of Concept (PoC), ERC-2018-PoC
Summary Every 30 seconds a person suffers an osteoporosis-related bone fracture in Europe, resulting in significant morbidity, mortality and healthcare costs estimated at €36billion annually. Current anti-catabolic therapeutics are associated with severe side effects and are purely preventative, while the only anabolic treatment plateaus at 6 months. This has resulted in a crisis in osteoporosis treatment. Therefore, the aim of EVERBONE is to develop and commercialise a new anabolic multitargeted medicinal product for bone regeneration that mimics the beneficial effects of physical exercise, by recruiting endogenous cells to promote angiogenesis, osteogenesis and bone formation. Building on the discovery that osteocyte-derived mechanically activated extracellular vesicles (MAEVs) mediate the anabolic bone forming response to physical exercise, this project will develop novel bioreactor systems to increase MAEV yield in vitro and scale up for clinical use. Secondly, it will identify a standard mechanical conditioning procedure to generate optimised angiogenically and osteogenically primed MAEVs. Finally, preclinical evaluation of MAEV treatment will be performed in a model of osteoporosis and bone healing, whereby it is expected that MAEVs will localize to bone and mobilise endogenous endothelial and stem cell populations to enhance bone regeneration in osteoporosis and improved healing rates in bone defects. This regrowth of bone will overcome current limitations of anti-resorptive treatments, reducing the risk of fracture and thus the financial burden of treatment. The project will leverage the applicants experience in bone mechanobiology to develop a new product with significant commercial potential. Therefore, the impact of EVERBONE will be multifaceted: it will transform how osteoporosis is treated, it will create economic value through the commercialisation of IP, and most importantly it will improve patient experience and their long-term health and well-being.
Summary
Every 30 seconds a person suffers an osteoporosis-related bone fracture in Europe, resulting in significant morbidity, mortality and healthcare costs estimated at €36billion annually. Current anti-catabolic therapeutics are associated with severe side effects and are purely preventative, while the only anabolic treatment plateaus at 6 months. This has resulted in a crisis in osteoporosis treatment. Therefore, the aim of EVERBONE is to develop and commercialise a new anabolic multitargeted medicinal product for bone regeneration that mimics the beneficial effects of physical exercise, by recruiting endogenous cells to promote angiogenesis, osteogenesis and bone formation. Building on the discovery that osteocyte-derived mechanically activated extracellular vesicles (MAEVs) mediate the anabolic bone forming response to physical exercise, this project will develop novel bioreactor systems to increase MAEV yield in vitro and scale up for clinical use. Secondly, it will identify a standard mechanical conditioning procedure to generate optimised angiogenically and osteogenically primed MAEVs. Finally, preclinical evaluation of MAEV treatment will be performed in a model of osteoporosis and bone healing, whereby it is expected that MAEVs will localize to bone and mobilise endogenous endothelial and stem cell populations to enhance bone regeneration in osteoporosis and improved healing rates in bone defects. This regrowth of bone will overcome current limitations of anti-resorptive treatments, reducing the risk of fracture and thus the financial burden of treatment. The project will leverage the applicants experience in bone mechanobiology to develop a new product with significant commercial potential. Therefore, the impact of EVERBONE will be multifaceted: it will transform how osteoporosis is treated, it will create economic value through the commercialisation of IP, and most importantly it will improve patient experience and their long-term health and well-being.
Max ERC Funding
149 891 €
Duration
Start date: 2019-01-01, End date: 2020-06-30
Project acronym EVO-HAFT
Project Evolution of stone tool hafting in the Palaeolithic
Researcher (PI) Veerle, Lutgard, Petra Rots
Host Institution (HI) UNIVERSITE DE LIEGE
Call Details Starting Grant (StG), SH6, ERC-2012-StG_20111124
Summary "Palaeolithic stone tool hafting has been considered important for decades, both in terms of technological and cognitive evolutions, but it has been hard to design methods that allow detailed insight into the appearance of hafting and its evolution through time. The main reason is that handles were manufactured from organic materials and these are only rarely preserved. The issue thus appears to largely escapes us, but as finds become more and more numerous, promising new techniques have also been developed, which allow a more detailed investigation of hafting. It has been demonstrated that a microscopic investigation of stone tools allows a distinction between tools that were used in the hand and those that were mounted in / on a handle, as well as an interpretation of the hafting arrangement. Knowing whether and how stone tools were hafted provides crucial data for improving our understanding of past human behaviour. It is invaluable for a better comprehension of technological evolutions, it provides insight into the organic tool component that is rarely preserved, and it allows understanding the complete life cycle of stone tools. The goal of this research project is to gain insights in the appearance, regional and chronological variability, and evolution of Palaeolithic stone tool hafting in Europe and the remaining Old World through a comprehensive functional investigation of key sites, which includes the analysis of wear traces and residues, bio/physico-chemical analyses, next to an elaborate experimental program. The proposed project starts from the conviction that many of the changes observed during the Palaeolithic can be understood based on functional data. Consequently, this research project will contribute significantly to our understanding of archaeological assemblages and their variability, and of past human behaviour and its evolution through time."
Summary
"Palaeolithic stone tool hafting has been considered important for decades, both in terms of technological and cognitive evolutions, but it has been hard to design methods that allow detailed insight into the appearance of hafting and its evolution through time. The main reason is that handles were manufactured from organic materials and these are only rarely preserved. The issue thus appears to largely escapes us, but as finds become more and more numerous, promising new techniques have also been developed, which allow a more detailed investigation of hafting. It has been demonstrated that a microscopic investigation of stone tools allows a distinction between tools that were used in the hand and those that were mounted in / on a handle, as well as an interpretation of the hafting arrangement. Knowing whether and how stone tools were hafted provides crucial data for improving our understanding of past human behaviour. It is invaluable for a better comprehension of technological evolutions, it provides insight into the organic tool component that is rarely preserved, and it allows understanding the complete life cycle of stone tools. The goal of this research project is to gain insights in the appearance, regional and chronological variability, and evolution of Palaeolithic stone tool hafting in Europe and the remaining Old World through a comprehensive functional investigation of key sites, which includes the analysis of wear traces and residues, bio/physico-chemical analyses, next to an elaborate experimental program. The proposed project starts from the conviction that many of the changes observed during the Palaeolithic can be understood based on functional data. Consequently, this research project will contribute significantly to our understanding of archaeological assemblages and their variability, and of past human behaviour and its evolution through time."
Max ERC Funding
1 192 300 €
Duration
Start date: 2013-01-01, End date: 2017-12-31
Project acronym EVODIS
Project Exploiting vortices to suppress dispersion and reach new separation power boundaries
Researcher (PI) Wim De Malsche
Host Institution (HI) VRIJE UNIVERSITEIT BRUSSEL
Call Details Starting Grant (StG), PE4, ERC-2015-STG
Summary The 21st century is expected to develop towards a society depending ever and ever more on (bio-)chemical measurements of fluids and matrices that are so complex they are well beyond the current analytical capabilities. Incremental improvements can no longer satisfy the current needs of e.g. the proteomics field, requiring the separation of tens of thousands of components. The pace of progress in these fields is therefore predominantly determined by that of analytical tools, whereby liquid chromatography is the most prominent technique to separate small molecules as well as macromolecules, based on differential interaction of each analyte with support structures giving it a unique migration velocity. To improve its performance, a faster transport between these structures needs to be generated. Unfortunately the commonly pursued strategy, relying on diffusion and reducing the structure size, has come to its limits due to practical limitations related to packing and fabrication of sub-micron support structures, pressure tolerance and viscous heating.
A ground-breaking step to advance chromatographic performance to another level would be to accelerate mass transport in the lateral direction, beyond the rate of diffusion only. To meet this requirement, an array of microstructures and local electrodes can be defined to create lateral electroosmotic vortices in a pressure-driven column, aiming to accelerate the local mass transfer in an anisotropic fashion. The achievement of ordered arrays of vortices is intimately linked to this requirement, which is also of broader importance for mixing, anti-fouling of membrane and reactor surfaces, enhanced mass transfer in reactor channels, emulsification, etc. Understanding and implementing anisotropic vortex flows will therefore not only revolutionize analytical and preparative separation procedures, but will also be highly relevant in all flow systems that benefit from enhanced mass transfer.
Summary
The 21st century is expected to develop towards a society depending ever and ever more on (bio-)chemical measurements of fluids and matrices that are so complex they are well beyond the current analytical capabilities. Incremental improvements can no longer satisfy the current needs of e.g. the proteomics field, requiring the separation of tens of thousands of components. The pace of progress in these fields is therefore predominantly determined by that of analytical tools, whereby liquid chromatography is the most prominent technique to separate small molecules as well as macromolecules, based on differential interaction of each analyte with support structures giving it a unique migration velocity. To improve its performance, a faster transport between these structures needs to be generated. Unfortunately the commonly pursued strategy, relying on diffusion and reducing the structure size, has come to its limits due to practical limitations related to packing and fabrication of sub-micron support structures, pressure tolerance and viscous heating.
A ground-breaking step to advance chromatographic performance to another level would be to accelerate mass transport in the lateral direction, beyond the rate of diffusion only. To meet this requirement, an array of microstructures and local electrodes can be defined to create lateral electroosmotic vortices in a pressure-driven column, aiming to accelerate the local mass transfer in an anisotropic fashion. The achievement of ordered arrays of vortices is intimately linked to this requirement, which is also of broader importance for mixing, anti-fouling of membrane and reactor surfaces, enhanced mass transfer in reactor channels, emulsification, etc. Understanding and implementing anisotropic vortex flows will therefore not only revolutionize analytical and preparative separation procedures, but will also be highly relevant in all flow systems that benefit from enhanced mass transfer.
Max ERC Funding
1 460 688 €
Duration
Start date: 2016-03-01, End date: 2021-02-28
Project acronym EVOLECOCOG
Project The evolutionary ecology of cognition across a heterogeneous landscape
Researcher (PI) John Leo Quinn
Host Institution (HI) UNIVERSITY COLLEGE CORK - NATIONAL UNIVERSITY OF IRELAND, CORK
Call Details Consolidator Grant (CoG), LS8, ERC-2013-CoG
Summary "Why do individuals vary in their cognitive abilities? This proposal takes the disciplines of cognition and evolutionary biology into a natural setting to answer this question by investigating a variety of proximate causes and population-level consequences of individual variation in cognitive ability. It represents the first large-scale integrative study of cognitive ability on any wild population. State of the art observational (remote sensing and automated self-administration trials of learning in the wild), chemical (stable isotope analysis of diet), physiological (stress, energetics, immunocompetence), molecular (DNA fingerprinting and metabarcoding) and analytical (reaction norm, quantitative genetic) techniques will be used. The chosen study system, the great tit Parus major, is one of the most widely used in Europe, but uniquely here will consist of 12 subpopulations across deciduous and conifer woodland fragments. The proposal’s broad scope is captured in three objectives: 1) To characterise proximate causes of variation in cognitive (associative/reversal learning; problem solving; brain size) and other traits (the reactive-proactive personality axis; bill morphology), all of which can influence similar ecologically important behaviour. Quantitative genetic, social, parasite-mediated, and physiological causes will be explored. 2) To examine links between these traits, and key behaviours and trade-offs, e.g., space use, niche specialization, predation, parental care and promiscuity; and 3) To examine the consequences of this variation for life histories and fitness. The research team consists of the PI, five early career biologists, and three PhD students, and will collaborate with eight researchers from Europe and further afield. The project will reveal ground-breaking insight into why individuals vary in their cognitive ability. It aims to impact a wide scientific community, to raise public interest in science, and to inform EU biodiversity policy."
Summary
"Why do individuals vary in their cognitive abilities? This proposal takes the disciplines of cognition and evolutionary biology into a natural setting to answer this question by investigating a variety of proximate causes and population-level consequences of individual variation in cognitive ability. It represents the first large-scale integrative study of cognitive ability on any wild population. State of the art observational (remote sensing and automated self-administration trials of learning in the wild), chemical (stable isotope analysis of diet), physiological (stress, energetics, immunocompetence), molecular (DNA fingerprinting and metabarcoding) and analytical (reaction norm, quantitative genetic) techniques will be used. The chosen study system, the great tit Parus major, is one of the most widely used in Europe, but uniquely here will consist of 12 subpopulations across deciduous and conifer woodland fragments. The proposal’s broad scope is captured in three objectives: 1) To characterise proximate causes of variation in cognitive (associative/reversal learning; problem solving; brain size) and other traits (the reactive-proactive personality axis; bill morphology), all of which can influence similar ecologically important behaviour. Quantitative genetic, social, parasite-mediated, and physiological causes will be explored. 2) To examine links between these traits, and key behaviours and trade-offs, e.g., space use, niche specialization, predation, parental care and promiscuity; and 3) To examine the consequences of this variation for life histories and fitness. The research team consists of the PI, five early career biologists, and three PhD students, and will collaborate with eight researchers from Europe and further afield. The project will reveal ground-breaking insight into why individuals vary in their cognitive ability. It aims to impact a wide scientific community, to raise public interest in science, and to inform EU biodiversity policy."
Max ERC Funding
1 993 189 €
Duration
Start date: 2015-03-01, End date: 2020-12-31
Project acronym EVWRIT
Project Everyday Writing in Graeco-Roman and Late Antique Egypt (I - VIII AD). A Socio-Semiotic Study of Communicative Variation
Researcher (PI) Klaas BENTEIN
Host Institution (HI) UNIVERSITEIT GENT
Call Details Starting Grant (StG), SH5, ERC-2017-STG
Summary This five-year project aims to generate a paradigm shift in the understanding of Graeco-Roman and Late Antique communication. Non-literary, ‘documentary’ texts from Ancient Egypt such as letters, petitions and contracts have provided and continue to provide a key witness for our knowledge of the administration, education, economy, etc. of Ancient Egypt. This project argues that since documentary texts represent originals, their external characteristics should also be brought into the interpretation: elements such as handwriting, linguistic register or writing material transmit indirect social messages concerning hierarchy, status, and power relations, and can therefore be considered ‘semiotic resources’. The project’s driving hypothesis is that communicative variation – variation that is functionally insignificant but socially significant (e.g. there are ~ there’s ~ it’s a lot of people) – enables the expression of social meaning. The main aim of this project is to analyse the nature of this communicative variation. To this end, a multidisciplinary team of six researchers (one PI, one post-doc, and four PhD’s) will apply recent insights form socio-semiotic and socio-linguistic theory to a corpus of Graeco-Roman and Late Antique documentary texts (I – VIII AD) by means of a three-level approach: (i) an open-access database of annotated documentary texts will be created; (ii) the ‘semiotic potential’ of the different semiotic resources that play a role in documentary writing will be analysed; (iii) the interrelationships between the different semiotic resources will be studied. The project will have a significant scientific impact: (i) it will be the first to offer a holistic perspective towards the ‘meaning’ of documentary texts; (ii) the digital tool will open up new ways to investigate Ancient texts; (iii) it will make an important contribution to current socio-semiotic and socio-linguistic research; (iv) it will provide new insights about humans as social beings.
Summary
This five-year project aims to generate a paradigm shift in the understanding of Graeco-Roman and Late Antique communication. Non-literary, ‘documentary’ texts from Ancient Egypt such as letters, petitions and contracts have provided and continue to provide a key witness for our knowledge of the administration, education, economy, etc. of Ancient Egypt. This project argues that since documentary texts represent originals, their external characteristics should also be brought into the interpretation: elements such as handwriting, linguistic register or writing material transmit indirect social messages concerning hierarchy, status, and power relations, and can therefore be considered ‘semiotic resources’. The project’s driving hypothesis is that communicative variation – variation that is functionally insignificant but socially significant (e.g. there are ~ there’s ~ it’s a lot of people) – enables the expression of social meaning. The main aim of this project is to analyse the nature of this communicative variation. To this end, a multidisciplinary team of six researchers (one PI, one post-doc, and four PhD’s) will apply recent insights form socio-semiotic and socio-linguistic theory to a corpus of Graeco-Roman and Late Antique documentary texts (I – VIII AD) by means of a three-level approach: (i) an open-access database of annotated documentary texts will be created; (ii) the ‘semiotic potential’ of the different semiotic resources that play a role in documentary writing will be analysed; (iii) the interrelationships between the different semiotic resources will be studied. The project will have a significant scientific impact: (i) it will be the first to offer a holistic perspective towards the ‘meaning’ of documentary texts; (ii) the digital tool will open up new ways to investigate Ancient texts; (iii) it will make an important contribution to current socio-semiotic and socio-linguistic research; (iv) it will provide new insights about humans as social beings.
Max ERC Funding
1 476 250 €
Duration
Start date: 2018-06-01, End date: 2023-05-31
Project acronym EXPAND
Project Defining the cellular dynamics leading to tissue expansion
Researcher (PI) Cedric Blanpain
Host Institution (HI) UNIVERSITE LIBRE DE BRUXELLES
Call Details Consolidator Grant (CoG), LS3, ERC-2013-CoG
Summary Stem cells (SCs) ensure the development of the different tissues during morphogenesis, their physiological turnover during adult life and tissue repair after injuries. .
Our lab has recently developed new methods to study by lineage tracing the cellular hierarchy that sustains homeostasis and repair of the epidermis and to identify distinct populations of SCs and progenitors ensuring mammary gland and prostate postnatal development.
While quantitative clonal analysis combined with mathematical modeling has been used recently to decipher the cellular basis of tissue homeostasis, such experimental approaches have never been used so far in mammals to investigate the cellular hierarchy acting during tissue expansion such as postnatal development and tissue repair.
In this project, we will use a multi-disciplinary approach combining mouse genetic lineage tracing and clonal analysis, mathematical modeling, proliferation kinetics, transcriptional profiling, and functional experiments to investigate the cellular and molecular mechanisms regulating tissue expansion during epithelial development and tissue repair and how the fate of these cells is controlled during this process.
1. We will define the clonal and proliferation dynamics of tissue expansion in the epidermis, the mammary gland and the prostate during postnatal growth and adult tissue regeneration.
2. We will define the clonal and proliferation dynamics of tissue expansion in the adult epidermis following wounding and mechanical force mediated tissue expansion.
3. We will define the mechanisms that regulate the switch from multipotent to unipotent cell fate during development of glandular epithelia.
Defining the cellular and molecular mechanisms underlying tissue growth and expansion during development and how these mechanisms differ from tissue regeneration in adult may have important implications for understanding the causes of certain developmental defects and for regenerative medicine.
Summary
Stem cells (SCs) ensure the development of the different tissues during morphogenesis, their physiological turnover during adult life and tissue repair after injuries. .
Our lab has recently developed new methods to study by lineage tracing the cellular hierarchy that sustains homeostasis and repair of the epidermis and to identify distinct populations of SCs and progenitors ensuring mammary gland and prostate postnatal development.
While quantitative clonal analysis combined with mathematical modeling has been used recently to decipher the cellular basis of tissue homeostasis, such experimental approaches have never been used so far in mammals to investigate the cellular hierarchy acting during tissue expansion such as postnatal development and tissue repair.
In this project, we will use a multi-disciplinary approach combining mouse genetic lineage tracing and clonal analysis, mathematical modeling, proliferation kinetics, transcriptional profiling, and functional experiments to investigate the cellular and molecular mechanisms regulating tissue expansion during epithelial development and tissue repair and how the fate of these cells is controlled during this process.
1. We will define the clonal and proliferation dynamics of tissue expansion in the epidermis, the mammary gland and the prostate during postnatal growth and adult tissue regeneration.
2. We will define the clonal and proliferation dynamics of tissue expansion in the adult epidermis following wounding and mechanical force mediated tissue expansion.
3. We will define the mechanisms that regulate the switch from multipotent to unipotent cell fate during development of glandular epithelia.
Defining the cellular and molecular mechanisms underlying tissue growth and expansion during development and how these mechanisms differ from tissue regeneration in adult may have important implications for understanding the causes of certain developmental defects and for regenerative medicine.
Max ERC Funding
2 400 000 €
Duration
Start date: 2014-06-01, End date: 2019-05-31
Project acronym EyeRegen
Project Engineering a scaffold based therapy for corneal regeneration
Researcher (PI) Mark Joseph Ahearne
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Starting Grant (StG), PE8, ERC-2014-STG
Summary Corneal blindness resulting from disease, physical injury or chemical burns affects millions worldwide and has a considerable economic and social impact on the lives of people across Europe. In many cases corneal transplants can restore vision however the shortage of donor corneas suitable for transplantation has necessitated the development of alternative treatments. The aim of this project is to develop a new approach to corneal tissue regeneration. Previous approaches at engineering corneal tissue have required access to donor cells and lengthy culture periods in an attempt to grow tissue in vitro prior to implantation with only limited success and at great expense. Our approach will differ fundamentally from these in that we will design artificial corneal scaffolds that do not require donated cells or in vitro culture but instead will recruit the patient’s own cells to regenerate the cornea post-implantation. These biomaterial scaffolds will incorporate specific chemical and physical cues with the deliberate aim of attracting cells and inducing tissue formation. Studies will be undertaken to examine how different chemical, biochemical, physical and mechanical cues can be used to control the behaviour of corneal epithelial, stromal and endothelial cells. Once the optimal combination of these cues has been determined, this information will be incorporated into the design of the scaffold. Recent advances in manufacturing and material processing technology will enable us to develop scaffolds with organized nanometric architectures and that incorporate controlled growth factor release mechanisms. Techniques such as 3D bio-printing and nanofiber electrospinning will be used to fabricate scaffolds. The ability of the scaffold to attract cells and promote matrix remodelling will be examined by developing an in vitro bioreactor system capable of mimicking the ocular environment and by performing in vivo tests using a live animal model.
Summary
Corneal blindness resulting from disease, physical injury or chemical burns affects millions worldwide and has a considerable economic and social impact on the lives of people across Europe. In many cases corneal transplants can restore vision however the shortage of donor corneas suitable for transplantation has necessitated the development of alternative treatments. The aim of this project is to develop a new approach to corneal tissue regeneration. Previous approaches at engineering corneal tissue have required access to donor cells and lengthy culture periods in an attempt to grow tissue in vitro prior to implantation with only limited success and at great expense. Our approach will differ fundamentally from these in that we will design artificial corneal scaffolds that do not require donated cells or in vitro culture but instead will recruit the patient’s own cells to regenerate the cornea post-implantation. These biomaterial scaffolds will incorporate specific chemical and physical cues with the deliberate aim of attracting cells and inducing tissue formation. Studies will be undertaken to examine how different chemical, biochemical, physical and mechanical cues can be used to control the behaviour of corneal epithelial, stromal and endothelial cells. Once the optimal combination of these cues has been determined, this information will be incorporated into the design of the scaffold. Recent advances in manufacturing and material processing technology will enable us to develop scaffolds with organized nanometric architectures and that incorporate controlled growth factor release mechanisms. Techniques such as 3D bio-printing and nanofiber electrospinning will be used to fabricate scaffolds. The ability of the scaffold to attract cells and promote matrix remodelling will be examined by developing an in vitro bioreactor system capable of mimicking the ocular environment and by performing in vivo tests using a live animal model.
Max ERC Funding
1 498 734 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym facessvep
Project UNDERSTANDING THE NATURE OF FACE PERCEPTION: NEW INSIGHTS FROM STEADY-STATE VISUAL EVOKED POTENTIALS
Researcher (PI) Bruno Rossion
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Starting Grant (StG), SH4, ERC-2011-StG_20101124
Summary Face recognition is one of the most complex functions of the human mind/brain, so that no artificial device can surpass human abilities in this function. The goal of this project is to understand a fundamental aspect of face recognition, individual face perception: how, from sensory information, does the human mind/brain build a visual representation of a particular face? To clarify this question, I will introduce the method of steady-state visual evoked potentials (SSVEPs) in the field of face perception. This approach has never been applied to face perception, but we recently started using it and collected strong data demonstrating the feasibility of the approach. It is based on the repetitive stimulation of the visual system at a fixed frequency rate, and the recording on the human scalp of an electrical response (electroencephalogram, EEG) that oscillates at that specific frequency rate. Because of its extremely high signal-to-noise ratio and its non-ambiguity with respect to the measurement of the signal of interest, this method is ideal to assess the human brain’s sensitivity to facial identity, non-invasively, and with the exact same approach in normal adults, infants and children, as well as clinical populations. SSVEP will also allow “tagging” different features of a stimulus with different stimulation frequencies (“frequency-tagging” method), and thus measure the representation and processing of these features independently, as well as their potential integration. Overall, this proposal should shed light on understanding one of the most complex function of the human mind/brain, while its realization will undoubtedly generate relevant data and paradigms useful for understanding other aspects of face processing (e.g., perception of facial expression) and high-level visual perception processes in general.
Summary
Face recognition is one of the most complex functions of the human mind/brain, so that no artificial device can surpass human abilities in this function. The goal of this project is to understand a fundamental aspect of face recognition, individual face perception: how, from sensory information, does the human mind/brain build a visual representation of a particular face? To clarify this question, I will introduce the method of steady-state visual evoked potentials (SSVEPs) in the field of face perception. This approach has never been applied to face perception, but we recently started using it and collected strong data demonstrating the feasibility of the approach. It is based on the repetitive stimulation of the visual system at a fixed frequency rate, and the recording on the human scalp of an electrical response (electroencephalogram, EEG) that oscillates at that specific frequency rate. Because of its extremely high signal-to-noise ratio and its non-ambiguity with respect to the measurement of the signal of interest, this method is ideal to assess the human brain’s sensitivity to facial identity, non-invasively, and with the exact same approach in normal adults, infants and children, as well as clinical populations. SSVEP will also allow “tagging” different features of a stimulus with different stimulation frequencies (“frequency-tagging” method), and thus measure the representation and processing of these features independently, as well as their potential integration. Overall, this proposal should shed light on understanding one of the most complex function of the human mind/brain, while its realization will undoubtedly generate relevant data and paradigms useful for understanding other aspects of face processing (e.g., perception of facial expression) and high-level visual perception processes in general.
Max ERC Funding
1 490 360 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym FAT NKT
Project Targeting iNKT cell and adipocyte crosstalk for control of metabolism and body weight
Researcher (PI) Lydia Lynch
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Starting Grant (StG), LS6, ERC-2015-STG
Summary Obesity has reached epidemic proportions globally. At least 2.8 million people die each year as a result of being overweight or obese, the biggest burden being obesity-related diseases. It is now clear that inflammation is an underlying cause or contributor to many of these diseases, including type 2 diabetes, atherosclerosis, and cancer. Recognition that the immune system can regulate metabolic pathways has prompted a new way of thinking about diabetes and weight management. Despite much recent progress, most immunometabolic pathways, and how to target them, are currently unknown. One such pathway is the cross-talk between invariant natural killer (iNKT) cells and neighboring adipocytes. iNKT cells are the innate lipid-sensing arm of the immune system. Since our discovery that mammalian adipose tissue is enriched for iNKT cells, we have identified a critical role for iNKT cells in regulating adipose inflammation and body weight. The goal of this project is to use a multi-disciplinary approach to identify key signals and molecules used by iNKT cells to induce metabolic control and weight loss in obesity. Using immunological assays and multi-photon intravital microscopy, cells and pathways that control the unique regulatory functions of adipose iNKT cells will be identified and characterised. Novel lipid antigens in adipose tissue will be identified using a biochemical approach, perhaps explaining iNKT cell conservation in adipose depots, and providing safe tools for iNKT cell manipulation in vivo. Finally, using proteomics and whole body metabolic analysis in vivo, novel ‘weight-loss inducing’ factors produced by adipose iNKT cells will be identified. This ambitious and high impact project has the potential to yield major insights into immunometabolic interactions at steady state and in obesity. The ability to activate or induce adipose iNKT cells holds remarkable potential as an entirely new therapeutic direction for treating obesity and type 2 diabetes.
Summary
Obesity has reached epidemic proportions globally. At least 2.8 million people die each year as a result of being overweight or obese, the biggest burden being obesity-related diseases. It is now clear that inflammation is an underlying cause or contributor to many of these diseases, including type 2 diabetes, atherosclerosis, and cancer. Recognition that the immune system can regulate metabolic pathways has prompted a new way of thinking about diabetes and weight management. Despite much recent progress, most immunometabolic pathways, and how to target them, are currently unknown. One such pathway is the cross-talk between invariant natural killer (iNKT) cells and neighboring adipocytes. iNKT cells are the innate lipid-sensing arm of the immune system. Since our discovery that mammalian adipose tissue is enriched for iNKT cells, we have identified a critical role for iNKT cells in regulating adipose inflammation and body weight. The goal of this project is to use a multi-disciplinary approach to identify key signals and molecules used by iNKT cells to induce metabolic control and weight loss in obesity. Using immunological assays and multi-photon intravital microscopy, cells and pathways that control the unique regulatory functions of adipose iNKT cells will be identified and characterised. Novel lipid antigens in adipose tissue will be identified using a biochemical approach, perhaps explaining iNKT cell conservation in adipose depots, and providing safe tools for iNKT cell manipulation in vivo. Finally, using proteomics and whole body metabolic analysis in vivo, novel ‘weight-loss inducing’ factors produced by adipose iNKT cells will be identified. This ambitious and high impact project has the potential to yield major insights into immunometabolic interactions at steady state and in obesity. The ability to activate or induce adipose iNKT cells holds remarkable potential as an entirely new therapeutic direction for treating obesity and type 2 diabetes.
Max ERC Funding
1 804 052 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym FHiCuNCAG
Project Foundations for Higher and Curved Noncommutative Algebraic Geometry
Researcher (PI) Wendy Joy Lowen
Host Institution (HI) UNIVERSITEIT ANTWERPEN
Call Details Consolidator Grant (CoG), PE1, ERC-2018-COG
Summary With this research programme, inspired by open problems within noncommutative algebraic geometry (NCAG) as well as by actual developments in algebraic topology, it is our aim to lay out new foundations for NCAG. On the one hand, the categorical approach to geometry put forth in NCAG has seen a wide range of applications both in mathematics and in theoretical physics. On the other hand, algebraic topology has received a vast impetus from the development of higher topos theory by Lurie and others. The current project is aimed at cross-fertilisation between the two subjects, in particular through the development of “higher linear topos theory”. We will approach the higher structure on Hochschild type complexes from two angles. Firstly, focusing on intrinsic incarnations of spaces as large categories, we will use the tensor products developed jointly with Ramos González and Shoikhet to obtain a “large version” of the Deligne conjecture. Secondly, focusing on concrete representations, we will develop new operadic techniques in order to endow complexes like the Gerstenhaber-Schack complex for prestacks (due to Dinh Van-Lowen) and the deformation complexes for monoidal categories and pasting diagrams (due to Shrestha and Yetter) with new combinatorial structure. In another direction, we will move from Hochschild cohomology of abelian categories (in the sense of Lowen-Van den Bergh) to Mac Lane cohomology for exact categories (in the sense of Kaledin-Lowen), extending the scope of NCAG to “non-linear deformations”. One of the mysteries in algebraic deformation theory is the curvature problem: in the process of deformation we are brought to the boundaries of NCAG territory through the introduction of a curvature component which disables the standard approaches to cohomology. Eventually, it is our goal to set up a new framework for NCAG which incorporates curved objects, drawing inspiration from the realm of higher categories.
Summary
With this research programme, inspired by open problems within noncommutative algebraic geometry (NCAG) as well as by actual developments in algebraic topology, it is our aim to lay out new foundations for NCAG. On the one hand, the categorical approach to geometry put forth in NCAG has seen a wide range of applications both in mathematics and in theoretical physics. On the other hand, algebraic topology has received a vast impetus from the development of higher topos theory by Lurie and others. The current project is aimed at cross-fertilisation between the two subjects, in particular through the development of “higher linear topos theory”. We will approach the higher structure on Hochschild type complexes from two angles. Firstly, focusing on intrinsic incarnations of spaces as large categories, we will use the tensor products developed jointly with Ramos González and Shoikhet to obtain a “large version” of the Deligne conjecture. Secondly, focusing on concrete representations, we will develop new operadic techniques in order to endow complexes like the Gerstenhaber-Schack complex for prestacks (due to Dinh Van-Lowen) and the deformation complexes for monoidal categories and pasting diagrams (due to Shrestha and Yetter) with new combinatorial structure. In another direction, we will move from Hochschild cohomology of abelian categories (in the sense of Lowen-Van den Bergh) to Mac Lane cohomology for exact categories (in the sense of Kaledin-Lowen), extending the scope of NCAG to “non-linear deformations”. One of the mysteries in algebraic deformation theory is the curvature problem: in the process of deformation we are brought to the boundaries of NCAG territory through the introduction of a curvature component which disables the standard approaches to cohomology. Eventually, it is our goal to set up a new framework for NCAG which incorporates curved objects, drawing inspiration from the realm of higher categories.
Max ERC Funding
1 171 360 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym FIAT
Project The Foundations of Institutional AuThority: a multi-dimensional model of the separation of powers
Researcher (PI) Eoin CAROLAN
Host Institution (HI) UNIVERSITY COLLEGE DUBLIN, NATIONAL UNIVERSITY OF IRELAND, DUBLIN
Call Details Consolidator Grant (CoG), SH2, ERC-2018-COG
Summary ‘Almost three centuries later, it is past time to rethink Montesquieu’s holy trinity’ (Ackerman, 2010).
As Ackerman (and many others) have observed, political reality has long left the traditional model of the separation of powers behind. The problems posed by this gap between constitutional theory and political practice have recently acquired fresh urgency as developments in Hungary, Poland, Turkey, Russia, the UK, US, Bolivia and elsewhere place the separation of powers under strain. These include the emergence of authoritarian leaders; personalisation of political authority; recourse to non-legal plebiscites; and the capture or de-legitimisation of other constitutional bodies.
This project argues that these difficulties are rooted in a deeper problem with constitutional thinking about institutional power: a constitution-as-law approach that equates the conferral of legal power with the authority to exercise it. This makes it possible for a gap to emerge between legal accounts of authority and its diverse –and potentially conflicting (Cotterrell)– sociological foundations. Where that gap exists, the practical authority of an institution (or constitution) may be vulnerable to challenge from rival and more socially-resonant claims (Scheppele (2017)).
It is this gap between legal norms and social facts that the project aims to investigate – and ultimately bridge.
How is authority established? How is it maintained? How might it fail? And how does the constitution (as rule? representation (Saward)? mission statement (King)?) shape, re-shape and come to be shaped by those processes? By investigating these questions across six case studies, the project will produce a multi-dimensional account of institutional authority that takes seriously the sociological influence of both law and culture.
The results from these cases provide the evidential foundation for the project’s final outputs: a new model and new evaluative measures of the separation of powers.
Summary
‘Almost three centuries later, it is past time to rethink Montesquieu’s holy trinity’ (Ackerman, 2010).
As Ackerman (and many others) have observed, political reality has long left the traditional model of the separation of powers behind. The problems posed by this gap between constitutional theory and political practice have recently acquired fresh urgency as developments in Hungary, Poland, Turkey, Russia, the UK, US, Bolivia and elsewhere place the separation of powers under strain. These include the emergence of authoritarian leaders; personalisation of political authority; recourse to non-legal plebiscites; and the capture or de-legitimisation of other constitutional bodies.
This project argues that these difficulties are rooted in a deeper problem with constitutional thinking about institutional power: a constitution-as-law approach that equates the conferral of legal power with the authority to exercise it. This makes it possible for a gap to emerge between legal accounts of authority and its diverse –and potentially conflicting (Cotterrell)– sociological foundations. Where that gap exists, the practical authority of an institution (or constitution) may be vulnerable to challenge from rival and more socially-resonant claims (Scheppele (2017)).
It is this gap between legal norms and social facts that the project aims to investigate – and ultimately bridge.
How is authority established? How is it maintained? How might it fail? And how does the constitution (as rule? representation (Saward)? mission statement (King)?) shape, re-shape and come to be shaped by those processes? By investigating these questions across six case studies, the project will produce a multi-dimensional account of institutional authority that takes seriously the sociological influence of both law and culture.
The results from these cases provide the evidential foundation for the project’s final outputs: a new model and new evaluative measures of the separation of powers.
Max ERC Funding
1 997 628 €
Duration
Start date: 2020-01-01, End date: 2024-12-31
Project acronym FibreRemodel
Project Frontier research in arterial fibre remodelling for vascular disease diagnosis and tissue engineering
Researcher (PI) Caitriona Lally
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Starting Grant (StG), PE8, ERC-2014-STG
Summary Each year cardiovascular diseases such as atherosclerosis and aneurysms cause 48% of all deaths in Europe. Arteries may be regarded as fibre-reinforced materials, with the stiffer collagen fibres present in the arterial wall bearing most of the load during pressurisation. Degenerative vascular diseases such as atherosclerosis and aneurysms alter the macroscopic mechanical properties of arterial tissue and therefore change the arterial wall composition and the quality and orientation of the underlying fibrous architecture. Information on the complex fibre architecture of arterial tissues is therefore at the core of understanding the aetiology of vascular diseases. The current proposal aims to use a combination of in vivo Diffusion Tensor Magnetic Resonance Imaging, with parallel in silico modelling, to non-invasively identify differences in the fibre architecture of human carotid arteries which can be directly linked with carotid artery disease and hence used to diagnose vulnerable plaque rupture risk.
Knowledge of arterial fibre patterns, and how these fibres alter in response to their mechanical environment, also provides a means of understanding remodelling of tissue engineered vessels. Therefore, in the second phase of this project, this novel imaging framework will be used to determine fibre patterns of decellularised arterial constructs in vitro with a view to directing mesenchymal stem cell growth and differentiation and creating a biologically and mechanically compatible tissue engineered vessel. In silico mechanobiological models will also be used to help identify the optimum loading environment for the vessels to encourage cell repopulation but prevent excessive intimal hyperplasia.
This combination of novel in vivo, in vitro and in silico work has the potential to revolutionise approaches to early diagnosis of vascular diseases and vascular tissue engineering strategies.
Summary
Each year cardiovascular diseases such as atherosclerosis and aneurysms cause 48% of all deaths in Europe. Arteries may be regarded as fibre-reinforced materials, with the stiffer collagen fibres present in the arterial wall bearing most of the load during pressurisation. Degenerative vascular diseases such as atherosclerosis and aneurysms alter the macroscopic mechanical properties of arterial tissue and therefore change the arterial wall composition and the quality and orientation of the underlying fibrous architecture. Information on the complex fibre architecture of arterial tissues is therefore at the core of understanding the aetiology of vascular diseases. The current proposal aims to use a combination of in vivo Diffusion Tensor Magnetic Resonance Imaging, with parallel in silico modelling, to non-invasively identify differences in the fibre architecture of human carotid arteries which can be directly linked with carotid artery disease and hence used to diagnose vulnerable plaque rupture risk.
Knowledge of arterial fibre patterns, and how these fibres alter in response to their mechanical environment, also provides a means of understanding remodelling of tissue engineered vessels. Therefore, in the second phase of this project, this novel imaging framework will be used to determine fibre patterns of decellularised arterial constructs in vitro with a view to directing mesenchymal stem cell growth and differentiation and creating a biologically and mechanically compatible tissue engineered vessel. In silico mechanobiological models will also be used to help identify the optimum loading environment for the vessels to encourage cell repopulation but prevent excessive intimal hyperplasia.
This combination of novel in vivo, in vitro and in silico work has the potential to revolutionise approaches to early diagnosis of vascular diseases and vascular tissue engineering strategies.
Max ERC Funding
1 521 875 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym FireSpec
Project Integrated spectroscopic sensors for the risk assessment of fires
Researcher (PI) Gunther Roelkens
Host Institution (HI) UNIVERSITEIT GENT
Call Details Proof of Concept (PoC), PC1, ERC-2015-PoC
Summary Most fire deaths are not caused by burns, but by smoke inhalation. Toxic gases in smoke are beside heat and lack of oxygen the major lethal factor in uncontrolled fires. Drones are considered more and more as the ultimate assistants during firefighting operations. Drones or Unmanned Aerial Vehicles (UAV) provide critical top-down perspectives on dangerous areas. This is helpful in dealing with most fires, and especially important and effective when dealing with fires at high risk sites. There is an urgent need for small, low-weight, low power, real-time gas sensors with a high specificity that can be integrated in UAVs. Availability of such sensors could save lives as UAVs might detect and map toxic gas fields in fireplaces and evaluate how they spread. Besides integration in UAVs, such sensors can be integrated in firefighter’s cloths or installed in fire prone places. In FireSpec, we propose the concept of an integrated wavelength modulation spectroscopic sensor for real time hazardous gas detection based on a silicon photonics integrated chip comprising a series of InP lasers, a gasprobe and an InP detector. The wavelengths of the lasers are matched with the mid-infrared (MIR) absorption lines of the gasses to be detected. The concept directly builds upon knowledge and technology developed in the ERC-project Miracle (Mid-InfraRed Active photonic integrated Circuits for Life sciences and Environment). In Miracle, we are investigating and developing the field of photonic integrated circuits for the MIR wavelength band based on high-index contrast waveguide structures. To extend the functionality of the photonic integrated circuit, heterogeneous integration of other materials such as III-V dies, on the high index contrast waveguide system for particular optical functions, such as lasers and detectors is pursued. FireSpec technology is expected to be a game changer in mobile real time toxic gas detection applications.
Summary
Most fire deaths are not caused by burns, but by smoke inhalation. Toxic gases in smoke are beside heat and lack of oxygen the major lethal factor in uncontrolled fires. Drones are considered more and more as the ultimate assistants during firefighting operations. Drones or Unmanned Aerial Vehicles (UAV) provide critical top-down perspectives on dangerous areas. This is helpful in dealing with most fires, and especially important and effective when dealing with fires at high risk sites. There is an urgent need for small, low-weight, low power, real-time gas sensors with a high specificity that can be integrated in UAVs. Availability of such sensors could save lives as UAVs might detect and map toxic gas fields in fireplaces and evaluate how they spread. Besides integration in UAVs, such sensors can be integrated in firefighter’s cloths or installed in fire prone places. In FireSpec, we propose the concept of an integrated wavelength modulation spectroscopic sensor for real time hazardous gas detection based on a silicon photonics integrated chip comprising a series of InP lasers, a gasprobe and an InP detector. The wavelengths of the lasers are matched with the mid-infrared (MIR) absorption lines of the gasses to be detected. The concept directly builds upon knowledge and technology developed in the ERC-project Miracle (Mid-InfraRed Active photonic integrated Circuits for Life sciences and Environment). In Miracle, we are investigating and developing the field of photonic integrated circuits for the MIR wavelength band based on high-index contrast waveguide structures. To extend the functionality of the photonic integrated circuit, heterogeneous integration of other materials such as III-V dies, on the high index contrast waveguide system for particular optical functions, such as lasers and detectors is pursued. FireSpec technology is expected to be a game changer in mobile real time toxic gas detection applications.
Max ERC Funding
149 685 €
Duration
Start date: 2016-03-01, End date: 2017-08-31
Project acronym FitteR-CATABOLIC
Project Survival of the Fittest: On how to enhance recovery from critical illness through learning from evolutionary conserved catabolic pathways
Researcher (PI) Greta Herman VAN DEN BERGHE
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), LS7, ERC-2017-ADG
Summary Since a few decades, human patients who suffer from severe illnesses or multiple trauma, conditions that were previously lethal, are being treated in intensive care units (ICUs). Modern intensive care medicine bridges patients from life-threatening conditions to recovery with use of mechanical devices, vasoactive drugs and powerful anti-microbial agents. By postponing death, a new unnatural condition, intensive-care-dependent prolonged (>1 week) critical illness, has been created. About 25% of ICU patients today require prolonged intensive care, sometimes for weeks or months, and these patients are at high risk of death while consuming 75% of resources. Although the primary insult was adequately dealt with, many long-stay patients typically suffer from hypercatabolism, ICU-acquired brain dysfunction and polyneuropathy/myopathy leading to severe muscle weakness, further increasing the risk of late death. As hypercatabolism was considered the culprit, several anabolic interventions were tested, but these showed harm instead of benefit. We previously showed that fasting early during illness is superior to forceful feeding, pointing to certain benefits of catabolic responses. In healthy humans, fasting activates catabolism to provide substrates essential to protect and maintain brain and muscle function. This proposal aims to investigate whether evolutionary conserved catabolic fasting pathways, specifically lipolysis and ketogenesis, can be exploited in the search for prevention of brain dysfunction and muscle weakness in long-stay ICU patients, with the goal to identify a new metabolic intervention to enhance their recovery. The project builds further on our experience with bi-directional translational research - using human material whenever possible and a validated mouse model of sepsis-induced critical illness for objectives that cannot be addressed in patients - and aims to close the loop, from a novel concept to a large randomized controlled trial in patients.
Summary
Since a few decades, human patients who suffer from severe illnesses or multiple trauma, conditions that were previously lethal, are being treated in intensive care units (ICUs). Modern intensive care medicine bridges patients from life-threatening conditions to recovery with use of mechanical devices, vasoactive drugs and powerful anti-microbial agents. By postponing death, a new unnatural condition, intensive-care-dependent prolonged (>1 week) critical illness, has been created. About 25% of ICU patients today require prolonged intensive care, sometimes for weeks or months, and these patients are at high risk of death while consuming 75% of resources. Although the primary insult was adequately dealt with, many long-stay patients typically suffer from hypercatabolism, ICU-acquired brain dysfunction and polyneuropathy/myopathy leading to severe muscle weakness, further increasing the risk of late death. As hypercatabolism was considered the culprit, several anabolic interventions were tested, but these showed harm instead of benefit. We previously showed that fasting early during illness is superior to forceful feeding, pointing to certain benefits of catabolic responses. In healthy humans, fasting activates catabolism to provide substrates essential to protect and maintain brain and muscle function. This proposal aims to investigate whether evolutionary conserved catabolic fasting pathways, specifically lipolysis and ketogenesis, can be exploited in the search for prevention of brain dysfunction and muscle weakness in long-stay ICU patients, with the goal to identify a new metabolic intervention to enhance their recovery. The project builds further on our experience with bi-directional translational research - using human material whenever possible and a validated mouse model of sepsis-induced critical illness for objectives that cannot be addressed in patients - and aims to close the loop, from a novel concept to a large randomized controlled trial in patients.
Max ERC Funding
2 500 000 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym FLICs
Project Enabling flexible integrated circuits and applications
Researcher (PI) Kris Jef Ria Myny
Host Institution (HI) INTERUNIVERSITAIR MICRO-ELECTRONICA CENTRUM
Call Details Starting Grant (StG), PE7, ERC-2016-STG
Summary Thin-film transistor technologies are present in many products today that require an active transistor backplane e.g. flat-panel displays and flat-panel photodetector arrays. Unipolar n-type transistors based on amorphous Indium-Gallium-Zinc-Oxide (a-IGZO) as semiconductor is currently the most promising technology for next generation products demanding a high-performant, low power transistor, manufacturable on flexible substrates enabling curved, bendable and even rollable displays. a-IGZO is a wide bandgap material characterized by extremely low off-state leakage currents and electron mobility of ~20 cm2/Vs. IGZO transistors fabricated on flexible substrates will also find their use in applications that require flexible integrated circuits.
The goal of this FLICs proposal is to develop disruptive technology and ground-breaking design innovations with amorphous oxide TFTs on plastic substrates, targeting large scale or very large scale flexible integrated circuits with unprecedented characteristics in terms of power consumption, supply voltage and operating speed, for applications in IoT and wearable healthcare sensor patches.
We introduce a new logic style, “quasi-CMOS”, which is based on unipolar, oxide dual-gate thin-film transistors. This logic style will drastically decrease the power consumption of unipolar logic gates in a novel way by taking advantage of dynamic backgate driving and of the transistor’s unique low off-state leakage current, without compromising on switching speed. In addition, we also introduce downscaling of the transistor’s dimensions, while remaining compatible with upscaling to large-area manufacturing platforms. Finally, we will investigate novel ultralow-power design techniques on system-level, while exploiting the quasi-CMOS logic gates.
We will demonstrate the power of this innovation with circuits for item-level Internet-of-Things, UHF RFID, and wearable health sensor patches.
Summary
Thin-film transistor technologies are present in many products today that require an active transistor backplane e.g. flat-panel displays and flat-panel photodetector arrays. Unipolar n-type transistors based on amorphous Indium-Gallium-Zinc-Oxide (a-IGZO) as semiconductor is currently the most promising technology for next generation products demanding a high-performant, low power transistor, manufacturable on flexible substrates enabling curved, bendable and even rollable displays. a-IGZO is a wide bandgap material characterized by extremely low off-state leakage currents and electron mobility of ~20 cm2/Vs. IGZO transistors fabricated on flexible substrates will also find their use in applications that require flexible integrated circuits.
The goal of this FLICs proposal is to develop disruptive technology and ground-breaking design innovations with amorphous oxide TFTs on plastic substrates, targeting large scale or very large scale flexible integrated circuits with unprecedented characteristics in terms of power consumption, supply voltage and operating speed, for applications in IoT and wearable healthcare sensor patches.
We introduce a new logic style, “quasi-CMOS”, which is based on unipolar, oxide dual-gate thin-film transistors. This logic style will drastically decrease the power consumption of unipolar logic gates in a novel way by taking advantage of dynamic backgate driving and of the transistor’s unique low off-state leakage current, without compromising on switching speed. In addition, we also introduce downscaling of the transistor’s dimensions, while remaining compatible with upscaling to large-area manufacturing platforms. Finally, we will investigate novel ultralow-power design techniques on system-level, while exploiting the quasi-CMOS logic gates.
We will demonstrate the power of this innovation with circuits for item-level Internet-of-Things, UHF RFID, and wearable health sensor patches.
Max ERC Funding
1 499 155 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym FLUOROCODE
Project FLUOROCODE: a super-resolution optical map of DNA
Researcher (PI) Johan M. V. Hofkens
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE4, ERC-2011-ADG_20110209
Summary "There has been an immense investment of time, effort and resources in the development of the technologies that enable DNA sequencing in the past 10 years. Despite the significant advances made, all of the current genomic sequencing technologies suffer from two important shortcomings. Firstly, sample preparation is time-consuming and expensive, and requiring a full day for sample preparation for next-generation sequencing experiments. Secondly, sequence information is delivered in short fragments, which are then assembled into a complete genome. Assembly is time-consuming and often results in a highly fragmented genomic sequence and the loss of important information on large-scale structural variation within the genome.
We recently developed a super-resolution DNA mapping technology, which allows us to uniquely study genetic-scale features in genomic length DNA molecules. Labelling the DNA with fluorescent molecules at specific sequences and using high-resolution fluorescence microscopy enabled us to produce a map of a genomic DNA sequence with unparalleled resolution, the so called FLUOROCODE. In this project we aim to extend our methodology to map longer DNA molecules and to include a multi-colour version of the FLUOROCODE that will allow us to read genomic DNA molecules like a barcode and probe DNA methylation status. The sample preparation, DNA labelling and deposition for imaging will be integrated to allow rapid mapping of DNA molecules. At the same time nanopores will be explored as a route to high-throughput DNA mapping.
FLUOROCODE will develop technology that aims to complement the information derived from current DNA sequencing platforms. The technology developed by FLUOROCODE will enable DNA mapping at unprecedented speed and for a fraction of the cost of a typical DNA sequencing project. We aniticipate that our method will find applications in the rapid identification of pathogens and in producing genomic scaffolds to improve genome sequence assembly."
Summary
"There has been an immense investment of time, effort and resources in the development of the technologies that enable DNA sequencing in the past 10 years. Despite the significant advances made, all of the current genomic sequencing technologies suffer from two important shortcomings. Firstly, sample preparation is time-consuming and expensive, and requiring a full day for sample preparation for next-generation sequencing experiments. Secondly, sequence information is delivered in short fragments, which are then assembled into a complete genome. Assembly is time-consuming and often results in a highly fragmented genomic sequence and the loss of important information on large-scale structural variation within the genome.
We recently developed a super-resolution DNA mapping technology, which allows us to uniquely study genetic-scale features in genomic length DNA molecules. Labelling the DNA with fluorescent molecules at specific sequences and using high-resolution fluorescence microscopy enabled us to produce a map of a genomic DNA sequence with unparalleled resolution, the so called FLUOROCODE. In this project we aim to extend our methodology to map longer DNA molecules and to include a multi-colour version of the FLUOROCODE that will allow us to read genomic DNA molecules like a barcode and probe DNA methylation status. The sample preparation, DNA labelling and deposition for imaging will be integrated to allow rapid mapping of DNA molecules. At the same time nanopores will be explored as a route to high-throughput DNA mapping.
FLUOROCODE will develop technology that aims to complement the information derived from current DNA sequencing platforms. The technology developed by FLUOROCODE will enable DNA mapping at unprecedented speed and for a fraction of the cost of a typical DNA sequencing project. We aniticipate that our method will find applications in the rapid identification of pathogens and in producing genomic scaffolds to improve genome sequence assembly."
Max ERC Funding
2 423 160 €
Duration
Start date: 2012-09-01, End date: 2017-08-31
Project acronym FMF-Dia
Project Immunological Diagnosis of Familial Mediterranean Fever
Researcher (PI) Mohamed LAMKANFI
Host Institution (HI) VIB
Call Details Proof of Concept (PoC), ERC-2016-PoC, ERC-2016-PoC
Summary Familial Mediterranean fever (FMF) is the most common monogenic autoinflammatory disease worldwide. The disease is highly prevalent in countries of the Mediterranean basin, the Caucasus and the Middle East, with 1:1000 up to 1:500 inhabitants being affected in countries like Turkey and Armenia. FMF has further spread to other regions of the world with migrations of these populations in modern history and today. Colchicine therapy is the gold standard for FMF patients, but given the risks involved, correct FMF diagnosis is desired before onset of therapy. Diagnosis of FMF is currently based on clinical presentation, and is further supported by review of ethnic origin, family history and genetic analysis of disease-associated MEFV alleles. However, clinical and genetic diagnosis of FMF are complicated by significant overlap of the clinical picture with other autoinflammatory diseases, and over 280 FMF alleles of MEFV have been described. Common genetic tests focus on the most common mutations in exons 2 and 10 of MEFV, but mutations may also occur in other parts of the MEFV gene. Consequently, genetic analysis of FMF is sometimes inconclusive, and correct diagnosis of FMF may sometimes be delayed for years. Here, we describe an immunological assay that for the first time allows selective identification of FMF patients based on the differential inflammasome activation response of their blood monocytes. The availability of a single robust, affordable and convenient assay that immunologically stratifies FMF patients from other autoinflammatory disease patients will be instrumental for improving timely and correct identification of FMF patients for colchicine therapy, and (where desired) will enable more cost-effective selection of patients for genetic confirmation of MEFV mutations.
Summary
Familial Mediterranean fever (FMF) is the most common monogenic autoinflammatory disease worldwide. The disease is highly prevalent in countries of the Mediterranean basin, the Caucasus and the Middle East, with 1:1000 up to 1:500 inhabitants being affected in countries like Turkey and Armenia. FMF has further spread to other regions of the world with migrations of these populations in modern history and today. Colchicine therapy is the gold standard for FMF patients, but given the risks involved, correct FMF diagnosis is desired before onset of therapy. Diagnosis of FMF is currently based on clinical presentation, and is further supported by review of ethnic origin, family history and genetic analysis of disease-associated MEFV alleles. However, clinical and genetic diagnosis of FMF are complicated by significant overlap of the clinical picture with other autoinflammatory diseases, and over 280 FMF alleles of MEFV have been described. Common genetic tests focus on the most common mutations in exons 2 and 10 of MEFV, but mutations may also occur in other parts of the MEFV gene. Consequently, genetic analysis of FMF is sometimes inconclusive, and correct diagnosis of FMF may sometimes be delayed for years. Here, we describe an immunological assay that for the first time allows selective identification of FMF patients based on the differential inflammasome activation response of their blood monocytes. The availability of a single robust, affordable and convenient assay that immunologically stratifies FMF patients from other autoinflammatory disease patients will be instrumental for improving timely and correct identification of FMF patients for colchicine therapy, and (where desired) will enable more cost-effective selection of patients for genetic confirmation of MEFV mutations.
Max ERC Funding
148 413 €
Duration
Start date: 2016-10-01, End date: 2018-03-31
Project acronym FOODCULT
Project Food, Culture and Identity in Ireland, 1550-1650
Researcher (PI) Susan O'Connor Flavin
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Starting Grant (StG), SH6, ERC-2018-STG
Summary FOODCULT is the first project to establish both the fundamentals of everyday diet, and the cultural ‘meaning’ of food and drink, in early modern Ireland. Exploring the period 1550-1650, one of major economic development, unprecedented intercultural contact, but also of conquest, colonisation and war, it focusses on Ireland as a case-study for understanding the role of food in the demonstration and negotiation of authority and power, and as a site for the development of emergent ‘national’ food cultures. Moving well beyond the colonial narrative of Irish social and economic development, however, it enlarges the study of food to examine neglected themes in Irish historiography, including gender, class, kinship and religious identities, as expressed through the consumption and exchange of food and drink.
Taking advantage of recent archaeological discoveries and the unprecedented accessibility of the archaeological evidence, the project develops a ground-breaking interdisciplinary approach, merging micro-historical analytical techniques with science and experimental archaeology, to examine what was eaten, where, why and by whom, at a level of detail deemed impossible for Irish history. Such questions will be explored in a comparative British Isles context, situating Irish developments within a broader analytical framework, whilst moving English food historiography beyond its current insular focus.
As a corollary, the project will produce the first major database of diet-related archaeological evidence for this period Mapping Diet: Comparative Food-Ways in Early Modern Ireland, while making accessible the only existing household accounts, a hugely significant, and previously overlooked, quantitative and qualitative source for dietary trends. These resources will shed light, not just on consumption patterns, but on Ireland’s broader economic and social development, whilst significantly furthering research agendas in early modern historical and archaeological scholarship.
Summary
FOODCULT is the first project to establish both the fundamentals of everyday diet, and the cultural ‘meaning’ of food and drink, in early modern Ireland. Exploring the period 1550-1650, one of major economic development, unprecedented intercultural contact, but also of conquest, colonisation and war, it focusses on Ireland as a case-study for understanding the role of food in the demonstration and negotiation of authority and power, and as a site for the development of emergent ‘national’ food cultures. Moving well beyond the colonial narrative of Irish social and economic development, however, it enlarges the study of food to examine neglected themes in Irish historiography, including gender, class, kinship and religious identities, as expressed through the consumption and exchange of food and drink.
Taking advantage of recent archaeological discoveries and the unprecedented accessibility of the archaeological evidence, the project develops a ground-breaking interdisciplinary approach, merging micro-historical analytical techniques with science and experimental archaeology, to examine what was eaten, where, why and by whom, at a level of detail deemed impossible for Irish history. Such questions will be explored in a comparative British Isles context, situating Irish developments within a broader analytical framework, whilst moving English food historiography beyond its current insular focus.
As a corollary, the project will produce the first major database of diet-related archaeological evidence for this period Mapping Diet: Comparative Food-Ways in Early Modern Ireland, while making accessible the only existing household accounts, a hugely significant, and previously overlooked, quantitative and qualitative source for dietary trends. These resources will shed light, not just on consumption patterns, but on Ireland’s broader economic and social development, whilst significantly furthering research agendas in early modern historical and archaeological scholarship.
Max ERC Funding
1 433 133 €
Duration
Start date: 2019-02-01, End date: 2024-01-31
Project acronym FOREFRONT
Project Frontiers of Extended Formulations
Researcher (PI) Samuel Fiorini
Host Institution (HI) UNIVERSITE LIBRE DE BRUXELLES
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary "Linear programming has proved to be an invaluable tool both in theory and practice. Semidefinite programming surpasses linear programming in terms of expressivity while remaining tractable. This project proposal investigates the modeling power of linear and semidefinite programming, in the context of combinatorial optimization. Within the emerging framework of extended formulations (EFs), I seek a decisive answer to the following question: Which problems can be modeled by a linear or semidefinite program, when the number of constraints and variables are limited? EFs are based on the idea that one should choose the ""right"" variables to model a problem. By extending the set of variables of a problem by a few carefully chosen variables, the number of constraints can in some cases dramatically decrease, making the problem easier to solve. Despite previous high-quality research, the theory of EFs is still on square one. This project proposal aims at (i) transforming our current zero-dimensional state of knowledge to a truly three-dimensional state of knowledge by pushing the boundaries of EFs in three directions (models, types and problems); (ii) using EFs as a lens on complexity by proving strong consequences of important conjectures such as P != NP, and leveraging strong connections to geometry to make progress on the log-rank conjecture. The proposed methodology is: (i) experiment-aided; (ii) interdisciplinary; (iii) constructive."
Summary
"Linear programming has proved to be an invaluable tool both in theory and practice. Semidefinite programming surpasses linear programming in terms of expressivity while remaining tractable. This project proposal investigates the modeling power of linear and semidefinite programming, in the context of combinatorial optimization. Within the emerging framework of extended formulations (EFs), I seek a decisive answer to the following question: Which problems can be modeled by a linear or semidefinite program, when the number of constraints and variables are limited? EFs are based on the idea that one should choose the ""right"" variables to model a problem. By extending the set of variables of a problem by a few carefully chosen variables, the number of constraints can in some cases dramatically decrease, making the problem easier to solve. Despite previous high-quality research, the theory of EFs is still on square one. This project proposal aims at (i) transforming our current zero-dimensional state of knowledge to a truly three-dimensional state of knowledge by pushing the boundaries of EFs in three directions (models, types and problems); (ii) using EFs as a lens on complexity by proving strong consequences of important conjectures such as P != NP, and leveraging strong connections to geometry to make progress on the log-rank conjecture. The proposed methodology is: (i) experiment-aided; (ii) interdisciplinary; (iii) constructive."
Max ERC Funding
1 455 479 €
Duration
Start date: 2014-09-01, End date: 2019-08-31
Project acronym FORMICA
Project Microclimatic buffering of plant responses to macroclimate warming in temperate forests
Researcher (PI) Pieter DE FRENNE
Host Institution (HI) UNIVERSITEIT GENT
Call Details Starting Grant (StG), LS9, ERC-2017-STG
Summary Recent global warming is acting across ecosystems and threatening biodiversity. Yet, due to slow responses, many biological communities are lagging behind warming of the macroclimate (the climate of a large geographic region). The buffering of microclimates near the ground measured in localized areas, arising from terrain features such as vegetation and topography, can explain why many species are lagging behind macroclimate warming. However, almost all studies ignore the effects of microclimatic buffering and key uncertainties still exist about this mechanism. Microclimates are particularly evident in forests, where understorey habitats are buffered by overstorey trees. In temperate forests, the understorey contains the vast majority of plant diversity and plays an essential role in driving ecosystem processes.
The overall goal of FORMICA (FORest MICroclimate Assessment) is to quantify and understand the role of microclimatic buffering in modulating forest understorey plant responses to macroclimate warming. We will perform the best assessment to date of the effects of microclimates on plants by applying microtemperature loggers, experimental heating, fluorescent tubes and a large-scale transplant experiment in temperate forests across Europe. For the first time, plant data from the individual to ecosystem level will be related to microclimate along wide temperature gradients and forest management regimes. The empirical results will then be integrated in cutting-edge demographic distribution models to forecast plant diversity in temperate forests as macroclimate warms.
FORMICA will provide the first integrative study on microclimatic buffering of macroclimate warming in forests. Interdisciplinary concepts and methods will be applied, including from climatology, forestry and ecology. FORMICA will reshape our current understanding of the impacts of climate change on forests and help land managers and policy makers to develop urgently needed adaptation strategies.
Summary
Recent global warming is acting across ecosystems and threatening biodiversity. Yet, due to slow responses, many biological communities are lagging behind warming of the macroclimate (the climate of a large geographic region). The buffering of microclimates near the ground measured in localized areas, arising from terrain features such as vegetation and topography, can explain why many species are lagging behind macroclimate warming. However, almost all studies ignore the effects of microclimatic buffering and key uncertainties still exist about this mechanism. Microclimates are particularly evident in forests, where understorey habitats are buffered by overstorey trees. In temperate forests, the understorey contains the vast majority of plant diversity and plays an essential role in driving ecosystem processes.
The overall goal of FORMICA (FORest MICroclimate Assessment) is to quantify and understand the role of microclimatic buffering in modulating forest understorey plant responses to macroclimate warming. We will perform the best assessment to date of the effects of microclimates on plants by applying microtemperature loggers, experimental heating, fluorescent tubes and a large-scale transplant experiment in temperate forests across Europe. For the first time, plant data from the individual to ecosystem level will be related to microclimate along wide temperature gradients and forest management regimes. The empirical results will then be integrated in cutting-edge demographic distribution models to forecast plant diversity in temperate forests as macroclimate warms.
FORMICA will provide the first integrative study on microclimatic buffering of macroclimate warming in forests. Interdisciplinary concepts and methods will be applied, including from climatology, forestry and ecology. FORMICA will reshape our current understanding of the impacts of climate change on forests and help land managers and policy makers to develop urgently needed adaptation strategies.
Max ERC Funding
1 498 469 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym FORSIED
Project Formalizing Subjective Interestingness in Exploratory Data Mining
Researcher (PI) Tijl De Bie
Host Institution (HI) UNIVERSITEIT GENT
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary "The rate at which research labs, enterprises and governments accumulate data is high and fast increasing. Often, these data are collected for no specific purpose, or they turn out to be useful for unanticipated purposes: Companies constantly look for new ways to monetize their customer databases; Governments mine various databases to detect tax fraud; Security agencies mine and cross-associate numerous heterogeneous information streams from publicly accessible and classified databases to understand and detect security threats. The objective in such Exploratory Data Mining (EDM) tasks is typically ill-defined, i.e. it is unclear how to formalize how interesting a pattern extracted from the data is. As a result, EDM is often a slow process of trial and error.
During this fellowship we aim to develop the mathematical principles of what makes a pattern interesting in a very subjective sense. Crucial in this endeavour will be research into automatic mechanisms to model and duly consider the prior beliefs and expectations of the user for whom the EDM patterns are intended, thus relieving the users of the complex task to attempt to formalize themselves what makes a pattern interesting to them.
This project will represent a radical change in how EDM research is done. Currently, researchers typically imagine a specific purpose for the patterns, try to formalize interestingness of such patterns given that purpose, and design an algorithm to mine them. However, given the variety of users, this strategy has led to a multitude of algorithms. As a result, users need to be data mining experts to understand which algorithm applies to their situation. To resolve this, we will develop a theoretically solid framework for the design of EDM systems that model the user's beliefs and expectations as much as the data itself, so as to maximize the amount of useful information transmitted to the user. This will ultimately bring the power of EDM within reach of the non-expert."
Summary
"The rate at which research labs, enterprises and governments accumulate data is high and fast increasing. Often, these data are collected for no specific purpose, or they turn out to be useful for unanticipated purposes: Companies constantly look for new ways to monetize their customer databases; Governments mine various databases to detect tax fraud; Security agencies mine and cross-associate numerous heterogeneous information streams from publicly accessible and classified databases to understand and detect security threats. The objective in such Exploratory Data Mining (EDM) tasks is typically ill-defined, i.e. it is unclear how to formalize how interesting a pattern extracted from the data is. As a result, EDM is often a slow process of trial and error.
During this fellowship we aim to develop the mathematical principles of what makes a pattern interesting in a very subjective sense. Crucial in this endeavour will be research into automatic mechanisms to model and duly consider the prior beliefs and expectations of the user for whom the EDM patterns are intended, thus relieving the users of the complex task to attempt to formalize themselves what makes a pattern interesting to them.
This project will represent a radical change in how EDM research is done. Currently, researchers typically imagine a specific purpose for the patterns, try to formalize interestingness of such patterns given that purpose, and design an algorithm to mine them. However, given the variety of users, this strategy has led to a multitude of algorithms. As a result, users need to be data mining experts to understand which algorithm applies to their situation. To resolve this, we will develop a theoretically solid framework for the design of EDM systems that model the user's beliefs and expectations as much as the data itself, so as to maximize the amount of useful information transmitted to the user. This will ultimately bring the power of EDM within reach of the non-expert."
Max ERC Funding
1 549 315 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym FOUNDCOG
Project Curiosity and the Development of the Hidden Foundations of Cognition
Researcher (PI) Rhodri CUSACK
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Advanced Grant (AdG), SH4, ERC-2017-ADG
Summary How do human infants develop complex cognition? We propose that artificial intelligence (AI) provides crucial insight into human curiosity-driven learning and the development of infant cognition. Deep learning—a technology that has revolutionised AI—involves the acquisition of informative internal representations through pre-training, as a critical precursory step to learning any specific task. We propose that, similarly, curiosity guides human infants to develop ‘hidden’ mature mental representations through pre-training well before the manifestation of behaviour. To test this proposal, for the first time we will use neuroimaging to measure the hidden changes in representations during infancy and compare these to predictions from deep learning in machines. Research Question 1 will ask how infants guide pre-training through directed curiosity, by testing quantitative models of curiosity adapted from developmental robotics. We will also test the hypothesis from pilot data that the fronto-parietal brain network guides curiosity from the start. Research Question 2 will further test the parallel with deep learning by characterising the developing infant’s mental representations within the visual system using the powerful neuroimaging technique of representational similarity analysis. Research Question 3 will investigate how individual differences in curiosity affect later cognitive performance, and test the prediction from deep learning that the effects of early experience during pre-training grow rather than shrink with subsequent experience. Finally, Research Question 4 will test the novel prediction from deep learning that, following perinatal brain injury, pre-training creates resilience provided that curiosity is intact. The investigations will answer the overarching question of how pre-training learning lays the foundations for cognition and pioneer the new field of Computational Developmental Cognitive Neuroscience.
Summary
How do human infants develop complex cognition? We propose that artificial intelligence (AI) provides crucial insight into human curiosity-driven learning and the development of infant cognition. Deep learning—a technology that has revolutionised AI—involves the acquisition of informative internal representations through pre-training, as a critical precursory step to learning any specific task. We propose that, similarly, curiosity guides human infants to develop ‘hidden’ mature mental representations through pre-training well before the manifestation of behaviour. To test this proposal, for the first time we will use neuroimaging to measure the hidden changes in representations during infancy and compare these to predictions from deep learning in machines. Research Question 1 will ask how infants guide pre-training through directed curiosity, by testing quantitative models of curiosity adapted from developmental robotics. We will also test the hypothesis from pilot data that the fronto-parietal brain network guides curiosity from the start. Research Question 2 will further test the parallel with deep learning by characterising the developing infant’s mental representations within the visual system using the powerful neuroimaging technique of representational similarity analysis. Research Question 3 will investigate how individual differences in curiosity affect later cognitive performance, and test the prediction from deep learning that the effects of early experience during pre-training grow rather than shrink with subsequent experience. Finally, Research Question 4 will test the novel prediction from deep learning that, following perinatal brain injury, pre-training creates resilience provided that curiosity is intact. The investigations will answer the overarching question of how pre-training learning lays the foundations for cognition and pioneer the new field of Computational Developmental Cognitive Neuroscience.
Max ERC Funding
2 500 000 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym FUTURE-PRINT
Project Tuneable 2D Nanosheet Networks for Printed Electronics
Researcher (PI) Jonathan Nesbitt COLEMAN
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Advanced Grant (AdG), PE5, ERC-2015-AdG
Summary In the future, even the most mundane objects will contain electronic circuitry allowing them to gather, process, display and transmit information. The resulting vast network, often called the Internet of Things, will revolutionise society. To realise this will require the ability to produce electronic circuitry extremely cheaply, often on unconventional substrates. This will be achieved through printed electronics, by the assembly of devices from solution (i.e. ink) using methods adapted from printing technology. However, while printed electronics has been advancing rapidly, the development of new, nano-materials-based inks is required for this area to meet its true potential.
We believe recent developments in liquid exfoliation of 2D nanosheets have given us the ideal family of materials to revolutionise electronic ink production. Liquid exfoliation can transform layered crystals into suspensions of nanosheets in very large quantities. In this way we can produce liquid-dispersed nanosheets of a wide range of types including conducting (e.g. graphene, MXenes, TiB2 etc), semiconducting (e.g. MoS2, WSe2, GaS, Black phosphorous etc), insulating (e.g. BN, talc) or electrochemically active (e.g. MoO3, Ni(OH)2, MnO2 etc). These nanosheets can be deposited from liquid to form porous networks of defined electronic type. While these networks have huge applications potential, a large amount of work must be done to translate them into working printed devices.
In this project, we will develop methods to transform large volume suspensions of exfoliated nanosheets into bespoke 2D inks with properties engineered for a range of specific printed device applications. We will learn to use this 2D ink to print patterned or large area 2D nanosheet networks with controlled structure, allowing us to tune the electrical properties of the network during printing. We will combine networks of different nanosheet types into complex heterostructures. This will allow us to print all device components (electrodes, active layers, dielectrics, energy storage layers) from one contiguous, multi-component network. In this way we will produce 2D network transistors, solar cells, displays and energy storage systems. FUTURE-PRINT will revolutionise electronic inks and will offer a new path forward for printed electronics.
Summary
In the future, even the most mundane objects will contain electronic circuitry allowing them to gather, process, display and transmit information. The resulting vast network, often called the Internet of Things, will revolutionise society. To realise this will require the ability to produce electronic circuitry extremely cheaply, often on unconventional substrates. This will be achieved through printed electronics, by the assembly of devices from solution (i.e. ink) using methods adapted from printing technology. However, while printed electronics has been advancing rapidly, the development of new, nano-materials-based inks is required for this area to meet its true potential.
We believe recent developments in liquid exfoliation of 2D nanosheets have given us the ideal family of materials to revolutionise electronic ink production. Liquid exfoliation can transform layered crystals into suspensions of nanosheets in very large quantities. In this way we can produce liquid-dispersed nanosheets of a wide range of types including conducting (e.g. graphene, MXenes, TiB2 etc), semiconducting (e.g. MoS2, WSe2, GaS, Black phosphorous etc), insulating (e.g. BN, talc) or electrochemically active (e.g. MoO3, Ni(OH)2, MnO2 etc). These nanosheets can be deposited from liquid to form porous networks of defined electronic type. While these networks have huge applications potential, a large amount of work must be done to translate them into working printed devices.
In this project, we will develop methods to transform large volume suspensions of exfoliated nanosheets into bespoke 2D inks with properties engineered for a range of specific printed device applications. We will learn to use this 2D ink to print patterned or large area 2D nanosheet networks with controlled structure, allowing us to tune the electrical properties of the network during printing. We will combine networks of different nanosheet types into complex heterostructures. This will allow us to print all device components (electrodes, active layers, dielectrics, energy storage layers) from one contiguous, multi-component network. In this way we will produce 2D network transistors, solar cells, displays and energy storage systems. FUTURE-PRINT will revolutionise electronic inks and will offer a new path forward for printed electronics.
Max ERC Funding
2 213 317 €
Duration
Start date: 2016-11-01, End date: 2021-10-31
Project acronym GENCOMMONS
Project Institutionalizing global genetic-resource commons. Global Strategies for accessing and using essential public knowledge assets in the life sciences
Researcher (PI) Tom Dedeurwaerdere
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Starting Grant (StG), SH2, ERC-2011-StG_20101124
Summary There has been a dramatic increase in interest in commons in the last 10 to 15 years, from traditional commons managing the use of exhaustible natural resources by fixed numbers of people within natural borders, to global information commons, dealing with non-rival, non-excludible goods by a potentially limitless number of unknown users. The emerging global genetic-resource commons fits somewhere in between, shifting in the direction of information commons as digital-information infrastructures allow physically distributed collections to be networked in virtual global pools. In this research project we propose that networking pools of genetic resources in a global commons potentially is a workable alternative to proprietary market-based solutions, which have been shown to be unable to generate sufficient investment in the vast quantities of genetic resources that are neglected because of their unknown and/or unlikely commercial value. These neglected resources are the building blocks for future scientific research and have enormous value for sustaining biodiversity and livelihoods in developing and industrialized countries. Our hypothesis is that implementing collective intellectual property strategies and standard material transfer agreements for access to these pre-competitive research materials has become feasible in a cost effective manner through new hybrid approaches to governance which combine design features from natural resource commons and digital information commons. To substantiate these proposals, this research project will conduct a comparative institutional analysis of the use and exchange practices in the genetic-resource commons, and propose a set of governance arrangements that would put these practices on a sound legal and institutional basis.
Summary
There has been a dramatic increase in interest in commons in the last 10 to 15 years, from traditional commons managing the use of exhaustible natural resources by fixed numbers of people within natural borders, to global information commons, dealing with non-rival, non-excludible goods by a potentially limitless number of unknown users. The emerging global genetic-resource commons fits somewhere in between, shifting in the direction of information commons as digital-information infrastructures allow physically distributed collections to be networked in virtual global pools. In this research project we propose that networking pools of genetic resources in a global commons potentially is a workable alternative to proprietary market-based solutions, which have been shown to be unable to generate sufficient investment in the vast quantities of genetic resources that are neglected because of their unknown and/or unlikely commercial value. These neglected resources are the building blocks for future scientific research and have enormous value for sustaining biodiversity and livelihoods in developing and industrialized countries. Our hypothesis is that implementing collective intellectual property strategies and standard material transfer agreements for access to these pre-competitive research materials has become feasible in a cost effective manner through new hybrid approaches to governance which combine design features from natural resource commons and digital information commons. To substantiate these proposals, this research project will conduct a comparative institutional analysis of the use and exchange practices in the genetic-resource commons, and propose a set of governance arrangements that would put these practices on a sound legal and institutional basis.
Max ERC Funding
1 041 600 €
Duration
Start date: 2011-10-01, End date: 2016-09-30
Project acronym GENDERBALL
Project Implications of the Shifting Gender Balance in Education for Reproductive Behaviour in Europe
Researcher (PI) Jan Van Bavel
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), SH3, ERC-2012-StG_20111124
Summary This project is the first comprehensive study of the demographic consequences of a major recent development in Europe: while men have always received more education than women in the past, this gender balance in education has now turned around. For the first time in history, there are more highly educated women than men reaching the reproductive ages and looking for a partner. I expect that this will have profound consequences for the demography of reproduction because mating practices have always implied that men are the majority in higher education. These traditional practices are no longer compatible with the new gender distribution in education. The objective of my project is to study in depth the consequences of this historically new situation for reproductive behaviour. The first step of the project is to reconstruct country-specific time series charting the shifting gender balance in education across time and space at different ages. These can then be used as contextual information in subsequent multilevel analyses of reproductive behaviour. In the second part, I will investigate how the reversal of the gender balance is influencing patterns of assortative mating by level of education. Third, I will study how the shifting gender balance is connected to the timing and probability of marriage versus unmarried cohabitation and to the timing and quantum of fertility. Finally, I will investigate the consequences for divorce and separation. Existing data sources will be used that cover a wide range of European countries. This project will not only be ground breaking by setting the research agenda for a new era in the European reproductive landscape. It will also introduce methodological innovations. First, agent based modelling will be used as a method to study assortative mating. Second, I propose a new way to study the causal effect of the gender balance in education. These methodological innovations will prove useful for many other social science projects.
Summary
This project is the first comprehensive study of the demographic consequences of a major recent development in Europe: while men have always received more education than women in the past, this gender balance in education has now turned around. For the first time in history, there are more highly educated women than men reaching the reproductive ages and looking for a partner. I expect that this will have profound consequences for the demography of reproduction because mating practices have always implied that men are the majority in higher education. These traditional practices are no longer compatible with the new gender distribution in education. The objective of my project is to study in depth the consequences of this historically new situation for reproductive behaviour. The first step of the project is to reconstruct country-specific time series charting the shifting gender balance in education across time and space at different ages. These can then be used as contextual information in subsequent multilevel analyses of reproductive behaviour. In the second part, I will investigate how the reversal of the gender balance is influencing patterns of assortative mating by level of education. Third, I will study how the shifting gender balance is connected to the timing and probability of marriage versus unmarried cohabitation and to the timing and quantum of fertility. Finally, I will investigate the consequences for divorce and separation. Existing data sources will be used that cover a wide range of European countries. This project will not only be ground breaking by setting the research agenda for a new era in the European reproductive landscape. It will also introduce methodological innovations. First, agent based modelling will be used as a method to study assortative mating. Second, I propose a new way to study the causal effect of the gender balance in education. These methodological innovations will prove useful for many other social science projects.
Max ERC Funding
1 489 000 €
Duration
Start date: 2013-01-01, End date: 2017-12-31
Project acronym GENDEVOCORTEX
Project Genetic links between development and evolution of the human cerebral cortex
Researcher (PI) Pierre Vanderhaeghen
Host Institution (HI) UNIVERSITE LIBRE DE BRUXELLES
Call Details Advanced Grant (AdG), LS5, ERC-2013-ADG
Summary "The mechanisms underlying the evolution of the human brain constitute one of the most fascinating unresolved questions of biology. The cerebral cortex has evolved rapidly in size and complexity in the hominid lineage, which is likely linked to quantitative and qualitative divergence in patterns of cortical development.
On the other hand, comparative genomics has revealed recently the existence of a number of ""hominid-specific"" genes, which constitute attractive candidates to underlie critical aspects of human brain evolution, but their function remains essentially unexplored, mostly because of the lack of appropriate experimental systems.
Here we propose to test a simple and radical hypothesis: that key species-specific features of the development of the human cerebral cortex, in particular the generation and differentiation of pyramidal neurons, are linked functionally to the emergence of hominid-specific (HS) genes controlling corticogenesis.
To achieve this high risk / high gain goal, we will first determine which HS genes are expressed in the human developing cortex in vivo, using a combination of genome-wide and in situ gene detection analyses, in order to select those most likely to impact corticogenesis.
The function of candidate HS genes will be determined using innovative models of human corticogenesis based on pluripotent stem cells, developed recently in our laboratory, as well as ex vivo cultures of human fetal cortex. In addition, the developmental and evolutionary impact of HS genes will be examined in a non-hominid context, the mouse embryonic cortex.
By identifying the function of hominid-specific genes in cortical developpment, we will uncover specific genetic mechanisms linking functionally the development and evolution of the human brain, with broad implications in neurobiology, developmental and evolutionary biology."
Summary
"The mechanisms underlying the evolution of the human brain constitute one of the most fascinating unresolved questions of biology. The cerebral cortex has evolved rapidly in size and complexity in the hominid lineage, which is likely linked to quantitative and qualitative divergence in patterns of cortical development.
On the other hand, comparative genomics has revealed recently the existence of a number of ""hominid-specific"" genes, which constitute attractive candidates to underlie critical aspects of human brain evolution, but their function remains essentially unexplored, mostly because of the lack of appropriate experimental systems.
Here we propose to test a simple and radical hypothesis: that key species-specific features of the development of the human cerebral cortex, in particular the generation and differentiation of pyramidal neurons, are linked functionally to the emergence of hominid-specific (HS) genes controlling corticogenesis.
To achieve this high risk / high gain goal, we will first determine which HS genes are expressed in the human developing cortex in vivo, using a combination of genome-wide and in situ gene detection analyses, in order to select those most likely to impact corticogenesis.
The function of candidate HS genes will be determined using innovative models of human corticogenesis based on pluripotent stem cells, developed recently in our laboratory, as well as ex vivo cultures of human fetal cortex. In addition, the developmental and evolutionary impact of HS genes will be examined in a non-hominid context, the mouse embryonic cortex.
By identifying the function of hominid-specific genes in cortical developpment, we will uncover specific genetic mechanisms linking functionally the development and evolution of the human brain, with broad implications in neurobiology, developmental and evolutionary biology."
Max ERC Funding
2 473 937 €
Duration
Start date: 2014-08-01, End date: 2019-07-31
Project acronym GENOMIA
Project Genomic Modifiers of Inherited Aortapathy
Researcher (PI) Bart Leo LOEYS
Host Institution (HI) UNIVERSITEIT ANTWERPEN
Call Details Consolidator Grant (CoG), LS4, ERC-2017-COG
Summary Thoracic aortic aneurysm and dissection (TAAD) is an important cause of morbidity and mortality in the western world. As 20% of all affected individuals have a positive family history, the genetic contribution to the development of TAAD is significant. Over the last decade dozens of genes were identified underlying syndromic and non-syndromic forms of TAAD. Although mutations in these disease culprits do not yet explain all cases, their identification and functional characterization were essential in deciphering three key aortic aneurysm/dissection patho-mechanisms: disturbed extracellular matrix homeostasis, dysregulated TGFbeta signaling and altered aortic smooth muscle cell contractility. Owing to the recent advent of next-generation sequencing technologies, I anticipate that the identification of additional genetic TAAD causes will remain quite straightforward in the coming years. Importantly, in many syndromic and non-syndromic families, significant non-penetrance and both inter- and intra-familial clinical variation are observed. So, although the primary genetic underlying mutation is identical in all these family members, the clinical spectrum varies widely from completely asymptomatic to sudden death due to aortic dissection at young age. The precise mechanisms underlying this variability remain largely elusive. Consequently, a better understanding of the functional effects of the primary mutation is highly needed and the identification of genetic variation that modifies these effects is becoming increasingly important. In this project, I carefully selected four different innovative strategies to discover mother nature’s own modifying capabilities in human and mouse aortopathy. The identification of these genetic modifiers will advance the knowledge significantly beyond the current understanding, individualize current treatment protocols to deliver true precision medicine and offer promising new leads to novel therapeutic strategies.
Summary
Thoracic aortic aneurysm and dissection (TAAD) is an important cause of morbidity and mortality in the western world. As 20% of all affected individuals have a positive family history, the genetic contribution to the development of TAAD is significant. Over the last decade dozens of genes were identified underlying syndromic and non-syndromic forms of TAAD. Although mutations in these disease culprits do not yet explain all cases, their identification and functional characterization were essential in deciphering three key aortic aneurysm/dissection patho-mechanisms: disturbed extracellular matrix homeostasis, dysregulated TGFbeta signaling and altered aortic smooth muscle cell contractility. Owing to the recent advent of next-generation sequencing technologies, I anticipate that the identification of additional genetic TAAD causes will remain quite straightforward in the coming years. Importantly, in many syndromic and non-syndromic families, significant non-penetrance and both inter- and intra-familial clinical variation are observed. So, although the primary genetic underlying mutation is identical in all these family members, the clinical spectrum varies widely from completely asymptomatic to sudden death due to aortic dissection at young age. The precise mechanisms underlying this variability remain largely elusive. Consequently, a better understanding of the functional effects of the primary mutation is highly needed and the identification of genetic variation that modifies these effects is becoming increasingly important. In this project, I carefully selected four different innovative strategies to discover mother nature’s own modifying capabilities in human and mouse aortopathy. The identification of these genetic modifiers will advance the knowledge significantly beyond the current understanding, individualize current treatment protocols to deliver true precision medicine and offer promising new leads to novel therapeutic strategies.
Max ERC Funding
1 987 860 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym GEOFIN
Project Western banks in Eastern Europe: New geographies of financialisation
Researcher (PI) Martin Sokol
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Consolidator Grant (CoG), SH3, ERC-2015-CoG
Summary Financialisation, or the growing power of finance over societies and economies, is increasingly recognised as the key feature of contemporary capitalism. However, significant gaps in our understanding of this process remain. Indeed, despite growing recognition that financialisation is an inherently spatial process, a geographically-informed view of financialisation remains underdeveloped. In addition, and related to this, the extent and the ways in which post-socialist ‘transition’ societies in East-Central Europe have been financialised remain under-researched and under-theorised. Yet, the examination of former state-socialist societies (built on the very opposite economic logic to that of financialisation) provides an unmatched opportunity to learn about financialisation itself, how it ‘penetrates’ societies and with what social and spatial implications. East-Central Europe in this sense constitutes a unique terrain for frontier research. GEOFIN will address the above shortcomings by producing empirical and theoretical insights to develop a geographically-informed view of financialisation. The objective is to examine how states, banks and households in post-socialist contexts have been financialised and to consider what implications this has for the societies in question and for Europe as a whole. The project will pilot a novel approach based on the concept of ‘financial chains’ which are understood both as channels of value transfer and as social relations that shape socio-economic processes and attendant economic geographies. A set of interlocking case studies will be mobilised to reveal the different ways in which banks, states and households across post-socialist East-Central Europe are interconnected by financial chains with each other and with a wider political economy. GEOFIN will fundamentally advance our understanding of new geographies of financialisation, opening up new horizons in studies of finance and its future role in the society.
Summary
Financialisation, or the growing power of finance over societies and economies, is increasingly recognised as the key feature of contemporary capitalism. However, significant gaps in our understanding of this process remain. Indeed, despite growing recognition that financialisation is an inherently spatial process, a geographically-informed view of financialisation remains underdeveloped. In addition, and related to this, the extent and the ways in which post-socialist ‘transition’ societies in East-Central Europe have been financialised remain under-researched and under-theorised. Yet, the examination of former state-socialist societies (built on the very opposite economic logic to that of financialisation) provides an unmatched opportunity to learn about financialisation itself, how it ‘penetrates’ societies and with what social and spatial implications. East-Central Europe in this sense constitutes a unique terrain for frontier research. GEOFIN will address the above shortcomings by producing empirical and theoretical insights to develop a geographically-informed view of financialisation. The objective is to examine how states, banks and households in post-socialist contexts have been financialised and to consider what implications this has for the societies in question and for Europe as a whole. The project will pilot a novel approach based on the concept of ‘financial chains’ which are understood both as channels of value transfer and as social relations that shape socio-economic processes and attendant economic geographies. A set of interlocking case studies will be mobilised to reveal the different ways in which banks, states and households across post-socialist East-Central Europe are interconnected by financial chains with each other and with a wider political economy. GEOFIN will fundamentally advance our understanding of new geographies of financialisation, opening up new horizons in studies of finance and its future role in the society.
Max ERC Funding
1 806 536 €
Duration
Start date: 2016-12-01, End date: 2021-11-30
Project acronym GlycoTarget
Project Exploring the targeted delivery of biopharmaceuticals enabled by glycosylation control
Researcher (PI) Nico Luc Marc Callewaert
Host Institution (HI) VIB
Call Details Consolidator Grant (CoG), LS7, ERC-2013-CoG
Summary Most biotechnological therapeutics used in the clinic today and under current development, are of protein nature. Eukaryotic expression systems (such as yeasts and mammalian cells) for these therapeutic proteins add carbohydrate moieties (glycans) to the proteins, and these glycans strongly modulate the protein's in vivo biodistribution and therapeutic efficacy. Until recently, no adequate tools were available to accurately control glycosylation structure in these expression systems, but bio-engineering research in our lab and elsewhere has now largely overcome this problem.
In the GlycoTarget ERC Consolidator grant project, we aim at exploring the relation between the structure of the glycans on therapeutic proteins and the in vivo targeting properties of these modified proteins to different tissues/cells/subcellular organelles.
As highly medically relevant test cases for this exploration, we have selected three diseases with strong unmet therapeutic need, that could potentially be treated with glyco-targeted biopharmaceuticals through three different routes of protein delivery: progressive liver disease (intravenous), allergic asthma (subcutaneous immunization) and active tuberculosis (intrapulmonary delivery).
Summary
Most biotechnological therapeutics used in the clinic today and under current development, are of protein nature. Eukaryotic expression systems (such as yeasts and mammalian cells) for these therapeutic proteins add carbohydrate moieties (glycans) to the proteins, and these glycans strongly modulate the protein's in vivo biodistribution and therapeutic efficacy. Until recently, no adequate tools were available to accurately control glycosylation structure in these expression systems, but bio-engineering research in our lab and elsewhere has now largely overcome this problem.
In the GlycoTarget ERC Consolidator grant project, we aim at exploring the relation between the structure of the glycans on therapeutic proteins and the in vivo targeting properties of these modified proteins to different tissues/cells/subcellular organelles.
As highly medically relevant test cases for this exploration, we have selected three diseases with strong unmet therapeutic need, that could potentially be treated with glyco-targeted biopharmaceuticals through three different routes of protein delivery: progressive liver disease (intravenous), allergic asthma (subcutaneous immunization) and active tuberculosis (intrapulmonary delivery).
Max ERC Funding
1 994 760 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym GRAPH
Project The Great War and Modern Philosophy
Researcher (PI) Nicolas James Laurent Fernando De Warren
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Consolidator Grant (CoG), SH5, ERC-2013-CoG
Summary "The First World War was an unprecedented event of destruction, transformation, and renewal that left no aspect of European culture unchanged. Philosophy proved no exception: the war motivated an historically singular mobilization of philosophers to write about the war during the years of conflict; significant works of philosophy were written during the war years and immediately thereafter; the postwar decades of the 1920s and 1930s witnessed a systematic reconfiguration of the landscape of philosophical thought that still largely defines contemporary philosophy. Surprisingly, while the impact of the war on literature, poetry, and the arts, political thought has been a subject of intense inquiry and interpretation, the significance of the war for modern philosophy remains relatively unexamined, often misunderstood or simply taken for granted.
This project aims at understanding the impact of the Great War on modern philosophy. It aims to chart an original course and establish a new standard for the philosophical study of the relation between the First World War and 20th-century philosophy through a comparative and critical approach to a diverse array of thinkers. Specifically, this project will investigate the hypothesis of whether diverse philosophical responses, direct and indirect, immediately or postponed, can be understood as formulations of different questions posed, or better: catalyzed by the war itself. This project will additionally argue that the very idea that war could reveal, challenge or legitimate cultural or philosophical meaning is itself a legacy of a distinctive kind of war-philosophy produced during the war.
This project will be divided into four sub-projects: (1) ""Philosophy of War and the Wars of Philosophy,""; (2) ""The Philosophy of Language and the Languages of Philosophy""; (3) ""The Care of the Soul""; (4) ""Europe after Europe."""
Summary
"The First World War was an unprecedented event of destruction, transformation, and renewal that left no aspect of European culture unchanged. Philosophy proved no exception: the war motivated an historically singular mobilization of philosophers to write about the war during the years of conflict; significant works of philosophy were written during the war years and immediately thereafter; the postwar decades of the 1920s and 1930s witnessed a systematic reconfiguration of the landscape of philosophical thought that still largely defines contemporary philosophy. Surprisingly, while the impact of the war on literature, poetry, and the arts, political thought has been a subject of intense inquiry and interpretation, the significance of the war for modern philosophy remains relatively unexamined, often misunderstood or simply taken for granted.
This project aims at understanding the impact of the Great War on modern philosophy. It aims to chart an original course and establish a new standard for the philosophical study of the relation between the First World War and 20th-century philosophy through a comparative and critical approach to a diverse array of thinkers. Specifically, this project will investigate the hypothesis of whether diverse philosophical responses, direct and indirect, immediately or postponed, can be understood as formulations of different questions posed, or better: catalyzed by the war itself. This project will additionally argue that the very idea that war could reveal, challenge or legitimate cultural or philosophical meaning is itself a legacy of a distinctive kind of war-philosophy produced during the war.
This project will be divided into four sub-projects: (1) ""Philosophy of War and the Wars of Philosophy,""; (2) ""The Philosophy of Language and the Languages of Philosophy""; (3) ""The Care of the Soul""; (4) ""Europe after Europe."""
Max ERC Funding
1 652 102 €
Duration
Start date: 2014-10-01, End date: 2019-09-30
Project acronym GSVRotor
Project Development of the Gas-Solid Vortex-Rotor Reactor
Researcher (PI) Guy Marin
Host Institution (HI) UNIVERSITEIT GENT
Call Details Proof of Concept (PoC), PC1, ERC-2014-PoC
Summary The project includes Proof-of-Concept research, as well as preparation and submission of a new patent application. The intention of the current proposal is to develop a novel gas-solid reactor that drastically reduces the limitations of these reactors. Moreover, novel fields of applications will be sought. This new Intellectual Property of UGent will be a subject of its following licensing.
Summary
The project includes Proof-of-Concept research, as well as preparation and submission of a new patent application. The intention of the current proposal is to develop a novel gas-solid reactor that drastically reduces the limitations of these reactors. Moreover, novel fields of applications will be sought. This new Intellectual Property of UGent will be a subject of its following licensing.
Max ERC Funding
149 900 €
Duration
Start date: 2015-06-01, End date: 2016-11-30
Project acronym HA-NFKB-VILI
Project Hypercapnic Acidosis and NF-kB in Ventilator Induced Lung Injury: Developing strategies to minimize lung injury and facilitate repair
Researcher (PI) John Laffey
Host Institution (HI) NATIONAL UNIVERSITY OF IRELAND GALWAY
Call Details Starting Grant (StG), LS6, ERC-2007-StG
Summary Acute Respiratory Distress Syndrome and Acute Lung Injury [ALI/ARDS] are devastating diseases, causing over 20,000 deaths annually in the US. Mechanical ventilation may worsen ALI/ARDS, a process termed Ventilator Induced Lung Injury [VILI]. Hypercapnic acidosis (HA) is a central component of lung ventilatory strategies to minimize VILI, and is a potent biologic agent, exerting a myriad of effects on diverse biologic pathways. Deliberately induced HA is protective in multiple lung injury models. However, HA may inhibit the host response to bacterial sepsis. Furthermore, HA may retard the repair process and slow recovery following ALI/ARDS. Hence, the diverse biologic actions of HA may result in net beneficial – or deleterious – effects depending on the specific context. An alternative approach is to manipulate a single key effector pathway, central to the protective effects of HA, which would also be effective in patients in whom hypercapnia is contra-indicated. Hypercapnia attenuates NF-kB activation, and may exert its effects – both beneficial and deleterious – via this mechanism. NF-kB is a pivotal regulator of the pro-inflammatory response, but is also a key epithelial cytoprotectant. Selective modulation of the NF-kB pathway, at the pulmonary epithelial surface, may accentuate the beneficial effects of HA on injury but minimize the potential for delayed tissue repair. We will investigate the contribution of NF-kB to the effects of HA, and characterize the direct effects modulation of NF-kB, in both in vitro and preclinical models of lung injury and repair. We will utilize pulmonary gene therapy, which facilitates delivery of high quantities of the therapeutic agent directly to the injury site, to maximize the potential for therapeutic benefit. These studies will provide novel insights into: key pathways contributing to lung injury and to repair; the role of HA and NF-kB in these processes; and the potential of pulmonary gene therapy in ALI/ARDS.
Summary
Acute Respiratory Distress Syndrome and Acute Lung Injury [ALI/ARDS] are devastating diseases, causing over 20,000 deaths annually in the US. Mechanical ventilation may worsen ALI/ARDS, a process termed Ventilator Induced Lung Injury [VILI]. Hypercapnic acidosis (HA) is a central component of lung ventilatory strategies to minimize VILI, and is a potent biologic agent, exerting a myriad of effects on diverse biologic pathways. Deliberately induced HA is protective in multiple lung injury models. However, HA may inhibit the host response to bacterial sepsis. Furthermore, HA may retard the repair process and slow recovery following ALI/ARDS. Hence, the diverse biologic actions of HA may result in net beneficial – or deleterious – effects depending on the specific context. An alternative approach is to manipulate a single key effector pathway, central to the protective effects of HA, which would also be effective in patients in whom hypercapnia is contra-indicated. Hypercapnia attenuates NF-kB activation, and may exert its effects – both beneficial and deleterious – via this mechanism. NF-kB is a pivotal regulator of the pro-inflammatory response, but is also a key epithelial cytoprotectant. Selective modulation of the NF-kB pathway, at the pulmonary epithelial surface, may accentuate the beneficial effects of HA on injury but minimize the potential for delayed tissue repair. We will investigate the contribution of NF-kB to the effects of HA, and characterize the direct effects modulation of NF-kB, in both in vitro and preclinical models of lung injury and repair. We will utilize pulmonary gene therapy, which facilitates delivery of high quantities of the therapeutic agent directly to the injury site, to maximize the potential for therapeutic benefit. These studies will provide novel insights into: key pathways contributing to lung injury and to repair; the role of HA and NF-kB in these processes; and the potential of pulmonary gene therapy in ALI/ARDS.
Max ERC Funding
1 052 556 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym HANDLING
Project Writers handling pictures: a material intermediality (1880-today)
Researcher (PI) anne REVERSEAU
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Starting Grant (StG), SH5, ERC-2018-STG
Summary Not only does the writer’s hand hold the pen, it manipulates pictures as well. Writers touch, hoard, cut, copy, pin and paste various kinds of pictures and these actions integrate literature in visual culture in many ways that have never been tackled as a whole before.
Some writers spent their life surrounded by pictures taken from magazines, creating an inspirational environment; yet others nurtured their imagination with touristic leaflets and visual advertisements; others created fictional characters based on collected portraits. What do writers do with pictures? How does literature stage the pictures handled? From very concrete and banal uses of pictures will emerge a new vision of literature as intermediality in action.
This investigation applies the tool set of visual anthropology and visual studies to writers for a deeper understanding of visual ecosystems. Covering a large period, from the beginning of mass reproduction in the 1880s and the digital practices of today, HANDLING focuses on the French and French-speaking field and stands as a laboratory to refashion a broader model for relationships between image and text. Its main challenge is to get to the root of contemporary iconographic practices.
HANDLING is unconventional because literary studies usually focus on the text: contrary to the norm, it sets the image at the very centre of the literary act. This approach might yield promising results for the visibility of literature in the future, especially in exhibitions. Making these practices visible will make literature itself more visible.
As an internationally recognized specialist of text-image relationships with an in-depth knowledge of French/Belgian literature and photography, I will build a team and lead this 5-year ambitious project. Grounded in interdisciplinarity, it will show the significant and unexpected role of literature in material visual culture.
Summary
Not only does the writer’s hand hold the pen, it manipulates pictures as well. Writers touch, hoard, cut, copy, pin and paste various kinds of pictures and these actions integrate literature in visual culture in many ways that have never been tackled as a whole before.
Some writers spent their life surrounded by pictures taken from magazines, creating an inspirational environment; yet others nurtured their imagination with touristic leaflets and visual advertisements; others created fictional characters based on collected portraits. What do writers do with pictures? How does literature stage the pictures handled? From very concrete and banal uses of pictures will emerge a new vision of literature as intermediality in action.
This investigation applies the tool set of visual anthropology and visual studies to writers for a deeper understanding of visual ecosystems. Covering a large period, from the beginning of mass reproduction in the 1880s and the digital practices of today, HANDLING focuses on the French and French-speaking field and stands as a laboratory to refashion a broader model for relationships between image and text. Its main challenge is to get to the root of contemporary iconographic practices.
HANDLING is unconventional because literary studies usually focus on the text: contrary to the norm, it sets the image at the very centre of the literary act. This approach might yield promising results for the visibility of literature in the future, especially in exhibitions. Making these practices visible will make literature itself more visible.
As an internationally recognized specialist of text-image relationships with an in-depth knowledge of French/Belgian literature and photography, I will build a team and lead this 5-year ambitious project. Grounded in interdisciplinarity, it will show the significant and unexpected role of literature in material visual culture.
Max ERC Funding
1 500 000 €
Duration
Start date: 2019-07-01, End date: 2024-06-30
Project acronym HDEM
Project High Definition Electron Microscopy: Greater clarity via multidimensionality
Researcher (PI) Timothy PENNYCOOK
Host Institution (HI) UNIVERSITEIT ANTWERPEN
Call Details Starting Grant (StG), PE4, ERC-2018-STG
Summary Atomic resolution microscopy relies on beams of energetic electrons. These beams quickly destroy fragile materials, making imaging them a major challenge. I have recently developed a new approach that provides the greatest possible resolving power per electron. The method provides both double resolution and excellent noise rejection, via multidimensional data acquisition and analysis. Here I propose to couple the new method with breakthroughs in high speed cameras to achieve unprecedented clarity at low doses, almost guaranteeing major advances for imaging beam sensitive materials. Proof of principle will be achieved for biochemical imaging using the easy to handle, commercially available GroEL chaperone molecule. We will combine our enhanced imaging capabilities with the averaging methods recently recognized by the Nobel prize in chemistry for imaging biomolecules at ultra low doses. After proving our low dose capabilities we will apply them to imaging proteins of current interest at greater resolution. Similar techniques will be used for fragile materials science samples, for instance metal organic framework, Li ion battery, 2D, catalyst and perovskite solar cell materials. Furthermore the same reconstruction algorithms can be applied to simultaneously acquired spectroscopic images, allowing us to not only locate all the atoms, but identify them. The properties of all materials are determined by the arrangement and identity of their atoms, and therefore our work will impact all major areas of science, from biology to chemistry and physics.
Summary
Atomic resolution microscopy relies on beams of energetic electrons. These beams quickly destroy fragile materials, making imaging them a major challenge. I have recently developed a new approach that provides the greatest possible resolving power per electron. The method provides both double resolution and excellent noise rejection, via multidimensional data acquisition and analysis. Here I propose to couple the new method with breakthroughs in high speed cameras to achieve unprecedented clarity at low doses, almost guaranteeing major advances for imaging beam sensitive materials. Proof of principle will be achieved for biochemical imaging using the easy to handle, commercially available GroEL chaperone molecule. We will combine our enhanced imaging capabilities with the averaging methods recently recognized by the Nobel prize in chemistry for imaging biomolecules at ultra low doses. After proving our low dose capabilities we will apply them to imaging proteins of current interest at greater resolution. Similar techniques will be used for fragile materials science samples, for instance metal organic framework, Li ion battery, 2D, catalyst and perovskite solar cell materials. Furthermore the same reconstruction algorithms can be applied to simultaneously acquired spectroscopic images, allowing us to not only locate all the atoms, but identify them. The properties of all materials are determined by the arrangement and identity of their atoms, and therefore our work will impact all major areas of science, from biology to chemistry and physics.
Max ERC Funding
1 500 000 €
Duration
Start date: 2019-03-01, End date: 2024-02-29
Project acronym HEARTMAPAS
Project Single Heart beat MApping of myocardial Performance, Activation, and Scar by ultrasound
Researcher (PI) Jan Robert Michel D'hooge
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), LS7, ERC-2011-StG_20101109
Summary Heart failure affects about 2% of the European population with an annual mortality rate of 10%. Cardiac Resynchronization Therapy (CRT) has been proven to reduce morbidity and mortality and has become a recommended treatment. Optimal lead placement for CRT not only requires exact anatomically mapped information on the mechanical activation sequence to be corrected but also tissue viability and performance maps. Unfortunately, to date, no imaging technique allows building such maps non-invasively in a single examination in CRT patients.
In this project, a new ultrasound imaging approach is proposed that will not only allow mapping all of these characteristics in a single examination but that will actually do this in a single heart beat.
Hereto, real-time segmentation of the left ventricle will be used to limit the data acquisition to the spatial regions that contain myocardium. In this way unnecessary spatial sampling can be avoided which combined with new beam forming strategies will allow imaging the entire left ventricle at frame rates above 1000Hz. This very high temporal resolution will be used to accurately measure the onset of local deformation of the left ventricle in order to construct a mechanical activation map. Subsequently, cardiac motion estimates will be used to track anatomical regions throughout the cardiac cycle in order to construct a temporally averaged backscatter intensity map. As scar tissue is known to be more reflective than normal myocardium, this should allow mapping of scar. Finally, the automatic segmentation process will allow to locally measure wall thickness and curvature from which the mechanical load distribution within the ventricle can be derived. This, combined with estimates of regional myocardial deformation will produce a map of myocardial performance.
The proposed system will thus provide important new diagnostic and therapeutic information and will therefore allow better CRT planning for the individual heart failure patient.
Summary
Heart failure affects about 2% of the European population with an annual mortality rate of 10%. Cardiac Resynchronization Therapy (CRT) has been proven to reduce morbidity and mortality and has become a recommended treatment. Optimal lead placement for CRT not only requires exact anatomically mapped information on the mechanical activation sequence to be corrected but also tissue viability and performance maps. Unfortunately, to date, no imaging technique allows building such maps non-invasively in a single examination in CRT patients.
In this project, a new ultrasound imaging approach is proposed that will not only allow mapping all of these characteristics in a single examination but that will actually do this in a single heart beat.
Hereto, real-time segmentation of the left ventricle will be used to limit the data acquisition to the spatial regions that contain myocardium. In this way unnecessary spatial sampling can be avoided which combined with new beam forming strategies will allow imaging the entire left ventricle at frame rates above 1000Hz. This very high temporal resolution will be used to accurately measure the onset of local deformation of the left ventricle in order to construct a mechanical activation map. Subsequently, cardiac motion estimates will be used to track anatomical regions throughout the cardiac cycle in order to construct a temporally averaged backscatter intensity map. As scar tissue is known to be more reflective than normal myocardium, this should allow mapping of scar. Finally, the automatic segmentation process will allow to locally measure wall thickness and curvature from which the mechanical load distribution within the ventricle can be derived. This, combined with estimates of regional myocardial deformation will produce a map of myocardial performance.
The proposed system will thus provide important new diagnostic and therapeutic information and will therefore allow better CRT planning for the individual heart failure patient.
Max ERC Funding
1 602 401 €
Duration
Start date: 2012-02-01, End date: 2018-01-31
Project acronym HELIOS
Project Heavy Element Laser Ionization Spectroscopy
Researcher (PI) Pieter Van Duppen
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE2, ERC-2011-ADG_20110209
Summary The aim of this proposal is to develop a novel laser-spectroscopy method and to study nuclear and atomic properties of heaviests elements in order to address the following key questions:
- Is the existence of the heaviest isotopes determined by the interplay between single-particle and collective nucleon degrees of freedom in the atomic nucleus?
- How do relativistic effects and isotopic composition influence the valence atomic structure of the heaviest elements?
The new approach is based on in-gas jet, high-repetition, high-resolution laser resonance ionization spectroscopy of short-lived nuclear-reaction products stopped in a buffer gas cell. The final goal is to couple the new system to the strongest production facility under construction at the ESFRI-listed SPIRAL-2 facility at GANIL (France) and to study isotopes from actinium to nobelium and heavier elements.
An increase of the primary intensity, efficiency, selectivity and spectral resolution by one order of magnitude compared to present-day techniques is envisaged, which is essential to obtain the required data .
The challenges are:
- decoupling the high-intensity heavy ion production beam (> 10^14 particles per second) from the low-intensity reaction products (few atoms per second)
- cooling of the reaction products from MeV/u to meV/u within less then hundred milliseconds
- separating the wanted from the, by orders of magnitude overwhelming, unwanted isotopes
- performing high-resolution laser spectroscopy on a minute amount of atoms in an efficient way.
Nuclear properties (charge radii, nuclear moments and spins) as well as atomic properties (transition energies and ionization potentials) will be deduced in regions of the nuclear chart where they are not known: the neutron-deficient isotopes of the actinide elements, up to nobelium (Z = 102) and beyond. The data will validate state-of-the-art calculations, identify critical weaknesses and guide further theoretical developments.
Summary
The aim of this proposal is to develop a novel laser-spectroscopy method and to study nuclear and atomic properties of heaviests elements in order to address the following key questions:
- Is the existence of the heaviest isotopes determined by the interplay between single-particle and collective nucleon degrees of freedom in the atomic nucleus?
- How do relativistic effects and isotopic composition influence the valence atomic structure of the heaviest elements?
The new approach is based on in-gas jet, high-repetition, high-resolution laser resonance ionization spectroscopy of short-lived nuclear-reaction products stopped in a buffer gas cell. The final goal is to couple the new system to the strongest production facility under construction at the ESFRI-listed SPIRAL-2 facility at GANIL (France) and to study isotopes from actinium to nobelium and heavier elements.
An increase of the primary intensity, efficiency, selectivity and spectral resolution by one order of magnitude compared to present-day techniques is envisaged, which is essential to obtain the required data .
The challenges are:
- decoupling the high-intensity heavy ion production beam (> 10^14 particles per second) from the low-intensity reaction products (few atoms per second)
- cooling of the reaction products from MeV/u to meV/u within less then hundred milliseconds
- separating the wanted from the, by orders of magnitude overwhelming, unwanted isotopes
- performing high-resolution laser spectroscopy on a minute amount of atoms in an efficient way.
Nuclear properties (charge radii, nuclear moments and spins) as well as atomic properties (transition energies and ionization potentials) will be deduced in regions of the nuclear chart where they are not known: the neutron-deficient isotopes of the actinide elements, up to nobelium (Z = 102) and beyond. The data will validate state-of-the-art calculations, identify critical weaknesses and guide further theoretical developments.
Max ERC Funding
2 458 397 €
Duration
Start date: 2012-03-01, End date: 2017-02-28
Project acronym HEXTREME
Project Hexahedral mesh generation in real time
Researcher (PI) Jean-François REMACLE
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Advanced Grant (AdG), PE8, ERC-2015-AdG
Summary Over one million finite element analyses are preformed in engineering offices every day and finite elements come with the price of mesh generation. This proposal aims at creating two breakthroughs in the art of mesh generation that will be directly beneficial to the finite element community at large. The first challenge of HEXTREME is to take advantage of the massively multi-threaded nature of modern computers and to parallelize all the aspects of the mesh generation process at a fine grain level. Reducing the meshing time by more than one order of magnitude is an ambitious objective: if minutes can become seconds, then success in this research would definitively radically change the way in which engineers deal with mesh generation. This project then proposes an innovative approach to overcoming the major difficulty associated with mesh generation: it aims at providing a fast and reliable solution to the problem of conforming hexahedral mesh generation. Quadrilateral meshes in 2D and hexahedral meshes in 3D are usually considered to be superior to triangular/tetrahedral meshes. Even though direct tetrahedral meshing techniques have reached a level of robustness that allow us to treat general 3D domains, there may never exist a direct algorithm for building unstructured hex-meshes in general 3D domains. In HEXTREME, an indirect approach is envisaged that relies on recent developments in various domains of applied mathematics and computer science such as graph theory, combinatorial optimization or computational geometry. The methodology that is proposed for hex meshing is finally extended to the difficult problem of boundary layer meshing. Mesh generation is one important step of the engineering analysis process. Yet, a mesh is a tool and not an aim. A specific task of the project is dedicated to the interaction with research partners that are committed to beta-test the results of HEXTREME. All the results of HEXTREME will be provided as an open source in Gmsh.
Summary
Over one million finite element analyses are preformed in engineering offices every day and finite elements come with the price of mesh generation. This proposal aims at creating two breakthroughs in the art of mesh generation that will be directly beneficial to the finite element community at large. The first challenge of HEXTREME is to take advantage of the massively multi-threaded nature of modern computers and to parallelize all the aspects of the mesh generation process at a fine grain level. Reducing the meshing time by more than one order of magnitude is an ambitious objective: if minutes can become seconds, then success in this research would definitively radically change the way in which engineers deal with mesh generation. This project then proposes an innovative approach to overcoming the major difficulty associated with mesh generation: it aims at providing a fast and reliable solution to the problem of conforming hexahedral mesh generation. Quadrilateral meshes in 2D and hexahedral meshes in 3D are usually considered to be superior to triangular/tetrahedral meshes. Even though direct tetrahedral meshing techniques have reached a level of robustness that allow us to treat general 3D domains, there may never exist a direct algorithm for building unstructured hex-meshes in general 3D domains. In HEXTREME, an indirect approach is envisaged that relies on recent developments in various domains of applied mathematics and computer science such as graph theory, combinatorial optimization or computational geometry. The methodology that is proposed for hex meshing is finally extended to the difficult problem of boundary layer meshing. Mesh generation is one important step of the engineering analysis process. Yet, a mesh is a tool and not an aim. A specific task of the project is dedicated to the interaction with research partners that are committed to beta-test the results of HEXTREME. All the results of HEXTREME will be provided as an open source in Gmsh.
Max ERC Funding
2 244 238 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym HHNCDMIR
Project Hochschild cohomology, non-commutative deformations and mirror symmetry
Researcher (PI) Wendy Lowen
Host Institution (HI) UNIVERSITEIT ANTWERPEN
Call Details Starting Grant (StG), PE1, ERC-2010-StG_20091028
Summary "Our research programme addresses several interesting current issues in non-commutative algebraic geometry, and important links with symplectic geometry and algebraic topology. Non-commutative algebraic geometry is concerned with the study of algebraic objects in geometric ways. One of the basic philosophies is that, in analogy with (derived) categories of (quasi-)coherent sheaves over schemes and (derived) module categories, non-commutative spaces can be represented by suitable abelian or triangulated categories. This point of view has proven extremely useful in non-commutative algebra, algebraic geometry and more recently in string theory thanks to the Homological Mirror Symmetry conjecture. One of our main aims is to set up a deformation framework for non-commutative spaces represented by ""enhanced"" triangulated categories, encompassing both the non-commutative schemes represented by derived abelian categories and the derived-affine spaces, represented by dg algebras. This framework should clarify and resolve some of the important problems known to exist in the deformation theory of derived-affine spaces. It should moreover be applicable to Fukaya-type categories, and yield a new way of proving and interpreting instances of ""deformed mirror symmetry"". This theory will be developed in interaction with concrete applications of the abelian deformation theory developed in our earlier work, and with the development of new decomposition and comparison techniques for Hochschild cohomology. By understanding the links between the different theories and fields of application, we aim to achieve an interdisciplinary understanding of non-commutative spaces using abelian and triangulated structures."
Summary
"Our research programme addresses several interesting current issues in non-commutative algebraic geometry, and important links with symplectic geometry and algebraic topology. Non-commutative algebraic geometry is concerned with the study of algebraic objects in geometric ways. One of the basic philosophies is that, in analogy with (derived) categories of (quasi-)coherent sheaves over schemes and (derived) module categories, non-commutative spaces can be represented by suitable abelian or triangulated categories. This point of view has proven extremely useful in non-commutative algebra, algebraic geometry and more recently in string theory thanks to the Homological Mirror Symmetry conjecture. One of our main aims is to set up a deformation framework for non-commutative spaces represented by ""enhanced"" triangulated categories, encompassing both the non-commutative schemes represented by derived abelian categories and the derived-affine spaces, represented by dg algebras. This framework should clarify and resolve some of the important problems known to exist in the deformation theory of derived-affine spaces. It should moreover be applicable to Fukaya-type categories, and yield a new way of proving and interpreting instances of ""deformed mirror symmetry"". This theory will be developed in interaction with concrete applications of the abelian deformation theory developed in our earlier work, and with the development of new decomposition and comparison techniques for Hochschild cohomology. By understanding the links between the different theories and fields of application, we aim to achieve an interdisciplinary understanding of non-commutative spaces using abelian and triangulated structures."
Max ERC Funding
703 080 €
Duration
Start date: 2010-10-01, End date: 2016-09-30
Project acronym Hidden Galleries
Project Creative Agency and Religious Minorities: ‘hidden galleries’ in the secret police archives in 20th Century Central and Eastern Europe
Researcher (PI) James Alexander Kapalo
Host Institution (HI) UNIVERSITY COLLEGE CORK - NATIONAL UNIVERSITY OF IRELAND, CORK
Call Details Starting Grant (StG), SH5, ERC-2015-STG
Summary This project concerns the creative agency of religious minorities in the transformation of Central and Eastern Europe societies in the 20th century. It constitutes the first comparative research on the secret police archives in the region from the perspective of the history and anthropology of religion and offers a radical perspectival shift on the value and uses of the secret police archives away from questions of justice and truth to questions of creative agency and cultural patrimony. Interdisciplinary in nature it combines archival, anthropological and cultural studies approaches to provide a re-examination and re-contextualization of the holdings of secret police archives in three states; Romania, Moldova and Hungary. The secret police archives, in addition to containing millions of files on individuals monitored by the state, also constitute a hidden repository of confiscated religious art and publications of religious minorities that were persecuted in the 20th century under fascism and communism. The investigation of these materials will be complemented by ethnographic research and the impact of the research will be extended through a public exhibition of previously hidden materials. The project has three principal stages: 1) copy/retrieve and catalogue examples of this creative material from the archives; 2) engage in ethnographic research with the communities that produced this material in order to explore the meaning and power of these artistic creations at the time of their production and in the context of post-socialism; 3) curate and stage a touring exhibition that re-presents the narratives and experiences of religious groups through their own artistic creations in order to conduct research in real time on questions of religious pluralism and intolerance in contemporary society. Through these three steps, this project will shed fresh light on the role that minority religious groups played in challenging the hegemonic order and in extending pluralism.
Summary
This project concerns the creative agency of religious minorities in the transformation of Central and Eastern Europe societies in the 20th century. It constitutes the first comparative research on the secret police archives in the region from the perspective of the history and anthropology of religion and offers a radical perspectival shift on the value and uses of the secret police archives away from questions of justice and truth to questions of creative agency and cultural patrimony. Interdisciplinary in nature it combines archival, anthropological and cultural studies approaches to provide a re-examination and re-contextualization of the holdings of secret police archives in three states; Romania, Moldova and Hungary. The secret police archives, in addition to containing millions of files on individuals monitored by the state, also constitute a hidden repository of confiscated religious art and publications of religious minorities that were persecuted in the 20th century under fascism and communism. The investigation of these materials will be complemented by ethnographic research and the impact of the research will be extended through a public exhibition of previously hidden materials. The project has three principal stages: 1) copy/retrieve and catalogue examples of this creative material from the archives; 2) engage in ethnographic research with the communities that produced this material in order to explore the meaning and power of these artistic creations at the time of their production and in the context of post-socialism; 3) curate and stage a touring exhibition that re-presents the narratives and experiences of religious groups through their own artistic creations in order to conduct research in real time on questions of religious pluralism and intolerance in contemporary society. Through these three steps, this project will shed fresh light on the role that minority religious groups played in challenging the hegemonic order and in extending pluralism.
Max ERC Funding
990 087 €
Duration
Start date: 2016-09-01, End date: 2020-08-31
Project acronym High-Spin-Grav
Project Higher Spin Gravity and Generalized Spacetime Geometry
Researcher (PI) Marc HENNEAUX
Host Institution (HI) UNIVERSITE LIBRE DE BRUXELLES
Call Details Advanced Grant (AdG), PE2, ERC-2015-AdG
Summary Extensions of Einstein’s gravity containing higher spin gauge fields (massless fields with spin greater than two) constitute a very active and challenging field of research, raising many fascinating issues and questions in different areas of physics. However, in spite of the impressive achievements already in store, it is fair to say that higher spin gravity has not delivered its full potential yet and still faces a rich number of challenges, both conceptual and technical. The objective of this proposal is to deepen our understanding of higher spin gravity, following five interconnected central themes that will constitute the backbone of the project: (i) how to construct an action principle; (ii) how to understand the generalized space-time geometry invariant under the higher-spin gauge symmetry – a key fundamental issue in the project; (iii) what is the precise asymptotic structure of the theory at infinity; (iv) what is the connection of the higher spin algebras with the hidden symmetries of gravitational theories; (v) what are the implications of hypersymmetry, which is the higher-spin version of supersymmetry. Holography in three and higher dimensions will constitute an essential tool.
One of the motivations of the project is the connection of higher spin gravity with tensionless string theory and consistent theories of quantum gravity.
Summary
Extensions of Einstein’s gravity containing higher spin gauge fields (massless fields with spin greater than two) constitute a very active and challenging field of research, raising many fascinating issues and questions in different areas of physics. However, in spite of the impressive achievements already in store, it is fair to say that higher spin gravity has not delivered its full potential yet and still faces a rich number of challenges, both conceptual and technical. The objective of this proposal is to deepen our understanding of higher spin gravity, following five interconnected central themes that will constitute the backbone of the project: (i) how to construct an action principle; (ii) how to understand the generalized space-time geometry invariant under the higher-spin gauge symmetry – a key fundamental issue in the project; (iii) what is the precise asymptotic structure of the theory at infinity; (iv) what is the connection of the higher spin algebras with the hidden symmetries of gravitational theories; (v) what are the implications of hypersymmetry, which is the higher-spin version of supersymmetry. Holography in three and higher dimensions will constitute an essential tool.
One of the motivations of the project is the connection of higher spin gravity with tensionless string theory and consistent theories of quantum gravity.
Max ERC Funding
1 841 868 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym HIGHWAVE
Project Breaking of highly energetic waves
Researcher (PI) Frederic DIAS
Host Institution (HI) UNIVERSITY COLLEGE DUBLIN, NATIONAL UNIVERSITY OF IRELAND, DUBLIN
Call Details Advanced Grant (AdG), PE8, ERC-2018-ADG
Summary HIGHWAVE is an interdisciplinary project at the frontiers of coastal/ocean engineering, earth system science, statistics and fluid mechanics that will explore fundamental open questions in wave breaking. Why do waves break, how do they dissipate energy and why is this important? A central element of the work builds on recent international developments in the field of wave breaking and wave run-up led by the PI that have provided the first universal criterion for predicting the onset of breaking of water waves in uniform water depths from deep to intermediate. This work has also shown that the run-up of nonlinear waves impinging on a vertical wall can exceed up to 12 times the far-field amplitude of the incoming waves. These results have now opened up the possibility for more accurate operational wave models. They have practical and economic benefits in determining structural loads on ships and coastal/offshore infrastructure, evaluating seabed response to extreme waves, and optimizing operational strategies for maritime and marine renewable energy enterprises. This is a tremendous advance comparable to the introduction of wave prediction during World War II, and the PI aims to be at the forefront of this research effort to take research in wave breaking into fundamentally new directions. The objectives of the project are: (i) to develop an innovative approach to include accurate wave breaking physics into coupled sea state and ocean weather forecasting models; (ii) to obtain improved criteria for the design of ships and coastal/offshore infrastructure; (iii) to quantify erosion by powerful breaking waves, and (iv) to develop new concepts in wave measurement with improved characterization of wave breaking using real-time instrumentation. This highly interdisciplinary project will involve an ambitious and unconventional combination of computational simulation/theory, laboratory experiments, and field measurements of sea waves, closely informed by application needs.
Summary
HIGHWAVE is an interdisciplinary project at the frontiers of coastal/ocean engineering, earth system science, statistics and fluid mechanics that will explore fundamental open questions in wave breaking. Why do waves break, how do they dissipate energy and why is this important? A central element of the work builds on recent international developments in the field of wave breaking and wave run-up led by the PI that have provided the first universal criterion for predicting the onset of breaking of water waves in uniform water depths from deep to intermediate. This work has also shown that the run-up of nonlinear waves impinging on a vertical wall can exceed up to 12 times the far-field amplitude of the incoming waves. These results have now opened up the possibility for more accurate operational wave models. They have practical and economic benefits in determining structural loads on ships and coastal/offshore infrastructure, evaluating seabed response to extreme waves, and optimizing operational strategies for maritime and marine renewable energy enterprises. This is a tremendous advance comparable to the introduction of wave prediction during World War II, and the PI aims to be at the forefront of this research effort to take research in wave breaking into fundamentally new directions. The objectives of the project are: (i) to develop an innovative approach to include accurate wave breaking physics into coupled sea state and ocean weather forecasting models; (ii) to obtain improved criteria for the design of ships and coastal/offshore infrastructure; (iii) to quantify erosion by powerful breaking waves, and (iv) to develop new concepts in wave measurement with improved characterization of wave breaking using real-time instrumentation. This highly interdisciplinary project will involve an ambitious and unconventional combination of computational simulation/theory, laboratory experiments, and field measurements of sea waves, closely informed by application needs.
Max ERC Funding
2 499 946 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym HiMISER
Project High resolution Miniature Implantable nerve Stimulator for Electroceutical Research (HiMISER)
Researcher (PI) Robert M.O. PUERS
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Proof of Concept (PoC), ERC-2018-PoC
Summary In my ERC project ‘µThalys’, we are researching future concepts for electronic medical implants. The main idea is to replace current medical implants that typically consist of a titanium casing with electronics and a few lead wires with less intrusive, softer, tissue-like implants. These future generation implants consist of soft, modular, miniature transducer nodes that can act stand-alone, or be connected in a network, thus forming e.g. an implantable sensor network. Now, the main idea driving this proof-of-concept (PoC) project application is that the results of our research could be very suitable to apply in an upcoming field that uses peripheral autonomous nerve stimulation for therapeutic ends. This is mostly referred to as ‘electroceuticals’ or ‘bioelectric medicine’. Rather than using chemical drugs that circulate through the entire body, the field of electroceuticals aims to develop therapies based on local stimulation of a nerve of the autonomous nervous system that leads to the target organ. For example, instead of taking drugs to reduce stomach acid formation the part of the vagus nerve going to the stomach can be stimulated to achieve the same therapeutic effect while reducing side-effects. However, current nerve stimulation devices are rather crude and stimulate all fibres in a nerve using a cuff electrode. Moreover, they are too large to allow them to be used in small animal studies. The non-availability of a miniature, high-resolution peripheral nerve stimulator is therefore a significant roadblock for researchers in the field, and for the future application in humans. In this project, we will combine our existing research results from the µThalys project to create a new generation of stimulators. Miniature electronics devices, packaging and soft high-resolution neural implants will enable miniature proof-of-concept devices that demonstrate precision stimulation in peripheral nerves. First steps towards commercialisation will be taken.
Summary
In my ERC project ‘µThalys’, we are researching future concepts for electronic medical implants. The main idea is to replace current medical implants that typically consist of a titanium casing with electronics and a few lead wires with less intrusive, softer, tissue-like implants. These future generation implants consist of soft, modular, miniature transducer nodes that can act stand-alone, or be connected in a network, thus forming e.g. an implantable sensor network. Now, the main idea driving this proof-of-concept (PoC) project application is that the results of our research could be very suitable to apply in an upcoming field that uses peripheral autonomous nerve stimulation for therapeutic ends. This is mostly referred to as ‘electroceuticals’ or ‘bioelectric medicine’. Rather than using chemical drugs that circulate through the entire body, the field of electroceuticals aims to develop therapies based on local stimulation of a nerve of the autonomous nervous system that leads to the target organ. For example, instead of taking drugs to reduce stomach acid formation the part of the vagus nerve going to the stomach can be stimulated to achieve the same therapeutic effect while reducing side-effects. However, current nerve stimulation devices are rather crude and stimulate all fibres in a nerve using a cuff electrode. Moreover, they are too large to allow them to be used in small animal studies. The non-availability of a miniature, high-resolution peripheral nerve stimulator is therefore a significant roadblock for researchers in the field, and for the future application in humans. In this project, we will combine our existing research results from the µThalys project to create a new generation of stimulators. Miniature electronics devices, packaging and soft high-resolution neural implants will enable miniature proof-of-concept devices that demonstrate precision stimulation in peripheral nerves. First steps towards commercialisation will be taken.
Max ERC Funding
150 000 €
Duration
Start date: 2019-01-01, End date: 2020-06-30
Project acronym HOLOBHC
Project Holography for realistic black holes and cosmologies
Researcher (PI) Geoffrey Gaston Joseph Jean-Vincent Compère
Host Institution (HI) UNIVERSITE LIBRE DE BRUXELLES
Call Details Starting Grant (StG), PE2, ERC-2013-StG
Summary String theory provides with a consistent framework which combines quantum mechanics and gravity. Two grand challenges of fundamental physics - building realistic models of black holes and cosmologies - can be addressed in this framework thanks to novel holographic methods.
Recent astrophysical evidence indicates that some black holes rotate extremely fast, as close as 98% to the extremality bound. No quantum gravity model for such black holes has been formulated so far. My first objective is building the first model in string theory of an extremal black hole. Taking on this challenge is made possible thanks to recent advances in a remarkable duality known as the gauge/gravity correspondence. If successful, this program will pave the way to a description of quantum gravity effects that have been conjectured to occur close to the horizon of very fast rotating black holes.
Supernovae detection has established that our universe is starting a phase of accelerated expansion. This brings a pressing need to better understand still enigmatic features of de Sitter spacetime that models our universe at late times. My second objective is to derive new universal properties of the cosmological horizon of de Sitter spacetime using tools inspired from the gauge/gravity correspondence. These results will contribute to understand its remarkable entropy, which, according to the standard model of cosmology, bounds the entropy of our observable universe.
Summary
String theory provides with a consistent framework which combines quantum mechanics and gravity. Two grand challenges of fundamental physics - building realistic models of black holes and cosmologies - can be addressed in this framework thanks to novel holographic methods.
Recent astrophysical evidence indicates that some black holes rotate extremely fast, as close as 98% to the extremality bound. No quantum gravity model for such black holes has been formulated so far. My first objective is building the first model in string theory of an extremal black hole. Taking on this challenge is made possible thanks to recent advances in a remarkable duality known as the gauge/gravity correspondence. If successful, this program will pave the way to a description of quantum gravity effects that have been conjectured to occur close to the horizon of very fast rotating black holes.
Supernovae detection has established that our universe is starting a phase of accelerated expansion. This brings a pressing need to better understand still enigmatic features of de Sitter spacetime that models our universe at late times. My second objective is to derive new universal properties of the cosmological horizon of de Sitter spacetime using tools inspired from the gauge/gravity correspondence. These results will contribute to understand its remarkable entropy, which, according to the standard model of cosmology, bounds the entropy of our observable universe.
Max ERC Funding
1 020 084 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym HoloQosmos
Project Holographic Quantum Cosmology
Researcher (PI) Thomas Hertog
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Consolidator Grant (CoG), PE9, ERC-2013-CoG
Summary The current theory of cosmic inflation is largely based on classical physics. This undermines its predictivity in a world that is fundamentally quantum mechanical. With this project we will develop a novel approach towards a quantum theory of inflation. We will do this by introducing holographic techniques in cosmology. The notion of holography is the most profound conceptual breakthrough that has emerged form fundamental high-energy physics in recent years. It postulates that (quantum) gravitational systems such as the universe as a whole have a precise `holographic’ description in terms of quantum field theories defined on their boundary. Our aim is to develop a holographic framework for quantum cosmology. We will then apply this to three areas of theoretical cosmology where a quantum approach is of critical importance. First, we will put forward a holographic description of inflation that clarifies its microphysical origin and is rigorously predictive. Using this we will derive the distinct observational signatures of novel, truly holographic models of the early universe where inflation has no description in terms of classical cosmic evolution. Second, we will apply holographic cosmology to improve our understanding of eternal inflation. This is a phase deep into inflation where quantum effects dominate the evolution and affect the universe’s global structure. Finally we will work towards generalizing our holographic models of the primordial universe to include the radiation, matter and vacuum eras. The resulting unification of cosmic history in terms of a single holographic boundary theory may lead to intriguing predictions of correlations between early and late time observables, tying together the universe’s origin with its ultimate fate. Our project has the potential to revolutionize our perspective on cosmology and to further deepen the fruitful interaction between cosmology and high-energy physics.
Summary
The current theory of cosmic inflation is largely based on classical physics. This undermines its predictivity in a world that is fundamentally quantum mechanical. With this project we will develop a novel approach towards a quantum theory of inflation. We will do this by introducing holographic techniques in cosmology. The notion of holography is the most profound conceptual breakthrough that has emerged form fundamental high-energy physics in recent years. It postulates that (quantum) gravitational systems such as the universe as a whole have a precise `holographic’ description in terms of quantum field theories defined on their boundary. Our aim is to develop a holographic framework for quantum cosmology. We will then apply this to three areas of theoretical cosmology where a quantum approach is of critical importance. First, we will put forward a holographic description of inflation that clarifies its microphysical origin and is rigorously predictive. Using this we will derive the distinct observational signatures of novel, truly holographic models of the early universe where inflation has no description in terms of classical cosmic evolution. Second, we will apply holographic cosmology to improve our understanding of eternal inflation. This is a phase deep into inflation where quantum effects dominate the evolution and affect the universe’s global structure. Finally we will work towards generalizing our holographic models of the primordial universe to include the radiation, matter and vacuum eras. The resulting unification of cosmic history in terms of a single holographic boundary theory may lead to intriguing predictions of correlations between early and late time observables, tying together the universe’s origin with its ultimate fate. Our project has the potential to revolutionize our perspective on cosmology and to further deepen the fruitful interaction between cosmology and high-energy physics.
Max ERC Funding
1 995 900 €
Duration
Start date: 2014-08-01, End date: 2019-07-31
Project acronym HOM
Project Homo Mimeticus: Theory and Criticism
Researcher (PI) Nidesh LAWTOO
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), SH5, ERC-2016-STG
Summary Mimesis is one of the most influential concepts in Western thought. Originally invoked to define humans as the “most imitative” creatures in classical antiquity, mimesis (imitation) has recently been at the centre of theoretical debates in the humanities, social sciences, and the neurosciences concerning the role of “mimicry,” “identification,” “contagion,” and “mirror neurons” in the formation of subjectivity. And yet, despite the growing confirmations that imitation is constitutive of human behaviour, mimesis still tends to be confined to the sphere of realistic representation. The HOM project combines approaches that are usually split in different areas of disciplinary specialization to provide a correction to this tendency.
Conceived as a trilogy situated at the crossroads between literary criticism, cinema studies, and critical theory, HOM’s outcomes will result in two monographs and accompanying articles that explore the aesthetic, affective, and conceptual implications of the mimetic faculty. The first, radically reframes a major proponent of anti-mimetic aesthetics in modern literature, Oscar Wilde, by looking back to the classical foundations of theatrical mimesis that inform his corpus; the second considers the material effects of virtual simulation by looking ahead to new digital media via contemporary science-fiction films; and the third establishes an interdisciplinary dialogue between philosophical accounts of mimesis and recent discoveries in the neurosciences. Together, these new perspectives on homo mimeticus reconsider the aesthetic foundations of a major literary author, open up a new line of inquiry in film studies, and steer philosophical debates on mimesis in new interdisciplinary directions.
Summary
Mimesis is one of the most influential concepts in Western thought. Originally invoked to define humans as the “most imitative” creatures in classical antiquity, mimesis (imitation) has recently been at the centre of theoretical debates in the humanities, social sciences, and the neurosciences concerning the role of “mimicry,” “identification,” “contagion,” and “mirror neurons” in the formation of subjectivity. And yet, despite the growing confirmations that imitation is constitutive of human behaviour, mimesis still tends to be confined to the sphere of realistic representation. The HOM project combines approaches that are usually split in different areas of disciplinary specialization to provide a correction to this tendency.
Conceived as a trilogy situated at the crossroads between literary criticism, cinema studies, and critical theory, HOM’s outcomes will result in two monographs and accompanying articles that explore the aesthetic, affective, and conceptual implications of the mimetic faculty. The first, radically reframes a major proponent of anti-mimetic aesthetics in modern literature, Oscar Wilde, by looking back to the classical foundations of theatrical mimesis that inform his corpus; the second considers the material effects of virtual simulation by looking ahead to new digital media via contemporary science-fiction films; and the third establishes an interdisciplinary dialogue between philosophical accounts of mimesis and recent discoveries in the neurosciences. Together, these new perspectives on homo mimeticus reconsider the aesthetic foundations of a major literary author, open up a new line of inquiry in film studies, and steer philosophical debates on mimesis in new interdisciplinary directions.
Max ERC Funding
1 044 000 €
Duration
Start date: 2016-11-01, End date: 2021-10-31
Project acronym Human Decisions
Project The Neural Determinants of Perceptual Decision Making in the Human Brain
Researcher (PI) Redmond O'connell
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Starting Grant (StG), LS5, ERC-2014-STG
Summary How do we make reliable decisions given sensory information that is often weak or ambiguous? Current theories center on a brain mechanism whereby sensory evidence is integrated over time into a “decision variable” which triggers the appropriate action upon reaching a criterion. Neural signals fitting this role have been identified in monkey electrophysiology but efforts to study the neural dynamics underpinning human decision making have been hampered by technical challenges associated with non-invasive recording. This proposal builds on a recent paradigm breakthrough made by the applicant that enables parallel tracking of discrete neural signals that can be unambiguously linked to the three key information processing stages necessary for simple perceptual decisions: sensory encoding, decision formation and motor preparation. Chief among these is a freely-evolving decision variable signal which builds at an evidence-dependent rate up to an action-triggering threshold and precisely determines the timing and accuracy of perceptual reports at the single-trial level. This provides an unprecedented neurophysiological window onto the distinct parameters of the human decision process such that the underlying mechanisms of several major behavioral phenomena can finally be investigated. This proposal seeks to develop a systems-level understanding of perceptual decision making in the human brain by tackling three core questions: 1) what are the neural adaptations that allow us to deal with speed pressure and variations in the reliability of the physically presented evidence? 2) What neural mechanism determines our subjective confidence in a decision? and 3) How does aging impact on the distinct neural components underpinning perceptual decision making? Each of the experiments described in this proposal will definitively test key predictions from prominent theoretical models using a combination of temporally precise neurophysiological measurement and psychophysical modelling.
Summary
How do we make reliable decisions given sensory information that is often weak or ambiguous? Current theories center on a brain mechanism whereby sensory evidence is integrated over time into a “decision variable” which triggers the appropriate action upon reaching a criterion. Neural signals fitting this role have been identified in monkey electrophysiology but efforts to study the neural dynamics underpinning human decision making have been hampered by technical challenges associated with non-invasive recording. This proposal builds on a recent paradigm breakthrough made by the applicant that enables parallel tracking of discrete neural signals that can be unambiguously linked to the three key information processing stages necessary for simple perceptual decisions: sensory encoding, decision formation and motor preparation. Chief among these is a freely-evolving decision variable signal which builds at an evidence-dependent rate up to an action-triggering threshold and precisely determines the timing and accuracy of perceptual reports at the single-trial level. This provides an unprecedented neurophysiological window onto the distinct parameters of the human decision process such that the underlying mechanisms of several major behavioral phenomena can finally be investigated. This proposal seeks to develop a systems-level understanding of perceptual decision making in the human brain by tackling three core questions: 1) what are the neural adaptations that allow us to deal with speed pressure and variations in the reliability of the physically presented evidence? 2) What neural mechanism determines our subjective confidence in a decision? and 3) How does aging impact on the distinct neural components underpinning perceptual decision making? Each of the experiments described in this proposal will definitively test key predictions from prominent theoretical models using a combination of temporally precise neurophysiological measurement and psychophysical modelling.
Max ERC Funding
1 382 643 €
Duration
Start date: 2015-05-01, End date: 2020-04-30
Project acronym HurdlingOxoWall
Project Late First-Row Transition Metal-Oxo Complexes for C–H Bond Activation
Researcher (PI) Aidan McDonald
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Starting Grant (StG), PE5, ERC-2015-STG
Summary The chemical, pharmaceutical, and materials industries rely heavily upon chemicals from oil and natural gas feed-stocks (saturated hydrocarbons) that require considerable functionalisation prior to use. Catalytic oxidative functionalisation (e.g. CH4 + [O] + cat. → CH3OH), using first row transition metal catalysts, is potentially a sustainable, cheap, and green route to these high-commodity chemicals. However, catalytic oxidation remains a great modern challenge because such hydrocarbons contain remarkably strong inert C–H bonds that can only be activated with potent catalysts. We will take a Nature-inspired approach to designing and preparing powerful oxidation catalysts: we will interrogate the active oxidant, a metal-oxo (M=O) species, to guide our catalyst design. Specifically, we will prepare unprecedented Late first-row transition Metal-Oxo complexes (LM=O’s, LM = Co, Ni, Cu) that will activate the strongest of C–H bonds (e.g. CH4).
This will be accomplished using a family of novel low coordinate ligands that will support LM=O’s. Due to their expected potent reactivity we will prepare LM=O’s under unique oxidatively robust, low-temperature conditions to ensure their stabilisation. The poorly understood factors (thermodynamics, metal, d-electron count) that control the reactivity of M=O’s will be thoroughly investigated. Based on these investigations LM=O reactivity will be manipulated and optimised. We expect LM=O’s will be significantly more reactive than any early transition metal-oxo’s (EM=O’s), because they will display a greater thermodynamic driving force for C–H activation. It is thus expected that LM=O’s will be capable of the activation of the strongest of C–H bonds (i.e. CH4). Driven by the knowledge acquired from these investigations, we will design and prepare the next generation of molecular oxidation catalysts - a family of late first-row transition metal compounds capable of catalysing hydrocarbon functionalisation under ambient conditions.
Summary
The chemical, pharmaceutical, and materials industries rely heavily upon chemicals from oil and natural gas feed-stocks (saturated hydrocarbons) that require considerable functionalisation prior to use. Catalytic oxidative functionalisation (e.g. CH4 + [O] + cat. → CH3OH), using first row transition metal catalysts, is potentially a sustainable, cheap, and green route to these high-commodity chemicals. However, catalytic oxidation remains a great modern challenge because such hydrocarbons contain remarkably strong inert C–H bonds that can only be activated with potent catalysts. We will take a Nature-inspired approach to designing and preparing powerful oxidation catalysts: we will interrogate the active oxidant, a metal-oxo (M=O) species, to guide our catalyst design. Specifically, we will prepare unprecedented Late first-row transition Metal-Oxo complexes (LM=O’s, LM = Co, Ni, Cu) that will activate the strongest of C–H bonds (e.g. CH4).
This will be accomplished using a family of novel low coordinate ligands that will support LM=O’s. Due to their expected potent reactivity we will prepare LM=O’s under unique oxidatively robust, low-temperature conditions to ensure their stabilisation. The poorly understood factors (thermodynamics, metal, d-electron count) that control the reactivity of M=O’s will be thoroughly investigated. Based on these investigations LM=O reactivity will be manipulated and optimised. We expect LM=O’s will be significantly more reactive than any early transition metal-oxo’s (EM=O’s), because they will display a greater thermodynamic driving force for C–H activation. It is thus expected that LM=O’s will be capable of the activation of the strongest of C–H bonds (i.e. CH4). Driven by the knowledge acquired from these investigations, we will design and prepare the next generation of molecular oxidation catalysts - a family of late first-row transition metal compounds capable of catalysing hydrocarbon functionalisation under ambient conditions.
Max ERC Funding
1 499 865 €
Duration
Start date: 2016-03-01, End date: 2021-02-28
Project acronym HydroLieve
Project A long-lasting non-migrating hydrogel for relieving chronic pain
Researcher (PI) Martin O'HALLORAN
Host Institution (HI) NATIONAL UNIVERSITY OF IRELAND GALWAY
Call Details Proof of Concept (PoC), ERC-2018-PoC
Summary Chronic Neuropathic Pain (NeP) is an enormous problem globally. Estimates predicate that 20% of adults suffer from pain world-wide and 10% are newly diagnosed with chronic pain annually1. According to the EuroPain Innovative Medicines Initiative (IMI)2, one in five adults in Europe suffers from chronic pain. One type of Neuropathic Pain that is particularly detrimental to patients quality of life is Trigeminal Neuralgia (TN). The trigeminal nerve is responsible for facial sensory function, and trauma to this nerve can result in sudden, severe, stabbing and recurrent episodes of pain. Patients with TN experience a prolonged debilitating condition, where the pain is often described as one of the most painful conditions in medicine. TN is infamously called the “suicide disease” as there have been numerous suicide reports associated with the condition. Epidemiologists estimated the prevalence of chronic TN pain is also large at 7%3, which amounts to 36 million Europeans. Similarly, in the US, the American Association of Neurological Surgeons report that 150,000 people are diagnosed with TN every year with a prevalence of chronic TN impacting 1.7 % of population 4. Therefore in this proposal, we describe the development of a long-lasting and drug-free treatment for chronic Trigeminal Neuralgia.
Summary
Chronic Neuropathic Pain (NeP) is an enormous problem globally. Estimates predicate that 20% of adults suffer from pain world-wide and 10% are newly diagnosed with chronic pain annually1. According to the EuroPain Innovative Medicines Initiative (IMI)2, one in five adults in Europe suffers from chronic pain. One type of Neuropathic Pain that is particularly detrimental to patients quality of life is Trigeminal Neuralgia (TN). The trigeminal nerve is responsible for facial sensory function, and trauma to this nerve can result in sudden, severe, stabbing and recurrent episodes of pain. Patients with TN experience a prolonged debilitating condition, where the pain is often described as one of the most painful conditions in medicine. TN is infamously called the “suicide disease” as there have been numerous suicide reports associated with the condition. Epidemiologists estimated the prevalence of chronic TN pain is also large at 7%3, which amounts to 36 million Europeans. Similarly, in the US, the American Association of Neurological Surgeons report that 150,000 people are diagnosed with TN every year with a prevalence of chronic TN impacting 1.7 % of population 4. Therefore in this proposal, we describe the development of a long-lasting and drug-free treatment for chronic Trigeminal Neuralgia.
Max ERC Funding
149 954 €
Duration
Start date: 2018-08-01, End date: 2020-01-31