Project acronym FICKLEFORMS
Project Fickle Formulas. The Political Economy of Macroeconomic Measurement
Researcher (PI) Daniel Kolja Mügge
Host Institution (HI) UNIVERSITEIT VAN AMSTERDAM
Call Details Starting Grant (StG), SH2, ERC-2014-STG
Summary Macroeconomic indicators are integral to economic governance. Measurements of growth, unemployment, inflation and public deficits inform policy, for example through growth targets and the inflation-indexation of wages. These indicators tell us “how economies are doing” and citizens often punish politicians who fail to deliver on them.
Their air of objectivity notwithstanding, it is far from self-evident how these indicators should be defined and measured. Our choices here have deeply distributional consequences, producing winners and losers, and will shape our future, for example when GDP figures hide the cost of environmental degradation. So why do we measure our economies the way we do?
Criticisms of particular measures are hardly new but their real-world effect has been limited. The project therefore asks: which social, political and economic factors shape the formulas used to calculate macroeconomic indicators? Extant research offers detailed histories of statistics, mostly in single countries. But we lack theoretical and empirical tools to describe and explain differences in measurement formulas between countries and over time.
FICKLEFORMS will provide such understanding through five sub-projects. The first systematically compares the evolution of four indicators in four central OECD countries: the United Kingdom, the United States, France and Germany. The second analyses the timing and content of statistical harmonization efforts through the United Nations, the IMF and the World Bank. The third constructs a new database of “measures of measures” to quantitatively test hypotheses emerging from the previous sub-projects. The final two sub-projects reach beyond the OECD and study the politics of macroeconomic measurement in China, India, Brazil and South Africa.
This project will promote public debate over meaningful measures, allow policy-makers to reflect on current practices, and sensitize academics who use macroeconomic data about their political roots.
Summary
Macroeconomic indicators are integral to economic governance. Measurements of growth, unemployment, inflation and public deficits inform policy, for example through growth targets and the inflation-indexation of wages. These indicators tell us “how economies are doing” and citizens often punish politicians who fail to deliver on them.
Their air of objectivity notwithstanding, it is far from self-evident how these indicators should be defined and measured. Our choices here have deeply distributional consequences, producing winners and losers, and will shape our future, for example when GDP figures hide the cost of environmental degradation. So why do we measure our economies the way we do?
Criticisms of particular measures are hardly new but their real-world effect has been limited. The project therefore asks: which social, political and economic factors shape the formulas used to calculate macroeconomic indicators? Extant research offers detailed histories of statistics, mostly in single countries. But we lack theoretical and empirical tools to describe and explain differences in measurement formulas between countries and over time.
FICKLEFORMS will provide such understanding through five sub-projects. The first systematically compares the evolution of four indicators in four central OECD countries: the United Kingdom, the United States, France and Germany. The second analyses the timing and content of statistical harmonization efforts through the United Nations, the IMF and the World Bank. The third constructs a new database of “measures of measures” to quantitatively test hypotheses emerging from the previous sub-projects. The final two sub-projects reach beyond the OECD and study the politics of macroeconomic measurement in China, India, Brazil and South Africa.
This project will promote public debate over meaningful measures, allow policy-makers to reflect on current practices, and sensitize academics who use macroeconomic data about their political roots.
Max ERC Funding
1 499 875 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym FLOVISP
Project Flow Visualization Based Pressure
Researcher (PI) Fulvio Scarano
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Proof of Concept (PoC), PC1, ERC-2014-PoC
Summary The ERC-FLOVIST project has focused on advancing Tomographic Particle Image Velocimetry (PIV) towards a versatile technique for the non-intrusive diagnostics of aero-acoustic problems.
One of the milestones has been the use of Tomo-PIV to infer the instantaneous three dimensional pressure field from the velocity measurement. The use of this laser-based technique for the detection of pressure fluctuations both around and on the surface of aerodynamic models offers the advantage that surface pressure transducers do not need to be installed, along with connecting cables for power supply and data transfer.
The technique has demonstrated high scalability and pressure fluctuations were detected from low-speed up to the supersonic flows. This is an important headway from standard technologies (surface pressure transducers and microphone arrays) favouring a broader utilization of PIV in aero-acoustics, flow-induced vibrations and bio-fluid mechanics.
The potential of this innovative approach has been recognized in science. However, the industry lags behind with a more conservative position, partly justified by system complexity and the high skills required to perform experiments. Instead, when correctly implemented this method can lead to important economical benefits with saving of costs for the integration of instrumentation. The targeted industrial areas are: aeronautics (aircraft aerodynamics and propulsion), energy systems (turbo machinery and wind energy). In wind-energy, the study of unsteady loads may lead to designs that reduce fatigue loads and increase system durability. Also, growing interest in noise emissions from wind turbines requires increased capabilities for their aero-acoustic analysis.
The proposal intends to move forward these capabilities from research labs to industrial facilities. The main task is bringing together the current advances of the Tomo-PIV technique to make it broadly usable by research centres and for industrial innovation.
Summary
The ERC-FLOVIST project has focused on advancing Tomographic Particle Image Velocimetry (PIV) towards a versatile technique for the non-intrusive diagnostics of aero-acoustic problems.
One of the milestones has been the use of Tomo-PIV to infer the instantaneous three dimensional pressure field from the velocity measurement. The use of this laser-based technique for the detection of pressure fluctuations both around and on the surface of aerodynamic models offers the advantage that surface pressure transducers do not need to be installed, along with connecting cables for power supply and data transfer.
The technique has demonstrated high scalability and pressure fluctuations were detected from low-speed up to the supersonic flows. This is an important headway from standard technologies (surface pressure transducers and microphone arrays) favouring a broader utilization of PIV in aero-acoustics, flow-induced vibrations and bio-fluid mechanics.
The potential of this innovative approach has been recognized in science. However, the industry lags behind with a more conservative position, partly justified by system complexity and the high skills required to perform experiments. Instead, when correctly implemented this method can lead to important economical benefits with saving of costs for the integration of instrumentation. The targeted industrial areas are: aeronautics (aircraft aerodynamics and propulsion), energy systems (turbo machinery and wind energy). In wind-energy, the study of unsteady loads may lead to designs that reduce fatigue loads and increase system durability. Also, growing interest in noise emissions from wind turbines requires increased capabilities for their aero-acoustic analysis.
The proposal intends to move forward these capabilities from research labs to industrial facilities. The main task is bringing together the current advances of the Tomo-PIV technique to make it broadly usable by research centres and for industrial innovation.
Max ERC Funding
148 750 €
Duration
Start date: 2015-06-01, End date: 2016-11-30
Project acronym GliaInnateSensing
Project Glia-derived factors in innate lymphoid cell sensing and intestinal defence
Researcher (PI) Jose Henrique Veiga Fernandes
Host Institution (HI) FUNDACAO D. ANNA SOMMER CHAMPALIMAUD E DR. CARLOS MONTEZ CHAMPALIMAUD
Call Details Consolidator Grant (CoG), LS6, ERC-2014-CoG
Summary The interplay between intestinal microbes and immune cells ensures vital functions of the organism. However, inadequate host-microbe relationships lead to inflammatory diseases that are major public health concerns.
Innate lymphoid cells (ILC) are an emergent family of effectors abundantly present at mucosal sites. Group 3 ILC (ILC3) produce pro-inflammatory cytokines and regulate mucosal homeostasis, anti-microbial defence and adaptive immune responses.
ILC development and function have been widely perceived to be programmed. However, recent evidence indicates that ILC are also controlled by dietary signals. Nevertheless, how ILC3 perceive, integrate and respond to environmental cues remains utterly unexplored.
We hypothesise that ILC3 sense their environment and exert their function as part of a novel epithelial-glial-ILC unit orchestrated by neurotrophic factors. Thus, we propose to employ genetic, cellular and molecular approaches to decipher how this unconventional multi-cellular unit is controlled and how glial-derived factors set ILC3 function and intestinal homeostasis.
In order to achieve this, we will assess ILC3-autonomous functions of neurotrophic factor receptors. ILC3-specific loss and gain of function mutant mice for neuroregulatory receptors will be used to define the role of these molecules in ILC3 function, mucosal homeostasis, gut defence and microbial ecology. Sequentially we propose to decipher the anatomical and functional basis for the enteric epithelial-glial-ILC unit. To this end we will employ high-resolution imaging, genome-wide expression analysis and tissue-specific mutants for define target genes.
Our ground-breaking research will establish a novel sensing program by which ILC3 integrate environmental cues and will define a key multi-cellular unit at the core of intestinal homeostasis and defence. Finally, our work will reveal new pathways that may be targeted in inflammatory diseases that are major Public Health concerns.
Summary
The interplay between intestinal microbes and immune cells ensures vital functions of the organism. However, inadequate host-microbe relationships lead to inflammatory diseases that are major public health concerns.
Innate lymphoid cells (ILC) are an emergent family of effectors abundantly present at mucosal sites. Group 3 ILC (ILC3) produce pro-inflammatory cytokines and regulate mucosal homeostasis, anti-microbial defence and adaptive immune responses.
ILC development and function have been widely perceived to be programmed. However, recent evidence indicates that ILC are also controlled by dietary signals. Nevertheless, how ILC3 perceive, integrate and respond to environmental cues remains utterly unexplored.
We hypothesise that ILC3 sense their environment and exert their function as part of a novel epithelial-glial-ILC unit orchestrated by neurotrophic factors. Thus, we propose to employ genetic, cellular and molecular approaches to decipher how this unconventional multi-cellular unit is controlled and how glial-derived factors set ILC3 function and intestinal homeostasis.
In order to achieve this, we will assess ILC3-autonomous functions of neurotrophic factor receptors. ILC3-specific loss and gain of function mutant mice for neuroregulatory receptors will be used to define the role of these molecules in ILC3 function, mucosal homeostasis, gut defence and microbial ecology. Sequentially we propose to decipher the anatomical and functional basis for the enteric epithelial-glial-ILC unit. To this end we will employ high-resolution imaging, genome-wide expression analysis and tissue-specific mutants for define target genes.
Our ground-breaking research will establish a novel sensing program by which ILC3 integrate environmental cues and will define a key multi-cellular unit at the core of intestinal homeostasis and defence. Finally, our work will reveal new pathways that may be targeted in inflammatory diseases that are major Public Health concerns.
Max ERC Funding
2 270 000 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym GTCMR
Project Global Terrorism and Collective Moral Responsibility: Redesigning Military, Police and Intelligence Institutions in Liberal Democracies
Researcher (PI) Seumas Miller
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Advanced Grant (AdG), SH5, ERC-2014-ADG
Summary International terrorism, e.g. Al Qaeda, IS, is a major global security threat. Counter-terrorism is a morally complex enterprise involving police, military, intelligence agencies and non-security agencies. Counter-terrorism should be framed as a collective moral responsibility of governments, security institutions and citizens. (1) How is international terrorism to be defined? (2) What is the required theoretical notion of collective moral responsibility? (3) What counter-terrorist strategies and tactics are effective, morally permissible and consistent with liberal democracy? Tactics: targeted killing, drone warfare, preventative detention, and bulk metadata collection (e.g. by NSA); (4) How is this inchoate collective moral responsibility to be institutionally embedded in security agencies? (i) How are security institutions to be redesigned to enable them to realise and coordinate their counter-terrorism strategies without over-reaching their various core institutional purposes which have hitherto been disparate, (e.g. law enforcement versus military combat), and without compromising human rights, (e.g. right to life of innocent civilians, right to freedom, right to privacy), including by means of morally unacceptable counter-terrorism tactics? (ii) How are these tactics to be integrated with a broad-based counter-terrorism strategy which has such measures as anti-radicalisation and state-to-state engagement to address key sources of terrorism, such as the dissemination of extremist religious ideology (e.g. militant Wahhabi ideology emanating from Saudi Arabia) and the legitimate grievances of some terrorist groups (e.g. Palestinian state)? What ought a morally permissible and efficacious (i) structure of counter-terrorist institutional arrangements, and (ii) set of counter-terrorist tactics, for a contemporary liberal democracy collaborating with other liberal democracies facing the common problem of international terrorism consist of?
Summary
International terrorism, e.g. Al Qaeda, IS, is a major global security threat. Counter-terrorism is a morally complex enterprise involving police, military, intelligence agencies and non-security agencies. Counter-terrorism should be framed as a collective moral responsibility of governments, security institutions and citizens. (1) How is international terrorism to be defined? (2) What is the required theoretical notion of collective moral responsibility? (3) What counter-terrorist strategies and tactics are effective, morally permissible and consistent with liberal democracy? Tactics: targeted killing, drone warfare, preventative detention, and bulk metadata collection (e.g. by NSA); (4) How is this inchoate collective moral responsibility to be institutionally embedded in security agencies? (i) How are security institutions to be redesigned to enable them to realise and coordinate their counter-terrorism strategies without over-reaching their various core institutional purposes which have hitherto been disparate, (e.g. law enforcement versus military combat), and without compromising human rights, (e.g. right to life of innocent civilians, right to freedom, right to privacy), including by means of morally unacceptable counter-terrorism tactics? (ii) How are these tactics to be integrated with a broad-based counter-terrorism strategy which has such measures as anti-radicalisation and state-to-state engagement to address key sources of terrorism, such as the dissemination of extremist religious ideology (e.g. militant Wahhabi ideology emanating from Saudi Arabia) and the legitimate grievances of some terrorist groups (e.g. Palestinian state)? What ought a morally permissible and efficacious (i) structure of counter-terrorist institutional arrangements, and (ii) set of counter-terrorist tactics, for a contemporary liberal democracy collaborating with other liberal democracies facing the common problem of international terrorism consist of?
Max ERC Funding
2 479 810 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym HandsandBible
Project The Hands that Wrote the Bible: Digital Palaeography and Scribal Culture of the Dead Sea Scrolls
Researcher (PI) Mladen Popovic
Host Institution (HI) RIJKSUNIVERSITEIT GRONINGEN
Call Details Starting Grant (StG), SH5, ERC-2014-STG
Summary The discovery of the Dead Sea Scrolls has fundamentally transformed our knowledge of Jewish and Christian origins. The scrolls provide a unique vantage point for studying the dynamic and creative engagement with authoritative scriptures that were to become the Bible. They also offer evidence for a scribal culture ‘in action’. Palaeography can provide access to this scribal culture, showing the human hand behind what came to be regarded as holy texts.
The main objective of this interdisciplinary project is to shed new light on ancient Jewish scribal culture and the making of the Bible by freshly investigating two aspects of the scrolls’ palaeography: the typological development of writing styles and writer identification. We will combine three different approaches to study these two aspects: palaeography, computational intelligence, and 14C-dating.
The combination of new 14C samples and the use of computational intelligence as quantitative methods in order to assess the development of handwriting styles and to identify individual scribes is a unique strength of this project, which will provide a new and much-needed scientific and quantitative basis for the typological estimations of traditional palaeography of the Dead Sea Scrolls. The quantitative evidence will be used to cluster manuscripts as products of scribal activity in order to profile scribal production and to determine a more precise location in time for their activity, focusing, from literary and cultural-historical perspectives, on the content and genres of the texts that scribes wrote and copied and on the scripts and languages that they used.
Through their scribal activities these anonymous scribes constructed a ‘textual community’ and negotiated identities of the movement behind the Dead Sea Scrolls. The exciting aspect of this project is the fact that it will, through the innovative and unconventional digital palaeographic analysis that we will be using, bring these scribal identities back to life.
Summary
The discovery of the Dead Sea Scrolls has fundamentally transformed our knowledge of Jewish and Christian origins. The scrolls provide a unique vantage point for studying the dynamic and creative engagement with authoritative scriptures that were to become the Bible. They also offer evidence for a scribal culture ‘in action’. Palaeography can provide access to this scribal culture, showing the human hand behind what came to be regarded as holy texts.
The main objective of this interdisciplinary project is to shed new light on ancient Jewish scribal culture and the making of the Bible by freshly investigating two aspects of the scrolls’ palaeography: the typological development of writing styles and writer identification. We will combine three different approaches to study these two aspects: palaeography, computational intelligence, and 14C-dating.
The combination of new 14C samples and the use of computational intelligence as quantitative methods in order to assess the development of handwriting styles and to identify individual scribes is a unique strength of this project, which will provide a new and much-needed scientific and quantitative basis for the typological estimations of traditional palaeography of the Dead Sea Scrolls. The quantitative evidence will be used to cluster manuscripts as products of scribal activity in order to profile scribal production and to determine a more precise location in time for their activity, focusing, from literary and cultural-historical perspectives, on the content and genres of the texts that scribes wrote and copied and on the scripts and languages that they used.
Through their scribal activities these anonymous scribes constructed a ‘textual community’ and negotiated identities of the movement behind the Dead Sea Scrolls. The exciting aspect of this project is the fact that it will, through the innovative and unconventional digital palaeographic analysis that we will be using, bring these scribal identities back to life.
Max ERC Funding
1 484 274 €
Duration
Start date: 2015-11-01, End date: 2020-10-31
Project acronym HAP-PHEN
Project From haplotype to phenotype: a systems integration of allelic variation, chromatin state and 3D genome data
Researcher (PI) Elzo De wit
Host Institution (HI) STICHTING HET NEDERLANDS KANKER INSTITUUT-ANTONI VAN LEEUWENHOEK ZIEKENHUIS
Call Details Starting Grant (StG), LS2, ERC-2014-STG
Summary High-throughput sequencing methods are breaching the barrier of $1000 per genome. This means that it will become feasible to sequence the genomes of many individual and create a deep catalog of the bulk of human genetic variation. A great task will lie in assigning function to all this genetic variation. Genome wide association studies have already shown that 40% of all loci significantly associated with disease are found in intergenic, supposedly regulatory regions. One of the current challenges in human genetics is that variants that affect expression on a single allele cannot be directly linked, because only have genotype information, rather then haplotype information. The overarching aim of the project is to resolve haplotypes in order to identify genetic variants that affect gene expression. We will do this in three sub-projects. In the first main project we will use 3D genome information gathered from Hi-C experiments to haplotype the genomes of six lymphoblastoid cell lines. We will integrate these data with chromatin profiling and RNAseq data in order to build integrative models for the prediction of gene expression and the effect of genetic variation on gene expression. In the second project we will perform haplotyping the breast cancer genes BRCA1/2 in a large cohort of individuals that come from families with a high-risk of hereditary breast cancer. Allelic imbalance in BRCA1/2 expression levels are known to be associated with an increased risk for breast cancer. We will aim to find genetic variants that are associated with a decreased allelic expression of BRCA1/2 to improve breast cancer risk assessment. Finally, we will develop a novel tool to study 3D genome organization of single alleles, which will allow us to identify how individual alleles are organized in the nucleus and identify multi-way interactions (i.e. involving more than two genomic loci). With this we hope to better understand how complex 3D organization contributes to gene regulation.
Summary
High-throughput sequencing methods are breaching the barrier of $1000 per genome. This means that it will become feasible to sequence the genomes of many individual and create a deep catalog of the bulk of human genetic variation. A great task will lie in assigning function to all this genetic variation. Genome wide association studies have already shown that 40% of all loci significantly associated with disease are found in intergenic, supposedly regulatory regions. One of the current challenges in human genetics is that variants that affect expression on a single allele cannot be directly linked, because only have genotype information, rather then haplotype information. The overarching aim of the project is to resolve haplotypes in order to identify genetic variants that affect gene expression. We will do this in three sub-projects. In the first main project we will use 3D genome information gathered from Hi-C experiments to haplotype the genomes of six lymphoblastoid cell lines. We will integrate these data with chromatin profiling and RNAseq data in order to build integrative models for the prediction of gene expression and the effect of genetic variation on gene expression. In the second project we will perform haplotyping the breast cancer genes BRCA1/2 in a large cohort of individuals that come from families with a high-risk of hereditary breast cancer. Allelic imbalance in BRCA1/2 expression levels are known to be associated with an increased risk for breast cancer. We will aim to find genetic variants that are associated with a decreased allelic expression of BRCA1/2 to improve breast cancer risk assessment. Finally, we will develop a novel tool to study 3D genome organization of single alleles, which will allow us to identify how individual alleles are organized in the nucleus and identify multi-way interactions (i.e. involving more than two genomic loci). With this we hope to better understand how complex 3D organization contributes to gene regulation.
Max ERC Funding
1 500 000 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym HEARTOFSTROKE
Project The heart of stroke: Pipes, Perfusion, Parenchyma
Researcher (PI) Jeroen Hendrikse
Host Institution (HI) UNIVERSITAIR MEDISCH CENTRUM UTRECHT
Call Details Starting Grant (StG), LS7, ERC-2014-STG
Summary My aim is to understand the cause of stroke in every single patient. Brain microinfarcts and macroinfarcts cause a major healthcare burden in Western societies both in terms of morbidity and costs. Cardiovascular thromboemboli from the heart, aorta and neck arteries are considered as the main cause. Still, the vast majority of brain infarcts are unexplained. In contrast to the thromboembolic explanation for brain infarcts, heart infarcts are known to be caused by local atherosclerotic plaque of the coronary arteries and impaired perfusion. This has led to successful preventive and therapeutic strategies against myocardial infarction. For brain infarcts, a blind eye is turned to local atherosclerotic plaque in the intracranial vasculature (Pipes) and impaired Perfusion as possible causes of macro and microinfarcts (Parenchyma). For the ‘3Ps’ (Pipes, Perfusion, Parenchyma) I have created new research fields based on innovative noninvasive arterial spin labeling perfusion MRI, perfusion reserve, perfusion territory and vessel wall MRI methods. In this project, I will go an important step beyond the state of the art by investigating the total intracranial burden of disease of these ‘3Ps’ and systematic evaluations in patients. Pipes: novel methods to visualise and characterise intracranial plaque including inflammatory plaque enhancement, intraplaque haemorrhage and calcification detection. Perfusion: novel methods to investigate hemodynamic impairment in areas with critically low perfusion with noninvasive arterial spin labeling MRI methods and perfusion reserve methods specific for each intracranial perfusion territory. Parenchyma: novel methods to detect microinfarcts. Technical innovations (Pipes, Perfusion) will be applied in synergy to explain micro and macroinfarcts (Parenchyma). Patient specific biomarkers will, similar to the heart, pave the way for designing preventive and therapeutic strategies aimed at reducing the burden of neurodegenerative diseases.
Summary
My aim is to understand the cause of stroke in every single patient. Brain microinfarcts and macroinfarcts cause a major healthcare burden in Western societies both in terms of morbidity and costs. Cardiovascular thromboemboli from the heart, aorta and neck arteries are considered as the main cause. Still, the vast majority of brain infarcts are unexplained. In contrast to the thromboembolic explanation for brain infarcts, heart infarcts are known to be caused by local atherosclerotic plaque of the coronary arteries and impaired perfusion. This has led to successful preventive and therapeutic strategies against myocardial infarction. For brain infarcts, a blind eye is turned to local atherosclerotic plaque in the intracranial vasculature (Pipes) and impaired Perfusion as possible causes of macro and microinfarcts (Parenchyma). For the ‘3Ps’ (Pipes, Perfusion, Parenchyma) I have created new research fields based on innovative noninvasive arterial spin labeling perfusion MRI, perfusion reserve, perfusion territory and vessel wall MRI methods. In this project, I will go an important step beyond the state of the art by investigating the total intracranial burden of disease of these ‘3Ps’ and systematic evaluations in patients. Pipes: novel methods to visualise and characterise intracranial plaque including inflammatory plaque enhancement, intraplaque haemorrhage and calcification detection. Perfusion: novel methods to investigate hemodynamic impairment in areas with critically low perfusion with noninvasive arterial spin labeling MRI methods and perfusion reserve methods specific for each intracranial perfusion territory. Parenchyma: novel methods to detect microinfarcts. Technical innovations (Pipes, Perfusion) will be applied in synergy to explain micro and macroinfarcts (Parenchyma). Patient specific biomarkers will, similar to the heart, pave the way for designing preventive and therapeutic strategies aimed at reducing the burden of neurodegenerative diseases.
Max ERC Funding
1 499 450 €
Duration
Start date: 2015-03-01, End date: 2020-02-29
Project acronym HInDI
Project The historical dynamics of industrialization in Northwestern Europe and China ca. 1800-2010: A regional interpretation
Researcher (PI) Bas Van leeuwen
Host Institution (HI) KONINKLIJKE NEDERLANDSE AKADEMIE VAN WETENSCHAPPEN - KNAW
Call Details Starting Grant (StG), SH6, ERC-2014-STG
Summary The Industrial Revolution is one of the most important events in human history: within a century, some countries multiplied their per capita income while others stagnated, exacerbating international inequality. Reducing this enduring inequality through industrial development has been a crucial policy goal; however, as no precise understanding of industrial development exists, no consistent international policy has been formulated.
The understanding of this phenomenon has been hampered by a lack of data, theory, and historical dynamism. Many studies are, due to the lack of data, conducted on the national level, despite the fact that industrialization is ultimately a regional (i.e. intra-national) phenomenon. The basic principle of regional industrialization was investigated until the 1990s, when the focus shifted to other areas. At that time these studies had still been unconnected with economic location theory, as in the 1990s these were still unable to explain the regional spread of industrialization. Yet, more recent location theories have relaxed certain theoretical assumptions, allowing their application to the spread of industrialization as well. However, even these theories often lack historical dynamism, i.e. the capability to predict and explain the considerable changes that industrialization underwent these past 200 years.
Using the regional approach, merging it with recent location theory, and creating a systematic regional dataset, this project will fundamentally alter our insights in the spread and development of industrialization. Analysis will focus on four macro regions and their sub-regions: two in Europe (England and the Low Countries) and two in China (the Yangtze delta and the Yungui area). These macro regions cover the timeline of industrialization (England, then the Low Countries, and much later, the Yangtze and finally the Yungui area) which may have caused different patterns of local industrialization within each of these macro regions.
Summary
The Industrial Revolution is one of the most important events in human history: within a century, some countries multiplied their per capita income while others stagnated, exacerbating international inequality. Reducing this enduring inequality through industrial development has been a crucial policy goal; however, as no precise understanding of industrial development exists, no consistent international policy has been formulated.
The understanding of this phenomenon has been hampered by a lack of data, theory, and historical dynamism. Many studies are, due to the lack of data, conducted on the national level, despite the fact that industrialization is ultimately a regional (i.e. intra-national) phenomenon. The basic principle of regional industrialization was investigated until the 1990s, when the focus shifted to other areas. At that time these studies had still been unconnected with economic location theory, as in the 1990s these were still unable to explain the regional spread of industrialization. Yet, more recent location theories have relaxed certain theoretical assumptions, allowing their application to the spread of industrialization as well. However, even these theories often lack historical dynamism, i.e. the capability to predict and explain the considerable changes that industrialization underwent these past 200 years.
Using the regional approach, merging it with recent location theory, and creating a systematic regional dataset, this project will fundamentally alter our insights in the spread and development of industrialization. Analysis will focus on four macro regions and their sub-regions: two in Europe (England and the Low Countries) and two in China (the Yangtze delta and the Yungui area). These macro regions cover the timeline of industrialization (England, then the Low Countries, and much later, the Yangtze and finally the Yungui area) which may have caused different patterns of local industrialization within each of these macro regions.
Max ERC Funding
1 452 309 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym HYMEDNA
Project Hypermethylated DNA detection using NanoGaps
Researcher (PI) Wilfred Gerard van der wiel
Host Institution (HI) UNIVERSITEIT TWENTE
Call Details Proof of Concept (PoC), PC1, ERC-2014-PoC
Summary HYMEDNA prepares commercialization of very sensitive point-of-care biosensors for early-stage cancer detection based on the electrical detection of hypermethylated DNA (hmDNA) inside nanogaps.
Only recently the awareness has risen that local hypermethylation of DNA provides a generic marker for a wide range of cancers. A robust, simple and cheap method for detecting hmDNA at low concentrations in blood, urine or faeces would be a major step forward in the early-stage detection of cancer. Existing hmDNA detection relies on fluorescent read-out, which requires dedicated laboratory handling.
Our technology is based on electrodes separated by a tunable nanogap. hmDNA is trapped and highly concentrated in between the electrodes using methyl binding domain (MBD) proteins. MBD binds specifically to methylated CpG sequences and thus provides the direct recognition of the targeted methylated moieties, contrasting existing DNA detection methods which commonly employ DNA (or PNA) oligos and rely on sequence specificity. After target binding, the conductivity of the trapped hmDNA is enhanced, for which we provide alternative routes. The detection step is formed by a simple measurement of the electrical conduction between the electrodes. As our detection scheme relies on completely turning on the conduction instead of only modulating it (as in other electrical detection schemes), an exceptionally high sensitivity is expected. As the most competitive advantages we identify (1) high selectivity (specific chemistry) and sensitivity (“on-off” effect), (2) simple, scalable device concept, (3) robustness against environment, (4) small size (allowing for implementation in a bioassay device for multiple target molecules), and (5) low cost price.
Summary
HYMEDNA prepares commercialization of very sensitive point-of-care biosensors for early-stage cancer detection based on the electrical detection of hypermethylated DNA (hmDNA) inside nanogaps.
Only recently the awareness has risen that local hypermethylation of DNA provides a generic marker for a wide range of cancers. A robust, simple and cheap method for detecting hmDNA at low concentrations in blood, urine or faeces would be a major step forward in the early-stage detection of cancer. Existing hmDNA detection relies on fluorescent read-out, which requires dedicated laboratory handling.
Our technology is based on electrodes separated by a tunable nanogap. hmDNA is trapped and highly concentrated in between the electrodes using methyl binding domain (MBD) proteins. MBD binds specifically to methylated CpG sequences and thus provides the direct recognition of the targeted methylated moieties, contrasting existing DNA detection methods which commonly employ DNA (or PNA) oligos and rely on sequence specificity. After target binding, the conductivity of the trapped hmDNA is enhanced, for which we provide alternative routes. The detection step is formed by a simple measurement of the electrical conduction between the electrodes. As our detection scheme relies on completely turning on the conduction instead of only modulating it (as in other electrical detection schemes), an exceptionally high sensitivity is expected. As the most competitive advantages we identify (1) high selectivity (specific chemistry) and sensitivity (“on-off” effect), (2) simple, scalable device concept, (3) robustness against environment, (4) small size (allowing for implementation in a bioassay device for multiple target molecules), and (5) low cost price.
Max ERC Funding
149 579 €
Duration
Start date: 2015-10-01, End date: 2016-09-30
Project acronym ICONICAL
Project In control of exciton and charge dynamics in molecular crystals
Researcher (PI) Ferdinand Cornelius Grozema
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Consolidator Grant (CoG), PE4, ERC-2014-CoG
Summary The aim of the work proposed here is to achieve control over charge and excited state dynamics in organic crystalline materials and in this way to come to solid state materials with explicit built-in functionality. The charge and excited state dynamics do not only depend on the properties of individual molecules but are to a large extent determined by the interactions between multiple molecules. By careful engineering of the properties of individual molecules and of the way they aggregate in the solid crystalline state it is in principle possible to design materials that exhibit a specific functionality. Examples of this are materials that are optimized to give high charge carrier mobilities and high exciton diffusion coefficients. It is also possible to design more complex functionality. An example of this is singlet exciton fission, a process by which one singlet excited state transforms into a combination of two triplet states. This spin-allowed process can in principle increase the efficiency of organic solar cells by a factor 1.5. A second example is upconversion of low energy photons into higher energy photons. This is possible by combining two low-energy triplet excited states into a single singlet excited state by triplet-triplet annihilation. Finally, it is possible gain control over charge separation on the interface of two different materials to increase the charge separation efficiency in photovoltaic cells.
In this work, we will explore ways to achieve control of charge and exciton dynamics in a combined effort including organic synthesis, computational chemistry and time-resolved spectroscopy and conductivity experiments. This research represents a major step forward in the understanding of the relation between molecular and solid state structure and the electronic properties of organic crystalline materials. This is of considerable fundamental interest but also has direct implications for the utilization of these materials in electronic devices.
Summary
The aim of the work proposed here is to achieve control over charge and excited state dynamics in organic crystalline materials and in this way to come to solid state materials with explicit built-in functionality. The charge and excited state dynamics do not only depend on the properties of individual molecules but are to a large extent determined by the interactions between multiple molecules. By careful engineering of the properties of individual molecules and of the way they aggregate in the solid crystalline state it is in principle possible to design materials that exhibit a specific functionality. Examples of this are materials that are optimized to give high charge carrier mobilities and high exciton diffusion coefficients. It is also possible to design more complex functionality. An example of this is singlet exciton fission, a process by which one singlet excited state transforms into a combination of two triplet states. This spin-allowed process can in principle increase the efficiency of organic solar cells by a factor 1.5. A second example is upconversion of low energy photons into higher energy photons. This is possible by combining two low-energy triplet excited states into a single singlet excited state by triplet-triplet annihilation. Finally, it is possible gain control over charge separation on the interface of two different materials to increase the charge separation efficiency in photovoltaic cells.
In this work, we will explore ways to achieve control of charge and exciton dynamics in a combined effort including organic synthesis, computational chemistry and time-resolved spectroscopy and conductivity experiments. This research represents a major step forward in the understanding of the relation between molecular and solid state structure and the electronic properties of organic crystalline materials. This is of considerable fundamental interest but also has direct implications for the utilization of these materials in electronic devices.
Max ERC Funding
2 000 000 €
Duration
Start date: 2015-06-01, End date: 2020-05-31