Project acronym 3D-nanoMorph
Project Label-free 3D morphological nanoscopy for studying sub-cellular dynamics in live cancer cells with high spatio-temporal resolution
Researcher (PI) Krishna AGARWAL
Host Institution (HI) UNIVERSITETET I TROMSOE - NORGES ARKTISKE UNIVERSITET
Call Details Starting Grant (StG), PE7, ERC-2018-STG
Summary Label-free optical nanoscopy, free from photobleaching and photochemical toxicity of fluorescence labels and yielding 3D morphological resolution of <50 nm, is the future of live cell imaging. 3D-nanoMorph breaks the diffraction barrier and shifts the paradigm in label-free nanoscopy, providing isotropic 3D resolution of <50 nm. To achieve this, 3D-nanoMorph performs non-linear inverse scattering for the first time in nanoscopy and decodes scattering between sub-cellular structures (organelles).
3D-nanoMorph innovatively devises complementary roles of light measurement system and computational nanoscopy algorithm. A novel illumination system and a novel light collection system together enable measurement of only the most relevant intensity component and create a fresh perspective about label-free measurements. A new computational nanoscopy approach employs non-linear inverse scattering. Harnessing non-linear inverse scattering for resolution enhancement in nanoscopy opens new possibilities in label-free 3D nanoscopy.
I will apply 3D-nanoMorph to study organelle degradation (autophagy) in live cancer cells over extended duration with high spatial and temporal resolution, presently limited by the lack of high-resolution label-free 3D morphological nanoscopy. Successful 3D mapping of nanoscale biological process of autophagy will open new avenues for cancer treatment and showcase 3D-nanoMorph for wider applications.
My cross-disciplinary expertise of 14 years spanning inverse problems, electromagnetism, optical microscopy, integrated optics and live cell nanoscopy paves path for successful implementation of 3D-nanoMorph.
Summary
Label-free optical nanoscopy, free from photobleaching and photochemical toxicity of fluorescence labels and yielding 3D morphological resolution of <50 nm, is the future of live cell imaging. 3D-nanoMorph breaks the diffraction barrier and shifts the paradigm in label-free nanoscopy, providing isotropic 3D resolution of <50 nm. To achieve this, 3D-nanoMorph performs non-linear inverse scattering for the first time in nanoscopy and decodes scattering between sub-cellular structures (organelles).
3D-nanoMorph innovatively devises complementary roles of light measurement system and computational nanoscopy algorithm. A novel illumination system and a novel light collection system together enable measurement of only the most relevant intensity component and create a fresh perspective about label-free measurements. A new computational nanoscopy approach employs non-linear inverse scattering. Harnessing non-linear inverse scattering for resolution enhancement in nanoscopy opens new possibilities in label-free 3D nanoscopy.
I will apply 3D-nanoMorph to study organelle degradation (autophagy) in live cancer cells over extended duration with high spatial and temporal resolution, presently limited by the lack of high-resolution label-free 3D morphological nanoscopy. Successful 3D mapping of nanoscale biological process of autophagy will open new avenues for cancer treatment and showcase 3D-nanoMorph for wider applications.
My cross-disciplinary expertise of 14 years spanning inverse problems, electromagnetism, optical microscopy, integrated optics and live cell nanoscopy paves path for successful implementation of 3D-nanoMorph.
Max ERC Funding
1 499 999 €
Duration
Start date: 2019-07-01, End date: 2024-06-30
Project acronym AAATSI
Project Advanced Antenna Architecture for THZ Sensing Instruments
Researcher (PI) Andrea Neto
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Starting Grant (StG), PE7, ERC-2011-StG_20101014
Summary The Tera-Hertz portion of the spectrum presents unique potentials for advanced applications. Currently the THz spectrum is revealing the mechanisms at the origin of our universe and provides the means to monitor the health of our planet via satellite based sensing of critical gases. Potentially time domain sensing of the THz spectrum will be the ideal tool for a vast variety of medical and security applications.
Presently, systems in the THz regime are extremely expensive and consequently the THz spectrum is still the domain of only niche (expensive) scientific applications. The main problems are the lack of power and sensitivity. The wide unused THz spectral bandwidth is, herself, the only widely available resource that in the future can compensate for these problems. But, so far, when scientists try to really use the bandwidth, they run into an insurmountable physical limit: antenna dispersion. Antenna dispersion modifies the signal’s spectrum in a wavelength dependent manner in all types of radiation, but is particularly deleterious to THz signals because the spectrum is too wide and with foreseeable technology it cannot be digitized.
The goal of this proposal is to introduce break-through antenna technology that will eliminate the dispersion bottle neck and revolutionize Time Domain sensing and Spectroscopic Space Science. Achieving these goals the project will pole vault THz imaging technology into the 21-th century and develop critically important enabling technologies which will satisfy the electrical engineering needs of the next 30 years and in the long run will enable multi Tera-bit wireless communications.
In order to achieve these goals, I will first build upon two major breakthrough radiation mechanisms that I pioneered: Leaky Lenses and Connected Arrays. Eventually, ultra wide band imaging arrays constituted by thousands of components will be designed on the bases of the new theoretical findings and demonstrated.
Summary
The Tera-Hertz portion of the spectrum presents unique potentials for advanced applications. Currently the THz spectrum is revealing the mechanisms at the origin of our universe and provides the means to monitor the health of our planet via satellite based sensing of critical gases. Potentially time domain sensing of the THz spectrum will be the ideal tool for a vast variety of medical and security applications.
Presently, systems in the THz regime are extremely expensive and consequently the THz spectrum is still the domain of only niche (expensive) scientific applications. The main problems are the lack of power and sensitivity. The wide unused THz spectral bandwidth is, herself, the only widely available resource that in the future can compensate for these problems. But, so far, when scientists try to really use the bandwidth, they run into an insurmountable physical limit: antenna dispersion. Antenna dispersion modifies the signal’s spectrum in a wavelength dependent manner in all types of radiation, but is particularly deleterious to THz signals because the spectrum is too wide and with foreseeable technology it cannot be digitized.
The goal of this proposal is to introduce break-through antenna technology that will eliminate the dispersion bottle neck and revolutionize Time Domain sensing and Spectroscopic Space Science. Achieving these goals the project will pole vault THz imaging technology into the 21-th century and develop critically important enabling technologies which will satisfy the electrical engineering needs of the next 30 years and in the long run will enable multi Tera-bit wireless communications.
In order to achieve these goals, I will first build upon two major breakthrough radiation mechanisms that I pioneered: Leaky Lenses and Connected Arrays. Eventually, ultra wide band imaging arrays constituted by thousands of components will be designed on the bases of the new theoretical findings and demonstrated.
Max ERC Funding
1 499 487 €
Duration
Start date: 2011-11-01, End date: 2017-10-31
Project acronym ADULT
Project Analysis of the Dark Universe through Lensing Tomography
Researcher (PI) Hendrik Hoekstra
Host Institution (HI) UNIVERSITEIT LEIDEN
Call Details Starting Grant (StG), PE9, ERC-2011-StG_20101014
Summary The discoveries that the expansion of the universe is accelerating due to an unknown “dark energy”
and that most of the matter is invisible, highlight our lack of understanding of the major constituents
of the universe. These surprising findings set the stage for research in cosmology at the start of the
21st century. The objective of this proposal is to advance observational constraints to a level where we can distinguish between physical mechanisms that aim to explain the properties of dark energy and the observed distribution of dark matter throughout the universe. We use a relatively new technique called weak gravitational lensing: the accurate measurement of correlations in the orientations of distant galaxies enables us to map the dark matter distribution directly and to extract the cosmological information that is encoded by the large-scale structure.
To study the dark universe we will analyse data from a new state-of-the-art imaging survey: the Kilo-
Degree Survey (KiDS) will cover 1500 square degrees in 9 filters. The combination of its large survey
area and the availability of exquisite photometric redshifts for the sources makes KiDS the first
project that can place interesting constraints on the dark energy equation-of-state using lensing data
alone. Combined with complementary results from Planck, our measurements will provide one of the
best views of the dark side of the universe before much larger space-based projects commence.
To reach the desired accuracy we need to carefully measure the shapes of distant background galaxies. We also need to account for any intrinsic alignments that arise due to tidal interactions, rather than through lensing. Reducing these observational and physical biases to negligible levels is a necessarystep to ensure the success of KiDS and an important part of our preparation for more challenging projects such as the European-led space mission Euclid.
Summary
The discoveries that the expansion of the universe is accelerating due to an unknown “dark energy”
and that most of the matter is invisible, highlight our lack of understanding of the major constituents
of the universe. These surprising findings set the stage for research in cosmology at the start of the
21st century. The objective of this proposal is to advance observational constraints to a level where we can distinguish between physical mechanisms that aim to explain the properties of dark energy and the observed distribution of dark matter throughout the universe. We use a relatively new technique called weak gravitational lensing: the accurate measurement of correlations in the orientations of distant galaxies enables us to map the dark matter distribution directly and to extract the cosmological information that is encoded by the large-scale structure.
To study the dark universe we will analyse data from a new state-of-the-art imaging survey: the Kilo-
Degree Survey (KiDS) will cover 1500 square degrees in 9 filters. The combination of its large survey
area and the availability of exquisite photometric redshifts for the sources makes KiDS the first
project that can place interesting constraints on the dark energy equation-of-state using lensing data
alone. Combined with complementary results from Planck, our measurements will provide one of the
best views of the dark side of the universe before much larger space-based projects commence.
To reach the desired accuracy we need to carefully measure the shapes of distant background galaxies. We also need to account for any intrinsic alignments that arise due to tidal interactions, rather than through lensing. Reducing these observational and physical biases to negligible levels is a necessarystep to ensure the success of KiDS and an important part of our preparation for more challenging projects such as the European-led space mission Euclid.
Max ERC Funding
1 316 880 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym ANISOTROPIC UNIVERSE
Project The anisotropic universe -- a reality or fluke?
Researcher (PI) Hans Kristian Kamfjord Eriksen
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE9, ERC-2010-StG_20091028
Summary "During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Summary
"During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym APROCS
Project Automated Linear Parameter-Varying Modeling and Control Synthesis for Nonlinear Complex Systems
Researcher (PI) Roland TOTH
Host Institution (HI) TECHNISCHE UNIVERSITEIT EINDHOVEN
Call Details Starting Grant (StG), PE7, ERC-2016-STG
Summary Linear Parameter-Varying (LPV) systems are flexible mathematical models capable of representing Nonlinear (NL)/Time-Varying (TV) dynamical behaviors of complex physical systems (e.g., wafer scanners, car engines, chemical reactors), often encountered in engineering, via a linear structure. The LPV framework provides computationally efficient and robust approaches to synthesize digital controllers that can ensure desired operation of such systems - making it attractive to (i) high-tech mechatronic, (ii) automotive and (iii) chemical-process applications. Such a framework is important to meet with the increasing operational demands of systems in these industrial sectors and to realize future technological targets. However, recent studies have shown that, to fully exploit the potential of the LPV framework, a number of limiting factors of the underlying theory ask a for serious innovation, as currently it is not understood how to (1) automate exact and low-complexity LPV modeling of real-world applications and how to refine uncertain aspects of these models efficiently by the help of measured data, (2) incorporate control objectives directly into modeling and to develop model reduction approaches for control, and (3) how to see modeling & control synthesis as a unified, closed-loop system synthesis approach directly oriented for the underlying NL/TV system. Furthermore, due to the increasingly cyber-physical nature of applications, (4) control synthesis is needed in a plug & play fashion, where if sub-systems are modified or exchanged, then the control design and the model of the whole system are only incrementally updated. This project aims to surmount Challenges (1)-(4) by establishing an innovative revolution of the LPV framework supported by a software suite and extensive empirical studies on real-world industrial applications; with a potential to ensure a leading role of technological innovation of the EU in the high-impact industrial sectors (i)-(iii).
Summary
Linear Parameter-Varying (LPV) systems are flexible mathematical models capable of representing Nonlinear (NL)/Time-Varying (TV) dynamical behaviors of complex physical systems (e.g., wafer scanners, car engines, chemical reactors), often encountered in engineering, via a linear structure. The LPV framework provides computationally efficient and robust approaches to synthesize digital controllers that can ensure desired operation of such systems - making it attractive to (i) high-tech mechatronic, (ii) automotive and (iii) chemical-process applications. Such a framework is important to meet with the increasing operational demands of systems in these industrial sectors and to realize future technological targets. However, recent studies have shown that, to fully exploit the potential of the LPV framework, a number of limiting factors of the underlying theory ask a for serious innovation, as currently it is not understood how to (1) automate exact and low-complexity LPV modeling of real-world applications and how to refine uncertain aspects of these models efficiently by the help of measured data, (2) incorporate control objectives directly into modeling and to develop model reduction approaches for control, and (3) how to see modeling & control synthesis as a unified, closed-loop system synthesis approach directly oriented for the underlying NL/TV system. Furthermore, due to the increasingly cyber-physical nature of applications, (4) control synthesis is needed in a plug & play fashion, where if sub-systems are modified or exchanged, then the control design and the model of the whole system are only incrementally updated. This project aims to surmount Challenges (1)-(4) by establishing an innovative revolution of the LPV framework supported by a software suite and extensive empirical studies on real-world industrial applications; with a potential to ensure a leading role of technological innovation of the EU in the high-impact industrial sectors (i)-(iii).
Max ERC Funding
1 493 561 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym BinCosmos
Project The Impact of Massive Binaries Through Cosmic Time
Researcher (PI) Selma DE MINK
Host Institution (HI) UNIVERSITEIT VAN AMSTERDAM
Call Details Starting Grant (StG), PE9, ERC-2016-STG
Summary Massive stars play many key roles in Astrophysics. As COSMIC ENGINES they transformed the pristine Universe left after the Big Bang into our modern Universe. We use massive stars, their explosions and products as COSMIC PROBES to study the conditions in the distant Universe and the extreme physics inaccessible at earth. Models of massive stars are thus widely applied. A central common assumption is that massive stars are non-rotating single objects, in stark contrast with new data. Recent studies show that majority (70% according to our data) will experience severe interaction with a companion (Sana, de Mink et al. Science 2012).
I propose to conduct the most ambitious and extensive exploration to date of the effects of binarity and rotation on the lives and fates of massive stars to (I) transform our understanding of the complex physical processes and how they operate in the vast parameter space and (II) explore the cosmological implications after calibrating and verifying the models. To achieve this ambitious objective I will use an innovative computational approach that combines the strength of two highly complementary codes and seek direct confrontation with observations to overcome the computational challenges that inhibited previous work.
This timely project will provide the urgent theory framework needed for interpretation and guiding of observing programs with the new facilities (JWST, LSST, aLIGO/VIRGO). Public release of the model grids and code will ensure wide impact of this project. I am in the unique position to successfully lead this project because of my (i) extensive experience modeling the complex physical processes, (ii) leading role in introducing large statistical simulations in the massive star community and (iii) direct involvement in surveys that will be used in this project.
Summary
Massive stars play many key roles in Astrophysics. As COSMIC ENGINES they transformed the pristine Universe left after the Big Bang into our modern Universe. We use massive stars, their explosions and products as COSMIC PROBES to study the conditions in the distant Universe and the extreme physics inaccessible at earth. Models of massive stars are thus widely applied. A central common assumption is that massive stars are non-rotating single objects, in stark contrast with new data. Recent studies show that majority (70% according to our data) will experience severe interaction with a companion (Sana, de Mink et al. Science 2012).
I propose to conduct the most ambitious and extensive exploration to date of the effects of binarity and rotation on the lives and fates of massive stars to (I) transform our understanding of the complex physical processes and how they operate in the vast parameter space and (II) explore the cosmological implications after calibrating and verifying the models. To achieve this ambitious objective I will use an innovative computational approach that combines the strength of two highly complementary codes and seek direct confrontation with observations to overcome the computational challenges that inhibited previous work.
This timely project will provide the urgent theory framework needed for interpretation and guiding of observing programs with the new facilities (JWST, LSST, aLIGO/VIRGO). Public release of the model grids and code will ensure wide impact of this project. I am in the unique position to successfully lead this project because of my (i) extensive experience modeling the complex physical processes, (ii) leading role in introducing large statistical simulations in the massive star community and (iii) direct involvement in surveys that will be used in this project.
Max ERC Funding
1 926 634 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym BLOCKCHAINSOCIETY
Project The Disrupted Society: mapping the societal effects of blockchain technology diffusion
Researcher (PI) Balazs BODO
Host Institution (HI) UNIVERSITEIT VAN AMSTERDAM
Call Details Starting Grant (StG), SH3, ERC-2017-STG
Summary Recent advances in cryptography yielded the blockchain technology, which enables a radically new and decentralized method to maintain authoritative records, without the need of trusted intermediaries. Bitcoin, a cryptocurrency blockchain application has already demonstrated that it is possible to operate a purely cryptography-based, global, distributed, decentralized, anonymous financial network, independent from central and commercial banks, regulators and the state.
The same technology is now being applied to other social domains (e.g. public registries of ownership and deeds, voting systems, the internet domain name registry). But research on the societal impact of blockchain innovation is scant, and we cannot properly assess its risks and promises. In addition, crucial knowledge is missing on how blockchain technologies can and should be regulated by law.
The BlockchainSociety project focuses on three research questions. (1) What internal factors contribute to the success of a blockchain application? (2) How does society adopt blockchain? (3) How to regulate blockchain? It breaks new ground as it (1) maps the most important blockchain projects, their governance, and assesses their disruptive potential; (2) documents and analyses the social diffusion of the technology, and builds scenarios about the potential impact of blockchain diffusion; and (3) it creates an inventory of emerging policy responses, compares and assesses policy tools in terms of efficiency and impact. The project will (1) build the conceptual and methodological bridges between information law, the study of the self-governance of technological systems via Science and Technology Studies, and the study of collective control efforts of complex socio-technological assemblages via Internet Governance studies; (2) address the most pressing blockchain-specific regulatory challenges via the analysis of emerging policies, and the development of new proposals.
Summary
Recent advances in cryptography yielded the blockchain technology, which enables a radically new and decentralized method to maintain authoritative records, without the need of trusted intermediaries. Bitcoin, a cryptocurrency blockchain application has already demonstrated that it is possible to operate a purely cryptography-based, global, distributed, decentralized, anonymous financial network, independent from central and commercial banks, regulators and the state.
The same technology is now being applied to other social domains (e.g. public registries of ownership and deeds, voting systems, the internet domain name registry). But research on the societal impact of blockchain innovation is scant, and we cannot properly assess its risks and promises. In addition, crucial knowledge is missing on how blockchain technologies can and should be regulated by law.
The BlockchainSociety project focuses on three research questions. (1) What internal factors contribute to the success of a blockchain application? (2) How does society adopt blockchain? (3) How to regulate blockchain? It breaks new ground as it (1) maps the most important blockchain projects, their governance, and assesses their disruptive potential; (2) documents and analyses the social diffusion of the technology, and builds scenarios about the potential impact of blockchain diffusion; and (3) it creates an inventory of emerging policy responses, compares and assesses policy tools in terms of efficiency and impact. The project will (1) build the conceptual and methodological bridges between information law, the study of the self-governance of technological systems via Science and Technology Studies, and the study of collective control efforts of complex socio-technological assemblages via Internet Governance studies; (2) address the most pressing blockchain-specific regulatory challenges via the analysis of emerging policies, and the development of new proposals.
Max ERC Funding
1 499 631 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym BSP
Project Belief Systems Project
Researcher (PI) Mark BRANDT
Host Institution (HI) STICHTING KATHOLIEKE UNIVERSITEIT BRABANT
Call Details Starting Grant (StG), SH3, ERC-2017-STG
Summary Belief systems research is vital for understanding democratic politics, extremism, and political decision-making. What is the basic structure of belief systems? Clear answers to this fundamental question are not forthcoming. This is due to flaws in the conceptualization of belief systems. The state-of-the-art treats a belief system as a theoretical latent variable that causes people’s responses on attitudes and values relevant to the belief system. This approach cannot assess a belief system because it cannot assess the network of connections between the beliefs–attitudes and values–that make up the system; it collapses across them and the interrelationships are lost.
The Belief Systems Project conceptualizations belief systems as systems of interconnecting attitudes and values. I conceptualize attitudes and values as interactive nodes in a network that are analysed with network analyses. With these conceptual and empirical tools, I can understand the structure and dynamics of the belief system and will be able to avoid theoretical pitfalls common in belief system assessments. This project will move belief systems research beyond the state-of-the-art in four ways by:
1. Mapping the structure of systems of attitudes and values, something that is not possible using current methods.
2. Answering classic questions about central concepts and clustering of belief systems.
3. Modeling within-person belief systems and their variations, so that I can make accurate predictions about partisan motivated reasoning.
4. Testing how external and internal pressures (e.g., feelings of threat) change the underlying structure and dynamics of belief systems.
Using survey data from around the world, longitudinal panel studies, intensive longitudinal designs, experiments, and text analyses, I will triangulate on the structure of political belief systems over time, between countries, and within individuals.
Summary
Belief systems research is vital for understanding democratic politics, extremism, and political decision-making. What is the basic structure of belief systems? Clear answers to this fundamental question are not forthcoming. This is due to flaws in the conceptualization of belief systems. The state-of-the-art treats a belief system as a theoretical latent variable that causes people’s responses on attitudes and values relevant to the belief system. This approach cannot assess a belief system because it cannot assess the network of connections between the beliefs–attitudes and values–that make up the system; it collapses across them and the interrelationships are lost.
The Belief Systems Project conceptualizations belief systems as systems of interconnecting attitudes and values. I conceptualize attitudes and values as interactive nodes in a network that are analysed with network analyses. With these conceptual and empirical tools, I can understand the structure and dynamics of the belief system and will be able to avoid theoretical pitfalls common in belief system assessments. This project will move belief systems research beyond the state-of-the-art in four ways by:
1. Mapping the structure of systems of attitudes and values, something that is not possible using current methods.
2. Answering classic questions about central concepts and clustering of belief systems.
3. Modeling within-person belief systems and their variations, so that I can make accurate predictions about partisan motivated reasoning.
4. Testing how external and internal pressures (e.g., feelings of threat) change the underlying structure and dynamics of belief systems.
Using survey data from around the world, longitudinal panel studies, intensive longitudinal designs, experiments, and text analyses, I will triangulate on the structure of political belief systems over time, between countries, and within individuals.
Max ERC Funding
1 496 944 €
Duration
Start date: 2018-07-01, End date: 2023-06-30
Project acronym CAPE
Project Ghosts from the past: Consequences of Adolescent Peer Experiences across social contexts and generations
Researcher (PI) Tina KRETSCHMER
Host Institution (HI) RIJKSUNIVERSITEIT GRONINGEN
Call Details Starting Grant (StG), SH3, ERC-2017-STG
Summary Positive peer experiences are crucial for young people’s health and wellbeing. Accordingly, multiple studies (including my own) have described long-term negative psychological and behavioral consequences when adolescents’ peer relationships are dysfunctional. Paradoxically, knowledge on adult social consequences of adolescent peer experiences –relationships with others a decade later - is much less extensive. Informed by social learning and attachment theory, I tackle this gap and investigate whether and how peer experiences are transmitted to other social contexts, and intergenerationally, i.e., passed on to the next generation. My aim is to shed light on how the “ghosts from peer past” affect young adults’ relationships and their children. To this end, I examine longitudinal links between adolescent peer and young adult close relationships and test whether parents’ peer experiences affect offspring’s peer experiences. Psychological functioning, parenting, temperament, genetic, and epigenetic transmission mechanisms are examined separately and in interplay, which 1) goes far beyond the current state-of-the-art in social development research, and 2) significantly broadens my biosocially oriented work on genetic effects in the peer context. My plans utilize data from the TRAILS (Tracking Adolescents’ Individual Lives’ Survey) cohort that has been followed from age 11 to 26. To study intergenerational transmission, the TRAILS NEXT sample of participants with children is substantially extended. This project uniquely studies adult social consequences of peer experiences and, at the same time, follows children’s first steps into the peer world. The intergenerational approach and provision for environmental, genetic, and epigenetic mediation put this project at the forefront of developmental research and equip it with the potential to generate the knowledge needed to chase away the ghosts from the peer past.
Summary
Positive peer experiences are crucial for young people’s health and wellbeing. Accordingly, multiple studies (including my own) have described long-term negative psychological and behavioral consequences when adolescents’ peer relationships are dysfunctional. Paradoxically, knowledge on adult social consequences of adolescent peer experiences –relationships with others a decade later - is much less extensive. Informed by social learning and attachment theory, I tackle this gap and investigate whether and how peer experiences are transmitted to other social contexts, and intergenerationally, i.e., passed on to the next generation. My aim is to shed light on how the “ghosts from peer past” affect young adults’ relationships and their children. To this end, I examine longitudinal links between adolescent peer and young adult close relationships and test whether parents’ peer experiences affect offspring’s peer experiences. Psychological functioning, parenting, temperament, genetic, and epigenetic transmission mechanisms are examined separately and in interplay, which 1) goes far beyond the current state-of-the-art in social development research, and 2) significantly broadens my biosocially oriented work on genetic effects in the peer context. My plans utilize data from the TRAILS (Tracking Adolescents’ Individual Lives’ Survey) cohort that has been followed from age 11 to 26. To study intergenerational transmission, the TRAILS NEXT sample of participants with children is substantially extended. This project uniquely studies adult social consequences of peer experiences and, at the same time, follows children’s first steps into the peer world. The intergenerational approach and provision for environmental, genetic, and epigenetic mediation put this project at the forefront of developmental research and equip it with the potential to generate the knowledge needed to chase away the ghosts from the peer past.
Max ERC Funding
1 464 846 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym CDMAN
Project Control of Spatially Distributed Complex Multi-Agent Networks
Researcher (PI) Ming Cao
Host Institution (HI) RIJKSUNIVERSITEIT GRONINGEN
Call Details Starting Grant (StG), PE7, ERC-2012-StG_20111012
Summary "Spatially distributed multi-agent networks have been used successfully to model a wide range of natural, social and engineered complex systems, such as animal groups, online communities and electric power grids. In various contexts, it is crucial to introduce control actions into such networks to either achieve desired collective dynamics or test the understanding of the systems’ behavior. However, controlling such systems is extremely challenging due to agents’ complicated sensing, communication and control interactions that are distributed in space. Systematic methodologies to attack this challenge are in urgent need, especially when vast efforts are being made in multiple disciplines to apply the model of complex multi-agent networks.
The goal of the project is twofold. First, understand whether a complex multi-agent network can be controlled effectively when the agents can only sense and communicate locally. Second, provide methodologies to implement distributed control in typical spatially distributed complex multi-agent networks. The project requires integrated skills since both rigorous theoretical analysis and novel empirical explorations are necessary.
The research methods that I plan to adopt have two distinguishing features. First, I use tools from algebraic graph theory and complex network theory to investigate the impact of network topologies on the systems’ controller performances characterized by mathematical control theory. Second, I utilize a homemade robotic-fish testbed to implement various multi-agent control algorithms. The unique combination of theoretical and empirical studies is expected to lead to breakthroughs in developing an integrated set of principles and techniques to control effectively spatially distributed multi-agent networks. The expected results will make original contributions to control engineering and robotics, and inspire innovative research methods in theoretical biology and theoretical sociology."
Summary
"Spatially distributed multi-agent networks have been used successfully to model a wide range of natural, social and engineered complex systems, such as animal groups, online communities and electric power grids. In various contexts, it is crucial to introduce control actions into such networks to either achieve desired collective dynamics or test the understanding of the systems’ behavior. However, controlling such systems is extremely challenging due to agents’ complicated sensing, communication and control interactions that are distributed in space. Systematic methodologies to attack this challenge are in urgent need, especially when vast efforts are being made in multiple disciplines to apply the model of complex multi-agent networks.
The goal of the project is twofold. First, understand whether a complex multi-agent network can be controlled effectively when the agents can only sense and communicate locally. Second, provide methodologies to implement distributed control in typical spatially distributed complex multi-agent networks. The project requires integrated skills since both rigorous theoretical analysis and novel empirical explorations are necessary.
The research methods that I plan to adopt have two distinguishing features. First, I use tools from algebraic graph theory and complex network theory to investigate the impact of network topologies on the systems’ controller performances characterized by mathematical control theory. Second, I utilize a homemade robotic-fish testbed to implement various multi-agent control algorithms. The unique combination of theoretical and empirical studies is expected to lead to breakthroughs in developing an integrated set of principles and techniques to control effectively spatially distributed multi-agent networks. The expected results will make original contributions to control engineering and robotics, and inspire innovative research methods in theoretical biology and theoretical sociology."
Max ERC Funding
1 495 444 €
Duration
Start date: 2013-01-01, End date: 2017-12-31