Project acronym ANTINEUTRINONOVA
Project Probing Fundamental Physics with Antineutrinos at the NOvA Experiment
Researcher (PI) Jeffrey Hartnell
Host Institution (HI) THE UNIVERSITY OF SUSSEX
Country United Kingdom
Call Details Starting Grant (StG), PE2, ERC-2012-StG_20111012
Summary "This proposal addresses major questions in particle physics that are at the forefront of experimental and theoretical physics research today. The results offered would have far-reaching implications in other fields such as cosmology and could help answer some of the big questions such as why the universe contains so much more matter than antimatter. The research objectives of this proposal are to (i) make world-leading tests of CPT symmetry and (ii) discover the neutrino mass hierarchy and search for indications of leptonic CP violation.
The NOvA long-baseline neutrino oscillation experiment will use a novel ""totally active scintillator design"" for the detector technology and will be exposed to the world's highest power neutrino beam. Building on the first direct observation of muon antineutrino disappearance (that was made by a group founded and led by the PI at the MINOS experiment), tests of CPT symmetry will be performed by looking for differences in the mass squared splittings and mixing angles between neutrinos and antineutrinos. The potential to discover the mass hierarchy is unique to NOvA on the timescale of this proposal due to the long 810 km baseline and the well measured beam of neutrinos and antineutrinos.
This proposal addresses several key challenges in a long-baseline neutrino oscillation experiment with the following tasks: (i) development of a new approach to event energy reconstruction that is expected to have widespread applicability for future neutrino experiments; (ii) undertaking a comprehensive calibration project, exploiting a novel technique developed by the PI, that will be essential to achieving the physics goals; (iii) development of a sophisticated statistical analyses.
The results promised in this proposal surpass the sensitivity to antineutrino oscillation parameters of current 1st generation experiments by at least an order of magnitude, offering wide scope for profound discoveries with implications across disciplines."
Summary
"This proposal addresses major questions in particle physics that are at the forefront of experimental and theoretical physics research today. The results offered would have far-reaching implications in other fields such as cosmology and could help answer some of the big questions such as why the universe contains so much more matter than antimatter. The research objectives of this proposal are to (i) make world-leading tests of CPT symmetry and (ii) discover the neutrino mass hierarchy and search for indications of leptonic CP violation.
The NOvA long-baseline neutrino oscillation experiment will use a novel ""totally active scintillator design"" for the detector technology and will be exposed to the world's highest power neutrino beam. Building on the first direct observation of muon antineutrino disappearance (that was made by a group founded and led by the PI at the MINOS experiment), tests of CPT symmetry will be performed by looking for differences in the mass squared splittings and mixing angles between neutrinos and antineutrinos. The potential to discover the mass hierarchy is unique to NOvA on the timescale of this proposal due to the long 810 km baseline and the well measured beam of neutrinos and antineutrinos.
This proposal addresses several key challenges in a long-baseline neutrino oscillation experiment with the following tasks: (i) development of a new approach to event energy reconstruction that is expected to have widespread applicability for future neutrino experiments; (ii) undertaking a comprehensive calibration project, exploiting a novel technique developed by the PI, that will be essential to achieving the physics goals; (iii) development of a sophisticated statistical analyses.
The results promised in this proposal surpass the sensitivity to antineutrino oscillation parameters of current 1st generation experiments by at least an order of magnitude, offering wide scope for profound discoveries with implications across disciplines."
Max ERC Funding
1 415 848 €
Duration
Start date: 2012-10-01, End date: 2018-09-30
Project acronym BAYES OR BUST!
Project Bayes or Bust: Sensible Hypothesis Tests for Social Scientists
Researcher (PI) Eric-Jan Wagenmakers
Host Institution (HI) UNIVERSITEIT VAN AMSTERDAM
Country Netherlands
Call Details Starting Grant (StG), SH4, ERC-2011-StG_20101124
Summary The goal of this proposal is to develop and promote Bayesian hypothesis tests for social scientists. By and large, social scientists have ignored the Bayesian revolution in statistics, and, consequently, most social scientists still assess the veracity of experimental effects using the same methodology that was used by their advisors and the advisors before them. This state of affairs is undesirable: social scientists conduct groundbreaking, innovative research only to analyze their results using methods that are old-fashioned or even inappropriate. This imbalance between the science and the statistics has gradually increased the pressure on the field to change the way inferences are drawn from their data. However, three requirements need to be fulfilled before social scientists are ready to adopt Bayesian tests of hypotheses. First, the Bayesian tests need to be developed for problems that social scientists work with on a regular basis; second, the Bayesian tests need to be default or objective; and, third, the Bayesian tests need to be available in a user-friendly computer program. This proposal seeks to make major progress on all three fronts.
Concretely, the projects in this proposal build on recent developments in the field of statistics and use the default Jeffreys-Zellner-Siow priors to compute Bayesian hypothesis tests for regression, correlation, the t-test, and different versions of analysis of variance (ANOVA). A similar approach will be used to develop Bayesian hypothesis tests for logistic regression and the analysis of contingency tables, as well as for popular latent process methods such as factor analysis and structural equation modeling. We aim to implement the various tests in a new computer program, Bayes-SPSS, with a similar look and feel as the frequentist spreadsheet program SPSS (i.e., Statistical Package for the Social Sciences). Together, these projects may help revolutionize the way social scientists analyze their data.
Summary
The goal of this proposal is to develop and promote Bayesian hypothesis tests for social scientists. By and large, social scientists have ignored the Bayesian revolution in statistics, and, consequently, most social scientists still assess the veracity of experimental effects using the same methodology that was used by their advisors and the advisors before them. This state of affairs is undesirable: social scientists conduct groundbreaking, innovative research only to analyze their results using methods that are old-fashioned or even inappropriate. This imbalance between the science and the statistics has gradually increased the pressure on the field to change the way inferences are drawn from their data. However, three requirements need to be fulfilled before social scientists are ready to adopt Bayesian tests of hypotheses. First, the Bayesian tests need to be developed for problems that social scientists work with on a regular basis; second, the Bayesian tests need to be default or objective; and, third, the Bayesian tests need to be available in a user-friendly computer program. This proposal seeks to make major progress on all three fronts.
Concretely, the projects in this proposal build on recent developments in the field of statistics and use the default Jeffreys-Zellner-Siow priors to compute Bayesian hypothesis tests for regression, correlation, the t-test, and different versions of analysis of variance (ANOVA). A similar approach will be used to develop Bayesian hypothesis tests for logistic regression and the analysis of contingency tables, as well as for popular latent process methods such as factor analysis and structural equation modeling. We aim to implement the various tests in a new computer program, Bayes-SPSS, with a similar look and feel as the frequentist spreadsheet program SPSS (i.e., Statistical Package for the Social Sciences). Together, these projects may help revolutionize the way social scientists analyze their data.
Max ERC Funding
1 498 286 €
Duration
Start date: 2012-05-01, End date: 2017-04-30
Project acronym CASAA
Project Catalytic asymmetric synthesis of amines and amides
Researcher (PI) Jeffrey William Bode
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Country Switzerland
Call Details Starting Grant (StG), PE5, ERC-2012-StG_20111012
Summary "Amines and their acylated derivatives – amides – are among the most common chemical functional groups found in modern pharmaceuticals. Despite this there are few methods for their efficient, environmentally sustainable production in enantiomerically pure form. This proposal seeks to provide new catalytic chemical methods including 1) the catalytic, enantioselective synthesis of peptides and 2) catalytic methods for the preparation of enantiopure nitrogen-containing heterocycles. The proposed work features innovative chemistry including novel reaction mechanism and catalysts. These methods have far reaching applications for the sustainable production of valuable compounds as well as fundamental science."
Summary
"Amines and their acylated derivatives – amides – are among the most common chemical functional groups found in modern pharmaceuticals. Despite this there are few methods for their efficient, environmentally sustainable production in enantiomerically pure form. This proposal seeks to provide new catalytic chemical methods including 1) the catalytic, enantioselective synthesis of peptides and 2) catalytic methods for the preparation of enantiopure nitrogen-containing heterocycles. The proposed work features innovative chemistry including novel reaction mechanism and catalysts. These methods have far reaching applications for the sustainable production of valuable compounds as well as fundamental science."
Max ERC Funding
1 500 000 €
Duration
Start date: 2012-12-01, End date: 2017-11-30
Project acronym ESSOG
Project Extracting science from surveys of our Galaxy
Researcher (PI) James Jeffrey Binney
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Country United Kingdom
Call Details Advanced Grant (AdG), PE9, ERC-2012-ADG_20120216
Summary "The goal is to put in place the infrastructure required to extract the promised science for large surveys of our Galaxy that are underway and will culminate in ESA's Cornerstone Mission Gaia. Dynamical models are fundamental to this process because surveys are heavily biased by the Sun's location in the Galaxy. Novel dynamical models will be built and novel methods of fitting them to the data developed. With their help we will be able to constrain the distribution of dark matter in the Galaxy. By modelling the chemical and dynamical evolution of the Galaxy we expect to be able to infer much information about how the Galaxy was assembled, and thus test the prevailing cosmological paradigm. During the grant period we will be applying our tools to ground-based surveys, but the first version of the Gaia Catalogue will become available at the end of the grant period, and our goal is to have everything ready and tested for its prompt exploitation."
Summary
"The goal is to put in place the infrastructure required to extract the promised science for large surveys of our Galaxy that are underway and will culminate in ESA's Cornerstone Mission Gaia. Dynamical models are fundamental to this process because surveys are heavily biased by the Sun's location in the Galaxy. Novel dynamical models will be built and novel methods of fitting them to the data developed. With their help we will be able to constrain the distribution of dark matter in the Galaxy. By modelling the chemical and dynamical evolution of the Galaxy we expect to be able to infer much information about how the Galaxy was assembled, and thus test the prevailing cosmological paradigm. During the grant period we will be applying our tools to ground-based surveys, but the first version of the Gaia Catalogue will become available at the end of the grant period, and our goal is to have everything ready and tested for its prompt exploitation."
Max ERC Funding
1 954 460 €
Duration
Start date: 2013-04-01, End date: 2018-03-31
Project acronym EURECA
Project Eukaryotic Regulated RNA Catabolism
Researcher (PI) Torben Heick Jensen
Host Institution (HI) AARHUS UNIVERSITET
Country Denmark
Call Details Advanced Grant (AdG), LS1, ERC-2013-ADG
Summary "Regulation and fidelity of gene expression is fundamental to the differentiation and maintenance of all living organisms. While historically attention has been focused on the process of transcriptional activation, we predict that RNA turnover pathways are equally important for gene expression regulation. This has been implied for selected protein-coding RNAs (mRNAs) but is virtually unexplored for non-protein-coding RNAs (ncRNAs).
The intention of the EURECA proposal is to establish cutting-edge research to characterize mammalian nuclear RNA turnover; its factor utility, substrate specificity and regulatory capacity. We foresee that RNA turnover is at the core of gene expression regulation - forming intricate connection to RNA productive systems – thus, being centrally placed to determine RNA fate. EURECA seeks to dramatically improve our understanding of cellular decision processes impacting RNA levels and to establish models for how regulated RNA turnover helps control key biological processes.
The realization that the number of ncRNA producing genes was previously grossly underestimated foretells that ncRNA regulation will impact on most aspects of cell biology. Consistently, aberrant ncRNA levels correlate with human disease phenotypes and RNA turnover complexes are linked to disease biology. Still, solid models for how ncRNA turnover regulate biological processes in higher eukaryotes are not available. Moreover, which ncRNAs retain function and which are merely transcriptional by-products remain a major challenge to sort out. The circumstances and kinetics of ncRNA turnover are therefore important to delineate as these will ultimately relate to the likelihood of molecular function. A fundamental challenge here is to also discern which protein complements of non-coding ribonucleoprotein particles (ncRNPs) are (in)compatible with function. Balancing single transcript/factor analysis with high-throughput methodology, EURECA will address these questions."
Summary
"Regulation and fidelity of gene expression is fundamental to the differentiation and maintenance of all living organisms. While historically attention has been focused on the process of transcriptional activation, we predict that RNA turnover pathways are equally important for gene expression regulation. This has been implied for selected protein-coding RNAs (mRNAs) but is virtually unexplored for non-protein-coding RNAs (ncRNAs).
The intention of the EURECA proposal is to establish cutting-edge research to characterize mammalian nuclear RNA turnover; its factor utility, substrate specificity and regulatory capacity. We foresee that RNA turnover is at the core of gene expression regulation - forming intricate connection to RNA productive systems – thus, being centrally placed to determine RNA fate. EURECA seeks to dramatically improve our understanding of cellular decision processes impacting RNA levels and to establish models for how regulated RNA turnover helps control key biological processes.
The realization that the number of ncRNA producing genes was previously grossly underestimated foretells that ncRNA regulation will impact on most aspects of cell biology. Consistently, aberrant ncRNA levels correlate with human disease phenotypes and RNA turnover complexes are linked to disease biology. Still, solid models for how ncRNA turnover regulate biological processes in higher eukaryotes are not available. Moreover, which ncRNAs retain function and which are merely transcriptional by-products remain a major challenge to sort out. The circumstances and kinetics of ncRNA turnover are therefore important to delineate as these will ultimately relate to the likelihood of molecular function. A fundamental challenge here is to also discern which protein complements of non-coding ribonucleoprotein particles (ncRNPs) are (in)compatible with function. Balancing single transcript/factor analysis with high-throughput methodology, EURECA will address these questions."
Max ERC Funding
2 497 960 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym GENEWELL
Project Genetics and epigenetics of animal welfare
Researcher (PI) Per Ole Stokmann Jensen
Host Institution (HI) LINKOPINGS UNIVERSITET
Country Sweden
Call Details Advanced Grant (AdG), LS9, ERC-2012-ADG_20120314
Summary Animal welfare is a topic of highest societal and scientific priority. Here, I propose to use genomic and epigenetic tools to provide a new perspective on the biology of animal welfare. This will reveal mechanisms involved in modulating stress responses. Groundbreaking aspects include new insights into how environmental conditions shape the orchestration of the genome by means of epigenetic mechanisms, and how this in turn modulates coping patterns of animals. The flexible epigenome comprises the interface between the environment and the genome. It is involved in both short- and long-term, including transgenerational, adaptations of animals. Hence, populations may adapt to environmental conditions over generations, using epigenetic mechanisms. The project will primarily be based on chickens, but will also be extended to a novel species, the dog. We will generate congenic chicken strains, where interesting alleles and epialleles will be fixed against a common background of either RJF or domestic genotypes. In these, we will apply a broad phenotyping strategy, to characterize the effects on different welfare relevant behaviors. Furthermore, we will characterize how environmental stress affects the epigenome of birds, and tissue samples from more than 500 birds from an intercross between RJF and White Leghorn layers will be used to perform an extensive meth-QTL-analysis. This will reveal environmental and genetic mechanisms affecting gene-specific methylation. The dog is another highly interesting species in the context of behavior genetics, because of its high inter-breed variation in behavior, and its compact and sequenced genome. We will set up a large-scale F2-intercross experiment and phenotype about 400 dogs in standardized behavioral tests. All individuals will be genotyped on about 1000 genetic markers, and this will be used for performing an extensive QTL-analysis in order to find new loci and alleles associated with personalities and coping patterns.
Summary
Animal welfare is a topic of highest societal and scientific priority. Here, I propose to use genomic and epigenetic tools to provide a new perspective on the biology of animal welfare. This will reveal mechanisms involved in modulating stress responses. Groundbreaking aspects include new insights into how environmental conditions shape the orchestration of the genome by means of epigenetic mechanisms, and how this in turn modulates coping patterns of animals. The flexible epigenome comprises the interface between the environment and the genome. It is involved in both short- and long-term, including transgenerational, adaptations of animals. Hence, populations may adapt to environmental conditions over generations, using epigenetic mechanisms. The project will primarily be based on chickens, but will also be extended to a novel species, the dog. We will generate congenic chicken strains, where interesting alleles and epialleles will be fixed against a common background of either RJF or domestic genotypes. In these, we will apply a broad phenotyping strategy, to characterize the effects on different welfare relevant behaviors. Furthermore, we will characterize how environmental stress affects the epigenome of birds, and tissue samples from more than 500 birds from an intercross between RJF and White Leghorn layers will be used to perform an extensive meth-QTL-analysis. This will reveal environmental and genetic mechanisms affecting gene-specific methylation. The dog is another highly interesting species in the context of behavior genetics, because of its high inter-breed variation in behavior, and its compact and sequenced genome. We will set up a large-scale F2-intercross experiment and phenotype about 400 dogs in standardized behavioral tests. All individuals will be genotyped on about 1000 genetic markers, and this will be used for performing an extensive QTL-analysis in order to find new loci and alleles associated with personalities and coping patterns.
Max ERC Funding
2 499 828 €
Duration
Start date: 2013-03-01, End date: 2018-02-28
Project acronym HBAR12
Project Spectroscopy of Trapped Antihydrogen
Researcher (PI) Jeffrey Scott Hangst
Host Institution (HI) AARHUS UNIVERSITET
Country Denmark
Call Details Advanced Grant (AdG), PE2, ERC-2012-ADG_20120216
Summary Antihydrogen is the only stable, neutral antimatter system available for laboratory study. Recently, the ALPHA Collaboration at CERN has succeeded in synthesizing and trapping antihydrogen atoms, storing them for up to 1000 s, and performing the first resonant spectroscopy, using microwaves, on trapped antihydrogen. This last, historic result paves the way for precision microwave and laser spectroscopic measurements using small numbers of trapped antihydrogen atoms. Because of the breakthroughs made in our collaboration, it is now possible, for the first time, to design antimatter spectroscopic experiments that have achievable milestones of precision. These measurements require a next-generation apparatus, known as ALPHA-2, which is the subject of this proposal. The items sought are hardware components and radiation sources to help us to test CPT (charge conjugation, parity, time reversal) symmetry invariance by comparing the spectrum of antihydrogen to that of hydrogen. More generally, we will address the very fundamental question: do matter and antimatter obey the same laws of physics? The Standard Model says that they must, but mystery continues to cloud our understanding of antimatter - as evidenced by the unexplained baryon asymmetry in the universe. ALPHA's experiments offer a unique, high precision, model-independent view into the internal workings of antimatter.
Summary
Antihydrogen is the only stable, neutral antimatter system available for laboratory study. Recently, the ALPHA Collaboration at CERN has succeeded in synthesizing and trapping antihydrogen atoms, storing them for up to 1000 s, and performing the first resonant spectroscopy, using microwaves, on trapped antihydrogen. This last, historic result paves the way for precision microwave and laser spectroscopic measurements using small numbers of trapped antihydrogen atoms. Because of the breakthroughs made in our collaboration, it is now possible, for the first time, to design antimatter spectroscopic experiments that have achievable milestones of precision. These measurements require a next-generation apparatus, known as ALPHA-2, which is the subject of this proposal. The items sought are hardware components and radiation sources to help us to test CPT (charge conjugation, parity, time reversal) symmetry invariance by comparing the spectrum of antihydrogen to that of hydrogen. More generally, we will address the very fundamental question: do matter and antimatter obey the same laws of physics? The Standard Model says that they must, but mystery continues to cloud our understanding of antimatter - as evidenced by the unexplained baryon asymmetry in the universe. ALPHA's experiments offer a unique, high precision, model-independent view into the internal workings of antimatter.
Max ERC Funding
2 136 888 €
Duration
Start date: 2013-05-01, End date: 2018-12-31
Project acronym HETMAT
Project Heterogeneity That Matters for Trade and Welfare
Researcher (PI) Thierry Mayer
Host Institution (HI) FONDATION NATIONALE DES SCIENCES POLITIQUES
Country France
Call Details Starting Grant (StG), SH1, ERC-2012-StG_20111124
Summary Accounting for firms' heterogeneity in trade patterns is probably one of the key innovations of international trade that occurred during the last decade. The impact of initial papers such as Melitz (2003) and Bernard and Jensen (1999) is so large in the field that it is considered to have introduced a new paradigm. Apart from providing a convincing framework for a set of empirical facts, the main motivation of this literature was that there are new gains to be expected from trade liberalization. Those come from a selection process, raising aggregate productivity through the reallocation of output among heterogeneous firms. It initially seemed that the information requirements for trade policy evaluations had become much more demanding, in particular requiring detailed micro data. However, the recent work of Arkolakis et al. (2011) suggests that two aggregate ``sufficient statistics'' may be all that is needed to compute the welfare changes associated with trade liberalization. More, they show that those statistics are the same when evaluating welfare changes in representative firm models. The project has three parts. The first one starts by showing that the sufficient statistics approach relies crucially on a specific distributional assumption on heterogeneity, the Pareto distribution. When distributed non-Pareto, heterogeneity does matter, i.e. aggregate statistics are not sufficient to evaluate welfare changes and predict trade patterns. The second part of the project specifies which type of firm-level heterogeneity matters. It shows how to identify which sectors are characterized by ``productivity sorting'' and in which ones ``quality sorting'' is more relevant. Extending the analysis to multiple product firms, the third part shows that heterogeneity inside the firm also matters for welfare changes following trade shocks. It considers how the change in the product mix of the firm following trade liberalization alters the measured productivity of the firm.
Summary
Accounting for firms' heterogeneity in trade patterns is probably one of the key innovations of international trade that occurred during the last decade. The impact of initial papers such as Melitz (2003) and Bernard and Jensen (1999) is so large in the field that it is considered to have introduced a new paradigm. Apart from providing a convincing framework for a set of empirical facts, the main motivation of this literature was that there are new gains to be expected from trade liberalization. Those come from a selection process, raising aggregate productivity through the reallocation of output among heterogeneous firms. It initially seemed that the information requirements for trade policy evaluations had become much more demanding, in particular requiring detailed micro data. However, the recent work of Arkolakis et al. (2011) suggests that two aggregate ``sufficient statistics'' may be all that is needed to compute the welfare changes associated with trade liberalization. More, they show that those statistics are the same when evaluating welfare changes in representative firm models. The project has three parts. The first one starts by showing that the sufficient statistics approach relies crucially on a specific distributional assumption on heterogeneity, the Pareto distribution. When distributed non-Pareto, heterogeneity does matter, i.e. aggregate statistics are not sufficient to evaluate welfare changes and predict trade patterns. The second part of the project specifies which type of firm-level heterogeneity matters. It shows how to identify which sectors are characterized by ``productivity sorting'' and in which ones ``quality sorting'' is more relevant. Extending the analysis to multiple product firms, the third part shows that heterogeneity inside the firm also matters for welfare changes following trade shocks. It considers how the change in the product mix of the firm following trade liberalization alters the measured productivity of the firm.
Max ERC Funding
1 119 040 €
Duration
Start date: 2012-11-01, End date: 2018-07-31
Project acronym INNODYN
Project Integrated Analysis & Design in Nonlinear Dynamics
Researcher (PI) Jakob Soendergaard Jensen
Host Institution (HI) DANMARKS TEKNISKE UNIVERSITET
Country Denmark
Call Details Starting Grant (StG), PE8, ERC-2011-StG_20101014
Summary Imagine lighter and more fuel economic cars with improved crashworthiness that help save lives, aircrafts and wind-turbine blades with significant weight reductions that lead to large savings in material costs and environmental impact, and light but efficient armour that helps to protect against potentially deadly blasts. These are the future perspectives with a new generation of advanced structures and micro-structured materials.
The goal of INNODYN is to bring current design procedures for structures and materials a significant step forward by developing new efficient procedures for integrated analysis and design taking the nonlinear dynamic performance into account. The assessment of nonlinear dynamic effects is essential for fully exploiting the vast potentials of structural and material capabilities, but a focused endeavour is strongly required to develop the methodology required to reach the ambitious goals.
INNODYN will in two interacting work-packages develop the necessary computational analysis and design tools using
1) reduced-order models (WP1) that enable optimization of the overall topology of structures which is today hindered by excessive computational costs when dealing with nonlinear dynamic systems
2) multi-scale models (WP2) that facilitates topological design of the material microstructure including essential nonlinear geometrical effects currently not included in state-of-the-art methods.
The work will be carried out by a research group with two PhD-students and a postdoc, led by a PI with a track-record for original ground-breaking research in analysis and optimization of linear and nonlinear dynamics and hosted by one of the world's leading research groups on topology optimization, the TOPOPT group at the Technical University of Denmark.
Summary
Imagine lighter and more fuel economic cars with improved crashworthiness that help save lives, aircrafts and wind-turbine blades with significant weight reductions that lead to large savings in material costs and environmental impact, and light but efficient armour that helps to protect against potentially deadly blasts. These are the future perspectives with a new generation of advanced structures and micro-structured materials.
The goal of INNODYN is to bring current design procedures for structures and materials a significant step forward by developing new efficient procedures for integrated analysis and design taking the nonlinear dynamic performance into account. The assessment of nonlinear dynamic effects is essential for fully exploiting the vast potentials of structural and material capabilities, but a focused endeavour is strongly required to develop the methodology required to reach the ambitious goals.
INNODYN will in two interacting work-packages develop the necessary computational analysis and design tools using
1) reduced-order models (WP1) that enable optimization of the overall topology of structures which is today hindered by excessive computational costs when dealing with nonlinear dynamic systems
2) multi-scale models (WP2) that facilitates topological design of the material microstructure including essential nonlinear geometrical effects currently not included in state-of-the-art methods.
The work will be carried out by a research group with two PhD-students and a postdoc, led by a PI with a track-record for original ground-breaking research in analysis and optimization of linear and nonlinear dynamics and hosted by one of the world's leading research groups on topology optimization, the TOPOPT group at the Technical University of Denmark.
Max ERC Funding
823 992 €
Duration
Start date: 2012-02-01, End date: 2016-01-31
Project acronym M and M
Project Generalization in Mind and Machine
Researcher (PI) jeffrey BOWERS
Host Institution (HI) UNIVERSITY OF BRISTOL
Country United Kingdom
Call Details Advanced Grant (AdG), SH4, ERC-2016-ADG
Summary Is the human mind a symbolic computational device? This issue was at the core Chomsky’s critique of Skinner in the 1960s, and motivated the debates regarding Parallel Distributed Processing models developed in the 1980s. The recent successes of “deep” networks make this issue topical for psychology and neuroscience, and it raises the question of whether symbols are needed for artificial intelligence more generally.
One of the innovations of the current project is to identify simple empirical phenomena that will serve a critical test-bed for both symbolic and non-symbolic neural networks. In order to make substantial progress on this issue a series of empirical and computational investigations are organised as follows. First, studies focus on tasks that, according to proponents of symbolic systems, require symbols for the sake of generalisation. Accordingly, if non-symbolic networks succeed, it would undermine one of the main motivations for symbolic systems. Second, studies focus on generalisation in tasks in which human performance is well characterised. Accordingly, the research will provide important constraints for theories of cognition across a range of domains, including vision, memory, and reasoning. Third, studies develop new learning algorithms designed to make symbolic systems biologically plausible. One of the reasons why symbolic networks are often dismissed is the claim that they are not as biologically plausible as non-symbolic models. This last ambition is the most high-risk but also potentially the most important: Introducing new computational principles may fundamentally advance our understanding of how the brain learns and computes, and furthermore, these principles may increase the computational powers of networks in ways that are important for engineering and artificial intelligence.
Summary
Is the human mind a symbolic computational device? This issue was at the core Chomsky’s critique of Skinner in the 1960s, and motivated the debates regarding Parallel Distributed Processing models developed in the 1980s. The recent successes of “deep” networks make this issue topical for psychology and neuroscience, and it raises the question of whether symbols are needed for artificial intelligence more generally.
One of the innovations of the current project is to identify simple empirical phenomena that will serve a critical test-bed for both symbolic and non-symbolic neural networks. In order to make substantial progress on this issue a series of empirical and computational investigations are organised as follows. First, studies focus on tasks that, according to proponents of symbolic systems, require symbols for the sake of generalisation. Accordingly, if non-symbolic networks succeed, it would undermine one of the main motivations for symbolic systems. Second, studies focus on generalisation in tasks in which human performance is well characterised. Accordingly, the research will provide important constraints for theories of cognition across a range of domains, including vision, memory, and reasoning. Third, studies develop new learning algorithms designed to make symbolic systems biologically plausible. One of the reasons why symbolic networks are often dismissed is the claim that they are not as biologically plausible as non-symbolic models. This last ambition is the most high-risk but also potentially the most important: Introducing new computational principles may fundamentally advance our understanding of how the brain learns and computes, and furthermore, these principles may increase the computational powers of networks in ways that are important for engineering and artificial intelligence.
Max ERC Funding
2 495 578 €
Duration
Start date: 2017-09-01, End date: 2022-08-31