Project acronym COMPOSES
Project Compositional Operations in Semantic Space
Researcher (PI) Marco Baroni
Host Institution (HI) UNIVERSITA DEGLI STUDI DI TRENTO
Call Details Starting Grant (StG), SH4, ERC-2011-StG_20101124
Summary The ability to construct new meanings by combining words into larger constituents is one of the fundamental and peculiarly human characteristics of language. Systems that induce the meaning and combinatorial properties of linguistic symbols from data are highly desirable both from a theoretical perspective (modeling a core aspect of cognition) and for practical purposes (supporting human-computer interaction). COMPOSES tackles the meaning induction and composition problem from a new perspective that brings together corpus-based distributional semantics (that is very successful at inducing the meaning of single content words, but ignores functional elements and compositionality) and formal semantics (that focuses on functional elements and composition, but largely ignores lexical aspects of meaning and lacks methods to learn the proposed structures from data). As in distributional semantics, we represent some content words (such as nouns) by vectors recording their corpus contexts. Implementing instead ideas from formal semantics, functional elements (such as determiners) are represented by functions mapping from expressions of one type onto composite expressions of the same or other types. These composition functions are induced from corpus data by statistical learning of mappings from observed context vectors of input arguments to observed context vectors of composite structures. We model a number of compositional processes in this way, developing a coherent fragment of the semantics of English in a data-driven, large-scale fashion. Given the novelty of the approach, we also propose new evaluation frameworks: On the one hand, we take inspiration from cognitive science and experimental linguistics to design elicitation methods measuring the perceived similarity and plausibility of sentences. On the other, specialized entailment tests will assess the semantic inference properties of our corpus-induced system.
Summary
The ability to construct new meanings by combining words into larger constituents is one of the fundamental and peculiarly human characteristics of language. Systems that induce the meaning and combinatorial properties of linguistic symbols from data are highly desirable both from a theoretical perspective (modeling a core aspect of cognition) and for practical purposes (supporting human-computer interaction). COMPOSES tackles the meaning induction and composition problem from a new perspective that brings together corpus-based distributional semantics (that is very successful at inducing the meaning of single content words, but ignores functional elements and compositionality) and formal semantics (that focuses on functional elements and composition, but largely ignores lexical aspects of meaning and lacks methods to learn the proposed structures from data). As in distributional semantics, we represent some content words (such as nouns) by vectors recording their corpus contexts. Implementing instead ideas from formal semantics, functional elements (such as determiners) are represented by functions mapping from expressions of one type onto composite expressions of the same or other types. These composition functions are induced from corpus data by statistical learning of mappings from observed context vectors of input arguments to observed context vectors of composite structures. We model a number of compositional processes in this way, developing a coherent fragment of the semantics of English in a data-driven, large-scale fashion. Given the novelty of the approach, we also propose new evaluation frameworks: On the one hand, we take inspiration from cognitive science and experimental linguistics to design elicitation methods measuring the perceived similarity and plausibility of sentences. On the other, specialized entailment tests will assess the semantic inference properties of our corpus-induced system.
Max ERC Funding
1 117 636 €
Duration
Start date: 2011-11-01, End date: 2016-10-31
Project acronym DECIDE
Project The impact of DEmographic Changes on Infectious DisEases transmission and control in middle/low income countries
Researcher (PI) Alessia Melegaro
Host Institution (HI) UNIVERSITA COMMERCIALE LUIGI BOCCONI
Call Details Starting Grant (StG), SH3, ERC-2011-StG_20101124
Summary Population structure and change and social contact patterns are major determinants of the observed epidemiology of infectious diseases, including the consequences on health. Demographic structure and the components of demographic dynamics are changing over time and substantially differ within countries and most critically between countries. However, some of the overall consequences of demographic changes remain unclear, though urbanisation and fertility decline will certainly have a profound impact on social structures, family composition and, as a consequence, on disease spread and on the identification of effective public health measures.
DECIDE will explore the following questions:
1. What are the major short- and medium-term impacts of demographic changes on the patterns of infectious disease (morbidity and mortality)?
2. How are these demographic changes affecting contact patterns that are of fundamental importance to the spread of infectious diseases? Are there new and different modes of transmission within and between populations?
3. What are the implications of demographic changes for infection control strategies? What is the interplay between demographic changes and public health policies in shaping future trajectories of infectious diseases?
In order to answer these questions, DECIDE will use the following strategy: analyse harmonised demographic and health survey data (DHS), and health and demographic surveillance system data (HDSS); develop new estimates of social contact patterns and other socio-demographic variables collecting data from representative samples of both urban and rural settings in selected countries; develop a theoretical framework to predict the likely chains through which demographic change influences the burden of infectious diseases; develop and parameterise mathematical population models for the transmission of infectious diseases to evaluate the impact of public health measures under changing demographic conditions.
Summary
Population structure and change and social contact patterns are major determinants of the observed epidemiology of infectious diseases, including the consequences on health. Demographic structure and the components of demographic dynamics are changing over time and substantially differ within countries and most critically between countries. However, some of the overall consequences of demographic changes remain unclear, though urbanisation and fertility decline will certainly have a profound impact on social structures, family composition and, as a consequence, on disease spread and on the identification of effective public health measures.
DECIDE will explore the following questions:
1. What are the major short- and medium-term impacts of demographic changes on the patterns of infectious disease (morbidity and mortality)?
2. How are these demographic changes affecting contact patterns that are of fundamental importance to the spread of infectious diseases? Are there new and different modes of transmission within and between populations?
3. What are the implications of demographic changes for infection control strategies? What is the interplay between demographic changes and public health policies in shaping future trajectories of infectious diseases?
In order to answer these questions, DECIDE will use the following strategy: analyse harmonised demographic and health survey data (DHS), and health and demographic surveillance system data (HDSS); develop new estimates of social contact patterns and other socio-demographic variables collecting data from representative samples of both urban and rural settings in selected countries; develop a theoretical framework to predict the likely chains through which demographic change influences the burden of infectious diseases; develop and parameterise mathematical population models for the transmission of infectious diseases to evaluate the impact of public health measures under changing demographic conditions.
Max ERC Funding
1 210 000 €
Duration
Start date: 2012-04-01, End date: 2017-12-31
Project acronym DROEMU
Project DROPLETS AND EMULSIONS: DYNAMICS AND RHEOLOGY
Researcher (PI) Mauro Sbragaglia
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA TOR VERGATA
Call Details Starting Grant (StG), PE3, ERC-2011-StG_20101014
Summary The applications of micro- and nanofluidics are now numerous, including lab-on-chip systems based upon micro-manipulation of discrete droplets, emulsions of interest in food and medical industries (drug delivery), analytical separation techniques of biomolecules, such as proteins and DNA, and facile handling of mass-limited samples. The problems involved contain diverse nano- and microstructures with a variety of lifetimes, touching atomistic scales (contact lines, thin films), mesoscopic collective behaviour (emulsions, glassy, soft-jammed systems) and hydrodynamical spatio-temporal evolutions (droplets and interface dynamics) with complex rheology and strong non-equilibrium properties. The interplay of the dynamics at the different scales involved still remains to be fully understood.
The fundamental research I address in this project aims to set up the unified framework for the characterization and modelling of interfaces in confined geometries by means of an innovative micro- and nanofluidic numerical platform.
The main challenging and ambitious questions I intend to address in my project are: How the stability of micro- and nanodroplets is affected by thermal gradients? Or by boundary corrugation and modulated wettability? Or by complex rheological properties of the dispersed and/or continuous phases? How these effects can be tuned to design new optimal devices for emulsions production? What are the rheological properties of these new soft materials? How confinement in small structures changes the bulk emulsion properties? What is the molecular-hydrodynamical mechanism at the origin of contact line slippage? How to realistically model the fluid-particle interactions on the molecular scale?
The strength of the project lies in an innovative and state-of-the-art numerical approach, based on mesoscopic Lattice Boltzmann Models, coupled to microscopic molecular physics, supported by theoretical modelling, lubrication theory and experimental validation.
Summary
The applications of micro- and nanofluidics are now numerous, including lab-on-chip systems based upon micro-manipulation of discrete droplets, emulsions of interest in food and medical industries (drug delivery), analytical separation techniques of biomolecules, such as proteins and DNA, and facile handling of mass-limited samples. The problems involved contain diverse nano- and microstructures with a variety of lifetimes, touching atomistic scales (contact lines, thin films), mesoscopic collective behaviour (emulsions, glassy, soft-jammed systems) and hydrodynamical spatio-temporal evolutions (droplets and interface dynamics) with complex rheology and strong non-equilibrium properties. The interplay of the dynamics at the different scales involved still remains to be fully understood.
The fundamental research I address in this project aims to set up the unified framework for the characterization and modelling of interfaces in confined geometries by means of an innovative micro- and nanofluidic numerical platform.
The main challenging and ambitious questions I intend to address in my project are: How the stability of micro- and nanodroplets is affected by thermal gradients? Or by boundary corrugation and modulated wettability? Or by complex rheological properties of the dispersed and/or continuous phases? How these effects can be tuned to design new optimal devices for emulsions production? What are the rheological properties of these new soft materials? How confinement in small structures changes the bulk emulsion properties? What is the molecular-hydrodynamical mechanism at the origin of contact line slippage? How to realistically model the fluid-particle interactions on the molecular scale?
The strength of the project lies in an innovative and state-of-the-art numerical approach, based on mesoscopic Lattice Boltzmann Models, coupled to microscopic molecular physics, supported by theoretical modelling, lubrication theory and experimental validation.
Max ERC Funding
1 170 924 €
Duration
Start date: 2011-12-01, End date: 2016-11-30
Project acronym DYNAMIC MODELS
Project Solving dynamic models: Theory and Applications
Researcher (PI) Felix Egbert Kübler
Host Institution (HI) UNIVERSITAT ZURICH
Call Details Starting Grant (StG), SH1, ERC-2011-StG_20101124
Summary The computation of equilibria in dynamic stochastic general
equilibrium models with heterogeneous agents has become
increasingly important in macroeconomics and public
finance. For a given example-economy, i.e. a given specification of
preferences, technologies and market-arrangements these methods
compute an (approximate) equilibrium and allow for quantitative
statements about one equilibrium of the example-economy.
Through these so-called 'computational experiments'
many economic insights can be obtained by analyzing
quantitative features of realistically calibrated models.
Unfortunately, economists often use ad hoc computational methods
with poorly understood properties that produce approximate solutions
of unknown quality.
The research-project outlined in this proposal
has three goals: Building theoretical foundations
for analyzing dynamic equilibrium models, developing efficient and stable
algorithms for the computation of equilibria in large scale models and
applying these algorithms to macroeconomic policy analysis.
Summary
The computation of equilibria in dynamic stochastic general
equilibrium models with heterogeneous agents has become
increasingly important in macroeconomics and public
finance. For a given example-economy, i.e. a given specification of
preferences, technologies and market-arrangements these methods
compute an (approximate) equilibrium and allow for quantitative
statements about one equilibrium of the example-economy.
Through these so-called 'computational experiments'
many economic insights can be obtained by analyzing
quantitative features of realistically calibrated models.
Unfortunately, economists often use ad hoc computational methods
with poorly understood properties that produce approximate solutions
of unknown quality.
The research-project outlined in this proposal
has three goals: Building theoretical foundations
for analyzing dynamic equilibrium models, developing efficient and stable
algorithms for the computation of equilibria in large scale models and
applying these algorithms to macroeconomic policy analysis.
Max ERC Funding
1 114 800 €
Duration
Start date: 2011-10-01, End date: 2016-09-30
Project acronym DYNCORSYS
Project Real-time dynamics of correlated many-body systems
Researcher (PI) Philipp Werner
Host Institution (HI) UNIVERSITE DE FRIBOURG
Call Details Starting Grant (StG), PE3, ERC-2011-StG_20101014
Summary "Strongly correlated materials exhibit some of the most remarkable phenonomena found in condensed matter systems. They typically involve many active degrees of freedom (spin, charge, orbital), which leads to numerous competing states and complicated phase diagrams. A new perspective on correlated many-body systems is provided by the nonequilibrium dynamics, which is being explored in transport studies on nanostructures, pump-probe experiments on correlated solids, and in quench experiments on ultra-cold atomic gases.
An advanced theoretical framework for the study of correlated lattice models, which can be adapted to nonequilibrium situations, is dynamical mean field theory (DMFT). One aim of this proposal is to develop ""nonequilibrium DMFT"" into a powerful tool for the simulation of excitation and relaxation processes in interacting many-body systems. The big challenge in these simulations is the calculation of the real-time evolution of a quantum impurity model. Recently developed real-time impurity solvers have, however, opened the door to a wide range of applications. We will improve the efficiency and flexibility of these methods and develop complementary approaches, which will extend the accessible parameter regimes. This machinery will be used to study correlated lattice models under nonequilibrium conditions. The ultimate goal is to explore and qualitatively understand the nonequilibrium properties of ""real"" materials with active spin, charge, orbital and lattice degrees of freedom.
The ability to simulate the real-time dynamics of correlated many-body systems will be crucial for the interpretation of experiments and the discovery of correlation effects which manifest themselves only in the form of transient states. A proper understanding of the most basic nonequilibrium phenomena in correlated solids will help guide future experiments and hopefully lead to new technological applications such as ultra-fast switches or storage devices."
Summary
"Strongly correlated materials exhibit some of the most remarkable phenonomena found in condensed matter systems. They typically involve many active degrees of freedom (spin, charge, orbital), which leads to numerous competing states and complicated phase diagrams. A new perspective on correlated many-body systems is provided by the nonequilibrium dynamics, which is being explored in transport studies on nanostructures, pump-probe experiments on correlated solids, and in quench experiments on ultra-cold atomic gases.
An advanced theoretical framework for the study of correlated lattice models, which can be adapted to nonequilibrium situations, is dynamical mean field theory (DMFT). One aim of this proposal is to develop ""nonequilibrium DMFT"" into a powerful tool for the simulation of excitation and relaxation processes in interacting many-body systems. The big challenge in these simulations is the calculation of the real-time evolution of a quantum impurity model. Recently developed real-time impurity solvers have, however, opened the door to a wide range of applications. We will improve the efficiency and flexibility of these methods and develop complementary approaches, which will extend the accessible parameter regimes. This machinery will be used to study correlated lattice models under nonequilibrium conditions. The ultimate goal is to explore and qualitatively understand the nonequilibrium properties of ""real"" materials with active spin, charge, orbital and lattice degrees of freedom.
The ability to simulate the real-time dynamics of correlated many-body systems will be crucial for the interpretation of experiments and the discovery of correlation effects which manifest themselves only in the form of transient states. A proper understanding of the most basic nonequilibrium phenomena in correlated solids will help guide future experiments and hopefully lead to new technological applications such as ultra-fast switches or storage devices."
Max ERC Funding
1 493 178 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym EDEQS
Project ENTANGLING AND DISENTANGLING EXTENDED QUANTUM SYSTEMS IN AND OUT OF EQUILIBRIUM
Researcher (PI) Pasquale Calabrese
Host Institution (HI) SCUOLA INTERNAZIONALE SUPERIORE DI STUDI AVANZATI DI TRIESTE
Call Details Starting Grant (StG), PE2, ERC-2011-StG_20101014
Summary "It is nowadays well established that many-body quantum systems in one and two spatial dimensions exhibit unconventional collective behavior that gives rise to intriguing novel states of matter. Examples are topological states exhibiting nonabelian statistics in 2D and spin-charge separated metals and Mott insulators in 1D. An important focus of current research is to characterize both equilibrium and non-equilibrium dynamics of such systems. The latter has become experimentally accessible only during the last decade and constitutes one of the main frontiers of modern theoretical physics. In recent years it has become clear that entanglement is a useful concept for characterizing different states of matter as well as non-equilibrium time evolution.
One main aim of this proposal is to utilize entanglement measures to fully classify states of matter in low dimensional systems. This will be achieved by carrying out a systematic study of the entanglement of several disconnected regions in 1D quantum critical systems. In addition, entanglement measures will be used to benchmark the performance of numerical algorithms based on tensor network states (both in 1D and 2D) and identify the ""optimal"" algorithm for finding the ground state of a given strongly correlated many-body system.
The second main aim of this proposal is to utilize the entanglement to identify the most important features of the the non equilibrium time evolution after a ""quantum quench"", with a view to solve exactly the quench dynamics in strongly interacting integrable models. A particular question we will address is which observables ""thermalize"", which is an issue of tremendous current experimental and theoretical interest. By combining analytic and numerical techniques we will then study the non equilibrium dynamics of non integrable models, in order to quantify the effects of integrability."
Summary
"It is nowadays well established that many-body quantum systems in one and two spatial dimensions exhibit unconventional collective behavior that gives rise to intriguing novel states of matter. Examples are topological states exhibiting nonabelian statistics in 2D and spin-charge separated metals and Mott insulators in 1D. An important focus of current research is to characterize both equilibrium and non-equilibrium dynamics of such systems. The latter has become experimentally accessible only during the last decade and constitutes one of the main frontiers of modern theoretical physics. In recent years it has become clear that entanglement is a useful concept for characterizing different states of matter as well as non-equilibrium time evolution.
One main aim of this proposal is to utilize entanglement measures to fully classify states of matter in low dimensional systems. This will be achieved by carrying out a systematic study of the entanglement of several disconnected regions in 1D quantum critical systems. In addition, entanglement measures will be used to benchmark the performance of numerical algorithms based on tensor network states (both in 1D and 2D) and identify the ""optimal"" algorithm for finding the ground state of a given strongly correlated many-body system.
The second main aim of this proposal is to utilize the entanglement to identify the most important features of the the non equilibrium time evolution after a ""quantum quench"", with a view to solve exactly the quench dynamics in strongly interacting integrable models. A particular question we will address is which observables ""thermalize"", which is an issue of tremendous current experimental and theoretical interest. By combining analytic and numerical techniques we will then study the non equilibrium dynamics of non integrable models, in order to quantify the effects of integrability."
Max ERC Funding
1 108 000 €
Duration
Start date: 2011-09-01, End date: 2016-08-31
Project acronym EINITE
Project "Economic Inequality across Italy and Europe, 1300-1800"
Researcher (PI) Guido Alfani
Host Institution (HI) UNIVERSITA COMMERCIALE LUIGI BOCCONI
Call Details Starting Grant (StG), SH6, ERC-2011-StG_20101124
Summary "The aim of EINITE is to clarify the dynamics of economic inequality in Europe from the late Middle Ages up until the beginning of the Industrial Revolution. Very little data about economic inequality during such an early period is available today. Apart from some studies focussed on single years and small areas (usually only one city or a village), the only European region which has been the object of a large research project is Holland.
The project will collect an extensive database about economic inequality, mainly of wealth (for which better documentation exists), focussing on Italy from a wider European perspective. Archival research will be concentrated on Italy where particularly good sources exist, but the Italian case will be placed in the varying European context. Published data and existing databases from all over the continent will be collected as terms of comparison. The final version of the project database will be made public.
The activity of ENITE will be organized around four main research questions:
1) What is the long-term relationship between economic growth and inequality?
This is the main question to which the others are all connected.
2) What were the effects of plagues and other severe mortality crises on property structures?
3) What is the underlying relationship between immigration and urban inequality?
4) How was economic inequality perceived in the past, and how did its perception change over time?
The project will also help to explain the origin of the property structures and inequality levels to be found on the eve of the Industrial Revolution. Then, it will provide information relevant to the ‘Kuznets curve’ debate. Overall the project will lead to a better knowledge of economic inequality in the past, which is also expected to help understanding recent developments in inequality levels in Europe and elsewhere."
Summary
"The aim of EINITE is to clarify the dynamics of economic inequality in Europe from the late Middle Ages up until the beginning of the Industrial Revolution. Very little data about economic inequality during such an early period is available today. Apart from some studies focussed on single years and small areas (usually only one city or a village), the only European region which has been the object of a large research project is Holland.
The project will collect an extensive database about economic inequality, mainly of wealth (for which better documentation exists), focussing on Italy from a wider European perspective. Archival research will be concentrated on Italy where particularly good sources exist, but the Italian case will be placed in the varying European context. Published data and existing databases from all over the continent will be collected as terms of comparison. The final version of the project database will be made public.
The activity of ENITE will be organized around four main research questions:
1) What is the long-term relationship between economic growth and inequality?
This is the main question to which the others are all connected.
2) What were the effects of plagues and other severe mortality crises on property structures?
3) What is the underlying relationship between immigration and urban inequality?
4) How was economic inequality perceived in the past, and how did its perception change over time?
The project will also help to explain the origin of the property structures and inequality levels to be found on the eve of the Industrial Revolution. Then, it will provide information relevant to the ‘Kuznets curve’ debate. Overall the project will lead to a better knowledge of economic inequality in the past, which is also expected to help understanding recent developments in inequality levels in Europe and elsewhere."
Max ERC Funding
995 400 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym FINIMPMACRO
Project Financial Imperfections and Macroeconomic Implications
Researcher (PI) Tommaso Monacelli
Host Institution (HI) UNIVERSITA COMMERCIALE LUIGI BOCCONI
Call Details Starting Grant (StG), SH1, ERC-2011-StG_20101124
Summary We plan to study the implications of financial market imperfections for four main questions.
First, how do financial imperfections affect the optimal conduct of monetary and exchange rate policy in open economies? A key insight is that we characterize financial frictions as endogenous and only occasionally binding. This can have important implications for the optimal conduct of stabilization policy.
Second, how do financial and labor market imperfections interact? We extend the standard search-and-matching model to allow firms to issue debt. This feature affects the wage bargaining process endogenously, since firms, by leveraging, can pay lower wages. We study the ability of such a model to replicate the volatility and persistence of unemployment in the data, and the role of financial imperfections in affecting the transmission of productivity and financial shocks.
Third, does the effectiveness of tax policy depend on its redistributive content, and how is this affected by financial imperfections? We characterize the distributional feature of several Tax Acts in the US, and investigate empirically whether tax changes that “favor the poor” are more expansionary than cuts that “favor the rich”. We then build a theoretical framework with heterogeneous agents and financial frictions to rationalize our evidence.
Fourth, how do financial intermediaries affect the transmission channel of monetary policy? We extend the current New Keynesian framework for monetary policy analysis to study the role of financial intermediaries. We emphasize the role of three features: (i) asymmetric information in interbank markets; (ii) maturity mismatch in the banks’ balance sheets; (iii) the “paradox of securitization”, thereby a deeper diversification of idiosyncratic risk leads to a simultaneous increase in the sensitivity of banks’ balance sheets to aggregate risk.
Summary
We plan to study the implications of financial market imperfections for four main questions.
First, how do financial imperfections affect the optimal conduct of monetary and exchange rate policy in open economies? A key insight is that we characterize financial frictions as endogenous and only occasionally binding. This can have important implications for the optimal conduct of stabilization policy.
Second, how do financial and labor market imperfections interact? We extend the standard search-and-matching model to allow firms to issue debt. This feature affects the wage bargaining process endogenously, since firms, by leveraging, can pay lower wages. We study the ability of such a model to replicate the volatility and persistence of unemployment in the data, and the role of financial imperfections in affecting the transmission of productivity and financial shocks.
Third, does the effectiveness of tax policy depend on its redistributive content, and how is this affected by financial imperfections? We characterize the distributional feature of several Tax Acts in the US, and investigate empirically whether tax changes that “favor the poor” are more expansionary than cuts that “favor the rich”. We then build a theoretical framework with heterogeneous agents and financial frictions to rationalize our evidence.
Fourth, how do financial intermediaries affect the transmission channel of monetary policy? We extend the current New Keynesian framework for monetary policy analysis to study the role of financial intermediaries. We emphasize the role of three features: (i) asymmetric information in interbank markets; (ii) maturity mismatch in the banks’ balance sheets; (iii) the “paradox of securitization”, thereby a deeper diversification of idiosyncratic risk leads to a simultaneous increase in the sensitivity of banks’ balance sheets to aggregate risk.
Max ERC Funding
778 800 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym LABORHETEROGENEITY
Project Labor Heterogeneity in Search Markets
Researcher (PI) Philipp Kircher
Host Institution (HI) EUROPEAN UNIVERSITY INSTITUTE
Call Details Starting Grant (StG), SH1, ERC-2011-StG_20101124
Summary The work laid out in this proposal aims to change our understanding of labor markets by viewing both the mobility as well as the frictions in the market as a consequence of long-term worker heterogeneity. Despite the advances in information technology which substantially reduce the costs of sending information (job advertisements, job applications) extracting the relevant information about worker quality remains hard. Long-term differences in ability coupled with screening frictions are proposed as the main reason for mismatch, for mobility, and for the presence of unemployment.
The proposal is based on novel empirical observations on occupational mobility. Both low-paid workers as well as high-paid workers in an occupation tend to leave it. The former tend to move to occupations with lower average pay, while the opposite holds for the latter. This happens even within firms, and after excluding managerial positions.
Most work on selection assumes that low-earners leave. This data suggest a novel angle: Workers have a long-term type that affects productivity in their current and in new occupations. They might accumulate human capital, but also their baseline ability is imperfectly known. Unexpectedly low performers (low-wage workers) have to leave towards less demanding tasks, while high performers change to more demanding tasks. This consistently accounts for the observed selection patterns.
When workers know more about their ability than new firms, this also explains unemployment: firms spend efforts on screening, and impose costs on workers to induce them to self-select. The latter counteracts exogenous reductions in workers’ search costs. The aim is to develop a tractable model of screening unemployment that can serve as a building block in larger macro-labor models, and to assess the work of the government employment agency through the lens of a mechanism designer that facilitates match-making but relies on firms for additional screening of the unemployed.
Summary
The work laid out in this proposal aims to change our understanding of labor markets by viewing both the mobility as well as the frictions in the market as a consequence of long-term worker heterogeneity. Despite the advances in information technology which substantially reduce the costs of sending information (job advertisements, job applications) extracting the relevant information about worker quality remains hard. Long-term differences in ability coupled with screening frictions are proposed as the main reason for mismatch, for mobility, and for the presence of unemployment.
The proposal is based on novel empirical observations on occupational mobility. Both low-paid workers as well as high-paid workers in an occupation tend to leave it. The former tend to move to occupations with lower average pay, while the opposite holds for the latter. This happens even within firms, and after excluding managerial positions.
Most work on selection assumes that low-earners leave. This data suggest a novel angle: Workers have a long-term type that affects productivity in their current and in new occupations. They might accumulate human capital, but also their baseline ability is imperfectly known. Unexpectedly low performers (low-wage workers) have to leave towards less demanding tasks, while high performers change to more demanding tasks. This consistently accounts for the observed selection patterns.
When workers know more about their ability than new firms, this also explains unemployment: firms spend efforts on screening, and impose costs on workers to induce them to self-select. The latter counteracts exogenous reductions in workers’ search costs. The aim is to develop a tractable model of screening unemployment that can serve as a building block in larger macro-labor models, and to assess the work of the government employment agency through the lens of a mechanism designer that facilitates match-making but relies on firms for additional screening of the unemployed.
Max ERC Funding
1 170 000 €
Duration
Start date: 2012-09-01, End date: 2017-08-31
Project acronym MEGA-XUV
Project Efficient megahertz coherent XUV light source
Researcher (PI) Thomas Südmeyer
Host Institution (HI) UNIVERSITE DE NEUCHATEL
Call Details Starting Grant (StG), PE3, ERC-2011-StG_20101014
Summary "Coherent extreme ultraviolet (XUV) light sources open up new opportunities for science and technology. Promising examples are attosecond metrology, spectroscopic and structural analysis of matter on a nanometer scale, high resolution XUV-microscopy and lithography. The most promising technique for table-top sources is femtosecond laser-driven high-harmonic generation (HHG) in gases. Unfortunately, their XUV photon flux is not sufficient for most applications. This is caused by the low average power of the kHz repetition rate driving lasers (<10 W) and the poor conversion efficiency (<10-6). Following the traditional path of increasing the power, numerous research teams are engineering larger and more complex femtosecond high-power amplifier systems, which are supposed to provide several kilowatts of average power in the next decade. However, it is questionable if such systems can easily serve as tool for further scientific studies with XUV light.
The goal of this proposal is the realization of a simpler and more efficient source of high-flux XUV radiation. Instead of amplifying a laser beam to several kW of power and dumping it after the HHG interaction, the generation of high harmonics is placed directly inside the intra-cavity multi-kilowatt beam of a femtosecond laser. Thus, the unconverted light is “recycled”, and the laser medium only needs to compensate for the low losses of the resonator. Achieving passive femtosecond pulse formation at these record-high power levels will require eliminating any destabilizing effects inside the resonator. This appears to be only feasible with ultrafast thin disk lasers, because all key components are used in reflection.
Exploiting the scientific opportunities of the resulting table-top multi-MHz coherent XUV light source in various interdisciplinary applications is the second major part of this project. The developed XUV source will be transportable, which will enable the fast implementation of joint measurements."
Summary
"Coherent extreme ultraviolet (XUV) light sources open up new opportunities for science and technology. Promising examples are attosecond metrology, spectroscopic and structural analysis of matter on a nanometer scale, high resolution XUV-microscopy and lithography. The most promising technique for table-top sources is femtosecond laser-driven high-harmonic generation (HHG) in gases. Unfortunately, their XUV photon flux is not sufficient for most applications. This is caused by the low average power of the kHz repetition rate driving lasers (<10 W) and the poor conversion efficiency (<10-6). Following the traditional path of increasing the power, numerous research teams are engineering larger and more complex femtosecond high-power amplifier systems, which are supposed to provide several kilowatts of average power in the next decade. However, it is questionable if such systems can easily serve as tool for further scientific studies with XUV light.
The goal of this proposal is the realization of a simpler and more efficient source of high-flux XUV radiation. Instead of amplifying a laser beam to several kW of power and dumping it after the HHG interaction, the generation of high harmonics is placed directly inside the intra-cavity multi-kilowatt beam of a femtosecond laser. Thus, the unconverted light is “recycled”, and the laser medium only needs to compensate for the low losses of the resonator. Achieving passive femtosecond pulse formation at these record-high power levels will require eliminating any destabilizing effects inside the resonator. This appears to be only feasible with ultrafast thin disk lasers, because all key components are used in reflection.
Exploiting the scientific opportunities of the resulting table-top multi-MHz coherent XUV light source in various interdisciplinary applications is the second major part of this project. The developed XUV source will be transportable, which will enable the fast implementation of joint measurements."
Max ERC Funding
1 500 000 €
Duration
Start date: 2012-03-01, End date: 2017-02-28