Research Papers and Computer Programs
Computer Programs
Two monograph
·
Robust Control and
Economic Model Uncertainty (with
Lars Hansen) (December , 2003) This is a draft
of a monograph on robust filtering and control. It adapts H_2, H_\infty, and
entropy methods to handle discounted problems. Both single agent and multiple
agent settings are studied. There
are new chapters about recursive equilibria in this version. PDF file
·
Recursive Models of
Dynamic Linear Economies (with Lars
Hansen) (June, 2004) This is a monograph on linear quadratic
stochastic dynamic equilibrium models, how to represent their equilibria and
how to extract their econometric implications mbook2
Working papers
- Robust Control and Misspecification (with Lars Peter Hansen, Gauhar Turmuhambetova,
and Noah Williams) (April 17, 2004). This paper integrates a variety of
results in robust control theory in the context of an approximating model
that is a diffusion. The
paper is partly a response to some criticisms of Anderson, Hansen, and
Sargent (see below) by Chen and Epstein. It formulates two robust control
problems -- a multiplier problem from the literature on robust control and
a constraint formulation that looks like Gilboa-Schmeidler's min-max
expected utility theory. The paper studies the connection between the two
problems, states an observational equivalence result for them, links both
problems to `risk sensitive' optimal control, and discusses time
consistency of the preference orderings associated with the two robust
control problems. [triple34.pdf ]
- Lotteries for consumers versus lotteries for firms, with Lars Ljungqvist
,(October, 2003). A
discussion of a paper by Edward Prescott for the Yale Cowles commission
conference volume on general equilibrium theory. Prescott emphasizes the
similarities in lotteries that can be used to aggregate over
nonconvexities for firms, on the one hand, and households, on the
other. We emphasize their
differences. [ pred4.pdf
- The Conquest of U.S.
Inflation: Learning, Model Uncertainty, and Robustness, with Timothy Cogley,(November,
2003). An adaptive
Fed’s model is a mixture of three models. The Fed uses Bayesian methods to
update estimates of three models of the Phillips curve: a Samuelson-Solow
model, a Solow-Tobin model, and a Lucas model. Each period, the central bank also updates the
probability that it assigns to each of these three models, then determines
its first-period decision by solving a `Bayesian linear regulator
problem’. Although by the mid
1970s the U.S. data induce the Fed to assign very high probability to the
Lucas model, the government refrains from adopting its low-inflation
policy recommendation because that policy has very bad consequences under
one of the other low (but not zero) probability models. The statistical model is thus able
to explain the puzzling delay in the Fed’s decision to deflate after
learning the natural rate hypothesis. [ inflat13.pdf]
- Bayesian Fan Charts for U.K.
Inflation: Forecasting and Sources of Uncertainty in an Evolving Monetary
System, with Timothy
Cogley and Sergei Morozov
(September, 2003). We
use Bayesian methods to estimate a VAR with drifting coefficients and
volatilities then construct fan charts that we compare with those of the
Bank of England’s MPC. Our
fan charts incorporate several sources of uncertainty, including a form of
model uncertainty that is represented by drifting coefficients and
volatilities. [ fantom2.pdf]
- Reactions to the Berkeley
Story (October, 2002). This paper is my discussion
of a paper at the 2002 Jackson Hole Conference
by Christina and David Romer.
The Romers paper uses narrative evidence to support and extend an
interpretation of post war Fed policy that has also been explored by Brad
DeLong and others. The basic
story is that the Fed has a pretty good model in the 50s, forgot it under
the influence of advocates of an exploitable Phillips curve in the late
60s, then came to its senses by accepting Friedman and Phelps’s version of
the natural rate hypothesis in the 1970s. The Romers extend the story by picking up Orphanides’s
idea that the Fed misestimated potential GDP or the natural unemployment
rate in the 1970s. The Romers story is that the Fed needed to accept the
natural rate hypothesis (which it did by 1970 according to them) and also
to have good estimates of the natural rate (which according to them it
didn’t until the late 70s or early 80s). The Romers story is about the Fed’s forgetting then
relearning a good model. My
comment features my own narration of a controversial paper by `Professors
X and Y’. [ romers3.pdf]
- European Unemployment and Turbulence Revisited in a Matching Model
(with Lars
Ljungqvist). (September, 2003). This paper recalibrates a matching model of den Haan,
Haefke, and Ramey and uses it to study how increased turbulence interacts
with generous unemployment benefits to affect the equilibrium rate of unemployment. In contrast to den Haan, Haefke, and
Ramey, we find that increased turbulence causes unemployment to rise. We trace the difference in
outcomes to how we model the hazard of losing skills after a voluntary job change. [ dhhr8.pdf]
- Robust control and filtering of forward-looking models (with Lars Hansen).
(November 19, 2002)
This is a comprehensive revision of an earlier paper with the same title.
We describe an equilibrium concept for models with multiple agents who, as
under rational expectations share a common model, but all of whom doubt
their model, unlike rational expectations. Agents all fear model misspecification and perform
their own worst-case analyses to construct robust decision rules. Although the agents share the
approximating models, their differing preferences cause their worst-case
models to diverge. We show
how to compute Stackelberg (or Ramsey) plans where both leaders and
followers fear model misspecification. [ king6.pdf]
· `Knowing the Forecasts of Others’
(with Joseph G. Pearlman) , April
10, 2004. This paper
uses recursive methods to compute an equilibrium of the notorious model in section
8 of Townsend’s 1983 JPE paper `Forecasting the Forecasts of Others’. The equilibrium is of finite (and even
low) dimension. Market prices
fully reveal traders’ private information, making the equilibrium equivalent
with one in which traders pool their information before trading. This means that the problem of
forecasting the forecasts of others disappears in equilibrium. There is no need to keep track of higher
order beliefs. [main paper (pdf)]
- Drifts and Volatilities:
Monetary Policies and Outcomes in the Post WWII U.S.
(with Tim Cogley) , August
26, 2002. This paper answers criticisms of our 2001 NBER Macro
Annual paper made by Sims and Stock. We enrich our specification of a
drifting coefficient VAR to allow stochastic volatility and study whether
our earlier evidence for drifting coefficients survives this
generalization. It does. We investigate the power of various tests that
have been used to test time invariance of the autoregressive coefficients
of VARs against alternatives like ours. All except one have low power.
These results about power help reconcile our results with those of Sims
and Bernanke and Mihov. We also study evidence that monetary policy rules
have drifted. [main paper (pdf)]
- `Certainty equivalence’
and `model uncertainty’
(with Lars Hansen) , July
2002. Prepared for an August 2002
conference in honor of Henri Theil. The paper reviews how the
structure of the Simon-Theil certainty equivalence result extends to
models that incorporate a preference for robustness to model
uncertainty. A model of precautionary
savings is used an example. [main paper (pdf)]
- A Quartet of Semi-Groups
for Model Specification, Robustness, Prices of Risk, and Model Detection,
(withEvan Anderson and
Hansen) April 2003.
This paper supercedes `Risk and Robustness in Equilibrium’, also on
this web page. A
representative agent fears
that his model, a
continuous time Markov process with jump and diffusion components,is misspecified
and therefore uses robust
control theory to make decisions. Under the decision maker's approximating model,
that cautious behavior puts adjustments for model misspecification into market prices for risk
factors. We use a
statistical theory
of detection
to quantify how much model
misspecification the
decision maker should fear, given his historical data record. A semigroup is a collection of
objects connected by
something like the
law of iterated expectations. The law of iterated expectations defines the
semigroup for a
Markov process, while similar laws define other semigroups. Related semigroups
describe (1) an approximating
model; (2) a model
misspecification adjustment
to the continuation value in the decision maker's Bellman equation;(3) asset prices; and (4) the behavior of the model
detection statistics that we use to calibrate how much robustness the decision maker
prefers. Semigroups
2, 3, and 4 establish a tight link between the market price of uncertainty and a bound on the
error in statistically discriminating between an approximating and a worst case model. [main paper (pdf)]
- The European Employment
Experience (with Lars Ljungqvist
, Aug
13, 2002). A general equilibrium model of stochastically
aging workers whose human capital depreciates during spells of
unemployment and appreciates during spells of employment. There are layoff
costs and government supplied unemployment compensation with a replacement
ratio attached to past earnings, the product of human capital and a wage
draw. The wage draw changes on the job via Markov chain, inspiring some quits. We use a common calibration of the
model with “European” and “American” unemployment compensations to study
the different unemployment experiences of Europe
and the U.S.
from the 1950s through 2000.
[pdf file ]
- European Unemployment: From a Worker's Perspective
(with Lars Ljungqvist) ,
Sept 17, 2001. Prepared for an October 2001 conference in honor of Edmund
Phelps. Within the environment of our JPE 1998 paper on European
unemployment, this paper conducts artificial natural experiments that
provoke ``conversations'' with two workers who experience identical shocks
but make different decisions because they live on opposite sides of the
Atlantic Ocean.] [main paper (pdf)
- Optimal Taxation without State Contingent Debt
(with Rao Aiyagari, Albert Marcet and Juha Seppala) , Sept
29, 2001. An extensively revised version of a paper recasting
Lucas and Stokey's analysis of optimal taxation in a market setting where
the government can issue only risk free one-period government debt. This
setting moves the optimal tax and debt policy substantially in the
direction posited by Barro. The paper works out two examples by hand,
another by the computer. [Postscript file ] [pdf file ]
- Time Inconsistency of Robust Control?
(with Lars Peter Hansen) , October
1, 2001. Responding to criticisms of Larry Epstein and his
coauthors, this paper describes senses in which various representations of
preferences from robust control are or are not time consistent. [Postscript file ] [pdf file ]
- Robust Control and Model Uncertainty (with Lars Peter Hansen) (January
22, 2001). Paper prepared for presentation at the meetings of
the American EconomicAssociation in New Orleans,
Jan 5, 2001. This paper
is a summary of results presented in more detail in Hansen, Sargent,
Turmuhambetova, and Williams (2001) -- see below. That paper formulates
two robust control problems -- a multiplier problem from the literature on
robust control and a constraint formulation that looks like
Gilboa-Schmeidler's min-max expected utility theory. [Postscript file ] [pdf file ]
- Robust Pricing with
Uncertain Growth (with Marco
Cagetti, Lars Peter Hansen, and Noah Williams)
(January 2001). A continuous time asset pricing model with robust
nonlinear filtering of a hidden Markov state. [pdf file
]
- Evolving Post-World War II U.S. Inflation Dynamics (with Timothy Cogley)
(Final Version, June 2001) This paper uses Bayesian methods and post
World War II data on inflation, unemployment, and an interest rate to
estimate a `drifting coefficients' model. The model is used to construct
characterizations of the data that make contact with various points in
Lucas's Critique and Sargent's The Conquest of American Inflation
published by Princeton University Press. The paper constructs various
measures of posterior means and variances of inflation and unemployment
and studies their relationships. This paper will appear in the 2001
Macroeconomic Annual. [Postscript file ] [pdf file ]
- Escaping Nash Inflation (with
In-Koo Cho and Noah Williams) (Very
Revised, May 31, 2001) This paper analytically computes the `escape
route' for a special case of the model in my Marshall lectures The Conquest
of American Inflation published by Princeton University Press. We show
that theoretical computations of the mean dynamics and escape dynamics do
a good job of explaining simulations like those in The Conquest of
American Inflation. The paper uses the mysterious insight of Michael
Harrison: `If an unlikely event occurs, it is very likely to occur in the
most likely way' . [Postscript file ] [pdf file ]
- Acknowledging Misspecification in Macroeconomic Theory (with Lars Peter Hansen, December 2000) The text
of Sargent's Frisch lecture at the 2000 World Congress of the Econometric
Society; also the basis for Sargent's plenary lecture at the Society for
Economic Dynamics in Costa Rica, June 2000. costa3.ps costa3.pdf
- Coment on Christopher Sims's `Fiscal Consequences for Mexico of
Adopting the Dollar' (June 13,
2000). A comment prepared for a conference on dollarization at the Federal
Reserve Bank of Cleveland, June 1-3, 2000. sims10.ps sims10.pdf
- Robust Permanent Income and Pricing with Filtering, (with Lars Peter Hansen and Neng Wang, August
25, 2000) This paper reformulate Hansen, Sargent, and Tallarini's 1999
(RESTud) model by concealing elements of the state from the planner and
the agents, forcing them to filter. The paper describes how jointly to do
robust filtering and control, then computes the appropriate `market prices
of Knightian uncertainty.' Detection error probabilities are used to
discipline the one free parameter that robust decision making adds to the
standard rational expectations paradigm. hsw2003.ps hsw2003.pdf
- Robustness, Detection, and the Price of Risk (March
27, 2000)
(with Evan Anderson and Lars Hansen) (Formerly
known as `Risk and Robustness in Equilibrium ';) This paper describes a
preference for robust decision rules in discrete time and continuous time
models. The paper extends earlier work of Hansen, Sargent, and Tallarini
in several ways. It permits non-linear-quadratic Gaussian set ups. It
develops links between asset prices and preferences for robustness. It
links premia in asset prices from Knightian uncertainty to detection error
statistics for descriminating between models. [ Postscript file ] [ PDF file ]
- Wanting Robustness in Macroeconomics (with Lars Peter Hansen, June
10, 2000) This paper is a `nontechnical' (according to Hansen)
survey of an approach to building a preference for robust decision rules
into macroeconomics. wanting.ps wanting.pdf
- Optimal Taxation without State Contingent Debt
(with Albert Marcet and Juha Seppala) , June
5, 2000. An extensively revised version of a paper recasting
Lucas and Stokey's analysis of optimal taxation in a market setting where
the government can issue only risk free one-period government debt. This
setting moves the optimal tax and debt policy substantially in the
direction posited by Barro. The paper works out two examples by hand,
another by the computer. [Postscript file ]
- Laboratory Experiments with an Expectational Phillips Curve (with Jasmina Arifovic),
August 22, 2001.
Experiments with human subjects in a Kydland-Prescott Phillips curve
economy. Postscript file PDF file
- Discussion of Can Market and Voting Institutions Generate Optimal Intergenerational
Risk Sharing, by Antonio Rangel and Richard Zeckhauser (March 25, 1999). NBER Florida conference on
social security. Postscript file
- Optimal Fiscal Policy in a Linear Stochastic Model (with Francois
Velde) (April
29, 1998)
[ Postscript file [ PDF file ]
- Policy Rules for Open Economies (January 1998)
(Discussion of Laurence Ball) Postscript file ]
- Projected U. S.
Demographics and Social Security
(November 1, 1998)
(with Mariacristina De Nardi and
Selahattin Imrohoroglu [ Postscript file ]
- The Big Problem of Small Change (August 1998)
(with François Velde) [ PDF file ] [ Postscript file ]
- Accounting Properly for the Government's Interest Costs
(with George Hall) [ Postscript file ]
- Neural Networks for Encoding and Adapting in Dynamic Economics
(with In-Koo Cho) [ Postscript file ]
- Learning to be Credible
(with In-Koo Cho) [ Postscript file ]
- Robust Permanent Income and Pricing (April 30, 1999)
(with Lars Peter Hansen and Thomas Tallarini) [ Postscript file ]
- Robust Permanent Income and Pricing (April 1997)
(with Lars Peter Hansen and
Thomas Tallarini) , old version
with preference shock specification of model[ Postscript file ]
- Mechanics of Forming and Estimating Dynamic Linear Economies
(with Evan Anderson, Lars P. Hansen and Ellen McGrattan)[
Postscript file ]
- Two Computational Experiments to Fund Social Security
(with He Huang and Selo Imrohoroglu) Postscript file ]
- Coinage, Debasements, and Gresham's
Laws
(with Bruce Smith) [ Postscript file ]
- The European Unemployment Dilemma, Revised May 1997
(with Lars Ljungqvist) Postscript file ]
- Alternative Monetary Policies in a Turnpike Economy
(with Rodolfo Manuelli) Postscript file ]
- An Appreciation of A. W. Phillips
(with Lars Peter Hansen) [
Postscript file ]
- Expectations and the Nonneutrality of Lucas [ Postscript file ]
- Discounted Linear Exponential Quadratic Gaussian Control
(with Lars Peter Hansen) [ Postscript file ]