## Week of Sept. 25

### Introduction: Belief, reasoning, and probability. Graphical models and probabilistic programs.

Homework: Church / LISP basics.

Readings:

- ProbMods book: Historical background (feedback)
- How to grow a mind: structure, statistics, and abstraction.
J. B. Tenenbaum, C. Kemp, T. L. Griffiths, and N. D. Goodman
(2011).
*Science.* - Optional: Structure and Interpretation of Computer Programs. (This is an amazing intro to computer science, through Scheme.)
- Optional: Some Scheme tutorials.

## Week of Oct. 2:

### Generative models and conditioning. Discussion on levels of analysis.

Homework: Excercises on Generative Models and Conditioning.

Readings:

- ProbMods wiki: Generative Models
- ProbMods wiki: Conditioning
- ProbMods book: Bayesian inference (feedback)
- Predicting the future. Griffiths and Tenenbaum (2006).
- Chapter 1 of "The adaptive character of thought." Anderson (1990).
- Optional: Chapter 1 of "Vision." Marr (1982).
- Optional: Ten Years of Rational Analysis. Chater, Oaksford (1999).
- Optional: Ping Pong in Church: Productive use of concepts in human probabilistic inference. Gerstenberg and Goodman (2012).

## Week of Oct. 9:

### Causal vs. statistical dependency. Patterns of inference.

Homework: Excercises on Patterns of Inference, also work on mini-project.

Mini-project for class on thursday.

Readings:

- ProbMods wiki: Patterns of Inference.
- ProbMods book: Graphical Models (feedback)
- Causal Reasoning Through Intervention. Hagmayer, Sloman, Lagnado, and Waldmann (2006).
- Optional: Bayesian models of object perception. Kersten and Yuille (2003).
- Optional: Sources of uncertainty in intuitive physics. Smith and Vul (2012).
- Optional: Internal physics models guide probabilistic judgments about object dynamics. Hamrick, Battaglia, Tenenbaum (2011).

## Week of Oct. 16:

### Learning as inference. Occam's razor.

Readings:

- ProbMods wiki: Learning as Conditional Inference.
- ProbMods wiki: Occam's Razor.
- Word learning as Bayesian inference. Tenenbaum and Xu (2000).
- Structure and strength in causal induction. Griffiths and Tenenbaum (2005).
- Optional: Bayesian modeling of human concept learning. Tenenbaum (1999).
- Optional: Word learning as Bayesian inference: Evidence from preschoolers. Xu and Tenenbaum (2005).
- Optional: Rules and similarity in concept learning. Tenenbaum (2000).

## Week of Oct. 23:

### Heirarchical and mixture models.

(Work on project proposals!)

Readings:

- ProbMods wiki: Hierarchical Models
- ProbMods wiki: Mixture Models
- ProbMods book: Hierarchical Bayes (feedback)
- Learning overhypotheses. Kemp, Perfors, and Tenenbaum (2006).
- Optional: Object name learning provides on-the-job training for attention. Smith, Jones, Landau, Gershko-Stowe, and Samuelson (2002).

## Week of Oct. 30:

### Non-parametrics and relational models.

Project proposals due Saturday!

Readings:

- ProbMods wiki: Non-Parametric Models
- ProbMods book: Infinite Models (feedback)
- Learning systems of concepts with an infinite relational model. Kemp, C., Tenenbaum, J. B., Griffiths, T. L., Yamada, T. & Ueda, N. (2006).
- Learning a theory of causality. Goodman, Ullman, and Tenenbaum (2011).
- Optional: Learning to learn causal models. Kemp, C., Goodman, N. & Tenenbaum, J. (2010).
- Optional: Infinite Relational Modeling of Functional Connectivity in Resting State fMRI. Morup, M. and Madsen, K.H. and Dogonowski, A.M. and Siebner, H. and Hansen, L.K. (2010).

## Week of Nov. 6:

### Logic, recursion, and grammar-based induction.

Readings:

- ProbMods wiki: Recursive Models
- ProbMods book: Logical representations (feedback)
- ProbMods book: Grammars for cognition (feedback)
- A rational analysis of rule-based concept learning. Goodman, Tenenbaum, Feldman, and Griffiths (2008).
- Optional: Kinship categories across languages reflect general communicative principles. Kemp and Regier (2012).
- Optional: Learning a theory of causality. Goodman, Ullman, and Tenenbaum (2011).
- Optional: Learning Structured Generative Concepts. Stuhlmueller, Tenenbaum, and Goodman (2010).
- Optional: Probabilistic models of language processing and acquisition. Chater and Manning (2006).

## Week of Nov. 13:

### Social cognition.

Readings:

- ProbMods wiki: Inference about inference: Nested query
- ProbMods book: Inverse decision making (feedback)
- Goal Inference as Inverse Planning. Baker, Tenenbaum, Saxe (2007).
- Quantifying pragmatic inference in language games. Frank and Goodman (2012).
- Optional: ProbMods book: Decision-making and reinforcement learning (feedback)
- Optional: Cause and intent: Social reasoning in causal learning. Goodman, Baker, Tenenbaum (2009).
- Optional: Knowledge and implicature: Modeling language understanding as social cognition. Goodman and Stuhlmueller (2013).
- Optional: Reasoning about Reasoning by Nested Conditioning: Modeling Theory of Mind with Probabilistic Programs. Stuhlmueller and Goodman (under review).
- Optional: Young children use statistical sampling to infer the preferences of other people. Kushnir, Xu, and Wellman (2010).

## Week of Nov. 20:

### Thanksgiving break.

## Week of Nov. 27:

### Inference algorithms and process models.

Readings:

- ProbMods wiki: Inference Algorithms
- ProbMods book: Rational process models (feedback)
- ProbMods book: Approximate inference (feedback)
- One and done: Globally optimal behavior from locally suboptimal decisions. Vul, Goodman, Griffiths, Tenenbaum (2009).
- Perceptual multistability as Markov chain Monte Carlo inference. Gershman, Vul, & Tenenbaum (2009).
- Optional: Burn-in, bias, and the rationality of anchoring. Lieder, Griffiths, and Goodman (2012).
- Optional: A more rational model of categorization. Sanborn, Griffiths, & Navarro (2006).
- Optional: Theory acquisition as stochastic search. Ullman, Goodman, and Tenenbaum (2010).
- Optional: Exemplar models as a mechanism for performing Bayesian inference. Shi, Griffiths, Feldman, Sanborn (2010).

## Week of Dec. 4:

### Project presentations!

Presentations will be Dec. 4, 1:30-4p.

Each project team will present a short summary. We'll go in reverse alphabetical order.

Project reports are due Saturday, Dec. 8, by midnight.