I remember Noah Goodman giving a tutorial on Bayesian models at ESSLLI 2010. The toolbox for that course centrally included the special-purpose programming language Church. I can see now that a number of video lectures by Goodman as well as Josh Tenenbaum and others are available at the website of a 2011 UCLA summer school.
The most interesting parts of the paper are, for my purposes, section 2, 3, and 4. These are the ones which are most directly devoted to giving the reader intuitions about the ins and outs of hierarchical Bayesian models.
There, the basic idea is nicely explained with a bit of bean-bag statistics borrowed from philosopher Nelson Goodman's Fact, Fiction and Forecast (1955):
Suppose we have many bags of colored marbles and discover by drawing samples that some bags seem to have black marbles, others have white marbles, and still others have red or green marbles. Every bag is uniform in color; no bag contains marbles of more than one color. If we draw a single marble from a new bag in this population and observe a color never seen before – say, purple – it seems reasonable to expect that other draws from this same bag will also be purple. Before we started drawing from any of these bags, we had much less reason to expect that such a generalization would hold. The assumption that color is uniform within bags is a learned overhypothesis, an acquired inductive constraint. (p. 308 in the published version)The paper appeared in a special issue of Cognition dedicated to probabilistic models of cognition. There are a number of other papers in the same issue that seem very interesting.
No comments :
Post a Comment