Saturday, March 1, 2014

Attneave: Applications of Information Theory to Psychology (1959)

Fred Attneave's book on information theory and psychology is a sober and careful overview of the various ways in which information theory had been applied to psychology (by people like George Miller) by 1959.

Attneave explicitly tries to stay clear of the information theory craze which followed the publication of Shannon's 1948 paper:
Applications of Information Theory to Psychology (cover)
Book cover; from Amazon.
Thus presented with a shiny new tool kit and a somewhat esoteric new vocabulary to go with it, more than a  few psychologists reacted with an excess of enthusiasm. During the early fifties some of the attempts to apply informational techniques to psychological problems were successful and illuminating, some were pointless, and some were downright bizarre. At present two generalizations may be stated with considerable confidence:
(1) Information theory is not going to provide a ready-made solution to all psychological problems; (2) Employed with intelligence, flexibility, and critical insight, information theory can have great value both in the formulation or certain psychological problems and in the analysis of certain psychological data (pp. v–vi)
Or in other words: Information theory can provide the descriptive statistics, but there is no hiding from the fact that you and you alone are responsible for your model.

Language Only, Please

Chapter 2 of the book is about entropy rates, and about the entropy of English in particular. Attneave talks about various estimation methods, and he discusses Shannon's guessing game and a couple of related studies.

As he sums up the various mathematical estimation tricks, he notes that predictions from statistical tables tend to be more reliable than predictions from human subjects with respect to the first couple of letters of a text. This means that estimates from human predictions will tend to overestimate the unpredictability of the first few letters of a string.

He then comments:
What we are concerned with above is the obvious possibility that calculated values (or rather, brackets) of HN [= the entropy of letter N given letter 1 through N – 1] will be too high because of the subject's incomplete appreciation of statistical regularities which are objectively present. On the other hand, there is the less obvious possibility that a subject's guesses may, in a certain sense, be too good. Shannon's intent is presumably to study statistical restraints which pertain to language. But a subject given a long sequence of letters which he has probably never encountered before, in that exact pattern, may be expected to base his prediction of the next letter not only upon language statistics, but also upon his general knowledge [p. 40] of the world to which language refers. A possible reply to this criticism is that all but the lowest orders of sequential dependency in language are in any case attributable to natural connections among the referents of words, and that it is entirely legitimate for a human predictor to take advantage of of such natural connections to estimate transitional probabilities of language, even when no empirical frequencies corresponding to the probabilities exist. It is nevertheless important to realize that a human predictor  is conceivably superior to a hypothetical "ideal predictor" who knows none of the connections between words and their referents, but who (with unlimited computational facilities) has analyzed all the English ever written and discovered all the statistical regularities residing therein. (pp. 39–40; emphases in original)
I'm not sure that was "Shannon's intent." Attneave seems to rely crucially on an objective interpretation of probability as well as an a priori belief in language as an autonomous object.

Just like Laplace's philosophical commitments became obvious when he starting talking in hypothetical terms, it is also the "ideal predictor" in this quote which reveals the philosophy of language that informs Attneave's perspective.

No comments :

Post a Comment