The essay is a part of an anthology called Natural Language and the Computer, which (in addition to being quite wonderfully techno-camp) is one of the most boldly and beautifully typeset books I've seen in a while.
The title page, with gargantuan print and the editor's name in a spacy font. |
Some aspects of its graphical design has an undeniable 1950s feel to it; others look like a kind of 1960s Space Odyssey futurism; and others again look like they were taken straight out of Blade Runner. It's all quite remarkable.
Title page of Part 1, with an unmistakably 1950s italic font and a "etched" figure 1. |
And then of course there's the always charming prospect of reliving a lecture taking stock of the computer revolution anno 1960. One of the other books in the same series is even called – I kid you not – The Foundations of Future Eletronics. How can you not love a 1961 book with that title?
At any rate, the essay that I'm interested in plods through the familiar details of Chomskyan grammar, pushing in particular the hard sell of transformational grammar over "immediate constituent" grammar (i.e., phrase structure grammars without any postprocessing).
The last section of the essay is called "Grammaticalness and Choice" and gets a bit into the issue of how sentences are selected, and what goes on in the head of the alleged "ideal speaker" of Chomskyan linguistics. This part contains a number of interesting quotes, and I'll provide some generous extracts below.
"Forced to Operate Empirically"
The first notion taken up in the section is that of grammaticality itself, which it seems that he sees some problems making empirically precise:Presumably the notion of "grammatical sentence" is characterized by the grammar itself, since in principle we formulate our rules in such a way as to generate only and always such sentences. It is a question of some interest whether there is a possibility of characterizing this notion independently of the grammar. It seems extremely unlikely that there is, and we will be forced to operate empirically with the machinery of the grammar, treating each sentence that it generates as a hypothesis to be tested for grammaticalness against the the reaction of native speakers. For each sentence rejected, we either revise the grammar to exclude the sentence (if we believe the rejection is on proper grounds–that is, not motivated by school grammar and the like), or we revise the grammar to generate the sentence in some special status (i.e., as only partially well formed). Each sentence accepted is, of course, a confirmation of the validity of the rules up to that point. (p. 43)I suppose what Stockwell has in mind here is that there might in principle exists some kind of objective test of grammaticality which could relieve us of having to trust laypeople to know the difference between "ungrammatical" and "nonsensical." (If you'll allow me a bit of self-citation, I've written a short paper on the idea of having such a distinction.)
Today, linguists might fantasize about such a test taking the form of an fMRI scan; in the 1980s, they would have imagined it as an EEG; and in the 1950s, a polygraph reading. But in the absence of such a test, we are forced to use live and conscious people even though
Informant reaction is difficult to handle, because such reactions involve much more than merely the question of grammaticalness. (p. 43)We thus only have indirect access to the alleged grammatical engine of the brain.
"The Rest Takes Care of Itself"
After briefly considering a couple of borderline grammatical cases, Stockwell continues:One might consider the utilization of grammar by the speaker as follows. The essence of meaning is choice; every time an element is chosen in proceeding through the rules, that choice is either obligatory (in which case it was not really a choice at all, since there were no alternatives), or it is optional (in which case the choice represented simultaneously both the positive decision and the rejection of all alternatives–the meaning of the choice inheres in the [sic] constrastive value of the chosen element as compared with all the possible choices that were rejected). (p. 44)Oddly enough, Stockwell's meditation on the actual role and implementation of Chomskyan grammar in a person's behavior brings him around to confirming not only de Saussure's picture of meaning, but also Shannon's. I wonder whether he is aware of the implications of this.
He then goes on to consider an example:
Thus these are the choices involved in a simple sentences suchAgain, this is oddly similar to the kind of generative model one would employ in information theory, and the notion of having a sparse language to minimize cognitive effort here takes the place of error-correction. But presumably, the philosophical difference is whether we need a source model (of the "optional choices") or only a channel model (of the "perfectly mechanical, perfectly automatic" choices).
Did the boy leave.Of the twelve choices, half are obligatory–either initiating the derivation, or following out obligatory consequences of optional choices. The additional rules of the phonetic component are nearly all obligatory. To include these would increase the obligatory choices to about twice the number of optional choices. In fact it is quite probable that in real discourse even the element the is obligatory (that is, the choice of the versus a seems quite predictable in a larger context). This would leave us with only five meaning-carrying (optional) choices. Everything else that goes into making up the sentence is in a valid sense perfectly mechanical, perfectly automatic. It can be argued that a grammar must maximize the obligatory elements and cut the optional choice to the barest minimum in order to get any reasonable understanding of how the human brain is capable of following complex discourse at all. That is, the hearer's attention is focused on matching, with his own generating machinery, the sequence of optional choices; since he has identical obligatory machinery, the rest takes care of itself. In this way, the same grammatical model accounts for both encoding and decoding. We do not need separate and distinct analogs for both sending and receiving messages. (pp. 44–45)
NP + VP Obligatory D + N Obligatory D == the Optional N == boy Optional aux + VP1 Obligatory aux == past Optional VP1 == Vi Optional Vi == leave Optional Tintrg Optional Inversion of Te Obligatory Empty carrier for past Obligatory Rising intonation Obligatory
"From Which He Knowingly Deviates"
This reading of the differences is backed up by his elaboration:Encoding and decoding does not imply that a speaker of hearer proceeds step by step in any literal sense through the choices characterized by the grammar in order to produce or understand sentences. The capacities characterized by the grammar are but one contributing factor of undetermined extent in the total performance of the user of language. The grammar enumerates only well-formed sentences and deviant sentences, which, recognized as ranging from slightly to extremely deviant by the language user, are interpreted somehow by comparison with well-formed ones. The grammar enumerates sentences at random; it does not select, as the user does, just that sentences appropriate to a context. The grammar clacks relentlessly through the possible choices; the user starts, restarts, jumps the grammatical traces, trails off. A generative grammar is not a speaker-writer analog. It is a logical analog of the regularities to which the language user conforms or from which he knowingly deviates. (p. 45)I take it that this entails that grammars are essentially and necessarily logical in nature, since their purpose is to describe the set of available sentences of the language rather than to predict their occurrence. From such a perspective, a probabilistic context-free grammar would be something of an oxymoron.
A logical and a probabilistic conception of grammar. |
"A Natural Extension of Scholarly Grammar"
Again in perfectly orthodox fashion, Stockwell finally tips his hat at the impossibility of formulating discovery procedures and makes a strange claim about the machine-unfriendly nature of transformation grammars:Although the title of this book suggests the machine processing of natural-language data, it should not be assumed that the transformational model of the structure of language is in any way geared to machines of any kind, either historically or in current development. On the contrary, it is a natural extension of traditional scholarly grammar, an effort to make explicit the regularities to which speakers of a language conform, which has been the focus of grammatical studies for over 2,500 years. The effort to formulate discovery procedures for the systematic inference of grammatical structure is quite recent; few if any transformationalists believe such a goal has any possibility of success–at least, not until much more is known about the kinds of regularities which grammarians seek to discover and to formulate in explicit terms. (p. 45)That seems a bit odd in the light of, e.g., Victor Yngve's recollection of how people were swayed by Chomsky's grammatical analyses because they "read like computer programs."
No comments :
Post a Comment