Dependence Chains
The most serious problem that Janssen points out is that IF logic may require that a choice A is independent from a choice C without ruling out that there is some intermediate choice B such that A depends on B, and B depends on C.Such cases create some quite strange examples, e.g.:
- TRUE: ∀x: (x ≠ 2) v (∃u/x: x = u).
- FALSE: ∀x: (x = 2) v (∃u/x: x = u).
- TRUE: ∀x: (x ≠ 2) v (∃u/x: x ≠ u).
- TRUE: ∀x: (x = 2) v (∃u/x: x ≠ u).
Similar examples exists where the middle term which smuggles in the dependence is not a disjunction, but another quantifier.
Naming Problems
Another problem occurs, according to Janssen, when "a variable is bound within the scope of a quantifier that binds the same variable" (p. 375). This occurs for instance in sentence like- ∀x∃x: R(x,x).
- ∀x∃y: R(x,y).
Solutions
To avoid the problems of Hintikka's system, Janssen defines a new game with explicit extra conditions such as "The strategy does not have variables in W as arguments" and "If the values of variables in W are changed, and there is a winning choice, then the same choice is a step towards winning" (p. 382).This solves the problem, but doesn't bring about much transparency, it seems to me. A better solution would probably be to describe the instantiation of the quantifiers and the selection of branches at the connectives as a probability distribution on a suitable power of the domain and of the branch options {L,R}. Then independence could be clearly described as statistical independence.
Such a system would require the domains to be finite, which is not good. However, within finite domains, results about logical strength of solution concepts would be easy to extract, because they would simply correspond to different constraints on the dependencies between the choices, i.e., marginal distributions. It would, in fact, allow us to quantify the amount of information that was transmitted from one choice to another by computing the mutual information between two marginal distributions.
No comments :
Post a Comment