Carnegie Mellon University
Belief Revision

Belief Revision

One of the central tenets of contemporary Bayesian methodology can be articulated epistemologically as follows: rational degrees of belief should obey Kolmogorof's axioms on pain of incoherence. Moreover, synchronic changes of belief (or suppositions) should be performed in accordance to the standard method of probability change called conditioning. 'Dutch Books' can be always be made against agents changing their views hypothetically via alternative methods (to conditioning).

Much of contemporary work in Methodology can be succinctly presented in terms of arguments pro and con the tenability of the former account of belief and belief change. Here are some examples. First, some authors have proposed the extension of the standard Bayesian view to diachronic changes of belief. The idea is to develop a Diachronic Dutch Book, defending the idea that learning should proceed in accordance to conditioning on pain of incoherence. Second, some authors have challenged the idea that all forms of supposition (especially the ones involved in decision making) can be captured by conditioning. Finally there are researchers who think that conditioning is correct, but essentially incomplete, given its inability to encode suppositions where hypotheses carry probability zero. Some of the proposed solutions to the third problem appeal to non-standard (or infinitesimal) probability. A well-known alternative appeals to a notion of primitive conditional probability. Other solutions use a primitively defined notion of belief and construct synchronic changes of view as coherent degree of credence on qualitatively defined suppositions. The latter strategy (initially advocated by Levi, Harper, G‘rdenfors and others) opened the doors for the study of non-probabilistic methods of belief change. Systematic research in this area has been conducted since at lest the mid-1970's. Alchourron, Makinson and G‘rdenfors offered in 1985 an influential account of the theory of theory change, usually known as AGM. Independently W. Spohn offered a general non-probabilistic theory of induction, where the objects of change are ordinal conditional functions, rather than theories. This work has been extensively applied, extended and revised, since the 1980's, in many areas of Artificial Intelligence. Levi has also argued (since the early 70's) in favor of using decision-theoretic methods in order to characterize the process of rational belief change. For example, Levi argues that the act of answering an empirical question can be modeled as an expansion of the agent's view (i.e. no held beliefs are withdrawn); and that the agent does so by rationally choosing among a set of (abductively identified) potential answers to the queston. Here Levi deploys the decision-theoretic notion if informational value in order to articulate the inquirer's concern to obtain new error-free information.

A number of researchers at CMU have intervened quite actively in research concerning either probabilistically, decision theoretically, logically or computationally construed theories of belief change. Teddy Seidenfeld has written various papers arguing against the normative force of diachronic Dutch books. He also clarified the extent to which objective Bayesians rely on methods of belief change orthogonal to conditioning (and offered arguments against objective Bayesians, which are based on this clarification). It has been argued that an important limitation (at least for descriptive purposes) of existing methods of belief change is that they presuppose that agents are consistent and logically omniscient. This prompted research on how finite (and eventually incoherent) sets of beliefs change through inquiry. Seidenfeld (in collaboration with Kadane and Schervish) has recently tackled this problem by studying degrees of incoherence for inductive methods. Glymour has offered various well-known arguments against the use of Bayesian methodology (and in particular confirmation theory) in order to represent and understand scientific reasoning. More recent work, in collaboration with Scheines and Spirtes (as well as with C. Meek), has offered ways of reconciling Bayesian methodology and causal theories of decision, by explaining the different roles of conditioning and intervening in theories of action. Arlo-Costa has done extensive work on qualitative theories of supposition and learning. In a series of articles published in the 1990's he provided foundations for a general theory of conditional reasoning that delivers an account of defeasible (or non-monotonic) conditionals as a particular case. This theory is based on a qualitative account of belief change compatible with central tenets of Bayesian methodology. More recent work focuses on (1) the interests and limitations of using of Non-Archimedean probability in order to represent hypothetical reasoning and learning, (2) computational and probabilistic models of abduction, (3) the role of supposing and learning in models of decision making and strategic interaction. Finally, Kelly has done extensive work classifying methods for iterated change (of the type advocated by Spohn) in terms of their reliability. This work appeals to the tools of learning theory.