Modal Logic for ceteris Paribus Preferences

Ceteris paribus clauses ("everything else being equeal") occur in many social sciences such as economics and social choice theory, where laws are often taken to be defeasible. Philosophers like Nancy Cartwright go so far as to claim that even pure sciences have at best ceteris paribus laws, and thus that they lie. In most cases though, ceteris paribus clauses stand for normal conditions of evaluation and account for possible defeaters of laws. On the basis of this standpoint, it seems that "everything else being normal" would be a better rendering. I will disambiguate two readings of ceteris paribus, which I call the equality and the normality reading, using tools of modal logic. I will then develop in detail the logic for the equality reading following the seminal work of von Wright in preference logic. Ceteris paribus logic (under the equality reading) finds interesting applications, for example by characterizing the Nash equilibrium, an important concept in game theory. It also has independent mathematical interests which I will discuss.

Information-Theoretic Logic: the classical deductive perspective

My talk mostly concerns the logical perspective on information that Johan van Benthem calls 'information-as-elucidation', the viewpoint that he identifies as the classical deductive sense. Perhaps the most effective slogans in this conception are that propositions are 'carriers of information' and that deduction is 'unpacking' the premises. This conception goes back at least as far as the middle of the 19th century involving thinkers, such as Boole (1847), Jevons (1870), and Venn (1881/1971), all of who shared the intuition of an information-based consequence relation. Indeed this was virtually the dominant conception of logical consequence until the emergency of the Bolzano-Tarski transformation-theoretic paradigm. There are vestiges of the information-theoretic coneption in the 20th century in Cohen and Nagel (1962/93) but the articulation of its main ideas was accomplished by John Corcoran (1995, 1998, and 1999), who provides a modern rigorous defense of what may be called the neo-Boolean or the ne-Fregean view. In addition to its natural and pedagogical appeal, information-theoretic logic reconciles humanistic and scientific temperaments found in different fields going from argumentation theory to deductive methodlogy. Moreover, it locates logi at the heart of formal epistemology, granting thereby a distinctive role for our deductive capacities to perform cogent reasoning in the service of knowledge, a point that is often left out form purely syntactic or semantic accounts of logic. I will discuss its philosophical traits in comparison with the model-theoretic stance by looking at the practice of establishing logical validity and logical invaliditiy. Information-theoretic logic acknowledges the fact that we routinely determine validity and invalidity relying on our judgement of information containment and no-containment of premises and conclusion in a given argument.

Three Logical Perspective on Information

Information is not an official part of your standard logic textbook. But conceptions of information live behind the scenes of the field. I will distinguish three of these: information-as-range (Carnap, Hintikka), information-as-correlation (Dretske, Barwise & Perry), and information-as-elucidation (the classical deductive perspective; but also close to information flow as generated by computation). I will discuss the connections between these stances, and raise the question whether a unification is possible, or even desirable.

Information and How to Harness it

Events carry information relative to constraints: the information that the rest of the world is as it has to be for the event to have occurred, given the constraints. Actions have success conditions relative to constraints and goals: the conditions under which the action will achieve the goal, given the constraints. Basically, we harness information by making a state that carries the information that P the cause of an action whose success conditions are that P. I will develop these ideas, which can be found in the following papers availabe.

Dependence Logic

In 1961 Henkin suggested a game theoretic semantics for first order logic and its extension by so called partially ordered quantifiers. It remained an open problem whether this extension and other similar logics could be given a compositional semantics. In 1997, Wilfrid Hodges made a breakthrough in this area by giving a compositional semantics for these logics. In his semantics satisfaction is defined as a relation between formulas and sets of assignments, rather than as a relation between formulas and individual assignments, as is customary in first order logic. Based on this idea, we introduce Dependence Logic. This is an extension of first order logic, in which dependence of variables on each other is a basic atomic concept. We give an overview of this logic, its properties, and its applications, from database theory of set theory.

TBA

Logical and other necessary truths present a challenge to both internalist and externalist views of knowledge, for different reasons. The internalist can’t avoid making it unrealistically difficult to know obvious logical truths, because he requires awareness of reasons. The externalist’s main problem is that the necessity of the logical truth combines with many of the usual externalist standards to make it far too easy to know a logical truth, even one of arbitrary complexity. I define an externalist account of knowledge of logical truth that avoids these problems and takes its bearings from an externalist account of knowledge of logical implication. Though the criteria offered for this type of knowledge are purely behavioral (dispositional), the account implies that logical truths cannot be known by authority, which corresponds to the traditional expectation that the subject in some sense “own” for herself the truth of such propositions