A logical formula F(X,P) can be treated as an equation to be satisfied by the solutions X(P).

J. McCarthy considers the parameterization of the models of formulas, gives the general solution in the case of propositional logic and states the problem for other logics. We find the general solution for the formulas in the first-order language with monadic predicates and equality. The solutions are obtained via quantifier elimination and parametrized by epsilon terms.

**References**

- John McCarthy, "Parameterizing the set of models of a propositional theory". Available at http://www-formal.stanford.edu/jmc/parameterize.html.

I start with a collection of results/viewpoints which were reported this May at seminars/conferences in 3 different Universities in Europe: Bonn, Cambridge and Leeds. Here is the Abstract of my Leeds talk (the Title was the same):

"We show how NF (Quine's "New Foundations") can be seen as a special model of ZF. First, we give a sufficient condition for [the famous long-standing open problem of] NF consistency. Next, in the ZF language, we will present a much simplified Specker's refutation of AC in NF (= in that model of ZF)."This is a material I feel I already know reasonably well; naturally, much more interesting for me would be to talk about things which are unknown and which at best for now I can only expect or conjecture. I have a research program for this academic year at Stanford, consisting of several stages, which, if eventually successful, should result in a model of NF. The title of the program, and the title of the project which gave me the money for visiting Stanford, is "Set Theory, Automorphisms and External Forcing", and this is very much what it is about.

On the other hand, I do realize that starting with my real present interests, without giving necessary background, would not do much good for the audience. So, I will start with known things, and hope to have a chance to discuss in some detail my Stanford research plans at some later point.

Euler's polyhedron formula asserts for a polyhedron *p* that
*V-E+F=2*, where *V*, *E*, and *F* are,
respectively, the numbers of vertices, edges, and faces of *p*.
The history of Euler's formula is bumpy because it turns out to be
difficult to say just what a polyhedron is, and, even when a
definition is given, it turns out to be difficult to specify the
subclass of polyhedra for which Euler's formula holds. A history of
Euler's formula is given in Imre Lakatos's *Proofs and
Refutations* [1]. Lakatos's history illustrates a critical process
for improving mathematical knowledge; he calls it the method of proofs
and refutations. The process consists of refining a conjecture (such
as Euler's polyhedron formula) in the light of counterexamples and
incomplete proofs. The goal of the method of proofs and refutations
is to produce a valid proof, which, by definition, has no
counterexamples.

Given its rocky history, then, formalizing Euler's formula presents an interesting challenge for formal proof checking. In this talk I discuss my recent work on a formalization of (a proof of) Euler's polyhedron formula. The proof was carried out in the MIZAR [2], which is a proof checking system based on classical first-order logic and set theory. I try to disarm Lakatos's antipathy toward formal mathematics by showing how the process of formalizing a proof mirrors or exemplifies the method of proofs and refutations.

**References**

- Imre Lakatos:
*Proofs and Refutations*, edited by John Worrall and Elie Zahar. Cambridge University Press, 1976. - MIZAR. Available at http://www.mizar.org.

Elga (2000) introduced the problem of Sleeping Beauty (SB). A conceptual paradox for the treatment of diachronic uncertainty in a Bayesian framework, SB seemed to show that, under certain circumstances, rationality dictates that one adjust one's probability assignment when no new information has been introduced. Rather than address the philosophical literature which followed Elga's paper, I will investigate the Game Theoretical literature which proceeded it. In particular, a close investigation of the problem of the Forgetful Passenger (FP) (Aumann, et al. (1997)) will demonstrate that, contra Elga's claims: (i) FP is fundamentally different from SB; (ii) the procedure for dealing with diachronic uncertainties recommended by Aumann does not give the analysis of SB that Elga claims for it; and (iii) (under Aumann's proposal) probability assignments do not change unless information states change. Our discussion will begin by formalizing an intuitive suggestion in Aumann, et al. We will then explore the implications of this formalization, in particular its application for translating games in extended form into Dynamic Epistemic Logic.

**References**

- Robert J. Aumann, Sergiu Hart, and Motty Perry, "The Forgetful
Passenger", in
*Games and Economic Behavior***20**, 1997, pp. 117-120. - Adam Elga, "Self-locating Belief and the Sleeping Beauty Problem",
in
*Analysis*,**60**(2), April 2000, pp. 143-47.

Dynamic Topological Logic (DTL) is a tri-modal system for reasoning
about dynamical systems and their short- and long-term behavior. The
intended semantics are defined over dynamical systems, which are
pairs composed of a topological space

Different logics arise depending on the assumptions we make of the
space *X* and the function *f*. Surprisingly, the
computational complexity of these logics may vary wildly as we modify
these assumptions. For example, it has been shown that if we consider
the class of systems where *X* is arbitrary but *f* is a
homeomorphism, the resulting DTL is not axiomatizable. It has also
been shown that DTL of continuous functions is undecidable; however, I
will show that this logic is recursively enumerable. Further, I will
discuss minimal spaces, which are dynamical systems where the orbit of
every point is dense; in this setting we get decidability. In this
talk I will compare the three systems, and show what makes them so
different despite their apparent similarities.

Category theory deals in a mathematically natural way with certain
kinds of algebraic structures on possibly very large collections of
structures, such as the category *Grp* of all groups and the
category *Top* of all topological spaces, in terms of the
structure preserving maps ("morphisms") between such objects. From
this point of view, the category *Cat* of all categories is
itself such a structure whose morphisms are the so-called functors
between categories. *Grp*, *Top* and *Cat* are
examples of objects in *Cat*. Even more, if *A*
and *B* are two categories, no matter how large, there is a
"still larger" category whose objects are all the functors
from *A* to *B*, and whose morphisms are the so-called
natural transformations between functors. Existing set-theoretical
foundations accounts for these kinds of constructions only in terms of
certain kinds of restrictions, e. g. by making a distinction between
"small categories" and "large categories" in a theory of sets and
classes. Nevertheless, it is plausible that the foundation of an
unrestricted category theory can be established without invoking such
distinctions. I shall present several criteria for such a theory and
show how they can be met to a considerable extent in a strong
consistent extension of *NFU* (Quine's system *NF* with
urelements). However, a full foundation is blocked in stratified
systems (with or without urelements) as presently treated. The talk
will go over the article, "Enriched stratified systems for the
foundations of category theory" to be found at
http://math.stanford.edu/~feferman/papers/ess.pdf

Domain theory was introduced by Scott around 1971, originally to provide denotational models of the untyped lambda-calculus. Since then many connections between domains and logic have been established.

I'll review some work of Zhang and myself, providing a domain-theoretic foundation for disjunctive logic programming. This foundation is built on clausal logic, a representation of the Smyth powerdomain of a so-called Scott domain.We indicate the completeness of a resolution rule for inference in such a clausal logic; we introduce a natural declarative semantics and an equivalent fixed-point semantics for disjunctive logic programs. At the end we mention an application to feature constraint systems, which gives a disjunctive logic programming semantics for constraint-based grammatical formalisms like head-driven phrase-structure grammar.

Many logical systems today describe intelligent interacting agents
over time. Frameworks include Interpreted Systems (*IS*),
Epistemic-Temporal Logic (*ETL*), and Dynamic Epistemic Logic
(*DEL*). This variety is an asset, as different modeling tools
can be fine-tuned to specific applications. But it may also be an
obstacle, when barriers between paradigms and schools go up. In this
talk, I will discuss some of the technical issues that arise when
comparing these systems. More specifically, I will

- Survey a number of decidability and undecidability results concerning Epistemic Temporal Logic. The goal is to provide a general picture drawing on ideas from different areas of modal logic.
- Rigorously compare Dynamic Epistemic Logic and Epistemic Temporal Logic.