After giving a brief overview of the renewal of interest in logic and the foundations of mathematics in Göttingen in the period 1914-1921, I shall give a detailed presentation of the approach to the foundations of mathematics found in Behmann's doctoral dissertation of 1918, "Die Antinomie der transfiniten Zahl und ihre Auflösung durch die Theorie von Russell und Whitehead." The dissertation was written under the guidance of David Hilbert and was primarily intended to give a clear exposition of the solution to the antinomies as found in Principia Mathematica. In the process of explaining the theory of Principia, Behmann also presented an original approach to the foundations of mathematics which saw in sense perception of concrete individuals the Archimedean point for a secure foundation of mathematical knowledge. In the last part of the talk I will argue for the importance of Behmann's investigations for the development of Hilbert's approach to the foundations of mathematics in the early twenties.
The year 1921 is considered a milestone in the history of logic: Wittgenstein popularized the truth-table method in his Tractatus, Emil Post proved the completeness of propositional logic, and he and Jan Lukasiewicz invent logics with more than two truth values. Three years earlier, David Hilbert and Paul Bernays achieved all these results and more in an (unpublished) lecture course in Bernays's Habilitationsschrift. The talk will describe these achievements, and trace some of the related advances of mathematical logic in the 1920s.
We define an operation inverse to cut elimination. Among the applications: assignment of ordinals in the epsilon substitution method and detailed description of continuous cut elimination.
The actor model of computation is a natural starting point for specifying, reasoning about, and implementing components of open distributed systems. The talk will begin with an intuitive introduction to actor computation and its semantic models. We then present actor component algebras as a basis for a compositional approach to system design. We will briefly survey a variety of examples including the interaction path algebra that is the basis for an abstract compositional semantics called interaction semantics. We will then introduce a general semantic framework called actor theories and show how this framework can be used in a variety of ways: for giving semantics to programming and specification languages; reasoning about program transformations, composing specifications; proving satisfaction and refinement relations, and correctness properties of components.
For specifying abstract data types (and functional requirements for software), finitary equational Horn logic has some advantages over alternative logics: it is reasonably expressive, yet specifications always have initial models, and exploration of the consequences of specifications can be automated.
In this talk, we consider a novel framework that provides some support for set theory within Horn logic. This framework treats both sets and (partial) functions as values, it includes set membership assertions as atomic formulae, and it comes equipped with constructors for power sets, function spaces, and products. Since extensional equality of sets and functions cannot be specified in Horn logic, we are led to consider initial models where sets are distinguished by labels. The lack of extensionality is insignificant when sets are never tested for equality, but merely for membership. Moreover, when sets are regarded as auxiliary entities in a specification - introduced only so as to classify the values of interest - one may obtain standard algebraic models as reducts of the initial labelled-set models.
The work reported here is in collaboration with Helene Kirchner, LORIA (France).