In this talk, the notion of a set algebra S satisfying a system E of equations is introduced. After defining a notion of freeness for such algebras, we show that, for any system E of equations, set algebras that are free in the class of structures satisfying E exist and are unique up to a bisimulation. Along the way, we point to analogues of classical set-theoretic and algebraic properties and begin to explore connections between the algebraic and set-theoretic viewpoints.
In this talk I will consider the relationship between continuity and sequentiality for type two functionals. It is a well-known result that when inputs are restricted to total functions, continuous and sequential functionals conincide. However, the sequential ``algorithms'' which compute continuous functionals rely on unbounded search and so are in some sense ``inefficient''. Is a more efficient solution possible? I will consider some answers to this question by introducing notions of bounded continuity and sequentiality and presenting some results which give quantitative relationships between these notions.
ACL2 is an interactive mechanical theorem prover designed for use in modeling hardware and software and proving properties about these models. ACL2 stands for "A Computational Logic for Applicative Common Lisp" and is in the "Boyer-Moore" tradition.
It has been used to prove theorems about a variety of algorithms and designs, including floating-point division and square root on both AMD K5 (where the operations are done in microcode) and K7 (where they are done using different algorithms in hardware). ACL2 has also been used in microprocessor modeling, including the Motorola CAP DSP chip and the Rockwell-Collins JEM1 Java Virtual Machine microprocessor.
The results of Statman [1979] and Orevkov [1982] show that ``fast'' cut-elimination is generally impossible; indeed the complexity of cut-elimination (independent of the particular procedure) is nonelementary. However this result does not imply that the choice of specific elimination procedures is irrelevant at all: We prove for a wide class of end-sequents which contain all equational theories -- among them Statman's worst-case examples -- that the elimination of monotone cuts is only of at most exponential expense if, as additional elimination technique, direct projections of cutformulas to subformulas are admitted. Moreover we show that, for this class of endsequents and for monotone cuts, the worst-case complexity of usual Gentzen-type elimination procedures is nonelementary. Furthermore, we describe other classes where cut-elimination is feasable and discuss applications to algebra and number theory.
This talk is a review of recent works by Michael Freedman: "Limit Logic, and Computation", "k-sat on Groups and Undecidability". In these works Michael Freedman discusses his new approach to solving one of the central problem of Computer Science, P#NP? One of the most studied NP-complete problems is the problem of proving satisfyability of Boolean formulae (SAT problem). The approach by Michael Freedman is based on introducing "infinite limit" for a class of formulas. If this limit is algorithmic, then it should imply that the class is polynomial (i.e. satisfyability of formulae from this class can be established by algorithm with polynomial number of steps). If this infinite limit is undecidable, then there is no polynomial algorithm solving problems from this class. To support this approach, Freedman investigates analogy between P#NP problem and celebrated results by Michael Gromov about discrete groups with polynomial growth. Gromov's approach employs infinite limit of groups as metric spaces with generated by minimal word length metric. In this talk we are going to discuss several Freedman's examples of introducing infinite limit which serve to support the general idea for solving P#NP problem.
Security protocols use primitives like encryption and digital signatures to transmit messages or reach agreement in a way that is resilient to malicious attack. This talk will describe a framework based on Horn linear logic (or, equivalently, multiset rewriting) that may be used to describe common protocols and characterize a class of malicious attacks. A characteristic of the framework is the way that existential quantification is used to model the way that common protocols choose fresh values, or "nonces." Two lower bounds will be described: protocol security is DEXP-time complete without nonces, and undecidable if nonces are allowed, even if we restrict the size of messages and the number of steps of each protocol to fixed finite bounds.
Imre Lakatos' Proofs and Refutations uses novel historiographic techniques to explain the emergence of new methods of mathematical proof during the 19th century. This talk will explain Lakatos' constructive approach, his historical claims, and their skeptical implications. The ideas will be used to characterize the heuristic structure of Goedel's incompleteness theorems and the evolution of the second incompleteness theorem from the 1930s to 1960s.
In this talk I will describe a formal system of infinitesimal analysis that is weak in proof-theoretic terms, yet it is strong enough to formalize a significant fragment of classical real analysis. The system, called Elementary Infinitesimal Analysis (EIA), has the property that the computational time bounds of its provably computable functions on the natural numbers are given by exponential terms. On the other hand, EIA can prove important theorems of classical real analysis; we show, as an example, that the existence theorem for ordinary differential equations, in the language of infinitesimal analysis, can be directly carried out in EIA. Moreover, EIA uses infinitesimals and infinite numbers in a simple and natural way so that proofs in this system are easy to comprehend.