Skip navigation

Seminar: 2012-2013

Unless otherwise indicated, all meetings take place on Tuesdays at 4:15PM at 380-381T

The seminar is led by Professors Solomon Feferman and Grigori Mints.

Autumn

  • Sept 27: Organizational Meeting
  • Oct 2: Balder ten Cate (UC Santa Cruz, Computer Science), Guarded Negation
    Materials: slides
    Abstract (+/-):
    Traditional approaches towards obtaining decidable fragments of first-order logic involve restricting quantifier alternation, number of variables, or patterns of quantification. I will present some recently proposed decidable fragments that are obtained by restricting the use of negation. In particular, "unary negation" logics only allow negation to be applied to subformulas with at most one free variable, and "guarded negation" logics only allow negation to be applied to subformulas whose free variables are guarded by an atomic relation. The guarded-negation fragment of first-order logic, and of least fixed-point logic, subsume many known decidable logical formalisms (such as modal logic, the modal mu-calculus, the guarded fragment, conjunctive queries, and monadic Datalog). Moreover, they turn out to be well-behaved, both from a computational and from a model-theoretic point of view. Based on joint work with Luc Segoufin, Vince Barany, and Martin Otto (STACS 2011, ICALP 2011, VLDB 2012).
  • Oct 9: Shane Steinert-Threlkeld (Stanford), The Church-Turing Thesis: Then and Now
    Materials: slides
    Abstract (+/-):
    In this talk, I will first recount some of the well-known history of the development of the Church-Turing Thesis and examine some ideas about what counts as evidence for the Thesis as well as catalog a few early reactions. Following this survey, I will review and compare two relatively recent axiomatic approaches to computability and their impact on the Thesis, one by Sieg and another by Dershowitz and Gurevich.
  • Oct 16: Peter Hawke (Stanford), The Turing Test: Then and Now
    Materials: slides
    Abstract (+/-):
    Continuing the theme of Alan Turing's accomplishments, in this talk we broadly survey some interesting historical points and recent developments concerning the Turing Test. On the historical side, we briefly consider the historical precursors of the test, the origin of the term "Turing Test" and some of the content of Turing's famous 1950 paper. We then turn to the ongoing debate. First we review some very well-known objections to the test. On the one hand we consider objections to the claim that a thinking machine is possible at all: namely, Searle's Chinese Room thought experiment and the so-called Mathematical Objection. On the other, we consider objections to the claim that the test is a reasonable test for intelligence, namely the Lovelace Objection and the methodological objections of Ford and Hayes. Finally, we sketch some recent reactions to these objections (from papers written in the last decade), with a view to illustrating that the issues raised by the Turing Test have not been settled and visionary work is still being produced. In particular, we consider Rapaport's more recent reply to Searle's Chinese Room; Abramson's contention that Turing's account of the Mathematical Objection is frequently misunderstood; the contention of Bringsjord et al. that the Lovelace Objection indicates a better, alternative test for intelligence; and Moor's account of the continuing importance of the Turing Test for AI.
  • Oct 23: Bruno Woltzenlogel Paleo (Vienna University of Technology), Elimination and Introduction of Cuts by Resolution
    Materials: slides
    Abstract (+/-):
    The use of cuts in formal sequent calculus proofs corresponds to the use of lemmas in mathematical proofs. Therefore, algorithmic methods for the elimination and introduction of cuts might allow us to automate the process of simplifying and structuring proofs. In this talk, I will present the CERES (cut-elimination by resolution) method, which works by extracting an unsatisfiable set of clauses from a proof with cuts and refuting this clause set by resolution. The obtained refutation then serves as a skeleton for a proof with only atomic cuts. Afterwards, I will show how to modify CERES in order to obtain a method of cut-introduction by resolution: CIRES.
  • Oct 30: Dana Scott (Berkeley, visiting), The Lambda-Calculus and Enumeration Operators
    Materials: slides (1), slides (2), handout
    Abstract (+/-):
    The idea of enumeration operators originally introduced by Myhill & Shepherdson and by Friedberg & Rogers can be turned into a modeling of lambda-calculus by developing the theory of an elementary notion of continuous operators on the power set of the integers. The easy construction of the model and the connection with recursion theory and enumeration degrees will be shown. On top of the lambda-calculus it is also possible to make a straight-forward model of Martin-Löf Type Theory. All of this is -- in some sense -- well known, but the elementary nature of the the modeling ought to be exploited more.
  • Nov 6: Dominic Hughes (Stanford, visiting), A Minimal Classical Sequent Calculus Free of Structural Rules
    Abstract (+/-):
    This talk summarizes a recent paper in Annals of Pure and Applied Logic. It presents a right-sided formulation M of Gentzen's classical sequent calculus LK, without explicit structural rules for contraction and weakening. A minimality theorem holds for the propositional fragment Mp: any propositional sequent calculus S (within a standard class of right-sided calculi) is complete if and only if S contains Mp (that is, each rule of Mp is derivable in S). Thus one can view M as a minimal complete core of Gentzen's LK.
  • Nov 13: Sam Sanders (Ghent), Reuniting the antipodes: bringing together Nonstandard Analysis and Constructive Analysis
    Materials: slides
    Abstract (+/-):

    Constructive Analysis (aka BISH) was introduced by Errett Bishop as a constructive redevelopment of Mathematics in the spirit of the intuitionistic tradition. As such, BISH is based on the BHK (Brouwer-Heyting-Kolmogorov) interpretation of logic, where the notions of `proof' and `algorithm' are central. Constructive Reverse Mathematics (CRM) is a spin-off from Harvey Friedman's `Reverse Mathematics' program where the base theory is based on BISH.

    In this talk, we introduce an interpretation of BISH in classical Nonstandard Analysis. The role of `proof' in this interpretation is played by the Transfer Principle of Nonstandard Analysis. The role of `algorithm' is played by `$\Omega$-invariance'. Intuitively, an object is $\Omega$-invariant if it does not depend on the *choice* of infinitesimal used in its definition. Using this new interpretation, we obtain many of the well-known results form CRM. In particular, all non-constructive principles (like LPO, LLPO, MP, etc) are interpreted as Transfer Principles, which are not available in our (nonstandard) base theory.

  • Nov 27: Grigori Mints (Stanford), Analysis + Direct Chaining for Proof Search
    Abstract (+/-):

    (Joint work with Shane Steinert-Threlkeld.)

    The ADC method of proof search in propositional natural deduction proposed in 2000 proceeds bottom up first by Analyzing the goal formula into sub-goals by applying all possible introduction rules, and then by checking if each of these sub-goals can be established using only elimination rules (Direct Chaining). This looks simpler than the worst case complexity (PSACE) of the derivability problem for intuitionistic propositional logic. We investigate the complexity of ADC for various fragments. ADC derivability is polynomially decidable for the &,->-fragment by a generalization of a familiar direct chaining algorithm for Horn formulas. Adding disjunction leads to a CoNP-complete fragment. A short counterexample in case the goal sequent is not ADC-derivable is provided by witnessing all relevant disjunctions. Adding constant "false" preserves CoNP-completeness.

  • Dec 4: Edward N Zalta (Stanford), Foundations for Mathematical Structuralism
    Materials: paper
    Abstract (+/-):

    (Joint work with Uri Nodelman, forthcoming in Mind)

    We investigate the form of mathematical structuralism that acknowledges the existence of structures and their distinctive structural elements. This form of structuralism has been subject to criticisms recently, and our view is that the problems raised are resolved by proper, mathematics-free theoretical foundations. Starting with an axiomatic theory of abstract objects, we identify a mathematical structure as an abstract object encoding the truths of a mathematical theory. From such foundations, we derive consequences that address the main questions and issues that have arisen. Namely, elements of different structures are different. A structure and its elements ontologically depend on each other. There are no haecceities and each element of a structure must be discernible within the theory. These consequences are not developed piecemeal but rather follow from our definitions of basic structuralist concepts.

  • Dec 11: Dag Westerståhl, Constants and Consequence Relations
    Materials: slides, paper
    Abstract (+/-):

    (Joint work with Dennis Bonnay)

    The main idea we pursue is that of extracting (logical) constants from (logical) consequence relations, one motivation being that speakers have a better intuitive grasp of what follows from what than they have of what makes a word logical. In B & W (2012) we described one such method of extraction, and showed that in a precise sense (that of a Galois connection), it is an inverse to the usual Bolzano-Tarski definition of consequence relative to a chosen set of symbols. The extraction method was syntactic: u is a constant wrt |- iff the validity of some |- inference is destroyed by replacing/reinterpreting u. Now we look at two semantic extraction methods, one of them being that u is a (logical constant if its interpretation is fully determined by the rules for its use (relative to a class of structures); a more familiar idea. The second semantic method turns out to be a semantic counterpart to the syntactic method. The first semantic method allows a rough distinction between logical and analytic consequence. We study relations between these methods, and more generally among mappings between (a) consequence relations, (b) sets of symbols, (c) classes of structures.

Winter

  • Jan 8: Ulrik Buchholtz (Stanford), Univalent Foundations and the Structure Identity Principle
    Materials: slides
    Abstract (+/-):

    Vladimir Voevodsky's Univalent Foundations program, currently the focus of an IAS special year, aims to develop a comprehensive, computational foundation for mathematics based on the homotopy-theoretical interpretation of type theory. The proposed Univalence Axiom implies a strong Structure Identity Principle: isomorphic structures are identical.

    In my talk I will introduce the homotopy-theoretical interpretation of type theory, the univalence axiom and structure identity principle.

  • Jan 15: Maarten McKubre-Jordens (Canterbury), Solving Dirichlet's Problem Constructively
    Materials: slides
    Absract (+/-):
    We critique commonly employed methods for solving problems in potential theory from a constructive perspective. To highlight the source of non-constructivity, we provide some Brouwerian examples in the general case of the Dirichlet problem, which show that the existence of solutions is essentially non-constructive. We also outline some extra hypotheses which, despite being classically trivial, provide sufficient conditions for the constructive existence of weak solutions to the Dirichlet problem.
  • Jan 22: Aran Nayebi (Stanford), On the elimination of the bounded universal quantifier for Diophantine predicates
    Materials: slides
    Abstract (+/-):
    The transformation of formulas containing bounded universal quantifiers into equivalent formulas containing only existential quantifiers in the theory of Diophantine equations generally introduces a large number of variables, resulting in a Diophantine equation that is often unwieldy and not practical to write down. Here, we develop several ways of conserving the large number of variables typically introduced and demonstrate the applicability of these methods to unprovability theory, obtaining a relatively concise exponential Diophantine representation of the combinatorial principle of the Paris-Harrington theorem for pairs.
  • Jan 29: Grigori Mints (Stanford): Several Interesting Problems from Applied Logic
    Materials: slides
    Abstract (+/-):
    Problems discussed here arise in the context of ``safe programming'' when a program is required to be verified.
    1. Saving Propositional Proofs
    2. Exponential Lower Bounds for Proofs in QBF (second order propositional logic) with Cut
    3. Solving Logical Equations in Arithmetic
  • Feb 5: Itay Neeman (UCLA): Forcing Axioms
    Abstract (+/-):

    Forcing axioms are statements about the existence of sufficiently generic filters over partially ordered sets (posets). Given a class ${\cal F}$ of posets, and a cardinal $\kappa$, the forcing axiom associated to ${\cal F}$ and $\kappa$ states that for any ${\mathbb P} \in {\cal F}$, and any collection ${\cal A}$ of $\kappa$ dense subsets of ${\mathbb P}$, there is a filter on ${\mathbb P}$ that meets all sets in ${\cal A}$.

    For several decades, forcing axioms have been used as center points for consistency proofs. A forcing axiom is shown consistent, typically through an iterated forcing construction, and then the consistency of other statements can be established by deriving them from the axiom.

    In my talk I will present some of the most important forcing axioms, including Martin's Axiom (MA), and the Proper Forcing Axiom (PFA). I will discuss some key points in their consistency proofs, and difficulties in adapting the proofs to obtain analogues of PFA that involve meeting more than $\aleph_1$ dense sets. I will end with recent work on new consistency proofs, and higher analogues of PFA.

  • Feb 12: no seminar
  • Feb 19: Ulrik Buchholtz (Stanford), The Unfolding of ID1
    Abstract (+/-):
    I will relate the results of my investigations into the proof theory of unfolding of ID1, the schematic system of one arithmetical inductive definition. The unfolding of a schematic system S seeks to answer the question: Given S, which operations and predicates, and which principles concerning them, ought to be accepted if one has accepted S?
  • Feb 26: Benno van den Berg (Utrecht), Nonstandard Analysis and Proof Theory
    Materials: slides
    Abstract (+/-):
    This talk will consist of two parts. First I will discuss some proof-theoretic work on systems for nonstandard analysis (with a particular emphasis on conservation results). After that I will talk about some work I have been doing with Eyvind Briseid and Pavol Safarik on functional interpretations of nonstandard systems.
  • Mar 5: Wes Holliday (Berkeley) and Thomas Icard (Stanford), Logic, Probability, and Epistemic Modality
    Abstract (+/-):
    What is the relation between ordinary talk using 'probably' and 'at least as likely as' and the mathematical theory of probability? Is Kolmogorovian probability implicated in the semantics of these expressions? Until recently, the answer was thought to be negative. Recently, however, several linguists and philosophers have suggested that using probability-based models is in fact necessary, in order to make the right predictions about entailment between sentences involving such expressions. We address this question of whether probability is necessary from a logical point of view, connecting the current literature to an older literature on qualitative probability, and proving several new completeness theorems. In particular, we axiomatize the current proposals in the literature, as well as a number of alternative semantic models, including: various types of fuzzy measures, qualitative orderings, and orderings on events derived from orderings on world-states. We believe the resulting logical landscape raises some foundational questions at the heart of the logical approach to natural language semantics, and we will briefly discuss these issues as well.
  • Mar 12: Stanley Wainer (Leeds): Fast Growing Functions and Arithmetical Independence Results
    Materials: slides
    Abstract (+/-):
    We explore the role of the function $a+2^x$ and its generalisations to higher number classes, in supplying complexity bounds for the provably computable functions across a broad spectrum of (arithmetically based) theories. We show how the resulting ``fast growing'' subrecursive hierarchy forges direct links between proof theory and various combinatorial independence results - e.g. Goodstein's Theorem (for Peano Arithmetic) and Friedman's Miniaturised Kruskal Theorem for Labelled Trees (for $\Pi^1_1$-CA$_0$). Ref: Schwichtenberg and Wainer, ``Proofs and Computations'', Persp. in Logic, CUP 2012.
  • Mar 19: Matthias Schirn (LMU Munich): Frege: Logical Objects by Abstraction and Their Criteria of Identity
    Materials: handout
    Abstract (+/-):
    In Grundlagen, Frege suggests the following strategy for the envisaged introduction of the real and complex numbers as logical objects: it should proceed along the lines of his introduction of the cardinals, namely by starting with a tentative contextual definition of a suitable number operator in terms of an abstraction principle whose right-hand side (a second-order or higher-order equivalence relation) was couched in purely logical terms, and by finally defining these numbers as equivalence classes of the relevant equivalence relation. In this talk, I discuss Frege’s introduction of the cardinals as well as his projected introduction of the “higher” numbers with special emphasis on the so-called Julius Caesar problem. Moreover, I analyze the variant of this problem that Frege faces in Grundgesetze when he comes to introduce his prototype of logical objects, namely coursesof-values, by means of a stipulation later to be enshrined in Basic Law V. The problem is now clad in formal garb. A second main issue, closely related to the first, is the nature and role of Frege’s criteria of identity for logical objects. I argue, among other things, (i) that pursuing the above strategy for the introduction of the higher numbers would inevitably lead to a whole family of possibly intractable Caesar problems; (ii) that in Grundlagen Frege considers the domain of the first-order variables to be all-encompassing, while in Grundgesetze there is a serious conflict between (a) his practice of taking the first-order domain to be all-inclusive when he elucidates and defines first-level functions and (b) the assumption underlying his proof of referentiality in §31 that the domain is restricted to the two truth-values and coursesof-values; (iii) that the range of applicability of the criteria of identity inherent in Hume’s Principle and in Axiom V — equinumerosity and coextensiveness of first-level concepts (functions) — lack the requisite unbounded generality even after having undergone considerable extension; (iv) that regarding the Caesar problem for cardinal numbers and its alleged solution the situation in Grundgesetze differs significantly from that in Grundlagen. If time allows, I shall also respond to the hypothetical question whether under certain conditions and especially in the aftermath of Russell’s Paradox Frege could have introduced the real numbers via an appropriate abstraction principle without offending against his logicist credo.

Spring

  • Apr 9:Grigori Mints (Stanford and SRI): Kripke Models, Proof Search, and Cut-Elimination for Intuitionistic Logic
    Materials: slides
    Abstract (+/-):
    Existing Schutte-style completeness proofs for intuitionistic predicate logic with respect to Kripke models provide cut-elimination only for some semantic tableau formulations. Beth models extend this to multiple-succedent Gentzen calculus, but simple translation back to familiar one-succedent Gentzen calculus LJ introduces cuts. We present a short (non-effective) proof of completeness for Kripke models and cut-elimination for LJ.
  • Apr 16: Leon van der Torre (Luxembourg): Input/Output Logic: Motivation, Theory, Challenges
    Materials: slides
    Abstract (+/-):
    In a range of contexts, one comes across processes resembling inference, but where input propositions are not in general included among outputs, and the operation is not in any way reversible. Examples arise in contexts of conditional obligations, goals, ideals, preferences, actions, and beliefs. Makinson and van der Torre's input/output logic was introduced in 2000 as a theory of such input/output operations. In this talk we review the motivation, theory, and current challenges.
  • Apr 23: Brett Wines (Stanford), A Universal Approach to Self-Referential Paradoxes, Incompleteness, and Fixed Points
    Abstract (+/-):
    We present the ideas formulated in papers by Noson Yanofsky and W. Bill Lawvere: a generalized schema applicable to description of diagonal proofs. We then apply it to several examples and paradoxes (Richard's Paradox, Berry's Paradox, etc.). Proofs making use of self-referential techniques - both familiar arguments (Cantor, Gödel-Rosser, Turing, Tarski, etc.) in addition to simple, but lesser-known arguments - are translated into this format.
  • Apr 30: Ryan Williams (Stanford CS), Recent Computational Lower Bounds in Complexity Theory via Diagonalization
    Abstract (+/-):
    For a long time, the general feeling in computational complexity theory has been that methods based on diagonalization would not be powerful enough to prove strong {\em lower bounds}, i.e., limitations on the ability of Turing machines to efficiently solve interesting problems. Recent work has shown that diagonalization can be fruitfully combined with other ideas to prove new lower bounds, such as time-space lower bounds for the satisfiability problem and circuit size lower bounds for nondeterministic exponential time. I will survey the general ideas and thought processes that are used to prove these lower bounds. Ultimately the ideas are algorithmic in nature: one assumes that too-good-to-be-true algorithms exist, and by designing new algorithms from these too-good algorithms, one eventually derives a contradiction to the assumption via diagonalization.
  • May 7: Solomon Feferman (Stanford), Bernays, Gödel, and Hilbert's consistency program
    Abstract (+/-):
    Paul Bernays was brought from Zürich to Göttingen in 1917 by David Hilbert - the leading mathematician of the time - to assist him in developing his consistency program for the foundations of mathematics. The major exposition of that work appeared in the 1930s in the two volume opus by Hilbert and Bernays, Grundlagen der Mathematik, whose preparation was due entirely to Bernays. In the meantime, Kurt Gödel, a precocious doctorate in Vienna, had discovered his remarkable incompleteness theorems which threatened to undermine Hilbert’s program. Though Hilbert refused to accept that, Bernays undertook to absorb the significance of those theorems through correspondence with Gödel. Thus began one of Gödel's most important intellectual and personal life-long relationships.
  • May 14: Gordon Plotkin (Edinburgh), A universal categorical account of the partial recursive functions
    Abstract (+/-):

    There is a well-known categorical account of the primitive recursive functions, where suitable universal conditions in cartesian categories ensure the representability of all primitive recursive functions. In general terms, one considers objects with a "zero element" and a "successor" morphism, and calls them "natural numbers objects" if they are the initial such structures. Initiality properties of this kind ensure the closure of the representable functions under the scheme of primitive recursion.

    However it seems that there has been no similar categorical treatment of the partial recursive functions via universal conditions. We show that finality yields closure of the (weakly) representable partial functions under Kleene's scheme of mu-recursion. A technical difficulty presents itself: categories of partial functions are not cartesian. To overcome this we use ideas of Par\'{e} and Rom\'{a}n, and work with natural numbers objects in monoidal categories. If these are also final, in a suitable sense, then all partial recursive functions are weakly representable.

    Things are not so pleasant as regards the strong representability of the partial recursive functions. It turns out that, assuming slightly stronger conditions, all such functions with recursive graphs are strongly representable, but those which do not have recursive graphs need not be. We give a counterexample demonstrating this; it is constructed as the syntactic category of an extension of Peano arithmetic obtained using an old idea of Jokusch and Soare.

  • May 17 (12:00-1:15): Jennifer Chubb Reimann (University of San Francisco), Natural relations and functions on computable structures
    Abstract (+/-):
    A mathematical structure is computable when its universe is computable as a set and all basic operations and relations in the structure are uniformly computable. In the study of such structures we are often interested in assessing the degree to which complex information can be "coded into" non-basic, but nevertheless natural, relations and functions on the structure. In this talk, we will see some interesting examples in this vein. In particular, we will consider natural relations (ordering relations) on computable algebraic structures and a natural function (the distance function) on computable graphs.
  • May 21: Shane Steinert-Threlkeld (Stanford), Iterating Semantic Automata
    Materials: slides, paper
    Abstract (+/-):
    The semantic automata framework, developed originally in the 1980s, provides computational interpretations of generalized quantifiers. While recent experimental results have aligned structural features of these automata with the neuroanatomical demands in processing sentences with quantifiers, the theoretical framework has remained largely unexplored. In this talk, after presenting some classic results on semantic automata in a modern style, we present the first application of semantic automata to polyadic quantification, exhibiting automata for iterated quantifiers. We will briefly discuss the role of semantic automata in linguistic theory and offer new empirical predictions for sentence processing with embedded quantifiers.
  • May 28: Dick de Jongh (Amsterdam), Heyting Arithmetic and Intermediate Logic
    Abstract (+/-):
    It is proved that all intermediate logics L (propositional logics between intuitionistic logic IPC and classical logic), that have the finite model property have the following maximality property: The logic of the propositional schemes valid in Heyting Arithmetic plus L is exactly L. This is a generalization of the old theorem that the logic of Heyting Arithmetic is exactly intuitionistic logic. This work was in cooperation with Rineke Verbrugge and Albert Visser.

Last modified