This seminar by vixey took place on 7th December 2008 20:00 UTC, in #mathematics channel of irc.freenode.net.

The timestamps displayed in the IRC text are -8 UTC.

Any explanation or errors regarding this seminar should be recorded in the discussion page.

## Seminar Edit

[12:06] < vixey> so we are primarily looking at the mechanics of a family of proof checking systems for constructive math [12:07] < vixey> You'll all have seen Set Theory and probably Category Theory as a foundation for mathematics [12:07] < vixey> where we encode tuples as {{a},{a,b}}, functions as sets of tuples, etc.. [12:09] < vixey> I should give some motivation actually [12:10] < vixey> Some theorems that have been formalized in type theory are 4 color theorem (Ill give some reasons it's especially practical for things like this later), Godels theorems and some computer stuff e.g. Fast Fourier Transform, proof checker for COC with one universe [12:11] < vixey> So I'll start by a couple diversions, first untyped lambda calculus and then a little on combinatory logic [12:11] < vixey> (And do just interrupt me if I say anything that doesn't make sense or make any mistakes) [12:12] < vixey> so thee syntax for untyped lambda calculus (I'll call lam) is: <lam> ::= <var> | <lam> <lam> | \<var>, <lam> [12:12] < vixey> so terms might look like, \x -> x, \x y -> y x, \u -> u u [12:13] < vixey> \ is supposed to look like a lambda symbol, I just chose not to use unicode for this [12:14] < vixey> I'm not going to be tediously formal because the idea is quite simple but the first relation I need is alpha-equivalence [12:14] < vixey> \x y -> y x is alpha equivalent to \u v -> v u in general alpha equivalence means that I renamed some variables [12:16] < vixey> oops I've been typing -> instead of , (going to start using , instead -> like I meant to..) [12:16] < vixey> so next I need beta reduction for terms, which is just substitution, For example (\x, \y, x y x) e ~beta~> \y, e y e [12:17] < vixey> and I can identify terms as beta equivalent, if there is any sequence of beta reductions to which the become equal [12:18] < vixey> so for example \i, i is alpha-beta equivalent to (\o, (\u v, v) o o) because (\o, (\u v, v) o o) ~beta~> \o, o [12:19] < vixey> There's also the notion of eta equivalence, that is any function f is equivalent to \x, f x (and eta equivalence will be the transitive closure of that) [12:19] < vixey> This untyped calculus is really not a useful proof system though, because 1) There's no meaning to the terms 2) alpha-beta-eta equality is not decideable [12:20] < vixey> so diversion 2 is the S K combinatory logic [12:20] < vixey> we take two axioms, [12:20] < vixey> K: A => (B => A) [12:20] < vixey> S: (A => (B => C)) => ((A => B) => (A => C)) [12:21] < vixey> and you can combine them (modus ponens) [12:21] < vixey> so for example [12:21] < vixey> SK: ((A => B) => (A => A)) [12:21] < vixey> SKK: (A => A) [12:22] < vixey> S & K is actually a complete basis for the implicational fragment of intuitionistic logic (Gentzens LJ) [12:23] < vixey> We can actually assign types to lambda terms in a very simple way though [12:24] < vixey> There will be a rule for each var app and lam production of the syntax earlier [12:25] < vixey> in the natural deduction style, [12:25] < vixey> Gamma |- M : A -> B [12:25] < vixey> Gamma |- N : A [12:25] < vixey> -------------------- [12:25] < vixey> Gamma |- M N : B [12:25] < vixey> (I better give a syntax for simple types, <type> ::= <var> | <type> -> <type>) [12:26] < vixey> this rule says that in a type context Gamma, give that M has the type A -> B, and N has type A, then the application MN has type B [12:26] < vixey> for the lambda abstraction the rules are: [12:26] < vixey> Gamma; x : A |- M : B [12:26] < vixey> ---------------------- [12:26] < vixey> Gamma |- \x, M : A -> B [12:27] < vixey> which says that if M has the type B in the context Gamma extended with x given some assumed type A, then \x, M has type A -> B [12:27] < vixey> the variable case means you just take the type from the context: [12:27] < vixey> Gamma; x : T; Delta |- x : T (x not in Delta) [12:28] < vixey> We can now define the S and K combinators using this language, called simply typed lambda calculus [12:28] < vixey> K = \x,\y,x [12:28] < vixey> S = \m,\n,\x,mx(nx) [12:28] < vixey> I'll show the type derivation for S [12:29] < vixey> m : A -> B -> C, n : A -> B, x : A |- mx : B -> C [12:29] < vixey> m : A -> B -> C, n : A -> B, x : A |- nx : B [12:29] < vixey> m : A -> B -> C, n : A -> B, x : A |- mx(nx) : C [12:29] < vixey> m : A -> B -> C, n : A -> B |- \x, mx(nx) : A -> C [12:29] < vixey> m : A -> B -> C |- \n,\x,mx(nx) : (A -> B) -> (A -> C) [12:29] < vixey> |- \m,\n,\x,mx(nx) : (A -> B -> C) -> (A -> B) -> (A -> C) [12:30] < vixey> So now this lambda notation is another way to write proofs, as the S and K Hilbert system was [12:30] < vixey> This (simply typed lambda) calculus has lots of nice properties, the one that really matters is strong normalization. [12:30] < vixey> meaning that every term reduces down to a unique normal form [12:31] < vixey> Turing has shown that this is strongly normalizing lexicographic induction on a list of types sorted by complexity, but you can also do induction on the size of the type derivation and various other ways. [12:31] < vixey> The consequence is that alpha-beta-eta equality is decideable. [12:32] < vixey> viewed as a programming language, you can express booleans, pairs, numbers and addition subtraction, exponetiation. [12:32] < vixey> infact the proof of P -> P, is the identity function [12:33] < vixey> that is given i : P -> P, then i x ~beta~> x [12:33] < vixey> also a proof of (C <- B) -> (B <- A) -> (C <- A) is the 'o' combinator for function composition (you know "f o g") [12:33] < vixey> This general idea of relating proofs with programs and their types (or meaning) with logical statements is called the Curry-Howard Isomorphims or BHK interpretation. [12:34] < vixey> So this simple typed calculus <=> (implicational) LJ [12:34] < vixey> But it's not very _useful_, you can't express second order logic, or even first order [12:35] < vixey> if you want to prove statements like, P(x) -> P(y) for example, Judgements about formation of sentences are required (things like P : * -> *; x : * |- P(x) : *) [12:35] < vixey> (I'm using * being the type of types in this example) [12:36] < vixey> That still wouldn't really be enough so I wont dwell on that idea [12:36] < vixey> We want to express statements like f : R^n -> R^m, so we really need to allow values (or proofs) inside types/statements [12:37] < vixey> (so (^) is a function * -> N+ -> *, X^1 = X, X^2 = (X,X)) [12:37] < vixey> That is, to introduce and eliminate arrows from not only proofs to proofs, but types to types, proofs to types and types to proofs [12:39] < vixey> So to do that I have to generalize the simple typed calculus to something, this means figuring out 3 things [12:39] < vixey> 1) generalize the X -> Y into the dependent product ||(x:X), Y [12:39] < vixey> (I'm using || as ASCII notation for the conventional uppercase pi symbol) [12:40] < vixey> 2) fold the value and the type level (either side of the : judgement) into one, so that the rules can apply to terms on either side [12:40] < vixey> 3) make sure it still flies (i.e. that there is no proof of False in the theory) [12:41] < vixey> I should probably mention something about the weird terminology the arrow was generalized into a 'product' [12:41] < vixey> That term comes from looking at the product ||x,P(x) as an element from forall x, P(x), P(1) and P(2) and P(3) and ... [12:42] < vixey> I'll just show how a couple of the rules for making the type judgement changed, [12:42] < vixey> G; (x:A) |- M : B G |- ||x:A, B [12:42] < vixey> ---------------------------------- lam [12:42] < vixey> G |- (\x:A, M) : (||x:A, B) [12:43] < vixey> the lambda case is very similar except that x might occur inside B now [12:43] < vixey> also not every product '||x:A, B' is allowed, so G |- ||x:A, B is there too now [12:44] < vixey> and especially * : * is not allowed, this would allow a paradox similar to the famous set of all sets paradox. [12:44] < vixey> really each * has an subscript index with *[0] : *[1], *[1] : *[2] but I'm going to leave all these implicit [12:45] < vixey> the next interesting change is the app rule; [12:45] < vixey> G |- f : (||x:A, B) a : A' A =alpha-beta= A' [12:45] < vixey> ------------------------------------------------- app [12:45] < vixey> f a : B[x := a] [12:45] < vixey> A =alpha-beta= A' means that the terms A and A' have to convertible [12:46] < vixey> and B[x := a] means that the value a is substituted inside the type B [12:46] < vixey> (it's in the same way that beta reduction happens) [12:46] < vixey> When you think about Curry-Howard again, this app rule is actually the cut rule [12:47] < vixey> infact cut elimination is strong normalization [12:47] < vixey> And I don't know if anybody is wondering why I left out eta, [12:48] < vixey> It causes a problem though, this term \x:A,(\x:B,x)x reduces into two different normal forms with eta [12:48] < vixey> \x:A,(\x:B,x)x ~beta~> \x:A,x [12:48] < vixey> \x:A,(\x:B,x)x ~eta~> \x:B,x [12:48] < vixey> so we've lost strong normalization (and more directly the church-rosser property, different reductions of the same term can always be reduced more into equivalent forms) [12:49] < vixey> Since that was all awfully abstract, I'll put it into use a bit.. and show how to build some blocks of logic up with those [12:50] < vixey> Define False = ||(P:*), P that is the absurd statement for any statement P, P is provable [12:51] < vixey> The consistency argument for this calculus (CoC) is that, since every term has a normal form, There is clearly no normal form with type False [12:51] < vixey> And negation can be defined: Not P = P -> False [12:51] < vixey> (I'm using X -> Y as shorthand for ||(_:X),Y) [12:52] < vixey> This is still an constructive logic so we wont get excluded middle or the equivalent, Not (Not P) -> P [12:53] < vixey> but Not (Not (Not P)) -> Not P _is_ provable [12:53] < vixey> You can actually embed classical reasoning in the intuitionistic logic by double negation [12:53] < vixey> Here's the actualy proof term: [12:53] < vixey> \(N : Not (Not (Not P))), \(p : P), N (\(N': Not P), N' p) : Not (Not (Not P)) -> Not P [12:54] < vixey> You can also express equality, Eq A x y = ||(P:A -> *), P(x) -> P(y) (by Liebniz) [12:54] < vixey> logical conjunction, and A B = ||(C:*), (A -> B -> C) -> C (given A B : *) [12:55] < vixey> To actually construct (or introduce) a conjuction: conj a b = \C:*, \z:A->B->C, z C x y [12:55] < vixey> and to take projections, pi1 p = p A (\a, \b, a) pi2 p = p A (\a, \b, b) [12:55] < vixey> Even the natural numbers are definable in this calculus, but there is a problem we can't prove induction! :( [12:56] < vixey> For a stronger theory (in which we can encode real mathematics like algebra, calculus, real analysis, ..) [12:56] < vixey> You can allow defining new terms (as long as they fit some criteria) in the same way that you add new datatypes to a programming language [12:57] < vixey> (and they each get an induction principles derived with them) [12:57] < vixey> by Curry-Howard this means that data-constructors are logical introduction rules and the induction principle/elimination rules correspond to recursion combinators [12:57] < vixey> so in the style of peano, declaring N : * with constructors [12:57] < vixey> zero : N [12:57] < vixey> succ : N -> N [12:58] < vixey> the derived induction scheme (I'll call it N!) is: N! : ||(P:N->*), (P zero) -> (||(n:N), P n -> P (succ n)) -> ||n, P n [12:58] < vixey> I'll try to illustrate the anatomy of it with some ASCII diagram.. [12:59] < vixey> N! <-- eliminator [12:59] < vixey> : ||(P:N->*), <-- motive [12:59] < vixey> (P zero) -> /___ methods [12:59] < vixey> (||(n:N), P n -> P (succ n)) -> \ [12:59] < vixey> ||n, P n <-- target [12:59] < vixey> (took that terminology from Elimination with a Motive) [12:59] < vixey> Im' sort of running out of time (I though I'd have 10 more mins..) [12:59] < vixey> but you can define functions like addition: x + y := N! (\_, N) y (\_ m, succ m) x [13:00] < vixey> All of the normal logical connectives can be defined similary, [13:00] < vixey> False with no constructors leads to the induction principle: [13:00] < vixey> False! :||(P:False -> *), ||(f : False), P f i.e. for any P:*, False -> P [13:01] < vixey> You can even reflect the alpha-beta conversion relation into a term of the type theory [13:01] < vixey> Eq: ||(A:*), A -> A -> * with the constructor (or introduction rule) reflexivity : ||(A:*), ||(x:A), Eq A x x [13:01] < vixey> Now we can prove that 1 + 1 = 2! [13:01] < vixey> reflexivity N 2 : Eq N (1 + 1) 2 [13:02] < vixey> This (:) judgement is derivable because both (1 + 1) and 2 are alpha-beta convertable to 2 [13:02] < vixey> I'll just wrap with an induction proof then [13:03] < vixey> in the same way as 1 + 1 = 2, One can show, forall x, 0 + x = x, \x, reflexivity N x : ||x, Eq N (0 + x) x [13:03] < vixey> but not ||x, Eq N (x + 0) x [13:03] < vixey> IT has to be split into the case 0 + 0 and assuming x + 0 = x, then show succ x + 0 = succ x [13:04] < vixey> N! (\x, Eq N (x + 0) x) [13:04] < vixey> (reflexivity N 0) [13:04] < vixey> (\x eql, Eq! N (x + 0) (\x', Eq N (succ (x + 0)) (succ x')) (reflexivity N (succ (x + 0))) x eql) : ||x, Eq N (x + 0) x [13:05] < vixey> that's the actual (quite complex) term, which is using the Eq elimination rule -- which corresponds to substitution (just what you'd use to prove symmetry and transitivity) [13:05] < vixey> I was going to say a bit about proof by reflection but I think I better stop now, being 6 mins overtime :)