On this page:
1 Purpose
2 Outline
3 More on pattern matching
4 Back to And
8.7

Lecture 15: Propositional logic & programming

1 Purpose

More propositional logic, more practice programming, begin with tactics

2 Outline

As noted before, logical negation is defined in terms of False, and notation is provided as:

notation:max "¬" p:40 => Not p
#print Not -- def Not : Prop → Prop := fun a => a → False

That is, ¬P holds if P -> False. This is a constructive view of negation: the negation of P is true if, from P, you can construct a proof of False. It is also why we end up using False a lot in proofs, even while we rarely use True, since whenever we are reasoning about negation, we end up reasoning about False.

One obvious consequence of this is that the statement And P (Not P) is nonsense: a statement can never be both true and false. If we unfold the definition of Not, we see that what that statement is saying is that P -> False and P. Thus, we can project out the P -> False function and the P, and apply the first to the second and get a proof of False.

variable (P : Prop)
theorem F : (And P (Not P)) -> False :=
  fun andpnotp => andpnotp.2 andpnotp.1

Now that we have a proof of False (a term that has type False), we truly are in an impossible situation, as if you recall, there are no terms of type False. It is defined as an inductive type with no constructors:

inductive False : Prop

As a result, there is an elimination rule (we will talk more about these soon) that says, if we have a proof of False, we can conclude anything.

#check False.elim -- False.elim : False → ?m.2

For example, if we wanted to prove the statement 1 = 2 (which is a valid statement, but not provable), and we knew And P (Not P), we could do that as follows:

theorem T : (And P (Not P)) -> 1 = 2 :=
  fun andpnotp => False.elim (andpnotp.2 andpnotp.1)

i.e., we first pull out the two components of the conjunction, applying the second to the first, which gives us a term of type False. We can then use False.elim, which takes something of types False and returns anything! In particular, it will produce a proof of 1 = 2 in this case.

3 More on pattern matching

So far, the functions we’ve shown have done pattern matching on a single argument. While you could certainly nest that, more often, you want to match simultaneously on multiple values. You can do that by separating the patterns (and, in a match expression, the values) with commas, as:

def bool_and : Bool -> Bool -> Bool
| true, true => true
| true, false => false
| false, true => false
| false, false => false

def bool_and1 (b1 : Bool) (b2 : Bool) : Bool := match b1, b2 with
| true, true => true
| true, false => false
| false, true => false
| false, false => false

def bool_and2 (b1 b2 : Bool) : Bool := match b1, b2 with
| true, true => true
| true, false => false
| false, true => false
| false, false => false

def bool_and3 (b1 b2 : Bool) : Bool := match b1, b2 with
| true, true => true
| _,_ => false

def bool_and4 (b1 b2 : Bool) : Bool := match b1, b2 with
| true, true => true
| true, false 
| false, true 
| false, false => false
4 Back to And

We’ve been using logical conjunction, i.e., And P Q where P and Q are Props. We wrote a function and_1 that took in a pair of propositions and returned the first one:

variable (Q : Prop)
def and_1 (cn : P  Q) : P := cn.1

Now, we could also have written a function that took a pair of booleans and returned the first one:

def band_1 (pr : Bool × Bool) :=  pr.1

Note a few differences: first, rather than having a conjunction of two arbitrary logical statements (a pair of two Props), we have a pair of Bools. To do this, we are using a different infix operator: logical is for types of type Prop, while × is for types of type Type. If you aren’t sure what a notation means, you can right click on it and say "Go to Definition", in this case, it takes us to the syntax file, where we can see that × is defined as Prod. So we can write this definition equivalently as:

def band_2 (pr : Prod Bool Bool) :=  pr.1

And if we go to definition on Prod, we can see that it is defined as:

structure Prod (α : Type u) (β : Type v) where
  /-- The first projection out of a pair. if `p : α × β` then `p.1 : α`. -/
  fst : α
  /-- The second projection out of a pair. if `p : α × β` then `p.2 : β`. -/
  snd : β

As in, it is structurally identical to logical And, but defined on Type, not Prop.

We can use this, for example, to write a more interesting function, this time showing how to define local functions or definitions:

def fibFast (n : Nat) : Nat :=
  (loop n).2
where
  loop : Nat  Nat × Nat
    | 0   => (0, 1)
    | n+1 => let p := loop n
             (p.2, p.1 + p.2)

#eval fibFast 100

Here we have used yet another special syntax: (a,b) as a way to write product literals (as an alternative to Prod.mk a b).

Returning to our logical statement, there are different ways to write the same statement. For example, to make this read a little more like a logical statement \forall \,P\, Q. P \land Q \rightarrow Q, we could write:

def and_2 : forall {P Q : Prop}, P  Q -> Q := fun x => x.2

First, we give a name to our theorem, and_2. This name is then given a type, which is the Theorem statement: forall {P Q : Prop}, P /\ Q -> Q. Unlike in the definitions we saw last time, we aren’t using variables: instead, we are taking the two Props as arguments. We have written them (P and Q) inside curly brackets instead of parenthesis to tell Lean that they should be able to be inferred from the rest of the arguments: in this case, p. This is a convenience that will make calling and_2 easier, since the type of p has P and Q in it, Lean is perfectly capable of figuring out what those arguments should be. Finally, we have the term, which is the proof; it takes in a pair, and projects out the second component. This difference in presentation doesn’t change what the underlying term is, but it may make it easier to read. Sometimes we will try to emphasize the programming part and sometimes the logical part: both are, indeed, the same thing.

Now, note that our term doesn’t take the P or Q as arguments. That is because we’ve indicated, with the curly brackets, that these arguments should be able to be inferred from the rest of the arguments. In particular, if we are given a pair of type P /\ Q, Lean can figure out what P and Q are. We could have equally written the definition as:

def and_2' : forall (P Q : Prop), P  Q -> Q := fun P Q x => x.2

Though if we do that, Lean will warn us that the term parameters P and Q are unused, which is a good indication that they might be able to be inferred. This is also an example (one of many) of features that exist in Lean to make it easier to write proofs. Indeed, you can see this going back all the way to the original idea in LCF to allow programming of tactics to make it easier to build proofs.