Document Sample

' $ ' $ Logic of Programming 1 Logic of Programming 2 COMP2600 Lectures in 2007 Introduction c Malcolm Newey, Aust. Natl. Univ. • Reliability is sometimes important. Hoare’s Logic of Programming • ... and sometimes very important. – Chemical plants, – Fly-by-wire aircraft, Lectures 13-15 ............... August 14-16, 2007 – Pacemakers, – Programming competitions. The key concepts are Hoare’s Precondition/Postcondition notation and the notion • High reliability requires rigourous veriﬁcation. of partial correctness of programs. & % & % COMP2600 COMP2600 ' $ ' $ Logic of Programming 3 Logic of Programming 4 Veriﬁcation V & V of Products • What gets veriﬁed in Computing? – Hardware, V & V stands for Validation and Veriﬁcation. – Compilers, It is a standard term used in systems engineering. – Application Programs, • What is veriﬁcation? – Even Speciﬁcations. – Checking that the system matches its speciﬁcation. • How do we do it? • What is validation? – Systematic testing, – Checking that the speciﬁcation reﬂects intentions. – Structured walkthroughs, Building the system right versus building the right system. – Assertion testing, – Formal veriﬁcation. & % & % COMP2600 COMP2600 ' $ ' $ Logic of Programming 5 Logic of Programming 6 Prerequisites for Program Veriﬁcation Formal Program Veriﬁcation • Need formal semantics of languages. (Theorems about what code should do) • Program veriﬁcation is about proving properties of programs; • Need to formally model application domain. • In particular, it is about proving they meet their speciﬁcations • Need vast library of mathematical theories. • Proofs guarantee correctness. • Need decision procedures. • They are mechanically checkable. • Need people. • Need the infrastructure for managing such endeavours. & % & % COMP2600 COMP2600 ' $ ' $ Logic of Programming 7 Logic of Programming 8 Speciﬁcation Examples Preconditions and Postconditions A speciﬁcation is made up of all the requirements. • Preconditions and postconditions are predicates that are deemed to hold before and after (respectively) the execution of a code fragment. Here are requirements from 3 very different systems. 1. If Fib(n) is a function to compute the n’th Fibonacci number, and k is a • We wish to say things like “If x > 2 before y := x*(x+1) is non-negative integer, then Fib(k + 2) = Fib(k + 1) + Fib(k ). executed then y + x > 8 is true afterwards”. 2. The routine avoid-melt-down should complete within 600msec. In this example, “x > 2” is a precondition while “y + x > 8” is a 3. The OS process called FileSystem never terminates. postcondition. & % & % COMP2600 COMP2600 ' $ ' $ Logic of Programming 9 Logic of Programming 10 Input-Output speciﬁcations Hoare’s Notation • Also called “Black-box” speciﬁcations • Used for i/o speciﬁcation of any code. • A routine R has been speciﬁed once we say that its output (if there is one) will • The Hoare triple (so called), {P } A {Q } satisfy some postcondition Q whenever the input satisﬁes precondition P. means “If P is true in the initial state and A terminates then Q will hold in the ﬁnal • We capture above situation by writing the notation {P } R {Q }. state.” • Example: We might specify a square root function by saying that • Examples: its output, sqrt (x ), must satisfy the postcondition 1. {x = 2} x := x+1 {x = 3} |(sqrt (x ))2 −x | x ≤ 10−6 2. {x = a} x := x+1 {x = a + 1} provided its input, x, satisﬁes the precondition, x > 0. 3. {x > 2} y := x*(x+1) {y > 8} & % & % COMP2600 COMP2600 ' $ ' $ Logic of Programming 11 Logic of Programming 12 Partial Correctness Notes on Hoare’s Notation • Hoare’s notation expresses ‘partial correctness’. • The original notation for propositions in this extension of predicate calculus We say a program is partially correct if it gives the right answer whenever it was P {A} Q instead of {P } A {Q }. terminates. That is, it never gives a wrong answer. • The now-accepted notation is nice because assertions in many languages are written as comments. • {P } A {Q } does NOT imply that A terminates, even if P holds initially. • The textbook is not so deﬁnite about what the meaning of the notation • So {x = 1} while x=1 do y:=2 {x = 3} actually is. is a true statement. • Surprising, perhaps, but the notion is sensible. & % & % COMP2600 COMP2600 ' $ ' $ Logic of Programming 13 Logic of Programming 14 Initial value convention • In the following Hoare triple Partial Correctness is Good {True} while x<>y do if x>y then x:=x-y • Why not insist on termination? else y:=y-x {x = gcd (xα , yα )} xα and yα indicate the initial values of x and y. – It may not be possible to achieve. – Usually important to not have the wrong answer. • Usually, a subscript alpha in a postcondition is used to indicate the value that its variable had initially. • Termination is different issue to ‘getting the right answer’. • The above statement is true even though the loop will not terminate for some initial (xα , yα ) pairs. • This is a typical initial value convention. & % & % COMP2600 COMP2600 ' $ ' $ Logic of Programming 15 Logic of Programming 16 Weak and strong conditions Weak and strong conditions • The condition (x > 0) says less about a state than does the condition • We normally say P is stronger than Q in the cases where P ⇒ Q (x = 5). It is the weaker condition. • The condition (x = 6) says more about a state than does the • but the statement {x > 0} x:=x+1 {x > 1} says more about the code condition (x > 0). It is the stronger condition. x:=x+1 than does {x = 5} x:=x+1 {x > 1}. • The statement {x = 5} x:=x+1 {x = 6} says more about • In general, if precondition P1 is weaker than P2 then the code x:=x+1 than does {x = 5} x:=x+1 {x > 0}. {P1 } A {Q } is stronger than {P2 } A {Q }. • In general, if postcondition Q1 is a stronger condition than Q2 • Usually we are interested in strong postconditions and weak preconditions then {P } A {Q1 } is stronger than {P } A {Q2 }. (because they promote strong speciﬁcations). & % & % COMP2600 COMP2600 ' $ ' $ Logic of Programming 17 Logic of Programming 18 Proof rule for Strengthening Preconditions Recap of Lecture 13 • It is OK to make a precondition more speciﬁc. • Preconditions, postconditions and Input/Output speciﬁcations • The rule: • The Hoare triple {P } A {Q } means “If P is true in the initial state and A {Pw } S {Q } Ps ⇒ Pw terminates then Q will hold in the ﬁnal state.” • We say a program is partially correct if it gives the right answer whenever it {Ps } S {Q } terminates. That is, it never gives a wrong answer. • An instance: • We say P is stronger than Q in the cases where P ⇒ Q {x > 2} x:=x+1 {x > 3} (x = 4) ⇒ (x > 2) • Usually we are interested in strong postconditions and weak preconditions. {x = 4} x:=x+1 {x > 3} & % & % COMP2600 COMP2600 ' $ ' $ Logic of Programming 19 Logic of Programming 20 Proof rule for Weakening Postconditions • It is OK to weaken a postcondition so it says less. The Assignment Axiom • The rule: {P } S {Qs } Qs ⇒ Qw • Suppose Q (x ) indicates a formula involving a variable, x, and that Q (z ) {P } S {Qw } indicates the same formula with all occurrences of x replaced by the term z. • An instance: • The assignment axiom of ‘Hoare Logic’ is: {x > 2} x:=x+1 {x > 3} (x > 3) ⇒ (x > 1) {Q (e)} x:=e {Q (x )} {x > 2} x:=x+1 {x > 1} & % & % COMP2600 COMP2600 ' $ ' $ Logic of Programming 21 Logic of Programming 22 Assignment Examples Another Assignment Example 1. Take the assignment x:=y+3; (x > 3). Suppose the desired postcondition is 1. Take the assignment x:=x+b+1 {y + 3 > 3} x:=y+3 {x > 3} and the desired postcondition (the Q (x ) of the rule) is an instance of the assignment axiom. (a = x + y ) ∧ (b = 2) ∧ (x = y + b). Thus an appropriate precondition is (y > 0). 2. The rule gives a precondition which is 2. The following is an instance of the assignment axiom : (a = x + b + 1 + y ) ∧ (b = 2) ∧ (x + b + 1 = y + b). {(x + 1) > 5} x:=x+1 {x > 5} So (x > 4) is sufﬁcient to guarantee (x > 5) after x:=x+1. & % & % COMP2600 COMP2600 ' $ ' $ Logic of Programming 23 Logic of Programming 24 Why Believe? This Makes Sense! • The assignment axiom of ‘Hoare Logic’ is: {Q (e)} x:=e {Q (x )} 1. Suppose the value assigned ( x + b + 1 ) is 12. 2. The postcondition in the ﬁnal state is • Why is it so? (a = 12 + y ) ∧ (b = 2) ∧ (12 = y + b) – Let v be the value assigned. 3. The above precondition in the initial state is also – If Q (e) is true initially, then so is Q (v ). (a = 12 + y ) ∧ (b = 2) ∧ (12 = y + b) Since the variable x has value v after assignment Q (x ) holds after that assignment. & % & % COMP2600 COMP2600 ' $ ' $ Logic of Programming 25 Logic of Programming 26 The Assignment Axiom is Optimal Proof rule for Sequencing • This Hoare triple in the assignment axiom: • Sequencing rule {Q (e)} x:=e {Q (x )} is as strong as possible. {P } S1 {Q } {Q } S2 {R } (i.e. Q (e) is the weakest appropriate precondition.) • Why? {P } S1 ; S2 {R } – Again, let v be the value assigned. • – If Q (e) is false initially, then so is Q (v ). {x > 2} x:=x+1 {x > 3} {x > 3} x:=x+2 {x > 5} Since the variable x has value v after assignment Q (x ) cannot hold after that assignment. {x > 2} x:=x+1; x:=x+2 {x > 5} – So Q (e) is too weak a precondition to guarantee postcondition Q (x ). & % & % COMP2600 COMP2600 ' $ ' $ Logic of Programming 27 Logic of Programming 28 Validity of rule for sequences Proof rule for conditionals • Take the hypotheses, H1 and H2 , to be true. • Conditional statement rule • Let σ be an arbitrary state that satisﬁes P. H1 says that after S1 executes in state σ , Q will be true. {P ∧ b} S1 {Q } {P ∧ ∼ b} S2 {Q } H2 says that when S2 executes in this resultant state, a state will result which satisﬁes R. {P } if b then S1 else S2 {Q } • But S1 ; S2 just means execute S1 and then execute S2 . • This rule is in ideal form for goal-directed proving. So, when S1 ; S2 executes (starting in state σ ) the resulting state will satisfy • If our goal is to prove something about a conditional statement, the rule gives R. exactly the right subgoals. • What about non-termination? & % & % COMP2600 COMP2600 ' $ ' $ Logic of Programming 29 Logic of Programming 30 Example of conditionals, a la Hoare • If we wish to prove Why is it so? {x > 2} if x>2 then y:=1 else y:=-1 {y > 0} then the proof rule for conditionals suggests we ﬁrst prove: {x > 2 ∧ x > 2} y:=1 {y > 0} and • Assume hypotheses of rule aiming to prove conclusion. {x > 2 ∧ ∼ (x > 2)} y:=-1 {y > 0} • To prove conclusion, assume P is true in an initial state, σ . That is, {x > 2} y:=1 {y > 0} If b is true in state σ , use 1st hypothesis to show Q follows. and {F } y:=-1 {y > 0}. If b is false in state σ , use 2nd hypothesis to show Q follows. • For the ﬁrst subgoal, the assignment axiom tells us that • In either case Q holds. Thus conclusion of rule holds. {T } y:=1 {y > 0}. This is good enough since (x > 2) ⇒ T . • For the second subgoal, the assignment axiom tells us {F } y:=-1 {y > 0}. A perfect match! & % & % COMP2600 COMP2600 ' $ ' $ Logic of Programming 31 Logic of Programming 32 A simpler proof rule for conditionals? The problem illustrated. • Consider this rule • Consider the following goal: {P } S1 {Q } {P } S2 {Q } {T } if x>y then max:=x else max:=y {max ≥ x } • What subgoals does the simpler rule suggest? {P } if b then S1 else S2 {Q } {T } max:=x {max ≥ x } • Is it valid? and {T } max:=y {max ≥ x } • The second proposition is not always true. • Why isn’t it the standard rule? • So this streamlined rule is not effective for this program. .. but the ﬁrst one is. The ‘subgoals’ may not be true! & % & % COMP2600 COMP2600 ' $ ' $ Logic of Programming 33 Logic of Programming 34 Why is it so? Proof rule for while loops • Assume invariant holds: {P ∧ b} S {P }. • The Loop Rule • To prove conclusion, assume also P holds in arbitrary state, σ . {P ∧ b} S {P } • If loop never terminates, starting in state σ , conclusion of rule holds. {P } while b do S {P ∧ ∼ b} • Otherwise loop terminates after n iterations. At the start of each iteration b holds. and by induction (using the fact that P is an invariant), P holds. • P is called the loop invariant. – For the ﬁrst iteration, P holds initially and at the end (using the fact that P • This rule suggests a subgoal when we know what we want to prove about a is an invariant), loop. – The same (P holds initially and ﬁnally) applies to subsequent iterations – • Subgoal {P ∧ b} S {P } is easier to show than the simpler invariant by induction. property {P } S {P }. • Thus P ∧ ∼ b holds on exit from the loop. & % & % COMP2600 COMP2600 ' $ ' $ Logic of Programming 35 Logic of Programming 36 Example of while loops, a la Hoare Let’s Prove a Program! • If we wish to prove {n ≥ 0} while n > 0 do n:=n-1 {n = 0} • Program (with speciﬁcation): • Is the predicate (n ≥ 0) a suitable invariant? {True} • The rule suggests the subgoal: i:=0; s:=0; {(n ≥ 0) ∧ (n > 0)} n:=n-1 {n ≥ 0}. while i = n do which shows that it is! i:=i+1; s:=s+(2*i-1) {s = n2 } • Since that subgoal is easily proved, the rule tells us: {n ≥ 0} while n > 0 do n:=n-1 {(n ≥ 0)∧ ∼ (n > 0)} • How does it work? • Done! & % & % COMP2600 COMP2600 ' $ ' $ Logic of Programming 37 Logic of Programming 38 Mathematical Underpinnings Can we Prove that Program? • Observation: First think about a loop invariant. 1 + 3 + 5 + 7 = 42 , 1 + 3 + 5 + 7 + 9 + 11 + 13 = 72 {s = i 2 } seems plausible. • General Rule (easily provable): ∀ n. (n ≥ 1) ⇒ n (2 × i − 1) = n2 • {s + (2 ∗ i − 1) = i 2 } s:=s+(2*i-1) {s = i 2 } i =1 – Base Case (n=1): Trivial! • {s + (2 ∗ (i + 1) − 1) = (i + 1)2 } i:=i+1 {s + (2 ∗ i − 1) = i 2 } – Step Case: • {s = i 2 } i:=i+1; s:=s+(2*i-1) {s = i 2 } n 2 i =1 (2 × i − 1) = n n+1 So far, so good. (It is an invariant.) ⇒ i =1 (2 × i − 1) = (n + 1)2 & % & % COMP2600 COMP2600 ' $ ' $ Logic of Programming 39 Logic of Programming 40 Completing the Proof How did we Prove that Program? Strengthen the precondition to match the While rule hypothesis: • {s = i 2 ∧ i = n} i:=i+1; s:=s+(2*i-1) {s = i 2 } • Coming up with the invariant, {s = i 2 }, required intuition. Now use the While Rule: • Proving it is an invariant was straightforward. • {s = i 2 } while ....s:=s+(2*i-1) {s = i 2 ∧ i = n} • Applying the While rule was automatic. and check initialization establishes the invariant: • So was pushing the loop precondition through the initialization. • {True} i:=0; s:=0 {s = i 2 } • Note the program precondition does not guarantee termination. • {True} Program {s = n2 } & % & % COMP2600 COMP2600 ' $ ' $ Logic of Programming 41 Logic of Programming 42 Are the Rules Complete? What about Termination • We focussed on Soundness (i.e. every provable Hoare triple is true). • Remember: Hoare Logic is for Partial Correctness. • Although we showed that each of the rules is sound, there were some hidden • There are separate techniques to show termination. assumptions. • A simple technique is to identify, for each loop, an integer expression which is • With the same assumptions, the rules are also completea for the language always positive but which decreases each time around the loop. fragment (assignment, conditional, while, sequence). • Partial Correctness + Termination = Total Correctness a A logic is complete if every true expression is provable in the logic & % & % COMP2600 COMP2600 ' $ ' $ Logic of Programming 43 Logic of Programming 44 Assumptions? What Assumptions? How about Procedures • The language of assertions the same as the sub-language of expressions. • We assumed no aliasing of variables. (In most real languages we can have • There are proof rules for Procedures, but you won’t ﬁnd them in any textbook. multiple names for the one variable.) • Block structure and declarations are part of that story. How is aliasing a problem? • The problem is the complexity of the procedures rules. • Suppose x and y refer to the same variable. • Functions actually are easier! – We get {y + 1 = 5 ∧ y = 5} x:=y+1 {x = 5 ∧ y = 5} – i.e. {y = 4 ∧ y = 5} x:=y+1 {x = 5 ∧ y = 5} & % & % COMP2600 COMP2600 ' $ ' $ Logic of Programming 45 Logic of Programming 46 References Conclusion The textbook has material on Hoare Logic • Grassman & Tremblay, “Logic and Discrete Mathematics: A Computer There are several other formal systems for reasoning about code. Science Perspective”, Prentice-Hall, Chapter 9, pages 481-518. • So, What is the point of Hoare logic? The seminal paper by Tony Hoare is: • Hoare triples are pervasive - in textbooks and in connection with other formal • Hoare, C.A.R., “An Axiomatic Basis for Computer Programming”, systems. Communications of the ACM, October 1969. • They compactly and rigourously capture what we often want to say about A comprehensive history of Hoare Logic appears in code. • Apt, K.R., Ten Years of Hoare Logic: A Survey”, ACM Transactions on Programming Languages and Systems, October, 1981. & % & % COMP2600 COMP2600

DOCUMENT INFO

Shared By:

Categories:

Tags:
Hoares, Logic, Programming, Introduction

Stats:

views: | 22 |

posted: | 4/4/2010 |

language: | English |

pages: | 12 |

Description:
& $ % Hoares Logic of Programming & $ % Introduction

OTHER DOCS BY lindash

How are you planning on using Docstoc?
BUSINESS
PERSONAL

By registering with docstoc.com you agree to our
privacy policy and
terms of service, and to receive content and offer notifications.

Docstoc is the premier online destination to start and grow small businesses. It hosts the best quality and widest selection of professional documents (over 20 million) and resources including expert videos, articles and productivity tools to make every small business better.

Search or Browse for any specific document or resource you need for your business. Or explore our curated resources for Starting a Business, Growing a Business or for Professional Development.

Feel free to Contact Us with any questions you might have.