Formal Methods

Document Sample
Formal Methods Powered By Docstoc
					Computer Security
463.6.1 Formal Methods

            UIUC CS463
          Computer Security

• J. M. Wing. A symbiotic relationship
  between formal methods and security.
  Proceedings of the NSF Workshop on
  Computer Security, Fault Tolerance, and
  Software Assurance: From Needs to
  Solutions. December 1998.
• Bishop, Chapter 20


• Types of formal methods
• Subtle errors in protocols
• Three illustrative case studies

• Formal method: automated technique based on
  mathematical logic used to analyze a property of a
• National Security Agency was the major source of
  funding formal methods research and development in
  the 70s and early 80s
   – Formal security models
   – Tools for reasoning about security
   – Applications of using these tools to prove systems secure
• The use of Internet brings security to the attention of
   – What kind of problems can formal methods help to solve in
   – What problems will formal methods never help to solve

The Limits of Formal Methods
• Systems will never be 100% secure
   – Formal methods will not break this axiom
• Assumptions about the system’s environment
   – Hard to state them explicitly
   – The system could be deployed in an environment not
     originally designed
      • For convenience or lack of an alternative
   – Clever intruders find out how to violate these
• Security is not an either/or property
   – Pay more, gain more
   – e.g. Passwords, certificates, biometrics are measured
     in terms of degree of security for authentication

What Formal Methods Can Do
• Delimit the system’s boundary: the
  system and its environment
• Characterize a system’s behavior more
• Define the system’s desired properties
• Prove a system meets its specification
  – Tell under what circumstances a system
    cannot meet its specification

How They Can Help
• These capabilities of formal methods help
  practitioner in two ways
  – Through specification, focusing on designer’s
     • What is the interface
     • What are the assumptions about the system
     • What is the system supposed to do under this condition and
       that condition
     • What are the system’s invariant properties
  – Through verification
     • Prove a system meets its security goals
     • Find out the weaknesses of the system


• Early formal method research funded by
  the National Security Agency, centered on
  proving systems secure
  – Bell-LaPadula model
  – Biba integrity model
  – Clark-Wilson model
• The systems of interest to prove secure
  were operating systems, more specifically,

Process of Proving

• Process of proving entails 3 parts
  – Formal specification
     • State the property of the system
     • For example: *-property
  – Model the system so that one can formally prove the
     • Model might be a semantic structure like a state machine or a
       syntactic structure like a logical expression
  – Proof
     • Methods
         – Rely on induction over traces of the state machine model
         – Or rely on deduction to show that an implication holds
     • Automatically proved by machine or require interactive
       guidance of human

The Orange Book
• US Trusted Computer System Evaluation
  Criteria – The Orange Book
  – Produced by NCSC (National Computer Security
    Center) in 1985
  – Provide a standard metric for NCSC to compare the
    security of different computer systems
  – Guide computer system vendors in the design and
    development of secure systems
  – Provide a means for specifying security requirements
    in Government contracts
     • Levels: D, C1,C2, B1, B2, B3, A1, A2
     • Certified A1 means that one has formally specified the
       system’s security requirements, formally modeled the
       system, and formally proved that the model meets its

Tools Specific for Security
• Tools specific to reason about security
  – Specifies an insecure state and the tool searches
    backwards to determine whether that state is
     • Interrogator
         – Based on Prolog, exhaustive search, fully automatic
     • NRL Protocol Analyzer
         – Based on Dolev and Yao’s work on algebraic term rewriting
           model for two-party cryptographic protocols
         – Less automatic

• BAN logic – Logic of Authentication
  – Reason in terms of belief logic
     • Accumulates belief during the run of the protocol

Major Approaches to FM

• Model checking
  – Example: FDR (NS counter-example)
• Theorem proving
  – Example: Isabelle
• Software specification
  – Example: Z

Variant of Otway-Rees Protocol
A and B share keys Ka and Kb respectively with S

  1.   A  B:   Na, A, B, {| Na, A, B |}Ka
  2.   B  S:   Na, A, B, {| Na, A, B |}Ka, Nb, {| Na, A, B |}Kb
  3.   S  B:   Na, {| Na, Kab |}Ka, {| Nb, Kab |}Kb
  4.   B  A:   Na, {| Na, Kab |}Ka

    Attack to the Example Protocol
•    An attack on the protocol
    1.     AB        :   Na, A, B, {| Na, A, B |}Ka
    1'.    CA         :   Nc, C, A, {| Nc, C, A |}Kc
    2'.    AS        :   Nc, C, A, {| Nc, C, A |}Kc ,Na', {| Nc, C, A |}Ka
    2''.   C(A)  S    :   Nc, C, A, {| Nc, C, A |}Kc ,Na, {| Nc, C, A |}Ka
    3'.    SA        :   Nc, {| Nc, Kca |}Kc, {| Na, Kca |}Ka
    4.     C(B)  A    :   Na, {| Na, Kca |}Ka

           Replacing Na' by A’s original nonce Na eventually causes
           A to accept key Kca as a key shared with B

A correct protocol

1.   A  B:   Na, A, B, {| Na, A, B |}Ka
2.   B  S:   Na, A, B, {| Na, A, B |}Ka, {| Na, Nb, A, B |}Kb
3.   S  B:   Na, {| Na, Kab |}Ka, {| Nb, Kab |}Kb
4.   B  A:   Na, {| Na, Kab |}Ka

Three Quick Case Studies

• Formal Simulation of L3A
  – Maude
• Formal Verification of On-Demand
  – ProVerif and TulaFale
• Formal Analysis of HIPAA
  – Privacy APIs and SPIN

      Study 1: Cramming Attacks

                                            Network     Unauthenticated
                                            Access         Ingress
                            E2E Security    Server                   Server
                              Tunnel         (NAS)




Goodloe, Gunter, Stehr 05                                                     17
Tunnel as Countermeasure

  Challenge: Coordinate the creation of the tunnels

   L3A Set-Up

       Client               NAS                            Server

                            SPD CS:(CN)

                            SPD SC:(SN)



  L3A Set-Up With Reuse

     Client                      NAS

SPD CS2:(CN)


                             SPD CS2:(CN)
                             SPD S2C:(S2N)
                                                          SPD S2C:(S2N)

• An English language description
  resembling an IETF RFC is produced.
• A formal specification is written in Maude.
• Systems are modeled using membership
  equational logic and rewriting logic.
• Symbolic simulation acts as a debugging
  aid for the design.


• Maude “logical            • OPNET, NS-2
  simulations”                discrete event
  – Exhaustive breadth-       simulation.
    first search.             – Good at modeling
  – Model checking.             details (time, etc).
  – Must abstract things      – Often used to estimate
    like packet size.           network performance.
  – Not good with timers.     – Not good at finding the
  – Not really useful for       extreme cases.
    performance               – Probably wouldn’t
    modeling.                   have found several of
                                the L3A concurrency

Overview of Module Interaction

   L3A      SIKE               setkey




                                     L3A Test                  L3A Test
     SIKE Test                       Concrete                  Abstract

                                       L3A                     Abstract

Setkey                                                       PKI

                 Security Policy        IPSec

                 Security Assoc

                                                IP Message
                     Routing Table

            State                                       Message
Modeling Uncovered Problems

Problems arose from interactions among the
• Numerous iterations were required to
  resolve problems resulting from when the
  IPSec databases are updated:
  – Maude search feature was of great help here.
• When things are not done right packets
  can slip into partially set up tunnels.

Study Did Not Model

• Timers
• Lost Messages
• Periodic updates to the secret used to
  generate the cookie
• Fragmentation
  – Can be the source of DOS attacks
• UDP layer: Ports not mentioned at all in
  the model


• Getting the right level of abstraction.
• Modeled the various components and
  – IP, IPSec, L3A, …..
• Did not need to model details of IPSec
  authentication and encryption since
  protocol is about setting up tunnels
• First effort at model of IP send/receive was
  too abstract:
  – Introduced messy concurrency problems.
      Study 2: WSEmail Protocol
      • On Demand
        Attachments Protocol
             – Nine messages, four
             – Complex messages
             – Want to prove that
               receiving an
               attachment means it
               was sent by the
               sender in the from field

Lux, May, Gunter, Bhattad 05              28
Proof Technique
• Reduce the complex and redundant messages
  – Eliminated headers, irrelevant to/from fields
• Choose a verifier that can evaluate protocol
  – Used ProVerif by Bruno Blanchet
• Formalize the messages and parties for the
  – Used TulaFale by MS Research
  – Compiled TulaFale scripts into ProVerif syntax
• Result was a smaller formal version in a
  machine checkable format
  – Lost injectiveness in the process of translating down
    since TulaFale cannot express timed nonces easily

Message Example
•   First message is from the Sending Client to the Sending Server
     – Email text, attachment, destination address, user name
     – Everything is signed user name token method
•   Abstractly
     – SC  SS: SS | (RC | RS) | Msg | Attachment
•   Production and Destruction Rules in TulaFale
     predicate mkMsg1(SC:item, nonce:bytes, creation:string, attachment:string,
        email:string, TOuser:string, TOdom:string, Msg1:item, Msg1Signed:item) :-
        destUserAtDomain = UName(TOuser, TOdom),
        isUserTokenKey(TokSC, SC, nonce, creation, KeySC),
        Msg1 = Message1(TokSC, attachment, email, destUserAtDomain),
        mkSignature(Sig, "hmacsha1", KeySC, <list>Msg1</>),
        Msg1Signed = <env>Sig Msg1</>.

     predicate isMsg1(Msg1Signed:item, SC:item, TOdom:string, TOuser:string,
        attachment:string, email:string, destUserAtDomain:item, Msg1:item) :-
       Msg1Signed = <env>Sig Msg1</>,
       Msg1 = Message1(TokSC, attachment, email, destUserAtDomain),
       isUserTokenKey(TokSC, SC, nonce, creation, KeySC),
       isSignature(Sig, "hmacsha1", KeySC, <list>Msg1</>),
       destUserAtDomain = UName(TOuser, TOdom).

• First pass at the formalization had errors
   – We ended up with a trivially true theorem
• Second pass was more careful
   – Each messages was checked for correctness and reachability
• Performance problems
   – ProVerif couldn’t handle such a large protocol
   – Blanchet created a version of ProVerif that skipped some extra
     parsing steps that were performed after the theorem was proved
• We finished with a theorem shown to be true, but without
  a derivation tree justifying the theorem
   – Would have made debugging hard
   – Future efforts will be made at making the prover more efficient

     Case Study 3: HIPAA Verification


May Gunter Lee 06                       32
 Our Approach
Formalize legal texts and use model checking to evaluate their static properties.
Compare to policy in practice to find compliance.

         Full Text        Selection        Command set         Model

         English         English               Privacy          Promela
Reference checking

• Comments on the 2000 version consent rules
  lead to a complete rework in the 2003 version.
• Examples:
  – Ambulance workers must obtain consent for services
    they did for unconscious patients after the fact –
    satisfied trivially.
  – Hospitals which usually do pre-operation
    preparations before procedures can not do so
    without the patient coming to sign a special
    designator – not satisfied.
  – Doctors who render remote diagnoses can not do so
    without having a special paper consent form sent or
    faxed to them first – satisfied (restricted depth
    needed to handle 2000 case).

Model example
•   Modeled the rule set in           Use506c1 (a, s, r, p, f, evidence)
                                      if AllowedAsIn506c1 (a, s, r, p, f, evidence)
    Spin                              and r == a
                                      and own in (a, f)
•   Trace the path that lead          and isTPO(p)
                                      then EnterUse (a, p, f)
    to specific valid and             end
    invalid states                    active proctype Use506c1 ()
                                      { bool result = false; bool temp;
•   Valid and invalid states             do
    are inputted as invariants           :: Use506c1_chan?request(_) ->
    –   Designated by experts in            AllowedAsIn506c1_chan?response(temp);
        health care and privacy             result = temp;
                                            result = result && (r==a);
        activists                           result = result && (m.mat[a].obj[f].own == 1);
    –   Mentioned explicitly in the         if
        text                                :: result -> EnterUse_chan!request(true);
    –   Derived from comments by            :: else -> skip;
        stakeholders in the law’s           fi;
        design                           od}

•   Property: Can a doctor see a            /* initialize the matrix */
                                            /* Dan is a doctor */
    patient record for treatment,           m.mat[Dan].obj[health_care_provider_group].memb
    payment, or health care                        er=1;
    operations without consent in a
                                            /* Paula is a patient and the subject of file1*/
    non-emergency situation?                m.mat[Paula].obj[file1].subject = 1;
•   Invariant: No health care
    provider can access a patient           /* Dan has the file in his system - he owns it */
                                            m.mat[Dan].obj[file1].own = 1;
    record in a non-emergency
    situation without first gaining         p.treatment=1; p.payment=1;
    consent or obtaining it afterward            p.healthcare_operations=1;
                                            /* set evidences */ evidence.emergency = 0; …
•   File f about Paula (patient). Dan
    (doctor) can not gain any access        /* check if Dan can get access to the file*/
    permissions on f without getting        invariant = (m.mat[Dan].obj[file1].treat == 0) &&
                                                 (m.mat[Dan].obj[file1].pay == 0) &&
    consent from Paula first (or after           (m.mat[Dan].obj[file1].healthops == 0) &&
    the fact in case of inability to gain        (m.mat[Dan].obj[f_new].treat == 0) &&
                                                 (m.mat[Dan].obj[f_new].pay == 0) &&
    consent at first).                           (m.mat[Dan].obj[f_new].healthops == 0);

• Security properties can be subtle and it is
  easy to leave vulnerabilities
• Formal methods can aid assurance in
  important areas although they do not
  provide a panacea for correctness
• Several types of techniques are available:
  formal specification, model checking,
  interactive verification
• Application features and correctness goals
  determine the correct technique