563.14.1 Tamper Resistant Architecture Decentralized Label Model

Document Sample
563.14.1 Tamper Resistant Architecture Decentralized Label Model Powered By Docstoc
					563.14.1 Tamper Resistant Architecture:
Decentralized Label Model for Information
Flow Control



         Presented by: Soumyadeb Mitra
 PISCES Group: Soumyadeb Mitra, Sruthi Bandhakavi, Ragib Hasan, Raman
                              Sharikyn
                       University of Illinois
                          Spring 2006
    Motivation for Decentralized Label Models

    • Security models have two goals
         – Prevent malicious destruction of information
         – Control release and propagation of
           information
    • Traditional security models:
         – Access control lists, capabilities
               • First goal supported
               • Second
                  – Information release can be restricted
                  – Information propagation is not well supported

Myers&Liskov                                                        2
Motivating Example
• Java applet
  – Downloaded from remote site and run locally
  – Code not trustworthy
• Security assurances
  – Restrict malicious transfer of information
  – No way to control information propagation
• Current approach: Sandbox
  – Too restrictive
• Possible Solution
  – Control information flow
                                                  3
The Basic Idea
• Assign security labels to data
  – Who created it?
  – Who is allowed to see it?
• Track data flowing through the system
  – Check violations

                            read(ƒ) - ƒ: {user: user}

                       z=ƒ
                                      Java Applet
                   write(socket,z)
                            socket: {:anyone}


                                                        4
Main Entities
• Principals, representing users
  – Who create data
• Values
  – Computation manipulate values
• Slots
  – Variables/objects acting as source and sink of values
• Channels
  – Input/Output : Values obtained from input channel
    and written to output channel
• Values/Slots/Channels have security labels
  associated with them
                                                            5
Labels
• Label
  – L = {own: reader1, reader2}
• Owner
  – Principal who is the source of the information
• Readers
  – Principals whom the owner is willing to
    release data
• Values/Slots/Channels have labels
  – Restriction on the kind of assignment (x:=v)
     • Discussed later

                                                     6
Derived Labels
• During computation, values are derived
  from other values.
  – The derived value must contain “information”
    about its source
• Example:
  – x: { Alice: P, Q }
  – y: { Bob: Q, R }
  – z = x+y: { Alice: P, Q ; Bob : Q, R}
  – Both restrictions apply
     • Effective reader : Q
                                                   7
Derived Labels
• Label of z is a join of labels of x and y
• L1 U L2
   – owners(L1 U L2) = owners(L1) U owners(L2)
   – reader(L1 U L2 ,o) = readers(L1 , o) ∩
                         readers(L2 , o)
• Example
  – x: { Alice: P, Q }
  – y: { Alice: P; Bob: P }
  – z = x+y: { Alice: P ; Bob : P}
                                             8
Restriction on Assignment
• x:=v
  – We allow a value to be assigned to a slot only
    if it is a restriction
  – A restriction, intuitively means higher security
• Examples : The following are disallowed
  – Lv = { Alice: P,Q}, Lx = { Alice: P,Q,R}
     • R can get access to v through x
  – Lx = { Bob: P }, Lv = { Alice: Q}
     • Alice looses control over v
     • P gets access to data

                                                       9
Restriction
• Definition: Restriction
  – L1 ⊑ L2 iff
     • readers(o, L1) readers(o, L2)


                        ∩I
     • owners (L1) owners (L2)
  – Examples       I∩
     • {Alice: X,Y,Z} ⊑ {Alice: X}
     • {Alice: X,Y,Z; Bob : X,Y} ⊑ {Alice: X; Bob :Y}
     • {Alice: X} ⊑ {Alice: X; Bob :Y}
• Assignment Rule
  – x:= v
       → Lv ⊑ Lx

                                                        10
Declassification
• Sometimes you want to reveal data

                 un-secure channel               Alice’s clear text
                                       Encrypt



                                      {
Send (string cleartext {Alice:Alice}) authority(Alice){
   ………
   encryptext {Alice:Alice} = Encrypt(cleartext);
   ………
                         declassify(encryptext)
     channel {:anyone} = encryptext -> Violated


   User must explicitly authorize Send to declassify

                                                                      11
Implicit Information Flow
                 x:=0
                 if (b)          x:=b
                  x:=1

• Lb ⊑ Lx

• Implicit information flow

• The assignment x:=1 depends on the value of b
   – Extra constraint: Lb ⊑ Lx



                                                  12
Implicit Information Flow
• Define labels associated with Program Counter
  – Lpc= U { Lv : v was used arrive at pc }
• x:=v
  – Lv U Lpc⊑ Lx
• Previous example
                     x:=0
                     if (b)
                      x:=1


           Lpc= Lb              Lb⊑ Lx


                                                  13
       Other Information Flows
   •   Termination Channels
   •   Timing Channels
   •   Resource Exhaustion Channels
   •   Power Channels




Sabelfeld and Myers                   14
Confidentiality Constraints

                                      Eve1
                                      Eve2
     Alice        System              Eve3
                                     Eve4
                                     Eve5



 Guarantee that Alice’s data is not released to Eve


                                                 15
Integrity Constraints

                                      Eve1
                                      Eve2
      Alice        System             Eve3
                                     Eve4
                                     Eve5



Guarantee that data Alice receives is not corrupted


                                                 16
Basic Idea
• Assign integrity labels with data
  – x: {? : a, b, c}
  – a, b and c trust the data x
• x:=v
  – Anyone trusting x must also trust v
  – TrustSet(x)   TrustSet(v)
                I∩




  – Lv ⊑ Lx
• Define join of Integrity constraints


                                          17
Details
• Inference and Verification
  – Compile time checking
  – Some data items assigned labels
  – Labels of others derived to satisfy constraints
  – Lpc depends on other variable’s labels, which
    in turn might depend on Lpc
  – Formulate a set of equations and solve
    simultaneously



                                                  18
563.14.2 Tamper Resistant Architecture

Secure Program Partitioning



                   Sruthi Bandhakavi
 PISCES Group: Soumyadev Mitra, Sruthi Bandhakavi, Ragib Hasan,
                      Raman Sharikyn

                  University of Illinois
PISCES
• Protocols and Implementation for Smart Card
  Enabled Software
• Focus on two technologies
  – Information flow
     • Can we split the code by taking the information
       flow in the programs into consideration?
  – Model-based design
     • Can we find a high-level model to represent the
       programs and use it to automatically split and
       produce code?



                                                         20
     Partitioning Jif Programs
   • Solve a constraint system to determine
     possible hosts
   • Use dynamic programming & heuristics to find
     an efficient solution
   • Rewrite program, inserting calls to runtime
     system
      – data forwarding and control transfers
   • Outputs: A collection of Java code fragments
     with host assignments


Zdancewic Zheng Nystrorm Myers                  21
Secure Program Partitioning

                 Source Code Policy



                        Compiler
    Trust
     info               Splitter

subprograms


              runtime
     Host 1             Host 2        Host 3
                                               22
Secure Program Partitioning
  Describes the
computation and
 the principals'        Source Code Policy
security policies.

                               Compiler
      Trust
       info                    Splitter

subprograms


                     runtime
       Host 1                  Host 2        Host 3
                                                      23
Secure Program Partitioning

                 Source Code Policy
                                      Verifies that the
                                      program obeys
                                        the security
                        Compiler
    Trust                                 policies.
     info               Splitter

subprograms


              runtime
     Host 1             Host 2             Host 3
                                                    24
 Secure Program Partitioning
Describes the trust
   relationships
between principals    Source Code Policy
    and hosts.

                            Compiler
      Trust
       info                 Splitter

 subprograms


                  runtime
        Host 1              Host 2         Host 3
                                                    25
Secure Program Partitioning

                 Source Code Policy



                        Compiler     Partitions the data
    Trust                             and computation
     info               Splitter   among hosts, so that
                                    policies are obeyed.
subprograms


              runtime
     Host 1             Host 2              Host 3
                                                     26
Secure Program Partitioning

                 Source Code Policy



                        Compiler
    Trust
     info               Splitter   Performs dynamic
                                     access control
                                      checks and
subprograms                             encrypts
                                    communication.

              runtime
     Host 1             Host 2               Host 3
                                                      27
 Security Assurance

• Goal: Resulting distributed program performs the
  same computation as the source and also satisfies the
  security policies.
• Guarantee: Principal P's security policy is violated only
  if a host that P trusts fails or is subverted.
• Example:                          A               B

   "Alice trusts A & C"
   "Bob trusts B & C"                     C


  If B fails, Alice's policy is obeyed, Bob's policy may
                          be violated.
                                                           28
Secure Program Partitioning

                 Source Code Policy



                        Compiler
    Trust
     info               Splitter

subprograms


              runtime
     Host 1             Host 2        Host 3
                                               29
Confidentiality Policies in Jif

• Confidentiality labels:
      int{Alice} a;       "a is Alice's private int"
• Integrity labels:
      int{?Alice} a;       "Alice must trust a"
• Combined labels:
      int{Alice, ?Alice} a; (Both constraints)



int{Alice} a1, a2;      // Insecure
                        a1 = b;         // Secure
int{Bob} b;                             a1 = a2;
                        b = a1;

                                                    30
 Policy Operations in Jif
• Declassification:
    int{Alice} a;
    declassify(a to Bob);

              "type-cast int{Alice} to int{Bob}"
• Endorse:
    int{?Bob} b;
    endorse(b by Alice);

• But (!) Alice must trust the integrity of decision to perform the
  policy operation.
   – Compiler guarantees the integrity


                                                                 31
Example: Oblivious Transfer
                      request(n)
          Alice                          Bob

        int m1;
        int m2;      answer(mn)


• Alice has two integers: m1 and m2.
• Alice's Policy:
"Bob gets to choose exactly one of m 1 and m2."
• Bob's Policy:
"Alice doesn't get to know which item I request."
• Classic Result:
  "Impossible to solve using 2 principals,
  with perfect security."
                                                    32
Oblivious Transfer (Java)
int              m1, m2;   // Alice's data
boolean          accessed;
int              n, ans;   // Bob's data

n = choose();             // Bob's choice

if (!accessed) {           // Transfer
   accessed = true;
   if (n == 1)
        ans = m1;
   else ans = m2;
}


                                         33
Adding Confidentiality Labels
int{Alice}       m1, m2;   // Alice's data
boolean          accessed;
int{Bob}         n, ans;   // Bob's data

n = choose();             // Bob's choice

if (!accessed) {          // Transfer
   accessed = true;
   if (n == 1)
        ans = m1;         Verification
   else ans = m2;            Fails
}


                                         34
Using Declassification
int{Alice}       m1, m2;   // Alice's data
boolean          accessed;
int{Bob}         n, ans;   // Bob's data

n = choose();           Verification
                             // Bob's choice
                           Fails
if (!accessed) {             // Transfer
   accessed = true;
   if (n == 1)
        ans = declassify(m1 to Bob);
   else ans = declassify(m2 to Bob);
}


                                          35
Integrity Constraints
int{Alice}        m1, m2;   // Alice's data
boolean{?Alice}   accessed;
int{Bob}          n, ans;   // Bob's data

n = choose();              // Bob's choice
                            Verification
if (!accessed) {               Fails
                           // Transfer
   accessed = true;
   if (n == 1)
        ans = declassify(m1 to Bob);
   else ans = declassify(m2 to Bob);
}


                                         36
Using Endorsement
int{Alice}        m1, m2;   // Alice's data
boolean{?Alice}   accessed;
int{Bob}          n, ans;   // Bob's data

n = choose();              // Bob's choice

if (!accessed) {            // Transfer
   accessed = true;
   if (endorse(n by Alice) == 1)
        ans = declassify(m1 to Bob);
   else ans = declassify(m2 to Bob);
}


                                          37
Secure Program Partitioning

                 Source Code Policy



                        Compiler
    Trust
     info               Splitter

subprograms


              runtime
     Host 1             Host 2        Host 3
                                               38
 Trust Configurations

• Labels describe the trust relationship between
  principals and the available hosts.
• Confidentiality:    Host A:{Alice}
  "Alice trusts host A not to leak her confidential
  data."

   int{Alice} m1;        m1 can be sent to A
   int{Bob} n;           n cannot be sent to A
• Integrity:          Host A:{?Alice}
  "Alice trusts host A not to corrupt her high-
  integrity data."

                                                      39
Host Selection

• Consider a field: int{Alice:;?:Alice} f;
• Host H : confidentiality label Ch
           integrity label Ih
• Constraints:
       {Alice:}  Ch Ih  {?:Alice}
• Generalize to program statements:
    C (values used by S)  Ch
    l (Locations defined by S)  Ih
• Constraints on declassify()



                                             40
 A Secure Solution
       A                     T                    B
 bool accessed;       int m1, m2;         int n, ans;

                      goto(B);            int choose() {
                                            ...
                      goto(A);              return n;
 if (!accessed){                          }
                      int n'= get(n,B);
  accessed=true;      if (n' == 1)
  goto(T);                set(ans, m1);   n = choose();
 }                    else ...            goto(T);

{Alice, ?Alice}    {Alice, ?Alice, Bob}   {Bob, ?Bob}

                                                           41
Secure Program Partitioning
• Language-based Confidentiality Policies
   – Compiler splits a program among
     heterogeneously trusted hosts.
   – Guided by security policies
   – Resulting distributed program satisfies the
     policies                                 Source Code Policy

• Benefits:
                                                    Compiler
   – End-to-end security          Trust
                                   info              Splitter

   – Decentralized           subprograms

   – Automatic                             runtime
                                    Host 1           Host 2        Host 3
   – Explicit Policies
                                                                             60




                                                                        42
Our Project
• Extend the same concept to an implementation-
  independent model.
• EFSM are very simple and can model a large
  number of systems
• Our model of EFSMs
  – Set of states
  – Each state is either
     • x:=v. GOTO nextstate
     • if (P) GOTO state1 else GOTO state2
  – Special variables
     • in, out

                                                  43
Example ESFM
    S0
                  n=in


    S1        if (isAccessed)
         0                         1

                     S2       isAccessed=0

                              S3 if (n)
                          0                  1
         S4
              out=declassify(m1)          out=declassify(m2) S5


                                    END


                                                                  44
Security Labels & Type Checking

• All variables
  – Confidentiality constraint Cx
  – Integrity constraint Ix
• States also have confidentiality constraints
  – Cstate = U { Cv : state depends on v}
  – Istate= U { Iv : state depends on v}
• x:=v
  – Cv U Cstate ⊑ Cx
  – Iv U Istate ⊑ Ix

                                             45
Mapping states to hosts
• Each state mapped to some host
  – x:=v can be mapped to h if
    • Cx ⊑ Ch
    • Ih ⊑ Ix
  – if (P) GOTO s0 ELSE s1
    • Can be mapped to h if
       – CP U Cstate ⊑ Ch
       – IP ⊑ Ih




                                   46
 Example EFSM annotated
CA= { Alice: ? Alice} CB= {Bob: ? Bob} CT= {Alice: Bob:}
m1,m2, isAccessed : {Alice: ? Alice}       S0                  B, T
n { Bob:}                                           n=in

in {Bob: }, out { Bob: }                                              A
                                  S1        if (isAccessed)
                                       0                                  1      A
                                                      S2       isAccessed=0
                                                                                T
                                                               S3     if (n)
                                                           0                        1
                                       S4       T                                       T
                                            out=declassify(m1)                 out=declassify(m2) S5


                                                                          END

                                                                                               47
Splitting of EFSM

                                                          n=in
if (isAccessed)


            isAccessed=0
                                           B


     END
                           T              if (n)
A
                           out=declassify(m1)      out=declassify(m2)




                                                                        48