An Automated Framework for Validating Firewall Policy Enforcement

Document Sample
An Automated Framework for Validating Firewall Policy Enforcement Powered By Docstoc
					An Automated Framework for
  Validating Firewall Policy
        Enforcement
  Adel El-Atawy*, Taghrid Samak, Zein Wali, Ehab Al-Shaer
            School of Computer Science, MNLAB
               DePaul University, Chicago, IL

           Frank Lin, Christopher Pham, Sheng Li
                            Cisco

                3rd Midwest Security Workshop
                      West Lafayette, IN
                        April 21, 2007
Agenda

• Motivation and Challenges
• Architecture Overview
• System Components
  – Policy Segmentation: Traffic Generation
  – Augmented Grammar: Syntax Specification
  – Policy Generation
• Architecture Revisited
• System Features
• Conclusion

                                              2
MSW – April 2007    INSPEC
Motivation

• Testing security devices is very important (critical
  for quality assurance) and also challenging.
• Updates and patches are frequent + Various
  aspects needs to be addressed  Automated
  Tools are always needed
• Source of bugs in security devices (e.g., firewall)
       – New interfaces/features  Parsing/CLI errors
       – Polices complexity evolving syntactically and
         semantically  Policy refinement/translating errors
       – Matching optimization algorithms  Filtering errors
       – Forwarding error                                      3
MSW – April 2007              INSPEC
Challenges of Testing Security
Devices
• Policy Generation
      – Many different configuration parameters to consider
              • ACL Keywords, protocols, header options, field values, rules
                overlapping, wild-card vs specific value, rule order, policy size,
                and combination of these
      – The generation process needs to be very tunable.
              • The space from which field values are to be chosen is huge, and
                good/selective coverage is essential.
              • Selecting certain dimensions to stress on.


                                                                                     4
MSW – April 2007                        INSPEC
Challenges of Testing Security
Devices
• Extensible ACL syntax:
       – The testing procedure should be totally transparent to
         the different changes in the supported syntaxes,
         keywords, predefined services and protocols, ...

• Traffic Generation
       – Given a policy, exhaustive testing requires 4x1013 years
         when all tuples are used and 4.5 when address domain
         and other dimensions are fixed  simply infeasible
       – Random sampling requires exponential number of
         samples in number of filtering fields and confidence
         required  high false negative rate and inefficient
                                                                  5
MSW – April 2007              INSPEC
Challenges of Testing Security
Devices
• Device Interface
      – Devices have different logging capabilities. Moreover,
        Logs – in many cases – lack the completeness needed
        to perform a thorough analysis.
      – Preparing the firewall for testing is, in most cases, very
        OS-specific.




                                                                     6
MSW – April 2007               INSPEC
Project Objectives

• Have an Automated Framework for Testing
  Network Security Devices.
• It should include:
   – Test Policy Generation
       • Syntax is flexible
       • Generation can be tuned
   – Test Case Generation
       • Efficient selection of packets


                                            7
MSW – April 2007   INSPEC
INSPEC Architecture

• The flow of the testing procedures can be
  shown as follows:
                                       ENGINE
              Firewall
              Grammar
                                                       Segment
                                                       Analyzer
                                                                                                          Firewall
               Policy
                          FW Policy
              Generator

                                      Segmentation        Test           Test Packet          Test
                                        Module          Segments          generator          Packets

               Manual     FW Policy




                                                                                                                 FW output
                                                                      Post-Test
                                            Reports    Test results                    Report to engine         Spy
                                                                      Analysis



                                                                                                                             8
MSW – April 2007                                      INSPEC
Grammar Parser and Processor

• Using a flexible grammar specification model: future
  changes, modifications, additions to ACL
  syntax/capabilities can be smoothly integrated.


• Most syntax changes can be enforced without
  changing the program/code.


• Adding features, (e.g., extra protocols with their
  extra fields) can be achieved with minimal effort.
                                                         10
MSW – April 2007         INSPEC
 Grammar Parser and Processor

  • Sample grammar (Standard IOS):
\\Rules
S := "access-list" acl-num action SrcAddr [opt]

acl-num\FieldID(100)
acl-num                    := \number(1,99)
                               number(1,99)
action\FieldID(0)
action                        "permit“\V(1) | "deny“\V(0)
                           := "permit" | "deny"

SrcAddr\FieldID(2)
SrcAddr              :=   IPany | IPpair
IPany                :=   "any"\Translate("IP","operator","any")
                          "any"
IPpair               :=   \IPvalue\Translate("IP","operator",“value") [Mask]
                           IPvalue [Mask]
Mask                 :=   \IPMask\Translate("IP","operator",“mask")
                           IPMask

opt\FieldID(80)
opt                     "log“\V(1)
                     := "log"

\\EndRules

                                                                          11
 MSW – April 2007                    INSPEC
 Grammar Parser and Processor

  • Sample grammar (Standard IOS):
\\Rules
S := "access-list" acl-num action proto SrcAddr [opt]

acl-num\FieldID(100)       := \number(1,99)
action\FieldID(0)          := "permit“\V(1) | "deny“\V(0) | “allow”\V(1)
Proto\FieldID(1)           := \Lookup(“number”,”proto.txt”)

SrcAddr\FieldID(2)   :=   IPany | IPpair
IPany                :=   "any"\Translate("IP","operator","any")
IPpair               :=   \IPvalue\Translate("IP","operator",“value") [Mask]
Mask                 :=   \IPMask\Translate("IP","operator",“mask")

opt\FieldID(80)      := "log“\V(1)

\\EndRules
                                                                           12
 MSW – April 2007                    INSPEC
  Grammar Parser/Processor
  • A more complex example (extended ACL):
S         := "access-list" Policy-Number action protocol SrcAddr [SrcPort] DestAddr [DestPort] [icmpquals] [igmpquals] REST
REST      := [ACK] [FIN] [PSH] [RST] [SYN] [URG] [Prec] [Tos] [Established] [Logging] [Fragments]
Policy-Number\FieldID(100)   := \number(100,199)
action\FieldID(0)            := "permit"\V(1) | "deny"\V(0)
protocol\FieldID(1)          := \number(0,255) | \Lookup("number","protocols.txt")
SrcAddr\FieldID(2)           := IPaddr
DestAddr\FieldID(3)          := IPaddr
SrcPort\FieldID(4)           := Port
DestPort\FieldID(5)          := Port

IPaddr    := IPany | IPhost | IPpair
IPany               := "any"\Translate("IP","operator","any")
IPhost              := "host" \IPvalue\Translate("IP","operator","IPhost")
IPpair              := \IPvalue\Translate("IP","operator","IPsubnet") \IPmask\Translate("IP","operator","IPmask")

Port\Cond(1,17)      :=   PortOp1|PortOp2|PortOp3|PortOp4|PortOp5
Port\Cond(1,6)       :=   PortOp1|PortOp2|PortOp3|PortOp4|PortOp5
PortOp1              :=   "eq" Y\Translate("port","operator","eq")
PortOp2              :=   "lt" Y\Translate("port","operator","lt")
PortOp3              :=   "range" Y\Translate("port","operator","ge") Y\Translate("port","operator","le")
PortOp4              :=   "gt" Y\Translate("port","operator","gt")
PortOp5              :=   "neq" Y\Translate("port","operator","ne")
Y\Cond(1,17)         :=   \number(1,65535) | \Lookup("number","udpports.txt")
Y\Cond(1,6)          :=   \number(1,65535) | \Lookup("number","tcpports.txt")

icmpquals\FieldID(6)\Cond(1,1)            :=   \Lookup("number","icmpquals.txt")
igmpquals\FieldID(19)\Cond(1,2)           :=   \Lookup("number","igmpquals.txt")
Prec\FieldID(7)                           :=   "precedence" \number(0,7)
Tos\FieldID(8)                            :=   "tos" \number(0,15)
Logging\FieldID(80)                       :=   "log"\V(1)
Established\Cond(1,6)\FieldID(9)          :=   "established"\V(1)
Fragments\FieldID(10)                     :=   "fragments"\V(1)
ACK\FieldID(11)\Cond(1,6)                 :=   "ack\V(1)
FIN\FieldID(12)\Cond(1,6)                 :=   "fin"\V(1)
PSH .. RST .. SYN .. URG

                                                                                                                       13
  MSW – April 2007                                           INSPEC
Policy Generation
                                                   Embedded statistics
                                                   guide the navigation
• Using the given                                                                               “TCP”
                                                                                                                             SrcPort                  DestPort


  BNF, rule-by-rule            “Access-list”          number             “Accept”               “UDP”                SrcIP                  DestIP

                           S                   1                   2                       3                 4                    5                       6
  generation takes                                                                              “ICMP”
  place.                                                                  “Deny”
                                                                                                “any”

                                                                       Protocol-qualification
• The edges of the
  graph representing                                “precedence”                                  precedence
                                                                                                                             6'              E
  the BNF carry the                  6                                                                                                nil


  statistical guidelines                                “tos”                                              tos
  for tunable policy                                “time-range”                                        Time-range
  structures.

• Can be provided                                                          “established”
  independently of a
  specific BNF.                                                          “log”|”log-input”

                                                                           “fragments”

                                                                                 - nil -
                                                                                                                                                     14
MSW – April 2007                               INSPEC
Policy Segmentation

• How to identify the different decision paths
  a firewall will take?
      – Different rule interactions within the policy
        create different decision paths


• Testing every decision path will cover the
  whole operation space of the firewall.
                                                        15
MSW – April 2007
Policy Segmentation
                     R1: tcp 121.63..: any 143.91.78.: any                      accept
                     R2: tcp 121.63.71.: any          143.91..: any             accept
                     R3: any ...: any ...: any           deny


                                   R3                                         S3

                           R2                                       S2
                                                                         S4
                                   R1                                         S1




                   (a) Policy address-space              (b) Segmented address-space

•Each segment will be tested independently with a set
of tailored packets.
•The action for such packets are dictated by the first
rule on top of the segment’s intersecting-rules list
                                                                                            16
MSW – April 2007                                INSPEC
Policy Segmentation

• The interaction between rules within a single policy
  partitions the space into different areas (i.e., Segments).
• Each segment is fully specified by the intersection of a
  specific subset of the policy rules.
• Segments are:
       – Disjoint
       – Cover the whole space of possible packet configurations (w.r.t. a
         given policy)
       – Number of Segments is not much larger than policy size
         (experimental: 2n … 5n)


                                                                             17
MSW – April 2007                   INSPEC
Segment Weight Analysis

• The weight of a segment is our estimate of the
  probability of a fault (wrong firewall decision) occurring in
  its space.
• Factors affecting the weight includes:
       –    Segment’s area
       –    Number of intersecting rules over the segment’s space
       –    Number of rules that affected the shape of this segment
       –    The complexity of the top most rule (owner rule) of this segment
       –    The complexity of each overlapping rule.



                                                                               18
MSW – April 2007                      INSPEC
Formulation of Segment Weights

   • Weight of a segment is a function of different
   parameters that specify how critical is this segment w.r.t.
   testing:




  • During a test interval of T seconds and using a rate of
  R packets/second, the number of generated packets ni
  for segment Si is given by the formula:




                                                                 19
MSW – April 2007                      INSPEC
Packet Generation
 • Packets are selected independently from each segment.
   Each segment is allocated packets in ratio to its weight.

 • Packets are selected with changing high order bits (from
   optional fields) before those of the lower order fields
   (IPs, protocol).
         – Example: A segment with tcp protocol will first select packets
           with all possible tcp control bits before changing the port number
           (if free), then when this space is exhausted the IP will be
           changed accordingly.
         – Example: Generic ip segments, will go directly to changing the
           DstIP field as they are the first available bits.


                                                                            20
MSW – April 2007                     INSPEC
Spy module

• Responsible for monitoring the output of the
  firewall and report it back to the engine.


• It can support multiple connecting Engines, and
  multiple simultaneous tests per Engine.


• Designed to withstand high speed injections.
                                                    21
MSW – April 2007       INSPEC
Spy module
1. The Engine requests a new
   test monitoring task.
2. The Main process of the spy,
   creates a thread, that will
   start processing this request.
3. The Capture process will
   capture the packets and
   sends them to the DeMux
   process that will redirect the
   packet to the corresponding
   thread.
4. Once the thread decides the
   test is over, it will destroy the
   pipe, and return to the
   Engine any discrepancies.



                                                22
MSW – April 2007                       INSPEC
INSPEC Architecture

• The flow of the testing procedures can be
  shown as follows:
                                       ENGINE
              Firewall
              Grammar
                                                       Segment
                                                       Analyzer
                                                                                                          Firewall
               Policy
                          FW Policy
              Generator

                                      Segmentation        Test           Test Packet          Test
                                        Module          Segments          generator          Packets

               Manual     FW Policy




                                                                                                                 FW output
                                                                      Post-Test
                                            Reports    Test results                    Report to engine         Spy
                                                                      Analysis



                                                                                                                             23
MSW – April 2007                                      INSPEC
INSPEC Architecture                                                            (detailed)
                                           BNF
                                                                                         BNF
                                       Configuration           BNF Parser             Translation
                                            File                                                    Administration
                                                                                        Graph        Information
                                                                                     (BNFGraph)



                   Test Scenario                                Policy
                                                    Policy               Policy Checker/
                    Generation                                                                       FW Admin
                                                   Generator               Compiler
                     Options

                                                             Manually
                                                           Entered Policy
                                                                                     Segmentation      Injection
                      Weight
                                                                                                        awaits
                     Analyzer
                                                                                                      success of
                     Options                                             Weight                      configuration
                                                                         Analyzer

                    Test Case                                                         Test Packet
                    Generation                                                         Generator
                     Options


                                                               Compression             Injection
                           Repeating
                             Block
                                                                  Spy                 FIREWALL

                                            Post test      Decompression
                    Reporting
                                            Analysis

                                                                                                                     24
MSW – April 2007                                               INSPEC
Key Features

• Flexible:
       – Customizable parsing
               • Testing engine (test and scenario generation) is
                 independent of CLI specifics
               • Highly flexible to accommodate future CLI
                 extensions
• Universal:
       – Intermediate representation (BDD Segments)
         abstracts the firewall model and filtering language
         capabilities

                                                                    25
MSW – April 2007                   INSPEC
Key Features

• Efficient
       – Test Generation
               • Smart test packet selection based on segmentation
               • Controllable based on the test complexity/critically
               • Generated Packets cover all possible decision paths of the
                 Filtering Algorithm.
       – Scenario Generation
               • Testing using a wide tuneable (e.g., distribution of fields)
                 selection of automatically generated policies.
               • To avoid using a huge number of packets for reasonable
                 coverage: Packets are selected in order to exhaust as
                 many dimensions as possible.
                                                                                26
MSW – April 2007                        INSPEC
Key Features
• Comprehensive/Coverage:
       – Scenario Generation
               • Policies generated cover a wide variety of ACEs, and their relations.
               • All possible rules (ACEs) interactions
               • The generation is guaranteed to generate legitimate ACEs for the
                 DUT's syntax.
       – Test Generation
               • Different values/combinations of tos/prec/tcp_flags are exhausted
                 before port numbers, before IP addresses.
               • Enumerating all “constants” (e.g., protocol port names)
               • Exhaustive testing if possible (reasonable size segments).
               • Packets generated such that all possible rule-rule interactions are
                 tested
               • Controllable based on the test complexity/criticality

                                                                                       27
MSW – April 2007                          INSPEC
Key Features

• Fast:
       – The packet generation mechanism guarantees there will
         be no duplicates in the packets selected, in an extremely
         efficient way.
       – Test can be terminated based on time or traffic
       – To avoid using a huge number of packets for reasonable
         coverage: Packets are selected in order to exhaust as
         many dimensions as possible.
          • Example: Different values/combinations of
             tos/prec/tcp_flags are exhausted before port numbers,
             before IP addresses.


                                                                     28
MSW – April 2007                INSPEC
Value to the Industry
• No manual inspection can beat INSPEC
       –    Easy to use and extend
       –    Comprehensive: works in multi-dimensions
       –    Intelligent
       –    Fast
• Nothing out there does similar thing particularly
  automatic policy generation
• INSPEC business values: It decreases
       – Time-to-market for new products and new features
       – MTTR (mean-time-to-repair)
       – Man-hour


                                                            29
MSW – April 2007                   INSPEC
                   Discussion

                     Q&A




                                30
MSW – April 2007      INSPEC
                                                                                                                                                                                          Effect of correlation on the technique effectiveness

                                                                                                                                                   100.00%

                                                                                                                                                   90.00%

                                                                                                                                                                                        Cor0/RAND




                                                                                                                  Advantage over RANDOM Sampling
                                                                                                                                                   80.00%
                                                                                                                                                                                        Cor0.5/RAND


Evaluation of the                                                                                                                                  70.00%


                                                                                                                                                   60.00%
                                                                                                                                                                                        Cor1/RAND




Segmentation-                                                                                                                                      50.00%

                                                                                                                                                   40.00%



based testing                                                                                                                                      30.00%

                                                                                                                                                   20.00%

                                                                                                                                                   10.00%

                                                                                                                                                    0.00%
                                                                                                                                                        1.E-10                                    1.E-09                  1.E-08                    1.E-07          1.E-06
                                                                                                                                                                                                             Failure RatioAverage fault ratio




                                                      Effect of policy style on performance                                                                                                                       Effect of real size variability
                                                                    (segment sizes normalized)                                                                                                                             (low sample size)
                                60.00%                                                                                                                                                  100.00%

                                                                                                                                                                                        90.00%
Gain over random (percentage)




                                50.00%
                                                                                                                                                                                        80.00%




                                                                                                                                                                 Gain (%) over Random
                                                                                                                                                                                        70.00%
                                40.00%
                                                                                                                                                                                        60.00%

                                30.00%                                                                                                                                                  50.00%
                                                                                                                                                                                        40.00%
                                20.00%
                                                                                                                                                                                        30.00%

                                                                                                                                                                                        20.00%
                                10.00%
                                                                                                                                                                                        10.00%

                                0.00%                                                                                                                                                    0.00%
                                         Distinct 1    Distinct 2          Overlap 1        Overlap 2   Corel 1                       Corel 2                                                              Low                           Mid                 High
                                                                                  Policy Style                                                                                                                                      Skewness
                                                                                                                                                                                                                                                                     32
                                 MSW – April 2007                                                                                                     INSPEC
Evaluation of the Segmentation-based
testing

• In most problems with FW implementations, the fault
    appears over whole rules and segments.

• This makes the segmentation even more effective. One
    Single Packet is theoretically enough per segment to
    discover the problem.
      – This is guaranteed in our test case selection model.

      – The probability to find the error is 100%.

                                                               34
MSW – April 2007                 INSPEC