Automated Security Testing of Virtual Machine Monitors

Document Sample
Automated Security Testing of Virtual Machine Monitors Powered By Docstoc
					Automated Testing of
System Software (Virtual
Machine Monitors)
Tao Xie
Department of Computer Science
North Carolina State University
http://www.csc.ncsu.edu/faculty/xie/
Automated System Software Testing
   Purpose: automated testing of system code
    bases (e.g., virtual machine monitors) for
    robustness, security, functionality, coverage..
   Often highly environment-dependent software
   Challenges
       Code bases are complex
           Heavily interact with system APIs; system behavior
            depends on environment state, e.g., open("/dev/tty",O_WRONLY)
       Testing requires sophisticated system setup
Testing Environment-Dependent Software

    Test inputs: method arguments, receiver
     object state & input environment state
    Test outputs: method return values, receiver
     object state & output environment state
    Sufficient & safe testing of software
        generate high-covering tests
        cause no threat to the environment
Application of Automated Testing Tools

   Dynamic symbolic execution tools generate input
    method arguments and receiver object state
       Microsoft Pex (C#) and CREST (C)
   Empirical study: applied these tools to test Xen,
    CodePlex Client, and NUnit framework
       Heavily interact with the file-system environment
   Observed results: test generation tools failed to
    generate high-covering test inputs
   Identified problem: required input environment
    states
   Problems
        P1: Generating input environment state is beyond the scope
         of test-generation tools.
        P2: Arbitrary program inputs generated by test-generation
         tools can lead to pollution/threat of the environment states.

                             Example code under test
//Code Coverage – Many cases – Environment state            // Safe testing – delete directory
01: public void Add(string localPath,                       01: bool OnBeforeAddItem(SourceItem item)
      bool recursive,SourceItemCallback callback)           02: {
02: {                                                                      ..........
03: Guard.ArgumentNotNullOrEmpty(localPath, "localPath");   09: if (item.ItemType == ItemType.File)
04: if (fileSystem.DirectoryExists(localPath))              10:      File.Delete(item.LocalName);
05:      AddFolder(localPath, recursive, callback, true);   11: else
06: else if (fileSystem.FileExists(localPath))              12:      Directory.Delete(item.LocalName);
07:      AddFile(localPath, callback, true);......          13: ......
                                                            14: return (answer == "y" || answer == "a");
                                                            15: }
Outline
   Mock objects as a solution to the identified
    problems
   Challenges
   Proposed approach
   Preliminary results
   Future work
Mock Objects
   Used to simulate the         //Code under test in Mock object based testing
                                 approach
    required      environment,   01: public void Add(string localPath,
    avoiding interacting with          bool recursive,SourceItemCallback callback)
                                 02: {
    the real environment         03: Guard.ArgumentNotNullOrEmpty(localPath,
                                                                       "localPath");
   Benefits:                    04: if (mockFileSystem.DirectoryExists
       Enable unit testing                                           (localPath))
       Increase code coverage   05:      AddFolder(localPath, recursive, callback,
                                                                       true);
       Ensure safe testing      06:   else if (mockFileSystem.FileExists
                                                                       (localPath))
                                 07:      AddFile(localPath, callback, true);......
   Challenge: Non-trivial to
    implement a mock object
Mock Objects (cont.)
   Incomplete/incorrect               //incorrect implementation
    implementation causes false        public bool DirectoryExists(string path)
                                       {   return false;    }
    alarms
   Non-trivial to implement           public void CreateDirectory(string path)
    sophisticated mock object          { listOfCreatedDir.Add(path); }

   Tedious task!
                                       //correct implementation
                                       public bool DirectoryExists(string path){
   Our solution:                         if (listOfCreatedDir.Contains(path))
    Systematic approach to build                return true;
                                          return false;
    a mock object to pass given        }
    tests failed due to insufficient
                                       public void CreateDirectory(string path){
    mock object                           if (listOfCreatedDir.Contains(path))
                                             return;
                                          else
                                             listOfCreatedDir.Add(path);
                                       }
Approach
   Moca (MOCk Assistant) follows Test-Driven
    Development (TDD) to systematically build a high-
    quality mock object that is sufficient to achieve
    effective testing of the code under test without
    causing any false alarms

   Moca makes use of PUTs (Parameterized Unit
    Tests)
   PUTs = unit tests (TUTs) with parameters, including
    functional specifications
   Microsoft research tool, Pex, accepts these PUTs
    and generates high-covering tests
Approach (cont.)
   Input to Moca
       A set of conventional unit tests (failing due to an
        insufficient mock object)
       An environment that needs to be mocked
       A set of PUTs


   Moca assists developers in building a mock object
    that can be used to replace the real-environment
    interactions
Approach (cont.)




                   TUT – Conventional unit tests
                   CUM – Class to mock
                   ENV – Environment
                   MUM – Method to mock
                   CUT – Code under test
Preliminary Results
   Application on a real-world application, CodePlex
    Client

   Results: Moca can assist developers in building a
    mock object
       Effective in achieving high coverage, w/o false alarms
       Sufficient when compared to a naive implementation,
       Less complex and thus incurring less effort than a
        manual sophisticated implementation
Summary
   Identified problems with automated testing of
    environment-dependent software
   Conducted empirical study to show benefits of
    using mock objects and indentify challenges in
    building mock objects
   Proposed an approach based on TDD methodology
    to build mock objects
   Demonstrated the feasibility and benefits of the
    proposed approach
   Also developed new techniques for test generation
Key Outcomes
Relevance to military/DoD:
   An undergraduate student, Justin Gorham, is working as a summer intern at the Fort Hood Army
    Electronic Proving Ground (EPG) team in applying Pex and our extensions on Army code bases.
   A PhD student, Kunal Taneja, is working as a summer intern at FDA in applying Pex and our
    extensions on DoD code base for regulatory purposes (mocking databases).

Publications:
   [AST 09] Madhuri R Marri, Tao Xie, Nikolai Tillmann, Jonathan de Halleux, and Wolfram Schulte.
    An Empirical Study of Testing File-System-Dependent Software with Mock Objects. In
    Proceedings of the 4th International Workshop on Automation of Software Test (AST 2009), Business
    and Industry Case Studies, pp. 149-153, May 2009.
   [Mutation 09] Tao Xie, Nikolai Tillmann, Jonathan de Halleux, and Wolfram Schulte. Mutation
    Analysis of Parameterized Unit Tests. In Proceedings of the 4th International Workshop on
    Mutation Analysis (Mutation 2009), pp. 177-181, April 2009.
   [SUITE 09] Madhuri R Marri, Suresh Thummalapenta, and Tao Xie. Improving Software Quality
    via Code Searching and Mining. In Proceedings of the First International Workshop on Search-Driven
    Development – Users, Infrastructure, Tools and Evaluation (SUITE 2009), pp. 33-36, May 2009.
   [DSN 09] Tao Xie, Nikolai Tillmann, Peli de Halleux, and Wolfram Schulte. Fitness-Guided Path
    Exploration in Dynamic Symbolic Execution. To appear in Proceedings of the 39th Annual
    IEEE/IFIP International Conference on Dependable Systems and Networks (DSN 2009), June-July
    2009.
   [ESEC/FSE 09] Suresh Thummalapenta, Tao Xie, Nikolai Tillmann, Peli de Halleux, and Wolfram
    Schulte. MSeqGen: Object-Oriented Unit-Test Generation via Mining Source Code. To appear in
    Proceedings of the 7th joint meeting of the European Software Engineering Conference and the ACM
    SIGSOFT Symposium on the Foundations of Software Engineering (ESEC/FSE 2009), August 2009.
Other Related Funding
Related new funding over the SOSI project period:

   NSF CAREER Award: 5 years (Aug 09-July 14)        $425,000
    “Cooperative Developer Testing with Test Intentions”

Other ongoing supports

   ARO Award: 3 years (Sept 08-Aug 11)             $300,000
    “ Mining Program Source Code for Improving Software Quality”
   NSF SoD Award: 3 years (Jan 08-Dec 10)          $245,000
    “Collaborative Research: SoD-TEAM: Designing Tests for Evolving
    Software Systems”
   NSF CyberTrust Award: 3 years (Aug 07-July 10)   $227,275
    “CT-ISG: Collaborative Research: A New Approach to Testing and
    Verification of Security Policies”
Future Directions
   Test generation
       Guided exploration of paths [DSN 09]
       Method-sequence generation [ESEC/FSE 09]
       Security (attack)/access control test generation (w/ NIST)
       Performance testing
       Embedded/network/db/SOA-app test generation
   Dealing with environments
       Fully automate Moca
       Domain-specific mock object tools/libraries (e.g., file
        system, database, network, hardware environments)
   Test oracles
       Detection of insufficiency of assertions
       Inference of normal behavior as approximate oracles
Questions?
More info on research of NCSU Automated Software
  Engineering Group:
 http://www.csc.ncsu.edu/faculty/xie/research.htm
 http://www.csc.ncsu.edu/faculty/xie/publications.htm


   Recent industry impact
       Our Fitnex strategy [DSN 09] integrated in Microsoft
        Research Pex as its default strategy (second half of 2008,
        download count of 5,600)

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:7
posted:7/14/2012
language:
pages:17