Intelligent Environments

Document Sample
Intelligent Environments Powered By Docstoc
					Intelligent Environments

 Computer Science and Engineering
  University of Texas at Arlington

              Intelligent Environments   1
Security and Privacy
   Motivation
   Techniques
   Issues

                 Intelligent Environments   2
   Physical security
   Data security
       Protect sensory data
       Wireless eavesdropping
       e-Intrusion
       Levels within environment
   Degree of autonomy

                   Intelligent Environments   3
   Physical security
   Law enforcement
   Encryption
   Firewalls
   Intrusion detection
   Biometrics
   Software safety
                Intelligent Environments   4
Physical Security
   Intrusion detection
   Video surveillance
   Metal detectors, X-ray scanners
   Motion detectors, infrared nets
   GPS tracking
   Access control (key, card, RFbadge,

                Intelligent Environments   5
Law Enforcement and Privacy
   Conflict between an individual’s right to
    privacy and the government’s need to
    invade privacy to uphold the law
   Complicated by digital data, encryption
    and wireless communications

                Intelligent Environments   6
U.S. Constitution
   Fourth Amendment (abridged)
       The right of the people to be secure in
        their persons, houses, papers, and effects,
        against unreasonable searches and
        seizures, shall not be violated.
   Fifth Amendment (abridged)
       No person shall be compelled in any
        criminal case to be a witness against

                   Intelligent Environments      7
Computer Crime Laws
   Texas computer crimes law

                   Intelligent Environments    8
Privacy and Cyber-Utopia
   Global, seamless and secure e-
   New encryption standard required
       Individual privacy preserved
       Law enforcement surveillance possible
       U.S. computer industry globally competitive
       Ability of national governments to regulate
        the nation preserved

                   Intelligent Environments     9
   Law enforcement eavesdropping on
    communication without informing the people
    who are communicating
   U.S. Supreme Court Olmstead v. U.S. (1928):
    wiretaps did not require special authorization
    if no trespassing necessary
   U.S. Supreme Court (1967): wiretaps, even of
    public phone booths, require prior judicial
                  Intelligent Environments     10
Effectiveness of Wiretapping
   Activity since 1968 (EPIC)
   Each wiretap actually enabled monitoring
    many conversations
   Computerization complicates wiretapping
       Digital data
       Computer switching
       Optical fiber transmission
       Need to know data structures, formats and
        algorithms used in communication systems

                    Intelligent Environments        11
Digital Telephony Standards
   1994 mandate that communications systems
    equipment be designed to allow practical
    wiretapping by law enforcement
   Isolate the communications stream of an
   $500M allocated for conversion
   Communications Assistance for Law
    Enforcement Act (CALEA)

                Intelligent Environments   12
Digital Telephony Standards:
   Most effective way to fight crime?
   Increase government’s “big brother” power?
   Security problems?
   Hindering technological advance?
   Who pays for the cost?
   Effect on U.S. industry competitiveness?
   Mandated capabilities useful?

                 Intelligent Environments   13
   Wiretapping encrypted digital
    communication of no use
   Solutions
       Break encryption scheme
       Legislate encryption

                  Intelligent Environments   14
Private-Key Encryption
   Also called secret key or symmetric
   Algorithm public; key private
   Easy to break if number of possible
    keys is small
   Problems
       How to securely distribute private key
       Ensuring authenticity of messages

                   Intelligent Environments      15
Data Encryption Standard
   Developed at IBM in 1977
   Private-key encryption
       56-bit key (256 = 72 x 1015 keys)
   Key chosen randomly for each message
   Applies 56-bit key to each 64-bit block
    of data
   Multiple passes for stronger encryption
       Triple DES still in use (256+56+56 keys)

                    Intelligent Environments       16
Public-key Encryption
   Also called asymmetric
   Each person generates a public and
    private key
   Everybody knows public keys
   Only individual A need know their own
    private key
   privateA(publicA(M)) = M
   publicA(privateA(M)) = M
               Intelligent Environments   17
Public-key Encryption
   Digital signatures
   Person A encrypts message M with their
    private key to get M’
   Person A encrypts M’ with B’s public key to
    get M’’, which is sent to B
   Person B decrypts M’’ with private key to get
   Person B decrypts M’ with A’s public key to
    get M, but only if from A
   publicA(privateB(publicB(privateA(M))) = M
                  Intelligent Environments     18
Generating Public/Private Key
   RSA algorithm (patented)
   encryptA(M) = Me modulo n
   decryptA(M) = Md modulo n
   Public key = (e,n)
   Private key = (d,n)
   n = p*q, where p and q are large random
       e and d chosen based on p and q
   Security rests on difficulty to factor product n
    of two large primes
                    Intelligent Environments     19
Government Encryption Policy
   Government’s position
       Public-key encryption too difficult to wiretap
       Limit export of encryption
       Design own tap-able encryption scheme
   Industry’s position
       Use widely-accepted, strong encryption standard
       Freely export standard

                      Intelligent Environments           20
Escrowed Encryption Standard
   EES developed by U.S. government in 1993
   Skipjack algorithm implemented on the
    Clipper and Capstone chips
   Private-key encryption
   Each chip has an 80-bit unit key U, which is
    escrowed in two parts to two different
   Chip also includes a 30-bit serial number and
    an 80-bit family key F common to all Clipper
                  Intelligent Environments     21
Escrowed Encryption Standard
   Two devices agree on an 80-bit session key K
    to communicate
   Message is encrypted with key K and sent
   Law-Enforcement Access Field (LEAF)
    appended to message, including
       Session key K encrypted with unit key U
       Serial number of sender
       All encrypted with family key F

                     Intelligent Environments     22
ESS Wiretapping
   Use family key to obtain LEAF
   Now have serial number of sending
    device and encrypted session key
   Upon authorization, two agencies
    present their two escrowed portions of
    the unit key U
   Use unit key U to decrypt session key K
   Use K to decrypt message

                Intelligent Environments   23
EES Issues
   Circumvention
   Security
       Skipjack algorithm
       Escrowed keys
   Both escrow agents governmental
   U.S. industry competitiveness
   “Forgetting” unit keys
   EES dropped due to public opposition

                     Intelligent Environments   24
Advanced Encryption Standard
   AES is U.S. government’s encryption
    standard as of 2001
   Rijndael algorithm selected from among
    several candidates
       Efficient block cipher
       Variable block and key length
            AES supports 128, 192 and 256 bits
   See

                       Intelligent Environments   25
Current Issues
   Encryption export limitations
       Relaxed January 2000
   Key recovery (escrowed) encryption
   Any encrypted message must be decryptable
    by law enforcement with proper authorization
       Encrypter must provide means to decrypt message
       Fifth amendment issues
   Wireless communications

                    Intelligent Environments       26
Points to Remember
   Law enforcement using new wiretap
    legislation to monitor email
   Escrowed key approaches likely to never
    catch on
   AES holds promise
   Law enforcement needs mechanism to
    decrypt information pertinent to criminal
   There is no specific “right to privacy” in the
    U.S. Constitution
                   Intelligent Environments      27
Privacy Law Resources
   Electronic Privacy Information Center
   Electronic Frontier Foundation

                  Intelligent Environments   28
   Filter packets not meeting specified
       IP number constraints
       Port constraints
       Connection-type constraints
   IETF IP Security Standard (IPSEC)
   Secure Shell (
   Good countermeasure, but complicated to

                     Intelligent Environments   29
Intrusion Detection
   Tripwire
   Firewall programming
   Pattern recognition
   Behavioral outliers

               Intelligent Environments   30
   Biometrics
       Automatically recognizing a person using
        distinguishing traits
   Modes
       Face
       Iris, retinal
       Vein
       Fingerprint, hand and finger geometry
       Handwriting
       Voice

                     Intelligent Environments      31
Face Recognition
   Controlled background
   By color (skin)
   By motion (e.g., blinks)
   Mixture of above
   Unconstrained scenes
       Neural networks
       Model-based
   CMU:

                    Intelligent Environments   32
Iris and Retinal Biometrics
   Identify iris
   Encode wavelet patterns
   100,000 comparisons per second on
    300MHz machine
   Vein mapping

               Intelligent Environments   33
Vein ID

   Grayscale image from back of hand
   Image segmentation and edge
   Generate unique vein map
               Intelligent Environments   34
Fingerprint and Hand
   Approaches
       Transform image
   Transforms
       Fourier transform
            Time to frequency
       Wavelet transform
            Time to frequency/occurrence

                      Intelligent Environments   35
   Hidden Markov models
   Neural networks

              Intelligent Environments   36
   Markov models

              Intelligent Environments   37
Software Safety
   Risk analysis is a difficult task that has a
    subjective component
   A software model of a real physical
    system can never perfectly represent all
    relevant aspects of the system
       Over-reliance on computer models that are
        not properly validated invites disaster

                   Intelligent Environments   38
Software Safety
   No widely accepted standard for
    developing safety-critical software
   Resources
       The Risks Digest
       CMU Software Engineering Institute

                        Intelligent Environments   39
 Degree of Autonomy

“2001: A Space Odyssey” Turner Entertainment, 1968

                   Intelligent Environments          40
Security and Privacy
   Law enforcement
   Individual counter measures
       Encryption
       Firewalling
       Biometrics
   Degree of autonomy

                      Intelligent Environments   41

Shared By: