Docstoc

Security with Privacy

Document Sample
Security with Privacy Powered By Docstoc
					Digital Forensics and Privacy




                     Doug Tygar, Berkeley




  June 26-28. 2005           All Hands Meeting
                 Digital Forensics & Privacy in TRUST




Figure 4: Management Structure of TRUST. The eleven technical challenge areas are divided into three groups,
                with no hierarchy implicit among the technical areas in each of the groups.

                                                                                                           2
                 Digital Forensics & Privacy in TRUST




Figure 4: Management Structure of TRUST. The eleven technical challenge areas are divided into three groups,
                with no hierarchy implicit among the technical areas in each of the groups.

                                                                                                           3
               Digital Forensics & Privacy

•   Digital Forensics
    – Recording, searching, & tracking information about (mis)use


•   Privacy
    – Allowing & disallowing access to information


•   Digital Forensics & Privacy belong together
    – Issues from each reflect on the other in a “hall of mirrors”
    – How do we keep private information from being improperly
      disclosed in the name of forensics?
    – How to monitor (and audit) digital forensics while keeping search
      information secret?
•   Related issue: usability
    – Case example: Spoofing (see talks by Dan Boneh and Rachna
      Dhamija)
                                                                          4
      Key strategies for Digital Forensics
•   Selective revelation

•   Strong audit

•   Rule processing technologies




                                             5
                             Idealized architecture

     Initial revelation of
        sanitized data



                                           Discovery via
                        !              standing queries or
                                         real-time search
                                  Privacy/
                                  Security
       Core Idea:                 Barrier
(1) Analyze data behind
security barrier; find critical
relationships
(2) Reveal relationships
selectively only through                                     Data Repositories
guarded interface


                                                                                 6
                     Distributed architecture

Multiple repositories 
Multiple privacy/security
barriers




      !




                                                7
     Example technology: encrypted search

                          encrypted queries
                                                      Foreign or
    Intelligence
                          encrypted response          private data
    Analyst
                                                      repository




• Queries are sent encrypted
• Queries are processed but not decrypted by repository
• Repository prepares response but does not know what search was or
  whether it was successful.


                                                                      8
      Sensor webs: experimental framework

•   Problems are real:
    – Li Zhuang (my group at Berkeley) work on reconstructing keyboard
      presses with no learning phase.
    – Markus Kuhn (Cambridge) work on reconstructing monitor
      information based on indirect leaking of light
    – Differential Power Analysis infers computation from electrical
      current draw
    – Culler, Hellerstein, Mulligan, Samuelson, Wagner on legal and
      technical approaches to privacy protection (this morning’s talk)
    – RFIDs raises prospect of near universal surveillance (see Mulligan
      and Wagner for RFIDs in books in public libraries)
    – Stanford Privacy Institute (this morning’s talk by Boneh)
    – Perrig and Song work at CMU




                                                                       9
         Places to Enforce Privacy in sensor webs
•   Checkpoint 1: Disguising information
    – E.g., SANDIA style RFIDs that return PRNG ID values
•   Checkpoint 2: Restrict information collected by motes
    – Use an “emergency flag” to signal that now information is subject to
      collection by motes (e.g., TESLA secure broadcast)
•   Checkpoint 3: Restrict information passed on by motes
    – Potential for integration with secure communications protocols
•   Checkpoint 4: Restrict information stored in databases
    – Selective control of information – this draws on the same principles:
      selective revelation, strong audit, rule processing technologies.
•   Checkpoint 5: Restrict access to information
    –   Law and Policy
    –   Allow encrypted search of databases
    –   Private Matching techniques
    –   Classic rule-based access control techniques

                                                                              10
      Making DF and Privacy real in TRUST

•   Collaborations with Social Science, Law and Policy
•   Economic Issues (e.g. appropriate liability rules)
•   Usability of Privacy
•   Evaluating mechanism strength (including metrics)
    for privacy
•   Integrating with usability and other research
    challenge areas
•   Testing on live data




                                                         11

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:2
posted:9/8/2012
language:English
pages:11