camp by xiangpeng

VIEWS: 17 PAGES: 32

									Phishing - A Social Disorder


                L. Jean Camp
       Alla Genkina   Allan Friedman

             www.ljean.com
A tangent

Workshop on the Economics of Information Security

                          2- 3 June 2005
                            Reception June 1

         Early registration rates end tomorrow

           http://www.infosecon.net/workshop
 ROI in security investment, economics of spam, economics of identity,
    vulnerability markets, empirical investigation on privacy & security
                                investments
Phishing - A Social Disorder

   A lack of context

   Human trust behaviors

   Social engineering
     – Begin with the social problem not with the technical potential
     – Solve for human trust behaviors
     – Provide unique signals
 Human and Computer Trust

Social   sciences
   Experiments designed to evaluate how people extend trust
   Game theory
   Common assumption: information exposure == trust

Philosophy
   Macro approach
   Examine societies and cultural practices

Computer     Security
   Build devices to enable trust
Philosophy Suggests


   Trust is necessary to simplify life

     ¯ People have an innate desire or need to trust

     ¯ People will default to extending trust

   People will continue to extend trust - so create
    another source of trust don‟t defeat trusting
    behaviors
Research on Humans Suggest...

   Humans may not differentiate between machines

   Humans become more trusting of „the network‟

   Humans begin with too much trust
     Confirmed by philosophical macro observation
     Confirmed by computer security incidents
     Validated
        • E-mail based
             Scams
             Viruses
             Hoaxes
    Three Observations

   Humans respond differently to human or computer
    "betrayals" in terms of forgiveness?

   People interacting with a computer do not distinguish
    between computers as individuals but rather respond to their
    experience with "computers”

   The tendency to differentiate between remote machines
    increases with computer experience
Response to Failure


   Humans respond differently to human or computer
    "betrayals" in terms of forgiveness
     – Attacks which are viewed as failures as ‘ignored’ or forgiven
     – Technical failures as seen as accidents rather than design
       decisions
         » May explain why people tolerate repeated security failures
     – May inform the balance between false positives and negatives in
       intrusion detection
         » Rarely identified malicious behavior taken more seriously
         » Technical failures easily forgiven
         » Failures expected
    Individiation


   People interacting with a computer do not
    distinguish between computers as individuals but
    rather respond to their experience with
    "computers”
     – People become more trusting
     – People differentiate less
     – People learn to trust
         » First observed by Sproull in computer scientists in 1991
         » Confirmed by all future privacy experiments
Differentiation


   The tendency to differentiate between remote
    machines decreases with computer experience

     – Explicit implication of second hypothesis
     – Explains common logon/passwords
         » along with cognitive limits
Observed Verification of
Hypotheses

   Users are bad security managers
      PGP, P3P,….

   Security should necessarily be a default

   Does end-to-end security maximize autonomy without end-to-
    end human abilities and tendencies?

   Data currently being compiled on experiments

   Surveys illustrate a continuing confusion of privacy &
    security
Computer security is built for
machines

   Passwords
      Humans are a bad source of entropy

   SSL
        Two categories: secure and not secure
        Does not encourage differentiation
        Every site should include a unique graphic with the lock
        Computer security should seek to differentiate machines
Privacy standards are built for
machines

   P3P assumes
     – All merchants trustworthy w.r.t. their own policies
     – Assumes increasingly sophisticated user
     – One standard for all transactions

   PGP
     – Monotonic increase in trust
     – No reset
     – No decrease in rate of trust extension
         » To compensate for increasing trust
     – No global or local reset
         » E.g. change in status
Key revocation is built for
Machines

   CRL tend to be single level

   Different levels of revocation are needed
     – Falsified initial credential
         » All past transactions suspect
     – Change in status
         » Future transactions prohibited
     – Refusal of renewal
         » Current systems adequate

   CRL should reflect the entire systems in which they work,
    including the social system

   CRL is too simplistic, depends on active checking
          WHAT TO DO?
   Computers
     –   Process data
     –   Store data
     –   Transmit data
     –   Distinguish
           » atomicity, privacy, availability,

   Humans
     – Understand context
     – Evaluate uncertainty
     – Make lumping decisions based on context

   Begin with the human as the basis of the design
     – Examine human interactions
     – Signal humans using pre-existing social capital
Context


   Trust is contextual

   Phillips on Zero Knowledge
     – Nyms had to be selected before the person engaged in
       interaction
     – The interaction in question is entering information
     – The information should be available before the interaction
Not Even Communicating with
Users

   Identity theft
     – unauthorized use of authenticating information to assert identity in the
       financial namespace
     – Internal process violation - Choicepoint (at least 145k records)
          » All access to the Choiepoint database was authorized
          » Subsequent misuse was authorized by the information obtained via
            Choicepoint
     – Security Violation - Berkeley
     – Confidentiality information - Bank of American backup data 1.2M
       records

   Risk profile is similar for individuals in all three cases
Dominant Trust Communication
Equivalent Value
Cradle to Grave ID…. So What

   Authentication as what? For what?

   Identification as having what attributes?

   Scope of namespace
     – License to drive
         » requires driving test
     – SSN
         » taxpayer ID to assert right to benefits
     – Birth certificate
         » proof of age
     – Define a credit namespace that allows for large scale illegal employment
     – Require that credit and banking agencies handle their own risks and
       pay for their own identity schemes for all transactions but cash
     – Make video rental agencies accept their own risks
     – Cell phone requires that you have an account to pay for it
     – DL requires you know how to drive
Perfect Single ID

    … for every namespace
        … and every context
                 … for all people




 for definitions: http:// www.ljean.com/
… or solve the problem at hand by
enabling contextually rational trust
behavior
Embedding Browsing in Social
Context

   First trust challenge
     – Enabling trust to allow entry onto the net
     – Enabling monetary flows

   Second trust challenge
     – Providing meaning trust information
         » TrustE, BBB, Verisign
     – Namespaces for specific trust assertions
         » Christian, gay friendly, responsible merchants
     – Requires a common understanding of the limits of the namespaces
         » Transitivity
         » Automated trust decisions
         » Consistency across contexts or explicit definition of context
               • E.g., purchase a book
                      – On divorce
                      – On impotency
                      – On effective job searching
                      – On number theory
Enabling Trust Behavior

   Signal not to trust

   Combine trust perceptions for end users
     – Privacy
         » Based on personal experience
         » Or verification of centralized authority (BBB)
     – Reliability
         » Personal experience
         » Verification (Consumer reports)
     – Security
         » Is personal experience valuable here?
               • Q: what is the value of peer production for security information
         » Centralized verification (Counterpane)
     – ID theft vs. account misuse is distinguished by the bank but not by the
       customer
     – Loss of data from privacy or security is the same for the individual
         » For whom should we design
Context


   Selected context
    determines

   Social network
    display

   Shared
    information

   NOT certificate
    authorities

   Depends on
    homophily
Visual Trust




   Verisign will protect you from anyone who will not give
    them money
     – There has been no business plan of a single trusted root which
       aligns that root with the end user.

   There are competitive trust sources that align with user

   Uses pop-up blocker
Centralized Elements

   No hierarchies
     – Trust is not inherently transitive
     – “Verisign is the trust company”

   Certificates
     – Signed green list

   Signer determines
     –   Frequency of update
     –   Business model
     –   Determinant of entry
     –   Potential examples
           » FDIC
           » Consumer reports
           » BBB
           » Phishguard
Reputation

   Initial reputation of zero

   First visit goes to 1 (out of ten)

   After second visit it increases

   Each visit decreases rate of delay
     – Max of 10

   Explicit rating
     – Stays constant without alteration over time
A New Paradigm for Design

   Design technology to conform to user behaviors

   Assume individuals will default to trust, then lump, and
    forgive

   Depends upon
          »   Usability
          »   Reputation system design
          »   Homophily
          »   Storage capacity
          »   Efficient search

   Provide signals NOT to trust
     – Do not assume that no signal means no trust.
     – No signal will be interpreted as trust
Definitions

   Attribute. a characteristic associated with an entity, such as an individual. Examples of persistent attributes include height, eye color, and date of
    birth. Examples of temporary attributes include address, employer, and organizational role. A Social Security Number are an example of a long-
    lived attribute. Some biometrics data are persistent, some change over time or can be changed, (e.g., fingerprints and hair color).


    Personal identifier. persistent identifiers associated with an individual human and the attributes that are difficult or impossible to alter. For
    example, human date of birth, height, and genetic pattern.


    Anonym (as in anonymous). An identifier associated with no personal identifier, but only with a single-use attestation of an attribute. An anonymous
    identifier identifies an attribute, once. An anonymous identifier used more than once becomes a pseudonym.


    Pseudonym. An identifier associated with attributes or sets of transactions, but with no permanent identifier
More Definitions




   Identification. Association of a personal identifier with an individual presenting attributes. For example, accepting the association between a
    physical person and claimed name; or determining an association with a medical record and a patient using physical attributes.


    Authentication. Proving as association between an identifier or attribute. For example, the association of an automobile with a license plate or a
    person with an account. The automobile is identified by the license plate, it is authenticated as legitimate by the database of cars that are not being
    sought for enforcement purposes.


    Identity Authentication. Proving as association between an entity and an identifier. For example, the association of a person with a credit or
    educational record.


    Attribute Authentication. Proving as association between an entity and an attribute. For example, the association of an painting with a certificate
    of authenticity. This is usually a two step process, where the association between entity and identifier is established, and then the link to identifier
    and attribute is established.
Yielding




   Authorization. A decision to allow a particular action based on an identifier or attribute. Examples include the ability of a person to make claims on
    lines of credit; the right of an emergency vehicle to pass through a red light; or a certification of a radiation-hardened device to be attached to a
    satellite under construction.


    Identity. That set of permanent or long-lived temporal attributes associated with an entity

								
To top