Actor Agency Agreement by ztp42359


Actor Agency Agreement document sample

More Info
									An Appropriate Design for
Trusted Computing and
Digital Rights Management

       Presentation to the State
       Services Commission of
            New Zealand

          Prof. Clark Thomborson

           8th December 2006
    Topical Outline
   Requirements analysis of e-government and corporate
    DRM at three levels: static, dynamic, governance.
       Assessment of IRM v1.0 with TC support
   Compliance: New Zealand‟s Four Principles for TC/DRM
   Suggested design improvements
       IRM: Emphasise integrity and availability, not confidentiality
       TC: More support for audit
       Relationship Management: support for hierarchical, bridging, and
        peering trust with other systems and individuals
   Steps toward uniform “purchase requirements” with
    emphasis on interoperability and appropriate security.
       In progress at the Jericho Forum.
   Eventually: develop an appropriate audit standard for
    DRM, perhaps through ISO.
    Static Security for DRM
   CIA: confidentiality, integrity, and availability.
   Internally-authored documents fall in three categories:
       Integrity first: internal correspondence. Agency (or corporate
        division)-confidential by default, but keys are shared widely within the
        agency to ensure ready availability.
       Integrity and availability first: operational data, e.g. citizen (or
        customer) records. Agency-confidential except in cases where
        privacy laws or expectations require finer-grain protection.
        Provisions for „bridging trust‟ allow efficient data sharing between
        agencies, where appropriate.
       Rarely: highly sensitive data, such as state (or corporate) secrets,
        requiring narrowly controlled access within the agency.
   Three categories of externally-authored documents:
       Integrity first: unsigned objects, e.g. downloads from the web.
       Integrity and availability first: signed objects, e.g. contracts, tax
       Rarely: objects whose confidentiality is controlled by an external
        party, e.g. licensed software and media.
Dynamic Security

   The gold standard: Authentication, Authorisation,
   If taken to an extreme, we‟ll have a “gold-plated”
    system design!
   Metaphorically, a security engineer should
       Seal all security perimeters with an authenticating gold
       Sprinkle auditing gold-dust uniformly but very sparingly
        over the most important security areas, and
       Place an authorising golden seal on all of the most
        important accesses.
Security Governance
   Governance should be pro-active, not reactive.
   Governors should constantly be asking questions,
    considering the answers, and revising plans.
       Specification, or Policy (answering the question of
        what the system is supposed to do),
       Implementation (answering the question of how to
        make the system do what it is supposed to do), and
       Assurance (answering the question of whether the
        system is meeting its specifications).
   We‟re still in the early stages of DRM.
       The monumental failures of early systems were the
        result of poorly-conceived specifications, overly-
        ambitious implementations, and scant attention to
Microsoft‟s IRM v1.0
   Supported in Office 2003 for email and attachments.
   All protected documents are encrypted with individual,
    symmetric, keys.
   Rights-management information is held in the document
   Keys are held at a server, and are released to
   Workstations hold recently-used keys in a cache. This
    improves performance at a small cost in confidentiality:
       Reduced latency, when re-opening a document;
       Reduced load on the server; but
       Reduced ability to withdraw privileges when the status of the
        subject changes (e.g. a job re-assignment) or when the document
        is reclassified.
   This would be a good design for a secretive organisation:
       Strong confidentiality, strong integrity, weak availability.
     Assessment of IRM v1.0 with TC,
     for Corporate DRM
   Microsoft‟s C = I > A design is a poor match to I = A > C.
       Availability could be improved with an independent key escrow, and a
        rights-management protocol which conforms to an open standard.
       An I = A > C design would use agency-level keys to encrypt
        documents. Signing keys might be distributed to individuals; but I think
        it would be better to use agency-level signatures, with each individual‟s
        “signing history” maintained in an audit record.
   Authentication and authorisation are acceptable in IRM v1.0,
    and could be improved with TC.
   An audit record of first-time document accesses can be
    maintained at the IRM v1.0 server.
       Improvement: All accesses could be auditable with platform TC.
   Current TC designs do not (as far as we know) support
    independent audits of all activities in the trusted partition.
       We believe all key-generation activity of a TPM must be auditable.
       We suggest requiring a birth-to-death TC-platform log which is both
        tamper-evident and tamper-resistant.
NZ e-Government Principle #1
   “For as long as it has any business or statutory
    requirements to do so, government must be able to:
       use the information it owns/holds;
       provide access to its information to others, when they are
        entitled to access it.”
   A rallying flag! Other sovereign governments will surely
    require assurance that all of their protected documents
    will remain available, especially if the master keys are
    under the ultimate control of a single vendor.
       To lessen its dependence on a single vendor, governmental
        agencies might insist on an independent escrow of keys, an
        open standard for DRM, and a transition plan to a secondary
    NZ e-Government Principle #2
   “Government use of trusted computing and digital rights
    management technologies must not compromise the
    privacy rights accorded to individuals
       who use government systems, or
       about whom the government holds information.”
   Possibly contentious in an international standard.
       Can we specify the uses of a TC/DRM system which would
        constitute a “compromise of privacy rights” in at least one
       Can we specify the jurisdictional differences in a way that can be
        supported by a standardised TC/DRM technology?
       This confidentiality requirement would, I believe, be within the range
        of feasibility of IRM v1.0 with TC, in NZ and in the USA.
         • I am not competent to comment on privacy rights in other jurisdictions,
           and I‟m by no means an expert on privacy in NZ or in the USA.
       Operational requirements: Independent audit of the source code for
        the rights-management server, and an audit trail of its operations.
    NZ e-Government Principle #3
   “The use of trusted computing and digital rights management
    technologies must not endanger the integrity of government-held
    information, or the privacy of personal information, by
        permitting information to enter or leave government systems, or
        be amended while within them,
        without prior government awareness and explicit consent.”
   Another rallying flag!
        All sovereign governments (and corporations) have strong requirements
         for integrity, and for operational controls on confidentiality and integrity.
   Technical analysis:
        These requirements could be well-supported by IRM v2.0, although they
         would be problematic in a closed-source DRM system on an unauditable
         TC platform.
        By default, documents entering a governmental (or corporate) security
         boundary must be “owned” by the receiving agency, so they can be fully
         managed by a local rights server.
        Strong controls (e.g. a manager‟s over-ride authority) should be placed on
         any individual‟s importation of non-owned documents.
NZ e-Government Principle #4
   “The security of government systems and information
    must not be undermined by use of trusted computing
    and digital rights management technologies.”
   One of the supporting policies:
       “Agencies will reject the use of TC/DRM mechanisms, and
        information encumbered with externally imposed digital
        restrictions, unless they are able to satisfy themselves that the
        communications and information are free of harmful content,
        such as worms and viruses.”
   A “killer app” for the NZ principles!
       This requirement is surprisingly difficult to achieve, in current
        TC/DRM technology.
       The e-Government unit has rendered an important service to the
        international community by identifying this security issue.
Malware Scans in TC/DRM

   An infected document may have been encrypted
    before its malware payload is recognisable by a
       An infected document may be opened at any time in the
   Adding a comprehensive, online, malware scan
    would significantly increase the multi-second
    latency of a first-time access in IRM v1.0.
       Third-party malware scans are problematic in a
        security-hardened kernel.
       The scanner must be highly privileged and trustworthy.
Shifting gears...

 There is great potential for confusion
  when using the words “trust” and
 We must develop operational definitions
  for these terms, if we wish to develop
  trustworthy computer systems.
    Technical and non-technical
    definitions of Trust
   In security engineering, placing trust in a system is a last
       It‟s better to rely on an assurance (e.g. a proof, or a recourse
        mechanism), than on a trusting belief that “she‟ll be right”.
   In non-technical circles, trust is a good thing: more trust is
    generally considered to be better.
   Trustworthiness (an assurance) implies that trust (a risk-
    aware basis for a decision) is well-placed.
       A completely trustworthy system (in hindsight) is one that has
        never violated the trust placed in it by its users.
       Just because some users trust a system, we cannot conclude that
        the system is trustworthy.
       A rational and well-informed person can estimate the
        trustworthiness of a system.
       Irrational or poorly-informed users will make poor decisions about
        whether or not, and under what circumstances, to trust a system.
Privilege in a Hierarchy
   Information flows                               King, President, Chief
    upwards, toward the                             Justice, Pope, or …
    most powerful actor
    (at the root).
   Commands and trust
    flow downwards.
   The King is the most       Peons, illegal immigrants, felons,
    privileged.                excommunicants, or …
   The peons are the
    most trusted.        Information flowing up is
                         Information flowing down is
                         Orange book TCSEC, e.g. LOCKix.
Trustworthiness in a Hierarchy
   Information flows                           King, President, Chief
    upwards, toward the                         Justice, Pope, or …
    most powerful actor.
   Commands and trust
    flow downwards.
   Peons must be trusted
    with some information! Peons, illegal immigrants, felons,
                            excommunicants, or …
   If the peons are not
    trustworthy, then the  If the King does not show good
    system is not secure.   leadership (by issuing
                            appropriate commands), then
                            the system will not work well.
                            “Noblesse oblige”!
Email in a Hierarchy
 Information flows                                     King, President, Chief
  upwards, toward                                       Justice, Pope, or …
  the leading actor.
 Actors can send
  email to their
 Non-upwards email                Peons, illegal immigrants, felons,
                                   excommunicants, or …
  traffic is trusted:
       not allowed by
        default;                 Email up: “privileged” (allowed by default)
       should be filtered,      Email down: “trusted” (disallowed by
        audited, …                default, risk to confidentiality)
                                 Email across: privileged & trusted routing
Email across Hierarchies

Q: How should we
handle email
                                              Merged X+Y
between hierarchies?
                        Company X                Agency Y

 1.   Merge
 2.   Subsume
                 •   Not often desirable or even feasible.
 3.   Bridge     •   Cryptography doesn’t protect X from Y,
                     because the CEO/King of the merged
                     company has the right to know all keys.
                 •   Can an appropriate King(X+Y) be found?
Email across Hierarchies

Q: How can we
manage email    Agency X
Answers:                   Company Y
1. Merge

2.   Subsume
3.   Bridge
Email across Hierarchies

Q: How can we
manage email    Company X           Agency Y
1. Merge
2. Subsume
                  •   Bridging connection: trusted
3.   Bridge!          in both directions.
Bridging Trust
   We use “bridges” every
    time we send personal             Agency X             Hotmail
    email from our work
   We build a bridge by
    constructing a
    “bridging persona”.
   Even Kings can form
    bridges.                           C, acting as a          C, acting as
   However Kings are                  governmental            a hotmail
    most likely to use an                 agent                client
    actual person, e.g.        • Bridging connection: bidirectional trusted.
    their personal
    secretary, rather than a   • Used for all communication among an
    bridging persona.            actor’s personae.
                               • C should encrypt all hotmail to avoid
    Personae, Actors, and Agents
   I use “actor” to refer to
         an agent (a human, or         Company X             Hotmail
          a computer program),
         pursuing a goal (risk
          vs. reward),
         subject to some
          constraints (social,
          technical, ethical, …)             C, acting
                                             as an
   In Freudian terms: ego,                                       C, acting as
    id, superego.                                                 a hotmail
   Actors can act on behalf                                      client
    of another actor:
    “agency”.                    •   When an agent takes on a secondary goal,
   In this part of the talk, we     or accepts a different set of constraints,
    are considering agency           they create an actor with a new “persona”.
    relationships in a
Bridging Trust: B2B e-commerce
   Use case:
    employee C of X
    purchasing               Company X             Company Y
    supplies through
    employee V of Y.
   Employee C
    creates a hotmail                                 Employee V
                                   C, acting
    account for a                  as an
    “purchasing”                   employee              C, acting as
    persona.                                             a purchaser

   Purchaser C       • Most workflow systems have rigid
    doesn‟t know any    personae definitions (= role assignments).
    irrelevant        • Current operating systems offer very little
    information.        support for bridges. Important future work!
Why can‟t we trust our leaders?
    Commands and trust
                                                       “Our leaders are but
     flow upwards (by
                                                       trusted servants…”
     majority vote, or by
    Information flows
     downwards by default
    Upward information
     flows are “trusted”
     (filtered, audited, etc.)       Peers
    In a peerage, the leading
     actors are trusted, have        By contrast, the King of a hierarchy
     minimal privilege, don‟t         has an absolute right (“root” privilege)
     know very much, and              to know everything, is not trusted,
     can safely act on                and cannot act safely.
     anything they know.
Turn the picture upside down!
    Information flows       Peers, Group members, Citizens
     upwards by default      of an ideal democracy, …
    Commands and trust
     flow downwards.
    Downward
     information flows are
     “trusted” (filtered,                     Facilitator, Moderator,
     audited, etc.)                           Democratic Leader, …
    A peerage can be
     modeled by Bell-La     Equality of privilege is the
     Padula, because
     there is a partial      default in a peerage, whereas
     order on the actors‟    inequality of privilege is the
     privileges.             default in a hierarchy.
Peer trust vs. Hierarchical trust

   Trusting decisions in a peerage are made by peers,
    according to some fixed decision rule.
       There is no single root of peer trust.
       There are many possible decision rules, but simple majority
        and consensus are the most common.
       Weighted sums in a reputation scheme (e.g. eBay for goods,
        Poblano for documents) are a calculus of peer trust -- but “we”
        must all agree to abide by the scheme.
       “First come, first serve” (e.g. Wiki) can be an appropriate
        decision rule, if the cost per serving is sufficiently low.
   Trusting decisions in a hierarchy are made by its most
    powerful members.
       Ultimately, all hierarchical trust is rooted in the King.
Legitimation and enforcement
   Hierarchies have difficulty with legitimation.
       Why should I swear fealty (give ultimate privilege) to this
        would-be King?
   Peerages have difficulty with enforcement.
       How could the least privileged actor possibly be an effective
   This isn‟t Political Science 101!
       I won‟t argue whether ideal democracies are better than ideal
       I will argue that hierarchical trust is quite different to peer
        trust, that bridging trust is also distinct, and that all three
        forms are important in our world.
   My thesis: Because our applications software will help
    us handle all three forms of trust, therefore our trusted
    operating systems should support all three forms.
Requirements for Relationship
   Orange-book security is hierarchical.
       This is a perfect match to a military or secret-
        service agency.
       This is a poor match to e-government and
        corporate applications.
   A general-purpose TC must support bridging
    and peering relationships.
   Rights-management languages must support
    bridges and peerages, as well as hierarchies.
   We cannot design an attractive, general
    purpose DRM system until we have designed
    the infrastructure properly!
   Closed-source methodology is appropriate for
    designing hierarchical systems.
     • These systems have trouble with legitimation.
     • Why should a user trust that the system designers (and
       administrators) won‟t abuse their privilege?
   Open-source methodology is appropriate for
    designing peerage systems.
     • These systems have trouble with enforcement.
     • Why should anyone trust a user not to abuse their
   Real-world peerages can legitimise hierarchies, and
    hierarchies can enforce peerages.
     • Can our next-generation OS use both design patterns?!?
 A Legitimised Hierarchy
OS Root Administrator       Auditor            • Each assurance group
                                                 may want its own Audit
                                                 (different scope,
                                                 objectives, Trust, … ).
                                               • The OS Administrator
                                                 may refuse to accept an
                                               • The OS Administrator
                         Users                   makes a Trusting
                                                 appointment when
                                                 granting auditor-level
                                                 Privilege to a nominee.
                                               • Assurance
 IG1             IG2    Inspector-General        organizations may be
                        (an elected officer)     hierarchical, e.g. if the
                                                 Users are governmental
               Chair of User Assurance           agencies or corporate
               Group                             divisions.
     Summary of Static Trust
   Three types of trust: hierarchical, bridging, peering.
        Information flows are either trusted or privileged.
   Hierarchical trust has been explored thoroughly in the Bell-La Padula
        A subordinate actor is trusted to act appropriately, if a superior actor
         delegates some privileges.
        Bell-La Padula, when the hierarchy is mostly concerned about
        Biba, when the hierarchy is mostly concerned about integrity.
        A general purpose TC OS must support all concerns of a hierarchy.
   Actors have multiple personae.
        Bridging trust connects all an actors‟ personae.
        A general purpose TC OS must support personae.
   Peering trust is a shared decision to trust an actor who is inferior to the
        Peerages have trouble with enforcement; hierarchies have trouble with
        A trusted OS must be a legitimate enforcement agent!
     A Modest Proposal
   Let‟s convene a broadly-representative group of
    purchasers to act as “our” governance body!
       Large corporations and governmental agencies have similar
        requirements for interoperability, auditability, static security, and
        multiple vendors.
       First meeting at
   A first goal: develop buyer‟s requirements for DRM, TC,
    and relationship management.
       International agreement and political “buy-in” is required if we are to
        have a system that is broadly acceptable.
       Regulatory requirements, such as protection of individual privacy,
        must be addressed.
       The Jericho Forum is already doing this (but it‟s not a standards
       Work through ISO?
   A second goal: develop a trustworthy auditing process.
    Acknowledgements & Sources
   Privilege and Trust, LOCKix: Richard O'Brien, Clyde Rogers, “Developing
    Applications on LOCK”, 1991.
   Trust and Power: Niklas Luhmann, Wiley, 1979.
   Personae: Jihong Li, “A Fifth Generation Messaging System”, 2002; and
    Shelly Mutu-Grigg, “Examining Fifth Generation Messaging Systems”,
   Use case (WTC): Qiang Dong, “Workflow Simulation for International
    Trade”, 2002.
   Use case (P2P): Benjamin Lai, “Trust in Online Trading Systems”, 2004.
   Use case (ADLS): Matt Barrett, “Using NGSCB to Mitigate Existing
    Software Threats”, 2005.
   Use case (SOEI): Jinho Lee, “A survey-based analysis of HIPAA security
    requirements”, 2006.
   Trusted OS: Matt Barrett, “Towards an Open Trusted Computing
    Framework”, 2005; and Governance of Trusted Computing: Thomborson
    and Barrett, to appear, ITG 06, Auckland.
   Corporate DRM: “Enterprise Information Protection & Control”, a position
    paper under development in the Jericho Forum,

To top