Weapon Technology Readiness Assessment by bxx71910

VIEWS: 0 PAGES: 61

Weapon Technology Readiness Assessment document sample

More Info
									Headquarters U.S. Air Force
Integrity - Service - Excellence

                        Air Force
            Technology Readiness
               Assessment (TRA)
                         Process

                              SAF/AQRE




                                   30 Apr 08
                                         What is a TRA?
                           (DoD TRA Deskbook, May 2005)

An objective, systematic, metrics-based process and
report that assesses the maturity of Critical Technology
Elements (CTEs)
     Uses Technology Readiness Levels (TRLs)
     Not a risk assessment; not a design review




              Integrity - Service - Excellence             2
                                                           Why do a TRA?

  Regulatory requirement for all acquisition programs
          DoDD 5000.1, DoDI 5000.2, 12 May 03
          AFI 63-101, 29 July 05; NSS 03-01, 27 Dec 04 (space
           programs)
          Submitted to DUSD(S&T) for DAE Programs
  Ensure appropriate technology maturity at each phase of the
  acquisition lifecycle
          PMs job is to manage programs, not develop technology
  According to a GAO review* of 54 DoD programs in 2005,
  only 15% began development with all of their technologies mature
          For programs with mature technology, RDT&E costs
           increased 9% and unit production cost increased < 1%
          For programs with immature technology, RDT&E costs
           increased 41% and unit production cost increased 21%
*Defense Acquisitions: Assessments of Selected Major Weapon Programs, GAO-05-301, March 2005

                    Integrity - Service - Excellence                                      3
                                               TRA Guidance

1.   DoD TRA Deskbook, May 05
     (https://acc.dau.mil/CommunityBrowser.aspx?id=18545)


2.   Air Force TRA Guidelines (SAF/AQRE)

3.   DAU Online Training (https://learn.dau.mil/html/clc/Clc.jsp)
     CLE 021 “Technology Readiness Assessment” (3 hrs)


4.   DAG (http://akss.dau.mil/dag/, para 10.5.2)




               Integrity - Service - Excellence                     4
                                         Milestone TRL Requirements

TRL Minimums**          4                          6*                            7                    8/9
                        A                           B                            C                    IOC           FOC
       CONCEPT                   TECHNOLOGY             SYSTEM DEVELOPMENT                 PRODUCTION                   O
      REFINEMENT                 DEVELOPMENT             & DEMONSTRATION                  & DEPLOYMENT                  &
                                                                    Design                            FRP
             Concept
             Decision                                               Readiness                         Decision          S
                                                                    Review                            Review




                                                TRA                             TRA

  TRL 1-3      TRL 4          TRL 5        TRL 6              TRL 7                     TRL 8             TRL 9
   Proof     Validation      Validation Demonstration     System Prototype              System            Mission
    Of           In         In Relevant  In Relevant       Demonstration              Qualification       Proven
  Concept    Laboratory     Environment Environment        In Operational
            Environment                                     Environment




  * Title 10 MDA Certification
  **Per DoD TRA Deskbook and discussions with DUSD (S&T)
                        Integrity - Service - Excellence                                                            5
                                     Title 10 MDA Certification

   USC Title 10, Sec. 2366a, Major defense acquisition programs:
    certification required before Milestone B or Key Decision Point B
    approval (per FY06/07/08 NDAAs)
      MDAPs may not receive MS B/KDP B approval until the MDA certifies
       that, among other items in Sec 2366a, “the technology in the program has
       been demonstrated in a relevant environment”
      A 2 Nov 07 SAF/AQR policy memo defines demonstrated in a relevant
       environment as all CTEs are TRL 6 or greater.
   PMO options:
        Demonstrate technology maturity at MS B
        Seek alternate, mature technologies
        Delay MS B to mature needed technologies
        Use evolutionary acquisition – immature technologies transitioned into
         program at later date
        Seek a National Security Waiver from the MDA (to be avoided)


                    Integrity - Service - Excellence                              6
                                                        Air Force TRA Process


   1. Initiate TRA             2. Plan TRA                3. Identify IRP*        4. Train IRP




                5. Candidate CTE             6. Finalize CTE
                                                                     7. Collect Data
                List                         List*




                               8. Perform                9. Document
                                                                                 10. Staff TRA
                               Assessment                TRA*


CTE – Critical Technology Element
IRP – Independent Review Panel                                   * SAF/AQR Decision Points (coordinate with
PMO – Program Management Office
TRA - Technology Readiness Assessment
                                                                 DUSD(S&T) for DAE programs)

                           Integrity - Service - Excellence                                           7
                                         Step 1: Initiate TRA

   PMO contacts SAF/AQR (SAE/DAE programs) to
    initiate a TRA:
     Phone: DSN 425-7777
     Workflow email: SAFAQR.Workflow@pentagon.af.mil


   SAF/AQR identifies AQRE and AQRT action officers
       SAF/AQRE: Engineering and Technical Mgmt Division
           TRA Process Owner
       SAF/AQRT: Science and Technology Division
           AFRL Interface (technology expertise)




                  Integrity - Service - Excellence          8
                                         Step 2: Plan TRA

   PMO and SAF/AQR develop the TRA plan
      Kick-off telecon is the common approach
      Participants include PMO, SAF/AQR, SAF
       Capability Directorate, SAF/ACE, etc.

   TRA plan consists of:
       Program description, acq strategy, etc.
       Technologies under consideration        PMO briefs these
       Planned TRA schedule                    areas during the
                                                kick-off telecon
       Funding considerations (TDY, labor, …)
       Proposed IRP participants


                Integrity - Service - Excellence                   9
                                         Step 3: Identify IRP

   PMO provides bios of candidate IRP members to
    SAF/AQR after kick-off telecon
       IRP Lead will be an experienced technical leader, outside of
        the PEO chain of command
       IRP members have knowledge of program‟s technologies
        and are independent of program‟s technology development
       Sources of IRP members: Center ENs, Air Force Research
        Laboratory (AFRL), FFRDCs, A&AS, other Services


   SAF/AQR modifies as necessary and approves IRP
    lead and members
       Coordinates IRP membership with DUSD(S&T) for DAE
        programs


                 Integrity - Service - Excellence                      10
                                      Step 4: Train IRP

   AQR and PMO train the IRP
      AQR provides TRA process information
      PMO provides a program description to include a
       potential list of technologies

   Best practice is to have IRP convene at PMO for
    briefings; VTC or telecon may be sufficient

   Good opportunity for the IRP to request artifacts
    (e.g. WBS, CDD, trade studies, s/w development
    plan, etc.) from PMO to initiate the assessment


              Integrity - Service - Excellence           11
                               Step 5: Candidate CTE List

   PMO identifies a candidate CTE list
       Develop a superset list of technologies by reviewing the
        WBS, system architecture, etc.
       Define a subset list of critical technologies by using the
        methodology per para 3.2.2 of the TRA Deskbook
       Clarifying definitions:
         New Technology - A technology whose performance
          characteristics have not yet been substantiated via operational
          test or other acceptable validation methods.
         Modified Technology - A change to a technology in which the
          resulting performance characteristics can not be described or
          predicted by prior/accepted design implementations.
         Engineering - Practical application of technology via design,
          analysis and/or construction, to solve specific technical
          problems.


   PMO submits candidate list of CTEs to IRP and AQR
                 Integrity - Service - Excellence                           12
                            Step 6: Finalize CTE List

   IRP recommends to AQR the final CTE list to be
    assessed
     Understand superset of program technologies
     Consider PMO‟s candidate CTE list
     Use IRP expert knowledge and TRA Deskbook CTE
      identification methodology to finalize CTE list


   SAF/AQR and modifies as necessary and approves
    the final CTE list
       Coordinates CTEs with DUSD(S&T) for DAE programs




               Integrity - Service - Excellence            13
                                  Step 7: Collect Data

   PMO collects and provides data for the IRP to use in
    assessing CTE maturity
     Data should include detailed results from test or
      demonstrations
     Burden of proof is on the PMO to provide evidence
      that a CTE has been successfully demonstrated
     Provide documentation to the IRP well in advance of
      the formal assessment to support a timely review




              Integrity - Service - Excellence              14
                              Step 8: Perform Assessment
   IRP performs assessment by reviewing data/artifacts to determine a
    TRL for each CTE, and documents rationale and overall conclusion
        IRP may consult SMEs to augment the IRP‟s skill set

   IRP scoring via closed door session (limited attendance to IRP
    members and SAF/AQR POCs)
        IRP members use score sheets to record TRL value and rationale
        Consulted SMEs do not participate in TRL scoring
        Keep score sheets and personal notes unclassified
        A copy of score sheets provided to the IRP Lead and to SAF/AQR

   IRP Lead strives for TRL consensus
      Completeness of artifacts and IRP discussion is key to achieving
       consensus
      Lack of consensus may necessitate AQR intervention

   IRP provides opportunity for PMO feedback on any CTEs requiring
    Technology Maturation Plans (TMPs)
   PMO provides facility for IRP assessment and verifies facility and
    attendees are cleared to handle classification level of data
    presented/discussed
                  Integrity - Service - Excellence                        15
                    Sample IRP Score Sheet

     CTE      TRL             Rationale


1.
2.
3.
4.
5.
6.
7.




       Integrity - Service - Excellence   16
                                  Step 9: Document TRA

   IRP documents its assessment IAW TRA Deskbook sections
    3.0 thru 3.4 on pages 4-3 and 4-4

   IRP Lead submits the assessment to the PMO with a copy to
    AQR

   PMO builds the TRA final document around the IRP‟s
    assessment using the template on pgs 4-1 thru 4-4 of TRA
    Deskbook

   PMO (CE/Tech Director) and IRP Lead sign TRA final document




                Integrity - Service - Excellence                  17
                                   TRA Final Document

Who writes    DoD TRA Deskbook template (pgs 4-1 to 4-4)
  PMO         1.0 Purpose of This Document
  PMO         2.0 Program Overview
  PMO                 2.1 Program Objective
  PMO                 2.2 Program Description
  PMO                 2.3 System Description
  IRP         3.0 Technology Readiness Assessment
  IRP             3.1 Process Description
  IRP             3.2 CTEs
  IRP             3.3 Assessment of Maturity
  IRP                 3.3.1 CTE #1
  IRP                 3.3.2 CTE #2
  IRP             3.4 Summary of TRLs by Technology
  PMO         4.0 PMO Response to IRP Assessment (optional)




             Integrity - Service - Excellence                 18
                         Additional Section 3.3 Detail
3.3.1 CTE #1 (Name)
   3.3.1.1 CTE Description
            Technology description
            Function performed
            Synopsis of development history
   3.3.1.2 Environment description
   3.3.1.3 Criteria for TRL assigned; including rationale
   3.3.1.4 References to supporting data

3.3.2 CTE #2 (Name)
   3.3.2.1

3.4 Summary of TRLs by Technology
   Table summarizing CTE #1,CTE #2, etc.


              Integrity - Service - Excellence              19
                                          Step 10: Staff TRA

   PMO submits TRA final document to SAF/AQR NLT 6 weeks
    prior to the Milestone

   SAF/AQR performs a peer review on the TRA final document

   IRP Lead and PMO representative shall be available to brief the
    TRA assessment to SAF/AQR and DUSD(S&T) (for DAE
    programs)

   SAF/AQR endorses the TRA via memo and forwards to SAF/AQ
    with an info copy to DUSD(S&T) (for DAE programs)

   From DUSD(S&T): Components of a successful TRA …………
    Process, Independence, Data

   SAF/AQ endorses the TRA via memo


                Integrity - Service - Excellence                      20
                          Summary of Key Responsibilities
   PMO
         Initiates TRA process with SAF/AQR
         Proposes IRP Lead and members to SAF/AQR
         Identifies candidate list of CTEs per WBS
         Provides artifacts and data to IRP for assessment
         Consolidates IRP assessment into a final TRA document and coordinates with SAF/AQR
         Funds TRA as necessary
   IRP
        Complete DAU CLE 021 “Technology Readiness Assessment” (~3 hrs online)
        Recommends to SAF/AQR, a final CTE list to be assessed
        Assesses TRL for all CTEs; include basis for assessment
        Documents and submits assessment to PMO and SAF/AQR
   IRP Lead
       Manages assessment effort IAW TRA Deskbook and PMO schedule
       Obtains additional expertise and artifact information as needed
       Strives for consensus in assessment
       Signs final TRA document
       Communicates progress and outbriefs TRA results to SAF/AQR
   SAF/AQR:
       AF TRA process owner
       Approves IRP Lead and membership
       Approves IRP‟s final list of CTEs
       Endorses Final TRA document
       Coordinates TRA activities with DUSD(S&T) for DAE programs




                      Integrity - Service - Excellence                                         21
                         Source Selection Considerations
   TRA Plan
       Write RFP to request KTRs provide technology assessment material and identify
        how technology will be evaluated in source selection
       IRP/AQR/DUSD(S&T) in-briefed and sign source selection non-disclosure
        agreements
       Source selection rules apply
            Data handling and control
            Restricted access and discussions
       Consult PMO contracting officer

   Performing the Assessment
        IRP assessment of technologies occurs in source selection
        IRP develops a TRA for each bidder
        PMO facilitates evaluation notices between the IRP and the bidders to obtain
         additional information per PMO contracting officer instructions
        SAF/AQR and DUSD(S&T) action officers are in-briefed and travel to the source
         selection site to facilitate the TRA process and to evaluate assessments; not formal
         members of the source selection team

   Staffing the Final TRA Document
        TRA does not imply approval of one source over another
        SAF/AQR and DUSD(S&T) decisions accomplished in source selection
        SAF/AQR endorsement memo to SAF/AQ and DUSD(S&T) – TRA complete and
         results are available in source selection


                    Integrity - Service - Excellence                                            22
                         Backup charts




Integrity - Service - Excellence     23
                   Title 10 MDAP MDA Certification
                 Prior to Milestone/KDP B Approval
Chapter 139 of Title 10, United States Code (USC), as amended by Section
  801, FY2006 NDAA; Section 805, FY2007 NDAA; and Section 812, FY 2008
  NDAA

Sec. 2366a. Major defense acquisition programs: certification required
  before Milestone B or Key Decision Point B approval
 (a) Certification- A major defense acquisition program may not receive
  Milestone B approval, or Key Decision Point B approval in the case of a space
  program, until the milestone decision authority --

        certifies that “the technology in the program has been demonstrated in
         a relevant environment….”

   (c) Waiver for National Security- The milestone decision authority may waive
    the applicability to a major defense acquisition program of one or more
    components of the certification requirement if the milestone decision authority
    determines that, but for such a waiver, the Department would be unable to
    meet critical national security objectives. Whenever the milestone decision
    authority makes such a determination and authorizes such a waiver, the
    waiver, the determination, and the reasons for the determination shall be
    submitted in writing to the congressional defense committees within 30 days
    after the waiver is authorized

                   Integrity - Service - Excellence                               24
                                     “Old” AF TRA Process
   Previously
      PM identifies independent team, conducts and documents TRA, and submits TRA
        report to AQR in support of major program decision with little to no prior coordination
      AQR involvement: inconsistent based on when and how PM coordinates TRA plan
        with AQR; AQR coordinates final TRA with DUSD(S&T); AQR as „independent reviewer‟
        reviews final report prior to major program decision and submits endorsement to
        SAF/AQ and DUSD(S&T)
   Results
      Program delays due to immature technologies – unable to certify compliance with Title
        10 MDA Certification requirement pushing out MS B (impacts cost & schedule)
      Last minute rush to complete; insufficient time allowed for quality assessment
      Independent TRA teams are led by or include program office personnel (perception of
        conflict of interest, lack of objectivity)
      TRA reports lack body of evidence supporting appropriate TRL
      Ad hoc, inconsistent AQR involvement and support; early involvement only if PM
        initiates coordination
   Lessons Learned
      Start earlier
      Emphasize TRA training
      Clarify AF TRA process (currently not well defined or understood)
      AQR needs to be more proactive in guiding process, reviewing plan, identifying
        independent team, and ensuring TRA quality meets OSD criteria

TRA GOAL: Programs select mature technologies (MDA Certified);
      OSD doesn’t need to conduct ITA on the program
                    Integrity - Service - Excellence                                          25
                                                             Why do a TRA?
According to a GAO review of 54 DoD programs:
  Only 15% of programs began SDD with mature technology
     Programs with mature technologies averaged 9% cost growth
      and a 7 month schedule delay
     Programs that did not have mature technologies averaged 41%
      cost growth and a 13 month schedule delay
   At critical design review, 42% of programs demonstrated
    design stability (90% drawings releasable)
     Design stability not achievable with immature technologies
     Programs with stable designs at CDR averaged 6% cost growth
     Programs without stable designs at CDR averaged 46% cost
      growth and a 29 month schedule delay

    Source: Defense Acquisitions: Assessments of Selected Major Weapon Programs, GAO-05-301, March 2005


                    Integrity - Service - Excellence                                                      26
                                   Purpose of TRA
   Determine maturity of CTEs via an independent or
    objective assessment
   A report on what has been accomplished to date for
    an important subset of technologies in the program
   Part of the program‟s technical risk assessment -- Not
    the sole means for discovering technology risk
   Does not predict future performance nor assess
    quality of system architecture, design, or integration
    plan




              Integrity - Service - Excellence               27
                   Title 10 MDAP MDA Certification
                 Prior to Milestone/KDP B Approval
Chapter 139 of Title 10, United States Code (USC), as amended by Section
  801, FY2006 NDAA; Section 805, FY2007 NDAA; and Section 812, FY 2008
  NDAA

Sec. 2366a. Major defense acquisition programs: certification required
  before Milestone B or Key Decision Point B approval
 (a) Certification- A major defense acquisition program may not receive
  Milestone B approval, or Key Decision Point B approval in the case of a space
  program, until the milestone decision authority --

        certifies that “the technology in the program has been demonstrated in
         a relevant environment….”

   (c) Waiver for National Security- The milestone decision authority may waive
    the applicability to a major defense acquisition program of one or more
    components of the certification requirement if the milestone decision authority
    determines that, but for such a waiver, the Department would be unable to
    meet critical national security objectives. Whenever the milestone decision
    authority makes such a determination and authorizes such a waiver, the
    waiver, the determination, and the reasons for the determination shall be
    submitted in writing to the congressional defense committees within 30 days
    after the waiver is authorized

                   Integrity - Service - Excellence                               28
                        Impact of Title 10 Certification
                              Requirement to MDAPs
   MDAPs may not receive MS B approval until the MDA certifies
    that the technology in the program has been demonstrated in a
    relevant environment (TRL 6 or greater)
   Post MS/KDP B programs appear to be excluded
   Program Managers preparing for MS B have the following
    options
      Demonstrate technology maturity at MS B
      Seek alternate, mature technologies
      Delay MS B to mature needed technologies
      Use an Evolutionary Acquisition approach
         Baseline program uses mature technologies
         Less mature technologies are transitioned into the program at a later
          date
       Seek a National Security Waiver from the MDA (to be avoided)



                 Integrity - Service - Excellence                                 29
                          TRA Core Team Members

 Organization                    Role        Identify by Name
SPO                  Program Manager,
                     Chief Engineer
SAF/AQRE             Component S&T –
                     TRA Process Owner
SAF/AQRT             Component S&T –
                     USAF SME
DUSD(S&T)            AO

AFRL                 AF SME (IRP Member)
ONR                  Navy SME (IRP Member)

ARL                  Army SME (IRP Member)


Identify IRP Lead and voting members

                 Integrity - Service - Excellence               30
                                                                    TRA Schedule
                     Time Before Major Milestone Decision
                                                                                   * For ACAT 1D or 1AM only
                   Kick-Off                                                                   AQR
                   Meeting             AQR                                                coordinates
                                                                    AQR and IRP
                                    coordinates       PM                                      IRP
                 AQR and PM                                           review
PM Requests                         on TRA IRP     identifies                            composition,
                 develop TRA                                        CTE list and
   TRA                              membership     candidate                              CTEs, and
                    Plan                                            environment
                                     and panel      CTE list                             environment
12-24 months                                                         definition
                  12 months            lead                                              definition with
                                                   10 months
                                                                     9 months            DUSD(S&T)*
                                    11 months
                                                                                           8 months



                                      PM & IRP
                                        Lead
 PM collects       Conduct                                                  AQR
                                     document
   data in        Technology                         PM submits        endorses TRA              Major
                                    TRA findings
support of the    Assessment                         final TRA to       and submits             Milestone
                                         and
    CTE               --                                 AQR            to AFAE and             Decision
                                     coordinate
 assessment      IRP assigns TRLs                                      DUSD(S&T)*
                                     with AQR &    NLT 2.5 months
 8-6 months        6 months         DUSD(S&T)*
                                                                       NLT 2 months
                                    NLT 3 months


  Note: schedule dependent on program complexity and contract strategy;
    include time to conduct technology demonstrations as appropriate

                     Integrity - Service - Excellence                                                      31
                            Potential TRA Costs
 IRP Member Support
    Travel
    Contractor support (SEI, FFRDC, etc.)
    Other (e.g., fee for service)
 Technology Demonstrations (if needed)
 TRA Assessment
    Conference Fees
    CTE Briefings presented by contractors




           Identify and budget appropriate funding
                as part of TRA planning effort
            Integrity - Service - Excellence         32
                         Critical Technology Element (CTE)
                                     (DoD TRA Deskbook, May 2005)
1.    Does the technology directly          5. Is the technology new or
      impact an operational                    novel?
      requirement?                          6. Has the technology been
2.    Does the technology have a               modified?
      significant impact on an              7. Has the technology been
      improved delivery schedule?              repackaged such that a new
                                               relevant environment is
3.    Does the technology have a               realized?
      significant impact on the
                                            8. Is the technology expected to
      affordability of the system?
                                               operate in an environment
4.    If this is a spiral development, is      and/or achieve a performance
      the technology essential to meet         beyond its original design
      the spiral deliverables?                 intention or demonstrated
                                               capability?



     For a technology to be critical, the answer to one of the first 4 questions
     must be “yes,” and the answer to one of the second 4 must also be “yes.”

                   Integrity - Service - Excellence                            33
                     Manufacturing Technology CTE
                            (DoD TRA Deskbook, May 2005)
5.    Has the manufacturing               9.    Are the materials available to
      technology been successfully              meet quantity and schedule
      integrated into a product line?           demands?
6.    Is the industrial base capable of   10.   Are the design-to-cost (DTC)
      design, development,                      goals achievable?
      production, maintenance and         11.   Are the key manufacturing
      support, and disposal of the              processes characterized,
      system?                                   capable, and controllable with
7.    Is the intended design                    respect to achieving the system
      producible?                               requirements?
8.    Have the materials been
      characterized in a manufacturing
      environment?



     For a technology to be critical, the answer to one of the first 4 questions
      must be “yes,” and the answer to one of these second 7 must be “no.”

                  Integrity - Service - Excellence                                34
                        Critical Technology Element (CTE)
                                   (DoD TRA Deskbook, May 2005)

   Critical Technology Elements (CTEs)
       A technology element is “critical”
            if the system being acquired depends on this technology
             element to meet operational requirements (with
             acceptable development, cost, and schedule and with
             acceptable production and operation costs) and
            if the technology element or its application is either new
             or novel.
       Said another way, an element that is new or novel or is being
        used in a new or novel way is critical if it is necessary to
        achieve the successful development of a system, its
        acquisition, or its operational utility

                 CTEs may be hardware, software, manufacturing, or
               life cycle related at the subsystem or component level

                  Integrity - Service - Excellence                        35
          Key References for TRLs and Assessment
                      Criteria in the TRA Deskbook

       Hardware                               Software                           Manufacturing
         TRLs                                  TRLs                                 TRLs

         Tables 3-1 and 3-2                  Table 3-3 in the TRA                   Table 3-4 in the TRA
       in the TRA Deskbook*                      Deskbook*                              Deskbook*



      Hardware                               Software                              Manufacturing
     Assessment                             Assessment                              Assessment
       Criteria                               Criteria                                Criteria
             Examples                            Examples                              Examples
     in Section C.2, Assessing           in Section C.3, Assessing             in Section C.4, Assessing
       Hardware CTEs, of the             Software CTEs, of the TRA            Manufacturing CTEs, of the
          TRA Deskbook                           Deskbook                           TRA Deskbook


      Appendix D, Guidance and best Practices for Identifying Critical technology Elements, in the TRA
                                               Deskbook


*http://www.defenselink.mil/ddre/doc/tra_deskbook_2005.p
                             df
                     Integrity - Service - Excellence                                                      36
                Corresponding Environment(s)

   Define Corresponding Environment(s)
      Tied to CDD and CONOPs
      May be more than one environment depending on
       system operations
      MS B requirement: “relevant environment”
   Identify current technology demonstrations that
    reflect the corresponding environment
      Modeling & Simulation
      Ongoing or completed demonstrations,
       exercises, or experiments
      Plan for technology maturation efforts




             Integrity - Service - Excellence          37
                     Environment Considerations
                             (DoD TRA Deskbook, May 2005)
 Physical Environment             Logical Environment
    Mechanical components,           Software (algorithm) interfaces
     processors, servers, and         Security interfaces
     electronics                      Web-enablement
    Kinetic and kinematic
                                   Security Environment
    Thermal and heat transfer
                                      Connection to firewalls
    Electrical and
                                      Security appliques
     electromagnetic
                                      Rates and methods of attack
    Climatic – weather,
     temperature, particulate      User and Use Environment
    Network infrastructure           Scalability
 Data Environment                    Upgradability
    Data formats and databases       User behavior adjustments
    Anticipated data rates           User interfaces
    Data delay and data              Organizational
     throughput                        change/realignments with
    Data packaging and framing
                                       system impacts
                                      Implementation plans




               Integrity - Service - Excellence                      38
                    Measuring Technology Readiness
                                 (DoD TRA Deskbook, May 2005)
                                 Technology Readiness Levels (TRLs)
System Test, Launch      TRL 9
& Operations                     9. Actual system proven through successful mission
                                     operations (sw mission-proven operational
                         TRL 8       capabilities)
System/Subsystem                 8. Actual system completed and qualified (sw mission
Development                          qualified) through test and demonstration (sw in
                         TRL 7       an operational environment)

Technology                       7. System prototype demonstration in an operational
Demonstration            TRL 6       (sw high-fidelity) environment
                                 6. System/subsystem model or prototype
                         TRL 5       demonstration in a relevant environment (sw
Technology                           module and/or subsystem validation in a relevant
Development                          end-to-end environment)
                         TRL 4   5. Component and/or breadboard (sw module and/or
                                     subsystem) validation in relevant environment
Research to Prove
Feasibility              TRL 3   4. Component and/or breadboard validation in
                                     laboratory environment
                                 3. Analytical and experimental critical function and/or
Basic Technology         TRL 2       characteristic proof-of-concept
Research
                                 2. Technology concept and/or application formulate
                         TRL 1
                                 1. Basic principles observed and reported


                    Integrity - Service - Excellence                                  39
            TRA Template– Final Document
   Program Description (OPR: Chief Engineer)
      Identify KPPs, CONOPS, program complexity, acquisition strategy,
        and agreed to environment definition in support of the TRA
   Methodology (OPR: Chief Engineer)
      How CTEs were identified
      How TRLs were assigned
   TRA Team (OPR: Chief Engineer with info provide by IRP)
      Identify IRP members, include credentials qualifying their
        participation and objectivity
   List of CTEs (OPR: Chief Engineer)
      Include original candidates and justification for exclusion
   Technology Maturation Plans (for „immature‟ CTEs) (OPR: Chief
    Engineer)
   TRLs (OPR: IRP Lead)
      Identify body of evidence, or data provided to justify TRLs
   TRA Summary and Conclusions (OPR: IRP Lead)


               Integrity - Service - Excellence                       40
                              What We Have Learned
Don’t wait until TRA to address technology maturity
 Address technology maturity in the program’s acquisition strategy
    Use of mature technology required by DoDD 5000.1
 TRA is basis for the technology portion of the MDA certification required by
  Section 2366a of Title 10, USC
    Address Title 10 MS B technology maturity requirement in RFP and
     source selection
Start Early
   Obtain early agreement on CTEs and “relevant environment”
   Develop a TRA team that can provide an objective assessment of the CTEs
   SAF/AQR coordinates TRA effort with DUSD(S&T) for ACAT ID/AM programs
Reminders
 ACAT I PMs should contact SAF/AQR at least 12 months before MS to
  coordinate TRA strategy
 ACAT II and III PMs should coordinate TRA strategy with MDA




                Integrity - Service - Excellence                                 41
TRA Policy and
  Guidance


Integrity - Service - Excellence   42
              DoDD 5000.1: The Defense Acquisition System
                                               (12 May 03)

2.2 The policies in this directive apply to all acquisition programs
3.5 The Program Manager (PM) is the designated individual with responsibility for
   and authority to accomplish program objectives for development, production, and
   sustainment to meet the user’s operational needs. The PM shall be accountable
   for credible cost, schedule, and performance reporting to the MDA.
4.3.2 Responsiveness. Advanced technology shall be integrated into producible
   systems and deployed in the shortest time practicable. Approved, time-phased
   capability needs matched with available technology and resources enable
   evolutionary acquisition strategies. Evolutionary acquisition strategies are the
   preferred approach to satisfying operational needs. Spiral development is the
   preferred process for executing such strategies.
E1.14 Knowledge-Based Acquisition. PMs shall provide knowledge about key
  aspects of a system at key points in the acquisition process. PMs shall reduce
  technology risk, demonstrate technologies in a relevant environment, and
  identify technology alternatives, prior to program initiation. They shall reduce
  integration risk and demonstrate product design prior to design readiness review.
  They shall reduce manufacturing risk and demonstrate producibility prior to full
  rate production.




                  Integrity - Service - Excellence                                    43
               DoDI 5000.2, Defense Acquisition Management
                      Framework: Technology Development
3.6.1 The purpose of this phase is to reduce technology risk and to determine the
   appropriate set of technologies to be integrated into a full system. Technology
   Development is a continuous technology discovery and development process
   reflecting close collaboration between the S&T community, the user, and the system
   developer. It is an iterative process designed to assess the viability of technologies
   while simultaneously refining user requirements.
3.6.2 The project shall enter Technology Development at Milestone A when the MDA
   has approved the TDS…A favorable Milestone A decision DOES NOT mean that a
   new acquisition program has been initiated.
3.6.5 The ICD and the TDS shall guide this effort. Multiple technology development
   demonstrations may be necessary before the user and developer agree that a
   proposed technology solution is affordable, militarily useful, and based on mature
   technology. The TDS shall be reviewed and updated upon completion of each
   technology spiral and development increment. Updates shall be approved to
   support follow-on increments.
3.6.7 The project shall exit Technology Development when an affordable increment of
   militarily-useful capability has been identified, the technology for that increment
   has been demonstrated in a relevant environment, and a system can be
   developed for production within a short timeframe (normally less than five years); or
   when the MDA decides to terminate the effort…A Milestone B decision follows the
   completion of Technology Development.

                   Integrity - Service - Excellence                                     44
               DoDI 5000.2, Defense Acquisition Management
                        Framework: System Development &
                                              Demonstration
3.7.1 The purpose of the SDD phase is to develop a system or an increment of
   capability; reduce integration and manufacturing risk (technology risk reduction
   occurs during Technology Development); …and demonstrate system integration,
   interoperability, safety, and utility.
3.7.1.2. SDD has two major efforts: System Integration and System Demonstration.
   The entrance point is MS B, which is also the initiation of an acquisition
   program….Each increment of an evolutionary acquisition shall have its own MS B.
3.7.2 Entrance Criteria. Entrance into this phase depends on technology maturity
   (including software), approved requirements, and funding. Unless some other factor
   is overriding in its impact, the maturity of technology shall determine the path to be
   followed.
3.7.2.2 The management and mitigation of technology risk, which allows less costly
   and less time-consuming systems development, is a crucial part of overall program
   management, and is especially relevant to meeting cost and schedule goals.
   Objective assessment of technology maturity and risk shall be a routine aspect of
   DoD acquisition. Technology developed in S&T or procured from industry or other
   sources shall have been demonstrated in a relevant environment or, preferably
   in an operational environment to be considered mature enough to use for product
   development in systems integration. Technology readiness assessments and
   where necessary, independent assessments, shall be conducted. If technology is
   not mature, the DoD Component shall use alternative technology that is mature and
   that can meet the user’s needs.
                   Integrity - Service - Excellence                                   45
                              Regulatory Requirements
                                          (Non Space)

   TRA required for MS B and C approval per DoDI 5000.2,
    Enclosure 3 (E3), Regulatory Information Requirements


         DoDI 5000.2,                          When
                                 Section                     Comment
         12 May 2003                          Required
    Technology Readiness                                 Assessments
    Assessment                 Table E3.T.2   MS B & C
                                                         required for all
                                                         programs (TRA
    Independent Technology                               required at program
    Readiness Assessment       Table E3.T.2              initiation for ships)
                                              MS B & C
    (ACAT ID only – as
    required by DUSD (S&T))




                 Integrity - Service - Excellence                                46
                        NSS Acquisition Policy 03-01
                                         (27 Dec 04)
AP1.1.11 Technology Readiness Assessment (TRA)
   –   SPD/PM identifies critical technologies and conducts TRA
   –   Component S&T Executive [SAF/AQR for the Air Force] conducts
       independent assessment of TRA at KDP B and C
E4.1 ENCLOSURE 4: INTEGRATED PROGRAM SUMMARY, E4.9, Risk
  Management
   –   At each KDP and Build Approval, the program office should identify the
       key technology components of the system and provide their
       assessment of the maturity of each key component using the
       Technology Readiness Level (TRL) method identified in the DoD
       Acquisition Guidebook
   –   The IPAT will review the program office assessment and determine if, in
       their view, all key technology components of the program have been
       identified. The IPA will also provide its own independent assessment of
       the maturity of the key components using the TRL method




                Integrity - Service - Excellence                             47
                                      Regulatory Requirements
                                                       (Space)
   TRA required at KDP B and C for space programs


    NSS Acquisition Policy 03-01*          Section          When               Comment
           27 Dec 2004                                     Required
                                                                            Component
    SPD/PM conducts Technology                                              S&T Executive
                                           AP1.1.11       KDP B & C
    Readiness Assessment (TRA)                                              assesses TRA;
                                                                            submits to IPAT
     SPD/PM assesses                                       Each KDP
     maturity of key technology               E4.9         and Build        Submitted to IPAT
     components                                            Approval

    * NSS Acquisition Policy 03-01 provides, “streamlined decision making framework for all
       DoD space system MDAPs.”




                     Integrity - Service - Excellence                                           48
               AFI 63-101: Operations of Capabilities
               Based Acquisition System (29 Jul 05)
This instruction applies to defense technology projects and acquisition programs
  procured under DOD 5000.2.
1.2 Capabilities Based Acquisition is a process…There are five mutually supporting tenets
   that comprise Capabilities Based Acquisition…[including] Technology Transition
   Process…[and] Robust Systems Engineering…

2.1.3 Technology Transition Process. One of the fundamentals that makes EA [evolutionary
   acquisition] work is the rapid and streamlined incorporation of mature, high pay-off
   technology into each increment… AFRL will support the development of phased
   capabilities requirements by helping acquisition program offices and operators assess the
   maturity and viability of technologies being considered for incorporation in EA programs
   and assist when appropriate, in the preparation of a Technology Development Strategy
   (TDS) for Milestone A, B, and C.

2.1.4.5 Systems Engineering Planning and the Systems Engineering Plan (SEP)… [The
   SEP] should incorporate the planning that is consistent with Technology Readiness
   Assessment (TRA) and successfully execute the Technology Development Strategy
   (TDS).



                     Integrity - Service - Excellence                                       49
                                            AFI 63-101: Chapter 3
                                                              Responsibilities
Deputy Assistant Secretary, (Science, Technology and Engineering).
 SAF/AQR will:
3.11.5 Review MDAP TRA plans for Milestones B and C, to include the Program Office’s
   identification of critical technologies and technical experts to perform the TRA.
3.11.6 Review and validate MDAP TRAs one month prior to milestone decision date and
   forward endorsement to the CAE for Milestone B and C. Additionally, transmit ACAT ID
   [and IAM] endorsements through the CAE to DUSD (S&T).
(Also, 4.3.5.2 SAF/AQR reviews MDAP TRAs and ACAT II TRAs for which the CAE has
   retained responsibility as the MDA one month prior to the Milestone review date, and
   forwards a recommendation to the CAE.


Capabilities Directors (CD). SAF/AQI will:
3.7.2 Review MDAP TRA plans for Milestones B and C, to include the program
  office identification of critical technologies and technical experts to perform the
  TRA.
3.7.3 Review MDAP TRAs one month prior to scheduled Milestone decision date.

                    Integrity - Service - Excellence                                      50
                                         AFI 63-101: Chapter 3
                                      Responsibilities (continued)

Program Executive Officers (PEO). PEOs will:
3.14.10 Ensure use of mature technologies demonstrated in relevant
  environments at Milestone B and Milestone C.


Program Manager (PM) Responsibilities. PMs will:
3.16.5 Use mature technology demonstrated in operationally relevant
  environments for product development and production of each increment of
  capability. Coordinate plans prior to starting an objective assessment of critical
  technologies for MDAPs… with SAF/AQR six months prior to Milestone B and C
  to avoid schedule delays…




                   Integrity - Service - Excellence                               51
                                   AFI 63-101: Chapter 4

4.3.5.2 Assessing Technology Readiness
 All acquisition programs must complete an objective technology
  readiness assessment (TRA) for MDA consideration at Milestones B
  and C.
 The assessment [TRA] determines whether or not critical technologies
  are sufficiently mature for product development and low-rate initial
  production.
 A critical technology should have been demonstrated in an
  operationally-relevant environment (or, more preferably, in an
  operational environment) to be considered mature enough to use in
  systems integration during product development.
 TRAs for ACAT II and III programs are reviewed by the applicable MDA.
  TRAs should be accomplished in an efficient and timely manner to
  prevent a delay to a Milestone decision.



              Integrity - Service - Excellence                        52
             Defense Acquisition Guidebook (DAG)
                              ACAT ID and IAM TRA Submission
   The DoD Component Science and Technology (S&T) Executive [AQR]
    directs the technology readiness assessment and, for Acquisition
    Category ID and Acquisition Category IAM programs
        Submits the findings to the CAE who should submit his or her report to the
         DUSD(S&T) with a recommended technology readiness level (TRL) (or some
         equivalent assessment) for each critical technology.
 When the DoD Component S&T Executive [AQR] submits his or her
  findings to the CAE, he or she should provide the DUSD(S&T) an
  information copy of those findings.
 In cooperation with the DoD Component S&T Executive [AQR] and the
  program office, the DUSD(S&T) should evaluate the technology
  readiness assessment and, if he/she concurs, forward findings to the
  OIPT leader and DAB.
 If the DUSD(S&T) does not concur with the technology readiness
  assessment findings, an independent technology readiness assessment,
  under the direction of the DUSD(S&T), should be required.
                                                            DAG, Section 10.5.2


                    Integrity - Service - Excellence                                  53
                                   DoD TRA Deskbook*
                                    Section 3, The TRA Process

3.2 Identifying CTEs
 3.2.1 TRA schedule Established
  a) AQR** / PM establish schedule for conducting TRA
  b) AQR provides training and support to SPO as needed
 3.2.2 The CTE Identification Process
  a) PM develops candidate list of CTEs using WBS or system
      architecture
  b) AQR / PM form independent team to review CTEs
  c) Independent team recommends which CTEs should be assessed in
      the TRA
 3.2.3 Data Collection
  a) PM collects data for TRL assessment
 3.2.4 CTEs Coordinated
  a) PM submits final CTE list to AQR
  b) AQR reviews CTEs and coordinates with PM


*Prepared by DUSD(S&T), May 2005
**AQR is “Component (S&T)”

                  Integrity - Service - Excellence                  54
                                   DoD TRA Deskbook
                  Section 3, The TRA Process (continued)

3.3 Assessing CTE Readiness
 3.3.1 TRA Performed
  a) AQR appoints and trains independent team to make
      assessments. May or may not be same team as in CTE
      identification, 3.2.2
  b) Independent team assesses TRLs for each CTE and prepares TRA
      for submission
 3.3.2 TRA Coordination
  a) AQR approves TRA and submits to CAE and info copies DUSD(S&T)
  b) CAE submits report to DUSD(S&T)
 3.3.3 DUSD(S&T) TRA Review and Evaluation
  a) DUSD(S&T) evaluates TRA in cooperation with AQR and PM
  b) If DUSD(S&T) does not concur with the TRA, and independent
      technical assessment can be conducted




             Integrity - Service - Excellence                        55
                                  DoD TRA Deskbook
                                Section 4, Submitting a TRA

4.2 Annotated Template for a TRA Submission
 “3.1 Process Description”
 Who led the TRA and what organizations or individuals
   performed the TRA (identifies the special expertise,
   establishing the competence and independence of the TRA)
 How CTEs were identified (process and criteria used, and who
   identified them) – describes the scale used for the
   assessments (TRLs)
 What analyses and investigations were performed when
   making the assessment




              Integrity - Service - Excellence                   56
                                    DoD TRA Deskbook
                   Section 4, Submitting a TRA (continued)

4.2 Annotated Template for a TRA Submission
 “3.2 CTEs”
 Shows the WBS or systems architecture and the CTEs
 Explains criterion for technologies that were included
 Describes the environment surrounding each CTE
 A table that lists the technology name and includes a words
   that describe the technology, its function, and the environment
   is appropriate
 Any additional technology elements that AQR considers
   critical should be included




               Integrity - Service - Excellence                      57
                                     DoD TRA Deskbook
                    Section 4, Submitting a TRA (continued)

4.2 Annotated Template for a TRA Submission
 “3.3 Assessment of Maturity”
 Describes the technology (subsystem, component, or technology)
 Describes the function it performs and, if needed, how it relates to
   other parts of the system
 Provides a synopsis of development history and status (facts
   about related uses, hours of testing, prototyping, relevance of test
   conditions, and results achieved)
 Describes the environment in which the technology has been
   demonstrated (include brief analysis comparing demonstrated
   environment to intended operational environment)
 The criteria for TRL and assign readiness level; include rationale
 Provide extensive references to papers, presentations, data, and
   facts that support the assessment
 Repeat for each CTE




              Integrity - Service - Excellence                            58
                                             DAU Online TRA Training
                        “TRA Process” https://learn.dau.mil/html/clc/Clc.jsp
   7 Steps:                                         1. Set Schedule

    Responsibilities:
    1.   PM responsibility (integrate into           2. Identify CTEs
         IMS); coord with AQR; keep
         DUSD(S&T) informed
    2.   PM responsibility; AQR verifies                                      5. Collect
                                                   3. Coordinate CTEs
                                                                                 Data
    3.   PM responsibility; coord with
         AQR; keep DUSD(S&T)
         informed
                                               4. Assess CTEs; prepare TRA
    4.   AQR responsibility; appoint
         independent review team; PM
         funds
    5.   PM responsibility                     6. Coordinate and Submit TRA

    6.   AQR coord; AE submits
    7.   DUSD(S&T) responsibility
                                                     7. OSD Review



                         Integrity - Service - Excellence                                  59
                              Document CTE Maturity
                                       Assessment
   CTE #1 - Name
   CTE Description
      Describe technology (subsystem, component, or technology)
      Describe function performed and, if needed, how it relates to
       other parts of the system
      Provide synopsis of development history and status (facts
       about related uses, hours of testing, prototyping, relevance
       of test conditions, and results achieved)
      Describe environment in which the technology has been
       demonstrated (include brief analysis comparing
       demonstrated environment to intended operational
       environment)
   TRL (now and at next milestone)
      Criteria for TRL and assign readiness level; include rationale
      Provide extensive references to papers, presentations, data,
       and facts supporting the assessment

Repeat for each CTE; Use next slide for immature CTEs
               Integrity - Service - Excellence                    60
                     Document CTE Maturity
                     Assessment (continued)
   CTE #2
   CTE Description
   TRL (now and at next milestone) and Rationale
   Summarize Technology Maturation Plan,
    appended (e.g., if TRL < 6 for MS B)
      Schedule showing Technology Demonstration
       Events (past and future)




          Integrity - Service - Excellence          61

								
To top