Docstoc

Program Management Office Plan

Document Sample
Program Management Office Plan Powered By Docstoc
					ELECTRONIC RECORDS ARCHIVES


          METRICS PLAN
            (MP v4.0)

                   (WBS # 1.8.1.1.2.)


                        for the


       NATIONAL ARCHIVES AND
      RECORDS ADMINISTRATION

    ELECTRONIC RECORDS ARCHIVES
    PROGRAM MANAGEMENT OFFICE
          (NARA ERA PMO)

                         Final

                   January 26, 2006



                     Prepared by:


    Integrated Computer Engineering (ICE)
                A Subsidiary of
     American Systems Corporation (ASC)


             Contract Number: GS-35F-0673K
     Delivery Order Number: NAMA-01-F-0031/05-036
Electronic Records Archive (ERA)                                             Metrics Plan (MP)
ERA Program Management Office (ERA PMO)
                                                Final

                           ERA METRICS PLAN (MP)

                                     Signature Page

Program Director,


I recommend approval of the Metrics Plan (MP).


__________________________________                          _______________________
Ira Harley                                                  Date
ERA Program




Accepted,



__________________________________                          _______________________
Carmen Colon                                                Date
ERA Program




Approved,



__________________________________                          _______________________
Kenneth Thibodeau                                           Date
ERA Program Director




01/26/06                                    i                               ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                Metrics Plan (MP)
ERA Program Management Office (ERA PMO)
                                                   Final

                        Document Change Control Sheet
Document Title: Metrics Plan (MP)

   Date        Filename/version #              Author               Revision Description
05/29/02       ERA.DC.MP.1.0.doc          Beth Shoults       Baseline MP
07/18/03       ERA.DC.MP.2.0.doc          David Harold       Incorporation of Government
                                                             comments from ERA
                                                             Documentation Review Comment
                                                             Form
09/07/04     ERA.DC.MP.2.1.DOC            David Harold       Annual Update,
                                                             See CR-ERA-PMO-DCMT-87
11/03/04      ERA .DC.MP.3.0.DOC David Harold                Incorporation of Government
                                                             comments from ERA
                                                             Documentation Review Comment
                                                             Form
12/23/05      ERA .DC.MP.3.1.DOC John Dean                   Annual Update. Reflects the
                                                             changeover from Systems Analysis
                                                             and Design to System Development.
                                                             See CR ERA00000624.
01/26/06      ERA.DC.MP.4.0.DOC David Harold                 Incorporation of Government
                                                             comments from ERA Deliverable
                                                             Review Comment Form




01/26/06                                      ii                              ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                                                                                  Metrics Plan (MP)
ERA Program Management Office (ERA PMO)
                                                                                   Final

Signature Page ................................................................................................................................. i
Document Change Control Sheet.................................................................................................... ii


1.0        INTRODUCTION .........................................................................................................................................1
   1.1         PURPOSE......................................................................................................................................................1
   1.2         ERA PROGRAM OVERVIEW .........................................................................................................................1
   1.3         SCOPE..........................................................................................................................................................1
      1.3.1      Metrics Characteristics..........................................................................................................................2
      1.3.2      Assumptions ...........................................................................................................................................2
      1.3.3      Limitations .............................................................................................................................................2
   1.4         ACRONYMS AND DEFINITIONS.....................................................................................................................2
   1.5         REFERENCES ...............................................................................................................................................4
      1.5.1      Standards and Guidelines ......................................................................................................................4
      1.5.2      NARA and ERA PMO Documentation ...................................................................................................5
2.0        ORGANIZATION .........................................................................................................................................5
   2.1         ROLES AND RESPONSIBILITIES ....................................................................................................................6
   2.2         SCHEDULE/INCREMENTAL APPROACH ........................................................................................................6
   2.3         PLANNED TASKS AND ACTIVITIES ...............................................................................................................6
   2.4         TASK ESTIMATION AND COST .....................................................................................................................7
3.0        METRICS COLLECTION AND USE ........................................................................................................7
   3.1      METRICS DEFINITION AND METHODOLOGY ................................................................................................7
      3.1.1   ERA PMO Metrics .................................................................................................................................8
      3.1.2   Development Contractor Metrics ..........................................................................................................9
   3.2      METRICS ENVIRONMENT INFRASTRUCTURE .............................................................................................. 10
      3.2.1   Metrics Collection ............................................................................................................................... 10
      3.2.2   Metrics Reporting ................................................................................................................................ 10
      3.2.3   Metrics Storage .................................................................................................................................... 10
4.0        RESOURCES .............................................................................................................................................. 11
   4.1         RESOURCES FOR METRICS ......................................................................................................................... 11
   4.2         TOOLS FOR METRICS ................................................................................................................................. 11
   4.3         TRAINING .................................................................................................................................................. 12
5.0        RISKS ........................................................................................................................................................... 12

6.0        QUALITY CONTROL MEASURES ........................................................................................................ 12

7.0        PLAN MAINTENANCE ............................................................................................................................ 12

APPENDIX A: ERA PMO METRICS DESCRIPTIONS .................................................................................. A-1

APPENDIX B: ERA IV&V CONTRACTOR METRICS DESCRIPTIONS ..................................................... B-1

APPENDIX C: ERA DEVELOPMENT CONTRACTOR METRICS DESCRIPTIONS ............................... C-1




01/26/06                                                                     iii                                                              ERA.DC.MP.4.0.doc

                                            ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                                                   Metrics Plan (MP)
ERA Program Management Office (ERA PMO)
                                                                   Final

                                                LIST OF FIGURES

Figure 3-1 Goal-Question-Metric Paradigm ................................................................................... 8


                                                 LIST OF TABLES

Table 1-1: Acronyms List ............................................................................................................... 4
Table A-1: Metric Set Definition ................................................................................................ A-1
Table B-1: Metric Set Definition ................................................................................................ B-1




01/26/06                                                      iv                                                ERA.DC.MP.4.0.doc

                                    ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                         Metrics Plan (MP)
ERA Program Management Office (ERA PMO)
                                                    Final

                                     ERA METRICS PLAN
1.0        Introduction

This section describes the purpose of the Metrics Plan (MP), provides background information
on the program, provides the scope including assumptions and limitations, defines terminology
used in the plan, and lists documents used as reference materials during plan development.

1.1        Purpose

The ERA MP is a program level document and its purpose is to plan metrics activities for the
Electronic Records Archives (ERA) Program for use throughout the ERA system lifecycle. It
describes the schedules, functions, responsibilities, and procedures for all metrics activities
within ERA.

The audience for this document is the ERA Program Management Office (PMO), as well as
National Archives and Records Administration (NARA) management responsible for oversight
of the ERA program. The collected metrics provide insight into the achievement of the ERA
vision through completion of system development and other program activities. Additionally,
the metrics provide input to NARA’s technical, quality, and product performance goals as
described in The Strategic Plan of the National Archives and Records Administration and the
Annual Performance Plan.

1.2        ERA Program Overview

ERA will be a comprehensive, systematic, and dynamic means for preserving virtually any kind
of electronic record, free from dependence on any specific hardware or software. The ERA
system, when operational, will make it easy for NARA users to find records they want and easy
for NARA to deliver those records in formats suited to the needs of its users. The success of the
ERA PMO in building and deploying the ERA system will depend on professional program and
project management with an emphasis on satisfying NARA requirements for a viable system.

1.3        Scope

Metrics provide visibility to the status and ongoing progress of the ERA program. Metrics to be
collected during the Systems Development Phase of the ERA system lifecycle (Increments 1–5)
as identified in this plan will be used to track the size, effort, budget, and schedule of the ERA
program. This plan applies to all ERA metrics that are required to be collected by the ERA PMO
and Independent Verification and Validation (IV&V) contractor as well as selected metrics
collected by the Development Contractor as documented herein. The ERA MP provides the
following:

      •    Definition and usage of the metrics;
      •    Identification of the roles and responsibilities for metrics collection, reporting, storage,
           and tracking processes; and
      •    Procedures, tools, and resources required for metrics collection and reporting.
01/26/06                                       Page 1                                  ERA.DC.MP.4.0.doc

                              ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                   Metrics Plan (MP)
ERA Program Management Office (ERA PMO)
                                                 Final

1.3.1      Metrics Characteristics

All metrics must adhere to the following characteristics of software lifecycle data as defined in
IEEE/EIA 12207.1-1997, Standard for Information Technology – Software life cycle processes –
Implementation Considerations.

      •    Unambiguous: Data is described in terms that only allow a single interpretation.
      •    Complete: Data includes necessary, relevant requirements with defined units of measure.
      •    Verifiable: A person or a tool can check the data for accuracy or correctness.
      •    Consistent: There are no conflicts within the data.
      •    Traceable: The origin of the data can be determined.
      •    Presentable: The data can be retrieved and viewed.

1.3.2      Assumptions

The ability to manage metrics assumes that in instances where metrics for the ERA PMO and
Development Contractor will be merged, the tools used by the Development Contractor will be
compatible with those used by the ERA PMO, and that other metrics will be provided by the
Development Contractor in a form directly usable by the PMO, such as Microsoft Excel
spreadsheets.

1.3.3      Limitations

There may be timing difficulties associated with the ability to acquire on a timely basis
Development Contactor metrics for inclusion in the PMO’s monthly Metrics Report.

1.4        Acronyms and Definitions

The terms used in this plan are defined in IEEE Std. 610.12-1990, IEEE Standard Glossary of
Software Engineering Terminology. Table 1-1, Acronyms List, contains a list of acronyms used
herein.

      ACRONYM                                  DEFINITION
 AC                      Actual Cost
 ACWP                    Actual Cost of Work Performed
 AI                      Action Item
 AS                      Acquisition Strategy
 BAC                     Budget At Completion
 BCWP                    Budgeted Cost Work Performed (earned value)
 BCWS                    Budgeted Cost of Work Scheduled (planned value)
 CCB                     Configuration Control Board
 CI                      Configuration Item
 CM                      Configuration Management
 CMG                     Configuration Management Guidance
 CMP                     Configuration Management Plan
01/26/06                                     Page 2                              ERA.DC.MP.4.0.doc

                            ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                Metrics Plan (MP)
ERA Program Management Office (ERA PMO)
                                              Final


   ACRONYM                                  DEFINITION
 CONOPS              Concept of Operations
 COTP                Contractor Oversight and Tracking Plan
 COTS                Commercial Off-The-Shelf
 CP                  Communications Plan
 CPI                 Cost Performance Index
 CPR                 Cost Performance Report
 CR                  Change Request
 C/SSR               Cost/Schedule Status Report
 CUG                 Change Request Database Users Guide
 CV                  Cost Variance
 EAC                 Estimate At Completion
 ELC                 ERA Life Cycle
 ERA                 Electronic Records Archives
 ESLOC               Effective Source Lines of Code
 ETC                 Estimate To Complete
 EV                  Earned Value
 FTE                 Full Time Equivalent
 GQM                 Goal-Question-Metric
 IEEE                Institute of Electrical and Electronics Engineers, Inc.
 IMS                 Integrated Master Schedule
 IPT                 Integrated Product Team
 IT                  Information Technology
 I&T                 Integration and Test
 ITD                 Inception To Date
 IV&V                Independent Verification and Validation
 KSLOC               Thousand Source Lines of Code
 LOE                 Level Of Effort
 LMC                 Lockheed Martin Corporation
 MP                  Metrics Plan
 MR                  Metrics Report
 MRP                 Metrics Report Process
 NARA                National Archives and Records Administration
 OBS                 Organizational Breakdown Structure
 PD                  Program Director
 PF                  Performance Factor
 PMBOK               Project Management Body of Knowledge
 PMD                 Program Management Division
 PMI                 Project Managements Institute
 PMO                 Program Management Office
 PMP                 Program Management Plan
 PO                  Program Office
 POST                Program Office Support Team
01/26/06                                  Page 3                               ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                 Metrics Plan (MP)
ERA Program Management Office (ERA PMO)
                                                 Final


      ACRONYM                                  DEFINITION
 PRP                    Peer Review Process
 PSD                    Program Support Division
 PTR                    Problem Trouble Report
 PV                     Planned Value
 QM                     Quality Management
 QMP                    Quality Management Plan
 RFP                    Request For Proposal
 RKG                    Risk Management Guidance
 RKM                    Risk Management Plan
 RO                     Risk Officer
 RQM                    Requirements Management Plan
 RW                     Remaining Work
 SAD                    Systems Analysis and Design
 SDLC                   Systems Development Life Cycle
 SED                    System Engineering Division
 SLIM                   Software Lifecycle Management
 SLOC                   Source Lines of Code
 SPI                    Schedule Performance Index
 STD                    Standard
 SV                     Schedule Variance
 TAB                    Total Allocated Budget
 TCPI                   To Complete Performance Index
 TRA                    Training Needs Assessment
 TRP                    PMO Training Plan
 TSP                    Testing Management Plan
 VAC                    Variance At Completion
 VAR                    Variance Analysis Report
 WBS                    Work Breakdown Structure
 WR                     Work Remaining
 XO                     Executive Officer
                                     Table 1-1: Acronyms List

1.5        References

The standards, guidelines, and documentation used to develop the ERA MP are described in the
sections that follow.

1.5.1      Standards and Guidelines

The standards and guidelines used in preparation of this document are listed below.

      •    American National Standards Institute (ANSI) 748-A

01/26/06                                     Page 4                            ERA.DC.MP.4.0.doc

                            ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                  Metrics Plan (MP)
ERA Program Management Office (ERA PMO)
                                                 Final

      •    IEEE/EIA Guide, Industry Implementation of International Standard ISO/IEC
           12207:1995 (ISO/IEC 12207), Standard for Information Technology – Software life
           cycle processes – Implementation Considerations, April 1998
      •    IEEE/EIA Guide, Industry Implementation of International Standard ISO/IEC
           12207:1995 (ISO/IEC 12207), Standard for Information Technology – Software life
           cycle processes – Life cycle data, April 1998
      •    IEEE Standard 1061-1998, Software Quality Metrics Methodology; December 8, 1998
      •    IEEE Standard for Software Productivity Metrics, Software Engineering Standards
           Subcommittee of the Technical Committee on Software Engineering of the IEEE
           Computer Society, March 22, 1993
      •    Project Management Institute’s (PMI) Project Management Body of Knowledge
           (PMBOK) 2004 Edition
      •    Government Performance and Results Act of 1993 (GPRA)

1.5.2      NARA and ERA PMO Documentation

The following NARA and ERA PMO or Development Contractor documentation, unless
superceded by the current version, was used to support the generation of this document.

      •    Fiscal Year 2004 Annual Performance Plan, Revised Final
      •    The Strategic Plan of the National Archives and Records Administration, 1997-2008,
           Revised 2003
      •    ERA Change Request Database User Guide (CUG) Version 1.4
      •    ERA Configuration Management Plan (CMP) Version 2.3
      •    ERA Metrics Report (MR)
      •    ERA Metrics Report Process (MRP) Version 1.0
      •    ERA Peer Review Process (PRP) Version 1.1
      •    ERA Program Management Plan (PMP) Version 2.3
      •    ERA Quality Management Plan (QMP) Version 2.6
      •    ERA Requirements Management Plan (RQM) Version 2.2
      •    ERA Risk Management Plan (RKM) Version 3.0
      •    ERA Testing Management Plan (TSP) Version 2.1
      •    ERA Training Needs Assessment (TRA) Version 2.1
      •    ERA PMO Training Plan (TRP) Version 3.0
      •    LMC Quantitative Management and Quality Plan, Version 1.2

2.0        Organization

The ERA PMO Organization is depicted in the ERA Program Management Plan (PMP). Please
refer to this document for the most recent ERA PMO organization chart.




01/26/06                                     Page 5                             ERA.DC.MP.4.0.doc

                            ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                   Metrics Plan (MP)
ERA Program Management Office (ERA PMO)
                                                 Final

2.1        Roles and Responsibilities

Roles and responsibilities for the ERA PMO are described in the ERA PMP. Please refer to this
document for more specificity.
2.2    Schedule/Incremental Approach
The source selection process has been fully completed and a single Development Contractor,
Lockheed Martin Corporation (LMC), has been down selected. LMC has been awarded an
option to develop ERA Increment One. Options for subsequent increments will be awarded
subject to availability of funding and adequate Development Contractor performance on the
preceding increment.

2.3        Planned Tasks and Activities

The metric task is identified in the ERA PMP. Metrics activities comprising this task, including
the collection, storage, and reporting of metrics using the ERA Metrics Report (MR) and other
means, are identified and scheduled in accordance with the ERA Work Breakdown Structure
(WBS) and Schedule, which is controlled as part of the ERA PMP.

Metrics are collected from a variety of sources that include the ERA PMO, Program Office
Support Team (POST), the Development Contractor, and IV&V contractor. Metrics will be
collected and reported on a monthly basis in the ERA MR.

As the reporting period nears completion, the ERA Metrics Task Leader transmits an e-mail to
all PMO metrics providers, e.g., Configuration Management (CM), the ERA PMO POST
Program Support Division Manager, Risk Officer, and IV&V Lead, requesting that metrics for
the reporting period just completed be collected/generated and reported. The e-mail contains a
desired due date for the metrics data.

Development contractor metrics are not submitted to the ERA PMO Contracting Officer’s
Representative (COR) as a formal CDRL delivery. The Development Contractor does however
provide metrics to the ERA PMO in various forums, such as monthly Reviews, and also places
them on their collaboration portal. The ERA Metrics Task Analyst will on a monthly basis
accumulate metrics of interest from the provided sources for incorporation into the ERA MR.

Upon receipt of the metrics data, the ERA Metrics Analyst saves the data to his/her account on
the working drive (i.e., ‘H’ drive). The ERA Metrics Task Leader obtains a copy of the previous
month’s ERA Metrics Report and the corresponding set of Microsoft Excel Workbooks. The
workbooks contain the tables and charts that will subsequently be copied to the ERA MR (which
is a Microsoft Word document). Each workbook will either contain a single metric or will
contain closely related sets of metrics. For instance, all CM metrics or all EVMS metrics could
be in a single workbook. Using the recently provided metrics data, the task leader analyzes the
data and populates the tables that are contained in individual worksheets in the Microsoft Excel
Workbooks. Once a table is updated with the reporting period’s data, the ERA Metrics Task
Leader updates the range of the source data to create an updated chart. This process repeats itself
for each metric.

01/26/06                                     Page 6                              ERA.DC.MP.4.0.doc

                            ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                   Metrics Plan (MP)
ERA Program Management Office (ERA PMO)
                                                Final

Once the Microsoft Excel Workbooks have been updated, the ERA Metrics Task Leader makes a
copy of the previous reporting period’s metrics report. Using the data that has been provided, the
ERA Metrics Task Leader populates the report with the latest metrics data (from the reporting
period just concluded). The front matter in the ERA MR (i.e., the text that precedes the graphical
representations of the data), consists of an encapsulation that describes the actual performance
for each metric. For example, Fiscal Year cost numbers are reported, e.g., budget, outlays, and
obligations, and the percentage under or over budget. The corresponding Microsoft Excel chart
provides a pictorial of the same Fiscal Year cost numbers.

Once this process is complete, the ERA Metrics Task Leader sends the ERA MR (i.e., the
Microsoft Word document) and the embedded spreadsheet data in the Microsoft Excel file(s)
electronically to the metrics data providers, requesting an informal review for the purpose of
ensuring that the information has been accurately portrayed in the charts. When complete, the
ERA MR is submitted to the Government by the ERA PMO POST Program Manager. Once
submitted, the ERA MR is subject to the ERA Document Review process. In addition, the
Metrics Task Lead will extract the graphics portraying each of the metrics and create a
PowerPoint presentation that can be directly presented to ERA PMO decision makers. This
presentation will be placed on the S: Drive in the Metrics folder.

2.4        Task Estimation and Cost

The ERA WBS and Schedule, part of the ERA PMP, defines metrics activities/tasks. WBS task
estimation and costs will be developed from the lowest WBS element. Please refer to the ERA
PMP for more detailed information.

3.0        Metrics Collection and Use

This section provides details regarding metrics definition, collection, and reporting. Application
of the measurement approach provides all program stakeholders with a common and quantitative
means to monitor risk and program success in a timeframe that avoids or minimizes program
impacts and the cost of correction.

Section 3.1 defines the methodology used to determine the metrics to be collected and reported
during the ERA system lifecycle. Metrics are subject to periodic review and update as program
activities are completed. Descriptions, definitions of data items, computations, additional data,
and examples of each metric are provided in Appendix A, ERA Metrics Descriptions.

Section 3.2 provides the detailed collection and storage procedures for the metrics as well as the
reporting requirements.

3.1        Metrics Definition and Methodology

The ERA MP defines a set of metrics that provide insight to system quality and productivity as
well as product characteristics and program management. The metrics collected and reported
based on this plan will allow the ERA PD monitor the status of the ERA program from a
quantitative perspective, and will help in making more informed programmatic decisions.

01/26/06                                    Page 7                               ERA.DC.MP.4.0.doc

                           ♦ National Archives and Records Administration ♦
   Electronic Records Archive (ERA)                                                      Metrics Plan (MP)
   ERA Program Management Office (ERA PMO)
                                                     Final


              Note that any metric in isolation is not sufficient to determine program status. A
              set of metrics and their trends are usually needed to make a good judgment.

   As an example: When a metric such as “Requirements Coverage” indicates unacceptable
   coverage of requirements for a given reporting period, other measures may be evaluated in order
   to isolate the specific cause(s) of the problem and/or use the data to analyze trends. In this way,
   the corrective action taken addresses the actual problem not just the symptoms. The key to
   successful use of the metrics defined in this plan is the frequency of reporting and data analysis.

   In trying to determine what to measure in order to achieve the goals of the ERA program, the
   Goal-Question-Metric (GQM) paradigm was used. Figure 3-1, Goal-Question-Metric
   Paradigm, illustrates the relationship of the GQM components.


Goal




Question




Metric


                                Figure 3-1 Goal-Question-Metric Paradigm

   The GQM paradigm is based on the theory that all measurement should be goal-oriented, i.e.,
   there has to be some rationale and need for collecting measurements, rather than collecting
   metrics just to collect metrics. Each metric collected is stated in terms of the major goals of the
   ERA development project. Questions are then derived from the goals and help to refine,
   articulate, and determine if the goals can be achieved. The metrics or measurements that are
   collected are then used to answer the questions in a quantifiable manner.

   Additionally, ERA program metrics provide input to NARA’s technical, quality, and product
   performance goals as described in The Strategic Plan of the National Archives and Records
   Administration and the Annual Performance Plan.

   3.1.1      ERA PMO Metrics

   Metrics will be collected and reported by the ERA PMO (including POST and IV&V) and the
   development contractor. Metrics reported by the development contractor will be reported

   01/26/06                                      Page 8                                ERA.DC.MP.4.0.doc

                                ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                 Metrics Plan (MP)
ERA Program Management Office (ERA PMO)
                                                Final

separately and by different means from the ERA PMO metrics. Metrics to be collected by the
ERA PMO include the following.

    •      Configuration Management
           – Action Item Aging
           – Corrective Action Aging
           – Change Request Inventory
           – Requirements Rate of Change
    •      Budget Tracking
    •      Earned Value
           − Budget At Completion (BAC)
           − Budgeted Cost of Work Scheduled (BCWS)
           − Budgeted Cost of Work Performed (BCWP)
           − Actual Cost of Work Performed (ACWP)
           − Estimate At Completion (EAC)
           − Variance at Completion (VAC)
           − Cost Variance
           − Schedule Variance
           − Schedule Performance Index (SPI)
           − Cost Performance Index (CPI)
           − To Complete Performance Index (TCPI)
    •      Program Staffing Profile
    •      Risk Containment Summary
    •      Work Product Completion
    •      Defect Management
    •      IV&V
           − IV&V Anomaly Density
           − IV&V Efficiency
           − IV&V Effectiveness

3.1.2      Development Contractor Metrics

Development contractor metrics will be provided to the ERA PMO in accordance with the
reporting frequencies stated in LMC’s Quantitative Management and Quality Plan (QMQP).
These will generally be in graphical format, and will be provided in the most appropriate forum,
such as Engineering Critical Thread Reviews, Monthly Status Reports and Reviews, and Major
Design Reviews. It should be noted that these metrics are not formal CDRL deliverables.
Metrics to be provided by the development contractor include:

    •      Action Item Aging,
    •      Change Request Inventory,
    •      Data Acceptance Status,
    •      Data Delivery Timeliness,
    •      Defect Management,
01/26/06                                    Page 9                             ERA.DC.MP.4.0.doc

                           ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                 Metrics Plan (MP)
ERA Program Management Office (ERA PMO)
                                                  Final

      •    Earned Value Management,
      •    Help Desk Success Rate,
      •    Labor Months by CSCI,
      •    Requirements Growth,
      •    Requirements Volatility,
      •    Risk Containment Summary,
      •    Software Development Productivity,
      •    I&T ESLOC Productivity
      •    Software Size,
      •    Source Lines of Code (SLOC) Growth,
      •    Schedule Performance, and
      •    Test Coverage.

The Development Contractor captures numerous other metrics, primarily for the purpose of
supporting their continuous process improvement efforts.

3.2        Metrics Environment Infrastructure

The sections below describe the metrics collection, reporting, and storage requirements.

3.2.1      Metrics Collection

Various ERA PMO Organization team members are responsible for ensuring that metrics data is
collected and reported in a timely manner. In some cases, this effort may require using tools to
extract the metrics data from a database at the appropriate time. Other data, e.g., number of
personnel/staff changes, is compiled manually. Tools used for the collection and reporting of
ERA metrics are defined in Section 4.2, Tools for Metrics. The data source used to collect the
data is provided in the metrics tables in Appendix A. Where possible, data is extracted
automatically from other sources. The collection and reporting for subsequent ERA system
lifecycle phases will be defined in future updates to the ERA MP.

3.2.2      Metrics Reporting

The metric data that is collected will be used for both monthly and quarterly metrics reporting.
The ERA MR will be generated on a monthly basis. The metric data to be used will be as of the
end of the reporting period. In addition, metrics graphics will be extracted from the ERA MR and
made available on the S: drive in briefing chart format to facilitate review by managers at all
levels.

3.2.3      Metrics Storage

Metrics data can be produced via simple query using the numerous toolsets, e.g., Rational Suite
AnalystStudio, wInsight, etc. that will be utilized on the ERA program. The periodic reports that
are generated are stored in a program repository managed through PVCS, to include the
Microsoft Excel file containing the metrics data.

01/26/06                                     Page 10                            ERA.DC.MP.4.0.doc

                             ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                      Metrics Plan (MP)
ERA Program Management Office (ERA PMO)
                                                  Final

The ERA MR is submitted in Microsoft Word format. This document contains charts that have
been copied from the Microsoft Excel files corresponding to specific metrics. All data used in
the compilation of the metrics report is stored in the Microsoft Excel file. The Microsoft Excel
file is collocated with the Microsoft Word file on the S: drive.

4.0        Resources

This section describes the ERA PMO metric resource requirements that will be required during
the course of the ERA system lifecycle.

4.1        Resources for Metrics

The resources needed for metrics are those provided by the ERA PMO to collect, enter, and
validate the data and provide the reports. For Increments 1-5, it is anticipated that the total level
of staffing required for the metrics collection and reporting effort will be one half (0.5) Full Time
Equivalent (FTE).

4.2        Tools for Metrics

Metrics collected, generated, and provided during the ERA system development lifecycle will be
gathered from various sources including, but not limited to, those listed below.

      •    ERA Deliverables Tracking Status -Microsoft Word table that tracks documentation that
           was submitted during the reporting period which is extracted from the WBS
      •    ERA Risk Radar - For a summary of all risks identified and tracked
      •    Microsoft Excel - For generation, storage, and reporting of metrics data including EV
      •    Microsoft PowerPoint - For the latest ERA Organizational Charts
      •    Microsoft Office Project Web Access and CS Solutions’ wInsight for EV data
      •    Microsoft Word - For actual generation of the Metrics Report to include presentation of
           the Microsoft Excel spreadsheets including description of the findings as of the end of the
           reporting period for each metric contained in the report
      •    Novell GroupWise – E-mail application to be used in conjunction with Rational Suite
           AnalystStudio applications that require automatic notifications to users
      •    PRISM - tracks obligations and expenditures for appropriated funds for the ERA Program
      •    Project Connect and wInsight (C/S solutions) – Export EV data from Microsoft Project to
           wInsight Utility applications to comply with ANSI 748-A standards
      •    Rational Suite AnalystStudio – Includes the following software applications:
           – Rational ClearQuest – Software application that is used in conjunction with Oracle
               for the tracking of the following:
                Action Items,
                Change Requests, and
                Corrective Actions.
           – Rational RequisitePro – Software application that will be used for requirements
               management
           – Rational ClearCase – Software application that will be used for management and
               version control of configuration items to include change history
01/26/06                                     Page 11                                ERA.DC.MP.4.0.doc

                             ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                  Metrics Plan (MP)
ERA Program Management Office (ERA PMO)
                                               Final


If the volume of metrics increases, other tools may be evaluated for a match with the needs of the
program.

4.3        Training

Training will be provided on the metrics collection process as specific training needs are
identified. Training that will be provided, will be performed in accordance with the ERA
Training Needs Assessment (TRA) and PMO Training Plan (TRP).

5.0        Risks

According to the IEEE Std. 1061-1998, Standard for Software Quality Metrics Methodology, the
purpose of measurement is to help management achieve project objectives, identify and track
risks, satisfy constraints, and recognize problems early. A system of ERA’s magnitude will not
be void of risk. Utilization of metrics in support of risk activities outlined in the ERA Risk
Management Plan (RKM) may facilitate mitigation efforts that reduce the probability and/or
severity or eliminate risks when encountered.

6.0        Quality Control Measures

Updates made to the ERA MP will be subject to peer review. Corrections to anomalies recorded
in a quality peer review meeting will be recorded in the meeting. The author will make
corrections and Quality Management (QM) will verify compliance.

The ERA PMO QM team will conduct process improvement reviews to review and evaluate
metrics generated during the ERA system development effort. Findings from these reviews may
cause a determination that the metrics processes need to be modified to be more effective.
Process improvement recommendations will be an output of these reviews.

The ERA MR will be submitted in accordance with the information provided in Section 3.1.1 and
is subject to QM review in accordance with the ERA Quality Management Plan (QMP).

7.0        Plan Maintenance

The ERA PD is responsible for this plan. As a part of process improvement (e.g., IV&V
assessments, lessons learned, QM assessments), the ERA MP and the overall quality
management approach will continue to evolve. The plan will be updated as needed to maintain
current and sufficient metrics management activities. The ERA MP was placed under CM
control following its initial approval by the ERA PMO and updates will be controlled by the
Configuration Control Board (CCB).




01/26/06                                  Page 12                               ERA.DC.MP.4.0.doc

                          ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                 Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                 Appendix A
                                              Final

                  Appendix A: ERA PMO Metrics Descriptions
The following are detailed descriptions of the metrics collected and reported by the ERA PMO.

Table A-1, Metric Set Definition, provides an explanation of the metric items and descriptions
to enhance reader comprehension.

           Item                                        Description
   Name               Name given to the metric
   Program Goals      List of program goals (measurements are goal-oriented)
   Questions          Questions derived from goals that must be answered in order to determine if
                      the goals are achieved
   Impact             Indication of whether a metric can be used to alter or halt the project.
   Target value       Numerical value of the metric that is to be achieved in order to meet planned
                      objective. Include the critical value and the range of the metric.
   Benefits           Provides examples of the benefits derived from using the metric.
   Tools              Software or hardware tools that are used to gather and store data, compute
                      the metric, and analyze the results.
   Application        Description of how the metric is used and what its area of application is.
   Data items         Input values that are necessary for computing the metric values.
   Computation        Explanation of the steps involved in the metrics computation.
   Interpretation     Interpretation of the results of the metrics computation.
   Considerations     Provides examples of the considerations as to the appropriateness of the
                      metric (e.g., Can data be collected for this metric? Is the metric appropriate
                      for this application?).
   Example            An example of applying the metric.
   Data Source        Location of where the data is kept
                              Table A-1: Metric Set Definition




01/26/06                                   A-1                                 ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
  Electronic Records Archive (ERA)                                                    Metrics Plan (MP)
  ERA Program Management Office (ERA PMO)                                                    Appendix A
                                                 Final



     Item                                                Description
Name              Action Item Aging
Program Goals     • Monitor action item closure
                  • Monitor risk exposure due to incomplete action items
Questions         • How many action items have been generated?
                  • What is the status (Open/Closed) of the Action Items?
                  • What is the potential impact to schedule and cost due to lack of action item
                      resolution?
Impact            This metric has the potential to alter the project if it is determined that the action
                  item will cause a redesign and/or cause schedule delays.
Threshold         < 30 days old
Benefits          This metric shows the age of each open action item by severity (Critical, Major,
                  Average, and Minor). The data provides visibility to all open action items including
                  those that have been outstanding for an extended period of time so that effort may
                  be applied to ensure resolution.
Tools             Rational ClearQuest
Application       This is a program management metric used to measure the PMO’s ability to
                  effectively respond to action items assigned to it.
Data Items        • Number of Action Items – Total number of action items from start of
                      Increment 1 up through and including the reporting period, which is normally
                      monthly
                  • Number of Action Items by Severity Level – Total number of action items from
                      start of Increment 1 up through and including the reporting period filtered on
                      Severity levels, i.e., Critical (Situation calls for quick or drastic action and is
                      especially important), Major(Elevated or advanced in importance),
                      Average(Middle or intermediate degree of importance), Minor(Reflective of
                      little or diminished importance)
                  • Number of Open Action Items – Total number of Open action items as of the
                      end of the reporting period. “Open” means an item that has been submitted and
                      has any processing state other than Closed
                  • Number of Open Action Items by Severity Level – Total number of Open
                      action items filtered by Severity level as of the end of the reporting period.
                  • Action Item Age – Used to ensure all Action Items are implemented in a timely
                      manner. It is calculated as the number of days between the date an open action
                      item was submitted and the end of the reporting period.
                  • Number of Open Action Items Based on Time Interval – Number of Action
                      Items open 0-30 days, 31 –60 days, 61 –90 days, and > 90 days
                  • Number of Action Items Open Per Severity and Time Interval – Number of
                      Action Items open 0-30 days, 31–60 days, 61–90 days, and > 90 days filtered
                      by Critical, Major, Intermediate, and Minor Severity levels
Computation       See Data Items Section above for computations


  01/26/06                                    A-2                                  ERA.DC.MP.4.0.doc

                           ♦ National Archives and Records Administration ♦
  Electronic Records Archive (ERA)                                                                          Metrics Plan (MP)
  ERA Program Management Office (ERA PMO)                                                                          Appendix A
                                                                       Final


      Item                                                                     Description
Name              Action Item Aging
Interpretation    Action items that have been open for more than 30 days need to be followed up to
                  ensure closure. Action items having the greatest severity and greatest age have the
                  most potential to become risks that can affect cost and schedule.
Considerations    The higher the severity the more emphasis that should be placed on bringing the
                  action item to closure.
Example
                                                              ERA PMO Action Item Aging Report




                                              9
                     Number of Action items




                                              6                                                                    Critical
                                                                                                                   Major
                                                                                                                   Average
                                              3                                                                    Minor



                                              0
                                                       0-30           31-60           61-90          >90

                                                                          Days Open




                                                                     Action Item Aging Example
Data Source       Action Item Database




  01/26/06                                                          A-3                                    ERA.DC.MP.4.0.doc

                                                  ♦ National Archives and Records Administration ♦
  Electronic Records Archive (ERA)                                                 Metrics Plan (MP)
  ERA Program Management Office (ERA PMO)                                                 Appendix A
                                                Final



      Item                                              Description
Name              Corrective Action Aging
Program Goals     • Monitor corrective action closure
                  • Monitor risk exposure due to uncompleted corrective actions
Questions         • How many corrective actions have been generated?
                  • What is the status (Open/Closed) of the corrective actions?
                  • What is the potential impact to schedule and cost due to lack of corrective
                     action implementation?
Impact            This metric cannot be used to alter or halt a project.
Threshold         < 30 days old
Benefits          This metric shows the age of each open corrective action. The data provides
                  visibility to all open corrective actions including those that have been outstanding
                  for an extended period of time so that effort may be applied to ensure resolution.
Tools             Rational ClearQuest
Application       This is a program management metric used to measure the PMO’s ability to
                  effectively resolve corrective actions assigned.
Data Items        • Total Number of Open Corrective Actions – Total number of open corrective
                      actions as of the end of the reporting period. “Open” means a corrective action
                      that has been submitted and has any processing state other than Closed
                  • Corrective Action Age – Used to ensure all corrective actions are implemented
                      in a timely manner. It is calculated as the number of days between the date a
                      corrective action was submitted and the end of the reporting period.
                  • Cumulative Number of Open Corrective Actions Based on Time Interval –
                      Number of Corrective Actions open 0-30 days, 31 –60 days, 61 –90 days, and >
                      90 days
Computation       See Data Items Section above for computations
Interpretation    Corrective actions that have been open for more than 30 days need to be followed
                  up to ensure closure.
Considerations    Reinforces formal QM review and audit processes to ensure processes are
                  “corrected.”




  01/26/06                                   A-4                                 ERA.DC.MP.4.0.doc

                           ♦ National Archives and Records Administration ♦
  Electronic Records Archive (ERA)                                                                                Metrics Plan (MP)
  ERA Program Management Office (ERA PMO)                                                                                Appendix A
                                                                             Final


     Item                                                                            Description
Name              Corrective Action Aging
Example
                                                              ERA PMO Corrective Action Aging Report

                     Number of Corrective Actions


                                                    9



                                                    6



                                                    3



                                                    0
                                                             0-30           31-60           61-90          >90

                                                                                Days Open




                                                                       Corrective Action Aging Example
Data Source       Corrective Action Database




  01/26/06                                                                A-5                                    ERA.DC.MP.4.0.doc

                                                        ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                  Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                  Appendix A
                                              Final



           Item                                        Description
   Name              Change Request Inventory
   Program Goals     • Identify trends early in the lifecycle in order to reduce, eliminate, or avoid
                         cost and schedule implications.
                     • Identify trends in receipt and handling of Change Requests (CRs), and
                         prevent perpetual CRs.
   Questions         • What documents/software/hardware are impacted based on the change?
                     • What is the impact of the required change in terms of cost and schedule?
                     • What is the status of CRs?
   Impact            This metric can be used to alter or halt a project.
   Threshold         • A change that will cause a schedule delay to the Test Readiness Review
                          (TRR).
                     • A change that will result in over $100,000 of additional cost to the
                          program.
                     • Any change where an approved increment requirement allocation is
                          shifted to another increment.
   Benefits          Enables the identification of trends such as significant scope creep that could
                     have negative effects on cost, schedule, or performance.
   Tools             Rational ClearQuest
   Application       This metric lists the ERA change requests that are open as of the end of the
                     reporting period or have been approved or disapproved during the reporting
                     period. The data provides management with insight to the trend in new
                     change requests and resolution as the program progresses. This is a program
                     management metric used to measure the amount of change in order to
                     determine potential negative trends.
   Data Items        • Change Request – A request for modification of ERA configuration item
                         (i.e., document, hardware, software, or requirement) made prior to the end
                         of the reporting period, which is normally monthly.
                     • Number of Change Requests – Total number of change requests in the
                         system in any state.
                     • Number of Change Requests by State – Total number of change requests
                         in the system in each state (Assigned, Closed, Opened, Rejected,
                         Resolved, Submitted, CR Reviewed, CP Reviewed, Reviewed, Cancelled,
                         and CP Submitted).
   Computation       Count of the number of change requests submitted, approved, or disapproved
                     during the reporting period and then charted using a standard bar graph.
   Interpretation    Provides an overview of the processing state of change requests, and can help
                     identify processing bottlenecks
   Considerations    Reinforces formal configuration control (of configuration items), i.e., no
                     changes can be made and incorporated into the configuration baseline without
                     approval of the Change Request.


01/26/06                                   A-6                                  ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                                                                       Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                                                                       Appendix A
                                                                                    Final


           Item                                                                                       Description
   Name              Change Request Inventory
   Example
                                                                                  Change Request Inventory

                                                  35
                                                  30                                                                                     Submitted
                      Number of Change Requests

                                                                                                                                         CR Reviewed
                                                  25                                                                                     CP Submitted
                                                                                                                                         CP Reviewed
                                                  20                                                                                     Assigned
                                                                                                                                         Rejected
                                                  15                                                                                     Opened
                                                                                                                                         Resolved
                                                  10                                                                                     Reviewed
                                                                                                                                         Closed
                                                   5                                                                                     Canceled
                                                   0
                                                             5               5                    5               5               5
                                                           -0            0                  l-0            g-
                                                                                                              0               0
                                                        ay            n-                                                   p-
                                                       M         Ju                    Ju             Au              Se

                                                                                 Reporting Period

                                                                                 Change Request Inventory Example
   Data Source       Change Request Tracking Database




01/26/06                                                                         A-7                                                  ERA.DC.MP.4.0.doc

                                                  ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                                 Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                                 Appendix A
                                                  Final




           Item                                             Description
   Name              Requirements Rate of Change
   Program Goals     Monitor the number of requirements that have been modified during the
                     reporting period to assess general stability and completeness for the
                     requirements.
   Questions         • How many requirements exist in the requirements repository?
                     • How many requirements have been modified?
                     • How many new requirements have been added?
                     • How many requirements have been deleted?
   Impact            This metric cannot be used to alter or halt a project.
   Threshold         N/A, there is no established target threshold
   Benefits          • The Requirements Rate of Change metric provides a measure of technical
                         flux as it relates to the system requirements.
                     • A key indicator to the status of the requirements is the number of new or
                         changed requirements per month
                     • Lends insight as to how effective the requirements elicitation and
                         generation process was, i.e., is an indicator on how well defined the
                         baselined requirements were.
   Tools             Rational ClearCase, Rational RequisitePro
   Application       The metric indicates how many of the requirements were modified or added
                     during the reporting period. Note that this report applies to RD requirements
                     only. Derived requirements sets will be managed and reported by the
                     development contractor.
   Data Items        • Total Number of Changes in Requirements - An approved modification
                         to an ERA requirement that has been placed under CM
                     • Total Number of Requirements - The count of approved ERA
                         requirements that have been placed under CM
   Computation              Requirements Rate of Change =   Total Number of Modified Requirements
                                                                                                         x 100
                                                            Total Number of Baselined Requirements


   Interpretation    If the rate of change begins to affect schedule and/or cost performance than it
                     can be inferred that scope creep is occurring and/or the original requirements
                     were poorly defined.
   Considerations    N/A




01/26/06                                      A-8                                             ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                Appendix A
                                              Final


           Item                                        Description
   Name              Requirements Rate of Change
   Example




                                          Requirements Rate of Change Example
   Data Source       Rational Suite ClearQuest Change Request Database, Rational Suite
                     RequisitePro Database




01/26/06                                   A-9                              ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                   Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                   Appendix A
                                                  Final




           Item                                                Description
   Name              Budget Tracking
   Program Goals     • Track the original spend plan for the reporting period
                     • Track the cost outlays for the reporting period
                     • Track the amount of obligations for the reporting period
   Questions         • What is the total budget allocation for the reporting period?
                     • What are the total cost outlays for the reporting period?
                     • What are the total obligations for the reporting period?
                     • What are the total labor costs for the reporting period?
   Impact            This metric may be used to alter or halt a project when overruns are
                     anticipated or have occurred.
   Threshold         ±10% over/under budgeted costs
   Benefits          Alerts Management attention to potential budget issues.
   Tools             PRISM
   Application       This is an Microsoft Excel tool used to track obligations and expenditures
   Data Items        • Original Spend Plan - The budget allocated to perform work on the
                         program for the reporting period.
                     • Outlays -Total expenditures incurred to perform the work through the end
                         of the reporting period.
                     • Obligations - Total monies obligated for the reporting period
   Computation       Sum totals for reporting month and then display using bar graph.
   Interpretation    If cumulative outlays – exceed the cumulative obligations by > 10% per
                     quarter than an overrun is imminent.
   Considerations    Reprogramming of the funds may be required
   Example
                                                        Monthly Actual Cost

                               $50,000,000.00

                               $40,000,000.00

                               $30,000,000.00                                      Budget
                                                                                   Obligations
                               $20,000,000.00
                                                                                   Outlays
                               $10,000,000.00

                                          $-
                                                      Jul-05    Aug-05   Sep-05


                                                       Monthly Cost Example
   Data Source       NARA Budget Office supplies data to ERA Program Budget Analyst


01/26/06                                       A-10                               ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                              Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                              Appendix A
                                              Final




           Item                                        Description
   Name              Earned Value Management (EVM)
   Program Goals     • Monitor performance, cost, and schedule using a timeline
                     • Monitor the schedule and completion of work products relative to their
                        scheduled and actual completion times
                     • Ensure the project has sufficient resources
                     • Determine how much of the planned work has been done
                     • Forecast the final spending and completion date
                     • Provide an early warning when the project starts to go off-track
                     • Discover which areas/tasks are causing the problems, and where
                        anomalies are occurring
                     • Demonstrate and keep the project/development under control
                     • Track total number of hours per task (cumulative), both budgeted and
                        actual during the reporting period
                     • Track total number of hours spent to complete a task
   Questions         • How is the project performing with respect to cost?
                     • How is the project performing with respect to schedule?
                     • Is the work force sufficient to complete the work and how well are they
                        performing?
                     • What are the staffing levels: Actual, Planned, Variance?
                     • Is the correct labor mix being utilized?
                     • Is project performance increasing?
                     • How much work and how many tasks have been completed as compared
                        to the plan?
                     • Will the project complete on time?
                     • Is scheduled work being completed on time?
                     • Is scheduled work being completed within cost parameters?
                     • Is the total number of hours (actual) spent working on a task more than
                        the budgeted amount?
                     • Is a pattern emerging where it is taking longer than planned to complete
                        particular tasks?
                     • Is the overrun of hours required to complete a task in a particular
                        component area?
                     • Was the prepared budget inadequate for the amount of work to be
                        performed?
                     • Is the component area more technically challenging than originally
                        anticipated?
   Impact            These metrics can be used to monitor progress, provide early warnings of
                     problems, trends, enable process improvement, and enable decision making
                     whether to continue work on the project.


01/26/06                                  A-11                              ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                Appendix A
                                              Final


           Item                                        Description
   Name              Earned Value Management (EVM)
   Threshold         • +10% Variance Analysis; and Reporting on Current period where
                        WBS$>$25K, on Cumulative where WBS$>$50K, and At-Completion
                        where WBS$>$125K.
                     • Cost Performance Index (CPI) or Schedule Performance Index (SPI):
                        anything outside of 1.0 + 0.1 range indicates a potential productivity
                        problem.
   Benefits          • Positive SV indicates “ahead” of schedule status
                     • Negative SV indicates “behind” schedule status
                     • Positive CV indicates “under” budget status
                     • Positive CV indicates “over” budget status
                     • CPI represents how much work was performed for each dollar spent.
                     • SPI represents how much work was performed for each dollar planned to
                        be spent if work was accomplished as planned in the schedule.
   Tools             • Microsoft Excel
                     • Microsoft Project
                     • C/S Solutions’ wInsight
   Application       • This is a program management metric used to monitor cost, performance,
                        and schedule.
                     • The SPI compares performance to the schedule. The indices of CPI and
                        SPI are the standard cost and schedule performance measures for both
                        government and industry. The CPI shows how efficiently the team has
                        turned costs into progress to date CPI represents how much work was
                        performed for each dollar spent.
                     • The primary reports used for analysis of performance in an EV system are
                        the Cost/Schedule Status Report and the Cost Performance Report (CPR).
                        The CPR includes BCWS, ACWP, BCWP, and EAC in addition to
                        calculated cost and schedule variances for each WBS element from the
                        cost account level up to the project level.
                     • VARs provide current period, cumulative, and at-completion data. VAR
                        contains a description of the cause of the variance, its impact on the
                        project including other elements of the project, corrective action to be
                        taken, and follow-up on previous action taken. Variance thresholds may
                        be reported as a percentage, dollar amount, or a combination of the two.
   Data Items        • Budget At Completion (BAC) - The total value assigned to the program
                        and, if all goes as planned, the total cost. The planned value accounts for
                        all direct and indirect labor (expressed in dollars) that the work is
                        expected to cost.
                     • Planned Value (PV), also known as Budgeted Cost of Work Scheduled
                        (BCWS) - The sum of budgets allocated to time-phased elements of work
                        (Work Packages (WP)) on the program.
                     • Earned Value (EV), also known as Budgeted Cost of Work Performed
01/26/06                                  A-12                                ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                 Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                 Appendix A
                                              Final


           Item                                        Description
   Name              Earned Value Management (EVM)
                        (BCWP) – The sum of the values calculated by taking the percent
                        complete of each work package and multiplying it by the budget for that
                        work package
                     • Actual Cost (AC), also known as Actual Cost of Work Performed
                        (ACWP) – The actual cost of the work performed to-date as captured
                        from timesheets and other expense reports.
                     • Work Package - The lowest level of effort in the ERA WBS
   Computations      • Cost Variance (CV) = The difference between BCWP and ACWP given
                        by the formula:
                            Cost Variance (CV) = EV – AC
                            OR
                            Cost Variance Percentage = CV / EV x 100

                     •   Schedule Variance (SV) = The difference between BCWP and PV given
                         by the formula:
                             Schedule Variance = EV – PV
                             OR
                             Schedule Variance Percentage = SV / PV x 100

                     •   Cost Performance Index (CPI) =EV divided by the ACWP given by the
                         formula:
                            Cost Performance Index = EV / AC

                           A CPI of less than a 1.0 indicates potential productivity problem

                     •   Schedule Performance Index (SPI) = EV divided by PV as given by the
                         formula:
                            Schedule Performance Index = EV / PV

                     •   Estimate at Completion (EAC) = This formula determines the unfinished
                         or unearned work given by the formula.
                         Estimate at Completion (EAC) = AC + WR / PF
                             Where:
                                     Work Remaining (WR) = BAC – EV and Performance Factor
                                     (PF) depends on the analysis. For example:
                                     Least Likely: PF = CPI, or
                                    Most Likely: PF is a weighted balance of SPI and CPI =
                                    .5(CPI) + .5(SPI), or
                                    Worst Case: PF = CPI x SPI
                         A poor performance, or CPI less than 1, results in an EAC that is greater

01/26/06                                  A-13                                 ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                  Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                  Appendix A
                                              Final


           Item                                        Description
   Name              Earned Value Management (EVM)
                        than the BAC
                     • Variance at Completion (VAC) = The difference between the EAC and
                       the BAC given by the following formula:
                            Variance at Completion (VAC) = EAC - BAC
                         When the projected final cost exceeds the budget, this is effectively
                         predicting an overrun, an Unfavorable Variance at Completion.

                     •   Estimate To Complete (ETC) =

                            Estimate To Complete (ETC) = EAC - AC

                     •   To Complete Performance Index (TCPI) shows the future projection of
                         the average productivity needed to complete the program within an
                         estimated budget. It is calculated by the following formula:

                            To Complete Performance Index (BAC) =
                                  Work Remaining / Budget Remaining
                            or
                                  (BAC – EV) / (BAC - AC)

                             To Complete Performance Index (EAC) =
                                     Work Remaining / Budget Remaining
                             or
                                     (BAC - EV) / (EAC – AC)
   Interpretation    • The closer the CPI and SPI are to a value of 1.00, the more successful the
                         program can be considered, at least in terms of cost and schedule.
                     • TCPI is compared with CPI to determine how realistic the most recent
                         EAC is for the program. If TCPI is greater than CPI ( CPI TCPI < 1 ), the
                         team is anticipating an efficiency improvement. The estimated total cost
                         of the program (EAC) can therefore be calibrated by comparing TCPI
                         with CPI. If TCPI is 20 percent above the current value of the CPI, both
                         indices require closer examination.
   Considerations    In order to use the metrics the program/project must:
                     • Have produced a WBS, Organizational Breakdown Structure (OBS) and
                         Integrated Master Schedule (IMS); and
                     • To prepare ETC for individual work packages that will be used to
                         determine percent complete of the work package, and subsequently be
                         used to calculate the EV, the following items should be considered:
                         − Actual cost has incurred,
                         − Schedule status,
01/26/06                                  A-14                                  ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                                                                                                                                                                                                                                          Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                                                                                                                                                                                                                                          Appendix A
                                                                                                                                Final


           Item                                                                                                                                                    Description
   Name              Earned Value Management (EVM)
                        − EV to-date,
                        − Remaining scope of work,
                        − Previous ETC,
                        − Historical data,
                        − Required resources by type,
                        − Projected cost and schedule efficiency improvement,
                        − Future actions, and
                        − Approved contract changes.
   Example                                                                                                                                                                         SPI           CPI              TCPI

                            1.40

                            1.20

                            1.00

                            0.80

                            0.60

                            0.40

                            0.20

                            0.00
                                      Sep 03 Oct 03 Nov 03 Dec 03 Jan 04 Feb 04 Mar 04 Apr 04 May-04 Jun-04 Jul-04 Aug-04 Sep-04 Oct-04 Nov-04 Dec-04 Jan-05 Feb-05 Mar-05 Apr-05 May-05 Jun-05 Jul-05 Aug-05 Sep-05 Oct-05 Nov-05 Dec-05

                               SPI      0.52       0.52       0.57       0.57       0.66       0.70        0.84       0.82       0.91       0.88       0.91        0.95       0.94        0.94        0.94        0.95        0.95            0.95            0.95        0.95        0.95       0.95       0.95       0.95      0.95     0.95     0.96    0.96
                               CPI      0.78       0.76       0.85       0.93       1.12       1.18        1.17       1.12       1.10       1.10       1.04        0.98       1.00        1.01        1.02        1.03        1.05            1.05            1.01        1.02        1.01       1.02       1.03       1.03      1.04     1.03     1.04    1.04
                               TCPI     1.02       1.04       1.03       1.02       0.95       0.91        0.86       0.89       0.82       0.76       0.84        1.17       1.00        0.99        0.98        0.95        0.92            0.91            0.97        0.95        0.96       0.92       0.85       0.81      0.67     0.95     0.95    0.94




                                                                                                                                Performance Index Example

                                                                                                                                                                                          BCWS        BCWP        ACWP

                       $25,000,000



                       $20,000,000


                       $15,000,000



                       $10,000,000



                        $5,000,000


                               $0
                                      Sep 03     Oct 03      Nov 03     Dec 03      Jan 04      Feb 04     Mar 04      Apr 04     May-04      Jun-04      Jul-04     Aug-04      Sep-04      Oct-04      Nov-04      Dec-04          Jan-05          Feb-05          Mar-05      Apr-05      May-05     Jun-05     Jul-05     Aug-05    Sep-05   Oct-05   Nov-05   Dec-05

                             BCWS $1,202,849 $2,354,010 $3,240,493 $4,330,440 $5,737,425 $6,609,619 $7,416,921 $7,758,115 $9,329,437 $10,403,750 $10,815,360 $11,081,712 $12,046,111 $12,729,325 $13,430,051 $14,164,521 $14,901,901 $15,572,064 $16,399,015 $17,175,559 $17,919,846 $18,745,073 $19,463,278 $20,259,578 $21,050,641 $21,849,044 $22,495,892 $23,194,495
                             BCWP $620,376     $1,231,750 $1,833,805 $2,454,199 $3,813,557 $4,593,888 $6,254,200 $6,385,017 $8,448,758 $9,167,165 $9,801,761 $10,486,151 $11,325,886 $11,976,209 $12,645,456 $13,390,409 $14,098,723 $14,735,286 $15,521,114 $16,295,080 $16,995,207 $17,787,616 $18,518,071 $19,337,042 $20,098,371 $20,860,946 $21,539,176 $22,206,686
                             ACWP $792,407     $1,613,722 $2,150,003 $2,637,831 $3,416,606 $3,906,764 $5,339,536 $5,696,783 $7,688,031 $8,305,697 $9,402,564 $10,696,946 $11,371,494 $11,880,722 $12,434,896 $12,990,038 $13,437,224 $14,067,728 $15,345,877 $15,989,955 $16,772,079 $17,441,285 $17,943,197 $18,764,752 $19,273,021 $20,201,441 $20,804,149 $21,370,040




                                        Performance Measurement Example
   Data Source       Export EV data from Microsoft Project to wInsight Utility applications




01/26/06                                                                                                     A-15                                                                                                                                                                                 ERA.DC.MP.4.0.doc

                                 ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                 Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                 Appendix A
                                              Final




           Item                                        Description
   Name              Program Staffing Profile
   Program Goals     Monitor the staffing levels required to perform program tasks against
                     projected staffing levels
   Questions         • What is the projected number of ERA PMO Government Staff required to
                         perform designated tasks?
                     • What is the actual number of ERA PMO Government Staff required to
                         perform designated tasks?
                     • What is the projected number of ERA PMO POST Staff required to
                         perform designated tasks?
                     • What is the actual number of ERA PMO POST Staff required to perform
                         designated tasks?
   Impact            Lack of resources could result in schedule slippage due to work overload.
   Threshold         < 90% projected staffing level could impact tasks being completed on time
                     which translates into a potential schedule slip.
   Benefits          When staffing levels are above the threshold it means that sufficient resources
                     are available to perform required tasks.
   Tools             Microsoft Excel, Microsoft PowerPoint
   Application       This is a program management metric used to monitor resources and cost
   Data Items        • Projected Staffing Level - Identification of ERA staffing required for
                         completing program activities by reporting period. Includes Staffing
                         Category and for each Staffing Category, the Number of Staff Members,
                         and Staffing Scheduled Finish Date.
                     • Program Staffing Level - Actual ERA staffing by Staffing Category as of
                         the end of the reporting period. Includes Staffing for each Staffing
                         Category, the Number of Staff Members, Staff Member Names, and
                         Reporting Period.
                     • Number of Projected ERA Staff – Total number of staff for the ERA
                         project, includes both Government and POST staff combined cumulative
                         up to and including the reporting period.
                     • Actual Number of ERA Staff – Actual number of staff for the ERA
                         project, includes both Government and POST staff combined cumulative
                         up to and including the reporting period.
                     • Number of Projected Government Staff – Total number of projected
                         Government staff required to complete program activities up to and
                         including the reporting period.
                     • Actual Number of Government Staff – Actual number of Government
                         staff to-date.
                     • Number of Projected POST Staff – Total number of projected POST staff
                         required to complete program activities up to and including the reporting
                         period.
                     • Actual Number of POST Staff – Actual number of POST staff to-date
01/26/06                                  A-16                                 ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                                                      Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                                                      Appendix A
                                                                         Final


           Item                                                                     Description
   Name              Program Staffing Profile
                     • Number of Projected Government Staff by Division (i.e., PMO Total,
                        PMO PO, PMO PSD, PMO SED) – Total number of Government
                        projected Government staff by division required to complete program
                        activities up to and including the reporting period.
                     • Actual Number of Projected Government Staff by Division – Actual
                        number of Government staff to-date by division.
                     • Number of Projected POST Staff by Division (i.e., POST Total, POST
                        PO, POST PMD, POST SED) – Total number of projected POST staff by
                        division required to complete program activities up to and including the
                        reporting period.
                     • Actual Number of POST Staff by Division – Actual number of POST
                        staff to-date by division.
   Computation                                    Staffing Profile % Rate =        Total Number of Actual Staff
                                                                                                                              x 100
                                                                                   Total Number of Projected Staff
   Interpretation    If staffing is too low, then there is the potential for schedule slippage as tasks
                     may not be completed as scheduled.
   Considerations    Can be used in conjunction with or to help support level of effort
   Example
                                                                              Program Staffing Profile
                                            70                                                                                Total
                                                                                                                              Combined
                                            60                                                                                Projected
                                                                                                                              Staff
                                            50
                       Nu mb er o f Staff




                                                                                                                              Total
                                                                                                                              Combined
                                            40                                                                                Actual Staff

                                            30                                                                                Total Actual
                                                                                                                              Government
                                            20                                                                                Staffing

                                            10                                                                                Total Actual
                                                                                                                              POST
                                                                                                                              Staffing
                                             0
                                                      5
                                                      5
                                                    05




                                                    05
                                                     4

                                                     4




                                                    05


                                                     5




                                                    05


                                                     5
                                                    05
                                                    05




                                                  l-0
                                                   -0
                                                   -0

                                                   -0




                                                   -0




                                                  -0
                                                 p-
                                                 n-




                                                 n-
                                                 b-




                                                 g-
                                                 r-
                                                ov

                                                ec




                                               ay
                                                ar




                                                ct
                                               Ju
                                              Ja




                                              Ju
                                              Ap
                                              Fe




                                              Se
                                              Au




                                              O
                                              M
                                              N

                                              D




                                              M




                                                                              Reporting Period

                                                                       Program Staffing Level Example




01/26/06                                                            A-17                                             ERA.DC.MP.4.0.doc

                                            ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                                                                                                               Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                                                                                                               Appendix A
                                                                           Final


           Item                                                                                     Description
   Name              Program Staffing Profile
                                                                   Program Staffing Profile Breakdown
                                                                               April 2003
                                                            30
                                                                      25                                                                25
                                                                                                                                             24
                                                            25




                                          Number of Staff
                                                            20
                                                                           15
                                                            15
                                                                                                                         10                                                      10   10
                                                            10                                            8                                                            8
                                                                                          7                                    7                            7   7          7
                                                                                              5
                                                            5                                                 3


                                                            0
                                                                     tal                                  D               D           tal                              D          D
                                                                   To              fice                 PM              SE          To              fic
                                                                                                                                                        e            PM         SE
                                                               O                 Of                 O               O            ST               Of              ST         ST
                                                             PM              m                    PM              PM           PO          ram                  PO         PO
                                                                          gra
                                                                       Pro                                                              rog
                                                                   O                                                                  TP
                                                                 PM                                                                POS
                                                                                          Government                                                                POST
                                     Total Projected Staf f
                                                                                                                              Organization
                                     Total Actual Staf f to Date




                                     Program Staffing Profile Breakdown Example
   Data Source       ERA Organization Charts




01/26/06                                                     A-18                                                                                                          ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                Appendix A
                                              Final




           Item                                        Description
   Name              Risk Containment Summary
   Program Goals     • Track risks by risk exposure
                     • Identify trends
                     • Develop risk strategies to mitigate, reduce, or eliminate potential risks
   Questions         • What is the total number of risks that have been identified?
                     • What is the total number of high exposure risks?
                     • What is the total number of moderate exposure risks?
                     • What is the total number of low exposure risks?
   Impact            Can be used to mitigate the consequence of failure to the project depending on
                     the severity of the risk.
   Threshold         N/A, there is no threshold
   Benefits          This measure provides a useful summary for management to identify trends in
                     risk identification in order to be able to monitor them and to also develop
                     strategies to mitigate, reduce, or eliminate them.
   Tools             Risk Radar
   Application       The metric is a program management metric used to monitor all risk items.
   Data Items        • Cumulative Number of Open Risk Items – Cumulative number of open
                         risk items up to and including the reporting period.
                     • Cumulative Number of Open Risk Items by Risk Exposure (i.e., High,
                         Moderate, Low) – Total number of open risk items by risk exposure level
                         that are open as of the end of the reporting period.
                         − High Exposure: Risks that have a significant impact on cost, schedule,
                             or performance. Significant action required.
                         − Moderate Exposure: Risks that have some impact. Special action may
                             be required. Additional management attention may be required.
                         − Low Exposure: Risks that have minimum impact. Normal oversight
                             needed to ensure risk remains low.




01/26/06                                  A-19                                ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                                                Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                                                Appendix A
                                                         Final


           Item                                                       Description
   Name              Risk Containment Summary
   Computation       Risk Exposure is determined using: Impact multiplied by
                     Likelihood/Probability.
                     Risk Impact Level and Likelihood/Probability are determined using the
                     following:
                             Level          Technical                      Schedule                     Cost           Impact on
                                           Performance                                                                Other Teams
                                 1       Minimal or no           Minimal or No Impact.            Minimal or no       None
                                         Impact                                                   Impact
                                 2       Acceptable with         Additional resources             <5%                 Some impact
                                         some reduction in       required. Able to meet need
                                         margin                  dates.
                                 3       Acceptable with         Minor slip in key milestone.     5 – 7%              Moderate
                                         significant             Not able to meet need dates.                         impact
                                         reduction in
                                         margin
                                 4       Acceptable – no         Major slip in key milestone      >7<10%              Major impact
                                         remaining margin        or critical path impacted.
                                 5       Unacceptable            Can’t achieve key team or        >10%                Unacceptable
                                                                 major program milestone.



                                                           Risk Impact Chart Example
                                 Level         Translated           Likelihood of            Potential for             Approach
                                               Probability           Occurrence               Mitigation
                             a               1 - 20%              Remote               Mitigation is almost       Is not necessary to
                                                                                       always possible.           develop a
                                                                                                                  contingency plan.
                             b               21 - 40%             Unlikely             Mitigation is usually      Continue current
                                                                                       possible.                  mitigation plan.
                             c               41 – 60%             Likely               Mitigation is possible     Continue execution
                                                                                       but difficult.             of mitigation plan;
                                                                                                                  develop
                                                                                                                  contingency plan.
                             d               61 – 80%             Highly Likely        Mitigation is unlikely     Prepare to enact
                                                                                       or difficult.              contingency plan.
                             e               81 - 99%             Near Certainty       Mitigation is not          Look to minimize
                                                                                       possible.                  impacts; enact
                                                                                                                  contingency plan.


                                             Risk Probability/Likelihood Chart Example
                     Using the above tables the data is then plotted. See Risk Containment
                     Summary Example below.
   Interpretation    Less than a 95% completion rate could infer a schedule slip is imminent.




01/26/06                                          A-20                                                       ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                                  Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                                  Appendix A
                                                    Final


           Item                                                   Description
   Name              Risk Containment Summary
   Considerations    Additional risk management data including strategies can be found in the ERA
                     Risk Management Plan (RKM). The metric data presented here is a subset of
                     that data. Lastly, risk management reports containing additional metric data is
                     reported on at various times providing more detail than what is being reported
                     here.
   Example
                                                                                                              1
                                             5


                                             4                                                      3
                             Impact

                                             3                            2              4


                                             2              3             1


                                             1              2             4


                                                         1 - 20        21 - 40         41 - 60    61 - 80   81 - 99
                                                                   Likelihood /Probability (%)


                     Legend:
                                  High Exposure
                                  Moderate Exposure
                                  Low Exposure
                      #           # = number of risks in that impact/probability bin


                                                 Risk Containment Summary Example
   Data Source       Risk Radar




01/26/06                                         A-21                                            ERA.DC.MP.4.0.doc

                          ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                                 Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                                 Appendix A
                                                  Final




           Item                                               Description
   Name              Work Product Completion Summary
   Program Goals     Track the number of Critical/Non-Critical Work Products, e.g., Government
                     Furnished Information provided to the System Developer to complete one of
                     their critical path activities, that are scheduled for completion and those that
                     are actually completed on a cumulative basis. A PMO work product is a
                     discrete deliverable called out in the WBS/IS.
   Questions         • What is the number of work products, e.g., documents, scheduled for
                         completion during the reporting period?
                     • What is the number of actual work products completed during the
                         reporting period?
   Impact             • Can be used to alter or halt a project if it is determined that the schedule is
                         not being met.
                      • Can be used to alter or halt a project if the work product(s) is of
                         significance and tied to completion of a program milestone.
   Threshold         • < 90% completed on time for Critical Work Products
                     • N/A for Non-Critical Work Products
   Benefits          Can determine if a program is on schedule or if milestones tied to the work
                     products are going to be met.
   Tools             • Microsoft Word table for work product list for the reporting period to be
                         used in conjunction with the ERA WBS and Schedule (Microsoft Project
                         Scheduler)
                     • Microsoft Excel to chart metric data
   Application       The metric presents the Cumulative Number of Work Products Completed
                     and the Cumulative Number of Work Products Scheduled for completion.
   Data Items        • Cumulative Number of Work Products Scheduled - Cumulative number
                         of ERA work products that are scheduled for completion by the end of the
                         reporting period in the program schedule. Includes Work Product Name,
                         Work Product Type, Work Product Scheduled Finish Date, and Actual #
                         of deliverables submitted.
                     • Cumulative Number Work Products Completed - Cumulative number of
                         ERA work products that were completed as of the end of the reporting
                         period. Includes Work Product Name, Work Product Type, Work Product
                         Scheduled Finish Date, and Work Product Actual Finish Date.
   Computation              Work Product Completion Rate =   Cumulative Number of Work Products Completed
                                                                                                            x 100
                                                             Cumulative Number of Work Products Scheduled

   Interpretation    Less than a 95% completion rate could infer a schedule slip is imminent
   Considerations    None



01/26/06                                     A-22                                             ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                                          Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                                          Appendix A
                                                                  Final


           Item                                                               Description
   Name              Work Product Completion Summary
   Example
                                                                  Work Product Completion




                              Number of Work Products
                                                        70
                                                        60
                                                        50                                                      Cumulative
                                                        40                                                      Scheduled
                                                        30                                                      Cumulative
                                                        20                                                      Completed
                                                        10
                                                         0   05



                                                                       5


                                                                                 05


                                                                                           05



                                                                                                     5
                                                                    l-0




                                                                                                   -0
                                                           n-




                                                                              g-


                                                                                        p-


                                                                                                 ct
                                                                  Ju
                                                         Ju




                                                                            Au


                                                                                      Se


                                                                                                O
                                                                           Reporting Period


                                                         Work Product Completion Summary Example
   Data Source       Work Breakdown Structure (WBS)




01/26/06                                                      A-23                                       ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                  Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                  Appendix A
                                              Final



           Item                                        Description
   Name              Defect Management
   Program Goals     • Monitor defects during development in order to avoid re-design that
                         translates into performance, cost, and schedule impacts.
                     • Monitor defects during test in order to determine the technical
                         competency of the system.
                     • Monitor defects in deliverables and/or deliverables presented during
                         program technical reviews in order to demonstrate competency of design.
   Questions         • What is the total number of defects?
                     • What is the total number of defects per Severity level?
                     • Are the defects found concentrated in any one area?
                     • What is the defect closure rate?
                     • What is the impact to cost and schedule?
   Impact            This metric can be used to alter or halt a project.
   Threshold         N/A, there is no target value
   Benefits          Enables the identification of trends that could have deleterious effects on cost,
                     schedule, or performance.
   Tools             Rational ClearQuest and Rational TestManager or other Rational-compatible
                     tools
   Application       Tracks the persistence of software defects through the ERA lifecycle to
                     measure the effectiveness of development and verification activities. This is
                     a program management metric used to identify and categorize defects that are
                     found during development that may impact schedule, cost, and performance.
   Data Items        • Defect - Any flaw in the specification, design, or in the coding,
                         implementation, or testing of a work product which if not removed, would
                         cause a program or system to fail or to produce incorrect results. Any
                         occurrence in a work product that is determined to be incomplete or
                         incorrect relative to the standards applicable for that work product. An
                         instance where the product does not meet a specified characteristic
                         recorded as of the end of the reporting period.
                     • Total Number of Defects Found – Total number of all defects found
                         during the reporting period.
                     • Cumulative Number of Defects Found – Cumulative number of defects
                         found during all reporting periods combined.
                     • Total Number of Defects Found Per Defect Severity Level – Total
                         number of defects found per severity level (i.e., Critical, High,
                         Intermediate, or Low).
                     • Percentage of Defects Found Per Severity Level – Calculated.
                         Percentage of defects by severity level = number of defects for a severity
                         level divided by total number of defects. X-axis = severity level, Y-axis =
                         number or percentage of defects.


01/26/06                                  A-24                                  ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                                              Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                                              Appendix A
                                                   Final


           Item                                              Description
   Name              Defect Management
                     • Total Number of Defects Found Per Origin (i.e., requirements,
                         architecture, design, code, test) - To see where most of the defects are
                         coming from so that corrective action can be taken in those areas to
                         reduce the number of defects. X-axis = defect origin or phase, Y-axis =
                         number of defects.
   Computation       See Data Items Section
   Interpretation    During the Development, and Operations and Support phases, the actual
                     number of defects detected is tracked as well as the phase in which the defect
                     was created. Examples include Requirements, Architecture, Design, Code,
                     and Test Levels. These can be further sub-divided, e.g., defects found in an
                     integration test could be broken down to the number of defects that are found
                     per Configuration Item, etc.
   Considerations    When analyzing defects, cost, schedule, and performance impacts will be
                     provided.
   Example                            Defects
                                                           Found In:
                                                                Requirements


                                                                               Architecture


                                                                                              Design


                                                                                                       Code




                                                                                                                       Total
                                                                                                                Test
                                  Originated in:

                                     Requirements              22               4               8       2        12     48

                                     Architecture              0               17               9       2         7     35

                                          Design               0                0             12        9         5     26

                                          Code                 0                0               0       7        16     23

                                           Test                0                0               0       0        28     28

                                          Total                22              21             29       20        85



                                                   Defect Management Example
   Data Source       Rational ClearQuest Defect Database




01/26/06                                    A-25                                                          ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                 Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                 Appendix B
                                              Final

       Appendix B: ERA IV&V Contractor Metrics Descriptions
The following are detailed descriptions of the metrics collected and reported by the IV&V
Contractor.

Table B-1, Metric Set Definition, provides an explanation of the metric items and descriptions
to enhance reader comprehension.

           Item                                        Description
   Name               Name given to the metric
   Program Goals      List of program goals (measurements are goal-oriented)
   Questions          Questions derived from goals that must be answered in order to determine if
                      the goals are achieved
   Impact             Indication of whether a metric can be used to alter or halt the project.
   Target value       Numerical value of the metric that is to be achieved in order to meet planned
                      objective. Include the critical value and the range of the metric.
   Benefits           Provides examples of the benefits derived from using the metric.
   Tools              Software or hardware tools that are used to gather and store data, compute
                      the metric, and analyze the results.
   Application        Description of how the metric is used and what its area of application is.
   Data items         Input values that are necessary for computing the metric values.
   Computation        Explanation of the steps involved in the metrics computation.
   Interpretation     Interpretation of the results of the metrics computation.
   Considerations     Provides examples of the considerations as to the appropriateness of the
                      metric (e.g., Can data be collected for this metric? Is the metric appropriate
                      for this application?).
   Example            An example of applying the metric.
   Data Source        Location of where the data is kept
                               Table B-1: Metric Set Definition




01/26/06                                   B-1                                 ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
  Electronic Records Archive (ERA)                                                         Metrics Plan (MP)
  ERA Program Management Office (ERA PMO)                                                         Appendix B
                                                    Final



      Item                                                  Description
Name              IV&V Anomaly Density
Program Goals     • Discover anomalies in the system/software
                  • Facilitate correction of the anomalies.
Questions         What is the relative number of anomalies found versus items reviewed?
Impact            Anomaly density measures can provide insightful information on the software
                  product quality, the quality of the software development processes, and the quality
                  of the IV&V effort
Threshold         High value suggests that the IV&V processes are effective
Benefits          Measures can be analyzed to gain insight into the interdependencies between the
                  development efforts and the IV&V efforts.
Tools             Microsoft Excel Spreadsheet
Application       Anomaly measures and trends can be used to improve the planning and execution
                  of IV&V processes
Data Items        • # Requirements anomalies found by IV&V effort
                  • # Requirements reviewed by IV&V effort
                  • # Design statement anomalies found by IV&V effort
                  • # Design statements reviewed by IV&V effort
                  • # Code anomalies found by IV&V effort
                  • # Code volume reviewed by IV&V effort
                  • # Test anomalies found by IV&V effort
                  • # Tests reviewed by IV&V effort
Computation
                   (1)   Requirements Anomaly Density = # Requirements anomalies found by IV & V effort
                                                            # Requirements reviewed by IV & V effort

                   (2)   Design Anomaly Density         = # Design statement anomalies found by IV & V effort
                                                             # Design statement reviewed by IV & V effort

                   (3)   Code Anomaly Density           = # Coding anomalies found by IV & V effort
                                                           # Code volume reviewed by IV & V effort

                   (4)   Test Anomaly Density           = # Test anomalies found by IV & V effort
                                                            # Tests reviewed by IV & V effort

Interpretation    If the IV&V anomaly density measure value is low, this suggests that the program
                  development quality is high, that the IV&V processes need to be improved, or a
                  combination of both. If the measure value is high, then this suggests that the
                  program development quality is low, that the IV&V processes are effective, or a
                  combination of both.
Considerations    Regardless of the measure value, the next step is to evaluate related software
                  program development measures to further clarify and discern the measure trends to
                  determine the need for process improvements.


  01/26/06                                        B-2                                    ERA.DC.MP.4.0.doc

                            ♦ National Archives and Records Administration ♦
  Electronic Records Archive (ERA)                                                   Metrics Plan (MP)
  ERA Program Management Office (ERA PMO)                                                   Appendix B
                                                 Final


     Item                                                Description
Name              IV&V Anomaly Density
Example
                                                 IV&V Anomaly Density

                                0.25

                                 0.2

                                0.15

                                 0.1

                                0.05

                                  0
                                       Requirements      Design        Code   Test


                                            IV&V Anomaly Density Example
Data Source       Anomaly Reporting System




  01/26/06                                    B-3                               ERA.DC.MP.4.0.doc

                           ♦ National Archives and Records Administration ♦
  Electronic Records Archive (ERA)                                                                   Metrics Plan (MP)
  ERA Program Management Office (ERA PMO)                                                                   Appendix B
                                                       Final



     Item                                                      Description
Name             IV&V Efficiency
Program Goals    • Discover anomalies in the software products and processes in the development
                     activity
                 • Facilitate correction of the anomalies
Questions        What is the relative number of anomalies found by IV&V in a particular activity
                 versus items found by IV&V in all activities?
Impact           IV&V Efficiency measures can provide insightful information on the software
                 product quality, the quality of the software development processes, and the quality
                 of the IV&V effort
Threshold        High value suggests that the IV&V effort is discovering anomalies in the earliest
                 possible activity or that the software development products are mature or a
                 combination of both.
Benefits         Measures can be analyzed to gain insight into the interdependencies between the
                 development efforts and the IV&V efforts.
Tools            Microsoft Excel Spreadsheet
Application      Efficiency measures and trends can be used to improve the planning and execution
                 of IV&V processes
Data Items       • # Requirements anomalies found by IV&V in requirements activity
                 • # Requirements anomalies found by IV&V in all activities
                 • # Design statement anomalies found by IV&V in design activity
                 • # Design statements anomalies found by IV&V in all activities
                 • # Code anomalies found by IV&V in implementation activity
                 • # Code anomalies found by IV&V in all activities
                 • # Test statements anomalies found by IV&V in test activity
                 • # Tests anomalies found by IV&V in all activities
Computation      (1)   Requirements IV & V Efficiency = # Requirements anomalies found by IV & V in requirements activity
                                                          # Requirements anomalies found by IV & V in all activities

                 (2)   Design IV & V Efficiency       = # Design statement anomalies found by IV & V in design activity
                                                        # Design statements anomalies found by IV & V in all activities

                 (3)   Code IV & V Efficiency         = # Coding anomalies found by IV & V in implementation activity
                                                             # Code anomalies found by IV & V in all activities

                 (4)   Test IV & V Efficiency         = # Test Statement anomalies found by IV & V in test activity
                                                            # Test anomalies found by IV & V in all activities

Interpretation   If the IV&V efficiency measure value is low, this suggests that the software IV&V
                 effort is not discovering anomalies in the earliest possible activity, or that the
                 software development products are immature, or a combination of both.

                 If the measure value is high, then this suggests that the IV&V effort is discovering
                 anomalies in the earliest possible activity, or that the software development products
                 are mature, or a combination of both.
  01/26/06                                         B-4                                            ERA.DC.MP.4.0.doc

                             ♦ National Archives and Records Administration ♦
  Electronic Records Archive (ERA)                                                   Metrics Plan (MP)
  ERA Program Management Office (ERA PMO)                                                   Appendix B
                                                 Final


    Item                                                 Description
Name             IV&V Efficiency
Considerations   Regardless of the measure value, the next step is to evaluate related software
                 program development measures to further clarify and discern the measure trends to
                 determine the need for process improvements.
Example
                                                     IV&V Efficiency

                                0.4
                               0.35
                                0.3
                               0.25
                                0.2
                               0.15
                                0.1
                               0.05
                                 0
                                      Requirements    Design           Code   Test


                                               IV&V Efficiency Example
Data Source      Anomaly Reporting System




  01/26/06                                    B-5                               ERA.DC.MP.4.0.doc

                           ♦ National Archives and Records Administration ♦
  Electronic Records Archive (ERA)                                                                    Metrics Plan (MP)
  ERA Program Management Office (ERA PMO)                                                                    Appendix B
                                                       Final



      Item                                                     Description
Name              IV&V Effectiveness
Program Goals     • Discover anomalies in the system/software
                  • Facilitate correction of the anomalies
Questions         What is the relative number of anomalies found by IV&V versus number anomalies
                  found by all sources?
Impact            IV&V effectiveness measures can provide insightful information on the added
                  benefits of IV&V to discover anomalies in software products and processes, and
                  the quality of the V&V effort
Threshold         High value suggests that the IV&V processes are effective
Benefits          Measures can be analyzed to gain insight into the interdependencies between the
                  development efforts and the IV&V efforts.
Tools             Microsoft Excel Spreadsheet
Application       Effectiveness measures and trends can be used to improve the planning and
                  execution of IV&V processes
Data Items        • # Requirements anomalies found by IV&V effort
                  • # Requirements anomalies found by all sources
                  • # Design statement anomalies found by IV&V effort
                  • # Design statements found by all sources
                  • # Code anomalies found by IV&V effort
                  • # Code anomalies found by all sources
                  • # Test anomalies found by IV&V effort
                  • # Tests anomalies found by all sources
Computation
                   (1)   Requirements IV & V Effectiveness     =   # Requirements anomalies found by IV & V effort
                                                                     # Requirements anomalies found by all sources

                   (2)   Design IV & V Effectiveness           =   # Design statement anomalies found by IV & V effort
                                                                    # Design statement anomalies found by all sources

                   (3)   Code IV & V Effectiveness             =   # Coding anomalies found by IV & V effort
                                                                     # Code anomalies found by all sources

                   (4)   Test Execution IV & V Effectiveness =     # Test anomalies found by IV & V effort
                                                                    # Tests anomalies found by all sources


Interpretation    If the IV&V effectiveness measure value is low, this suggests that the program
                  development effort is effective, or that the IV&V effort may require improvement,
                  or a combination of both.

                  If the measure value is high, then this suggests that the software development
                  processes may require improvement, or that the IV&V processes are effective, or
                  that only incremental changes to the IV&V processes may be required.



  01/26/06                                           B-6                                           ERA.DC.MP.4.0.doc

                            ♦ National Archives and Records Administration ♦
  Electronic Records Archive (ERA)                                                   Metrics Plan (MP)
  ERA Program Management Office (ERA PMO)                                                   Appendix B
                                                 Final


     Item                                                Description
Name              IV&V Effectiveness
Considerations    Regardless of the measure value, the next step is to evaluate related software
                  program development measures to further clarify and discern the measure trends to
                  determine the need for process improvements.
Example
                                                      IV&V Effectiveness

                                0.45
                                 0.4
                                0.35
                                 0.3
                                0.25
                                 0.2
                                0.15
                                 0.1
                                0.05
                                   0
                                       Requirements      Design        Code   Test


                                              IV&V Effectiveness Example
Data Source       Anomaly Reporting System




  01/26/06                                    B-7                               ERA.DC.MP.4.0.doc

                           ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                    Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                   Appendix C
                                               Final

  Appendix C: ERA Development Contractor Metrics Descriptions
The following are detailed descriptions of the metrics collected and made available to the ERA
PMO by the development contractor, LMC. These metrics are captured by LMC in support of its
own management processes related to development of the ERA system. These descriptions are
copied, with a few slight modifications, from LMC’s QMPM document. The QMPM is not a
CDRL item. The metrics shown below represent those of interest to the ERA PMO, but exclude
those metrics whose primary purpose is related to LMC’s own process productivity and process
improvement efforts. LMC is in control of their metrics and may modify them at any time.
Examples of the listed metrics are not shown, as none were provided in the QMPM.

                                  ERA01 - Action Item Aging
                  This set of metrics show the age of each open action item by severity. The data
Description       provides visibility to action items that have been outstanding for an extended
                  period of time so that effort may be applied to ensure resolution.
                  This metric has the potential to alter the program if it is determined that the
Rationale
                  action item will cause a redesign and/or cause schedule delays.
                     All action Items closed on schedule
Goals
                     Closure issues identified and escalated quickly
                  As a minimum, the following data items are recorded for each action item:
                   Date Opened,
                   Initiator,
                   Action,
                   Severity,
                   Function,
                   Process,
                   Phase,
                   Product,
Data Items         Release,
Collected
                   Action Owner,
                   Action Resolution History, and
                   Closure Date.

                  As a minimum, the following action item metric values are calculated and
                  reported periodically:
                   Cumulative Number of Action Items submitted (opened) by Severity
                     Level, Function, Product, and in total.
                   Total Number of Open Action Items by Severity Level, Function, Product,
                     and in total at end of a reporting period.
01/26/06                                    C-1                                   ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                 Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                Appendix C
                                              Final

                                  ERA01 - Action Item Aging
                     Cumulative Number of Closed Action Items by Severity Level, Function,
                      Product, and in total as of the end of a reporting period.
                     Average Age of Open Action Items (Aging).
                     Average Time Taken to Close Action Items.
                     Cumulative Number of Action Items Open Per Severity Level, Function,
                      Product, and in total per Age Interval – Number of Action Items open 0-30
                      days, 31–60 days, 61–90 days, and > 90 days.
                  Action Items are recorded in the Program Level Action Item database
Collection        available through the ERA Team Portal. The Action Item Database
Procedure         automatically generates Action Item Closure metrics and provides them to the
                  ERA Program Measures Repository.
Thresholds        < 30 days old
Frequency         Action Item Metrics queries will be generated weekly.
Baseline          No baseline procedure is necessary. The data are collected from the action
Procedure         item database on an established schedule.
Controlled
                  Retained in the ERA Team Portal Measurements Repository.
Location
                  The ERA PMO reviews and approves closure of those Action Items that have
                  direct visibility or impact to the ERA PMO. All other Action Items are closed
                  at the lowest organizational level possible with concurrence of initiator and
Assumptions       action item owner that the action is complete.
                  Action Items from the ERA Action Item Database may be promoted to the
                  ERA PMO Database, but the two databases are otherwise separate.

                             ERA02 - Change Request Inventory
                  This metric lists the ERA change requests that are open as of the end of the
                  reporting period, or those that have been approved or disapproved during the
                  reporting period. The data provides management with insight to the trend in
                  new change requests and resolution as the program progresses. This is also a
Description       program management metric used to measure the rate of change in order to
                  determine potential negative trends.
                  A Change Request is a request for modification of ERA component (i.e.,
                  document, hardware, or software) and can be associated with an initiation time
                  and a status at any point in time.
                  Change is a natural state of all programs; however, when the change activity is
                  large or takes a long time (both qualitative considerations with respect to the
Rationale         program), the program can fall behind because it is reacting to the changes or
                  because the changes reflect a set of expectations other than that which the
                  program is providing.

01/26/06                                   C-2                                 ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                     Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                    Appendix C
                                               Final

                             ERA02 - Change Request Inventory
                  Knowing information on the rate at which change requests are being initiated,
                  how long it is taking to resolve them, the specific areas of the program
                  affected, and the impact on the program cost and schedule establish the
                  foundation for containing program risks and for improving program
                  efficiency.
                     Identify trends early in their life cycle in order to reduce, eliminate, or
Goals                 avoid cost and schedule implications.
                     Identify relationships between CRs and prevent CRs that never close.
                  As a minimum, the following data items are recorded for each change request:
                   Initiation Date,
                   Description of change,
                   Type of change,
                   Product,
                   Phase,
                   Owner,
                   Function,
                   Process,
                   Release,
                   Priority,
                   Cost Impact of implementing the change,
Data Items         Schedule Impact due to implementing the changes,
Collected          Risk associated with implementing or NOT implementing the change,
                   Collateral product and process impacts,
                   Approve/Disapprove Status,
                   Disposition History, and
                   Closure Date,
                  As a minimum, the following change request metric values are calculated and
                  reported periodically:
                   Total Number of Change Requests Submitted by Type, Priority, and in
                     total during reporting period and cumulative.
                   Total Number of Change Requests Approved by Type, Priority, and in
                     total during reporting period and cumulative.
                   Total Number of Change Requests Disapproved by Type, Priority, and in
                     total during reporting period and cumulative.
                   Total Number of Change Requests Open by Type, Priority, and in total.


01/26/06                                    C-3                                    ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                   Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                  Appendix C
                                               Final

                             ERA02 - Change Request Inventory
                  Change requests are initiated in the program change request database. The
Collection        metric data are extracted automatically from the database.
Procedure         Using the data analysis tool, the collected data items are analyzed to identify
                  trends and issues that require attention.
Frequency         Reported monthly, although reports can be generated at virtually any time.
                  Representations are workload indicators and will not be subject to thresholds
Thresholds
                  or quantitative controls.
                  Individual CR status is baselined through the CCB review procedures. The
Baseline
                  reporting is automated and represents the status at the point in time that the
Procedure
                  CCB determines.
Controlled
                  Retained in the ERA Team Portal Measurements Repository.
Location
                  The ERA PMO reviews and approves all Change Requests which have direct
                  visibility or impact to the ERA PMO or which require contract or ERA PMO
                  specification changes. All other Change Requests are handled at the lowest
Assumptions       organizational level possible.
                  ERA Change Requests that require ERA PMO visibility are submitted to and
                  maintained by the ERA PMO, and contain as much information as possible to
                  enable the ERA PMO to make an informed decision.

                               ERA05 - Data Acceptance Status
                  There are specific data items that require NARA’s approval. These are
Description       indicated on the CDRL. This metric tracks the time to approve and the number
                  of outstanding documents at any point in time.
                  Delays in securing approval can impact the program and be an indicator of
                  excessive work load, poor quality, or poor coordination prior to delivery. The
Rationale         review of the metric should assist the program team and the ERA Program
                  Management Office with allocating resources and priorities to meet common
                  goals.
                     Timely review and acceptance of Lockheed Martin deliverables.
Goals
                     Minimize re-work of deliverables.
                     CDRL Item,
                     Due Date,
Data Items           Delivery Date,
Collected            Resolution of Issues History (Issue, Response, Acceptance, associated
                      dates), and
                     ERA PMO Acceptance Date,
Collection        Items that require NARA approval are tracked by Data Management (DM)
01/26/06                                   C-4                                   ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                  Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                 Appendix C
                                               Final

                                ERA05 - Data Acceptance Status
Procedure         and appear as line items in the Integrated Schedule (IS). Weekly reports
                  transfer the number of documents submitted requiring approval, the number
                  still pending approval, the number which the Lockheed Martin Team is
                  addressing comments to update and secure approval. The date will be provided
                  to the ERA measurements database such that the acceptance times and
                  acceptance rates can be stratified by responsible CWBS and data item.
Thresholds        None.
Frequency         Data will be collected from the data management repository at least weekly.
                  No specific baseline procedure is required. The date delivered and date
Baseline
                  received will be the dates that data management transmits data to or receives
Procedure
                  feedback from the ERA Program Management Office.
Controlled
                  Retained in the ERA Team Portal Measurements Repository.
Location
Assumptions       CDRL Items are delivered to the ERA PMO on schedule.

                              ERA06 - Data Delivery Timeliness
                  This metric reflects the planned and actual CDRL Item deliveries that the
Description
                  Lockheed Martin Team is committed to make on the program.
                  Prompt data delivery is necessary for the ERA PMO to plan their internal
                  activities. It is also an indication of the schedule performance by the ERA
Rationale
                  Team and reflects the degree of ownership and accountability that the IPTs
                  feel.
Goals                100% of CDRL Items delivered on time.
                     CDRL Item,
Data Items
                     Due Date, and
Collected
                     Delivery Date.
                  CDRL Item delivery dates are reflected in the Integrated Schedule (IS). When
Collection
                  they are delivered, the Integrated Schedule is updated accordingly. The weekly
Procedure
                  status is collected automatically from the IS.
Thresholds        No CDRL Items are delivered late.
                  Weekly drops are taken from the Integrated Schedule and ported to the ERA
Frequency
                  Team Portal Measurements Repository.
Baseline          The baseline plan is that which is reflected in the Integrated Schedule
Procedure         controlled by the Chief Engineer.
Controlled        The delivery dates and history will be recorded in the Integrated Schedule.
Location          Historical data is retained in the ERA Team Portal Measurements Repository.
Assumptions       None.

01/26/06                                    C-5                                 ERA.DC.MP.4.0.doc

                          ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                   Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                  Appendix C
                                               Final

                                 ERA07 - Defect Management
                  This metric indicates the status of Problem Trouble Report (PTR) against the
                  ERA system. It reports the numbers identified and resolved. It is used to assess
                  the maturity of the system under test and its preparedness for deployment, and
                  to measure the effectiveness of development and verification activities. The
Description       data collection allows the reports to be stratified by severity and affected area,
                  among other characteristics.
                  This set of metrics supports the LMTSS Cross-Program metrics requirement.
                  In particular, Defect Detection per Phase and PTR Productivity are LMTSS
                  Cross-Program metrics.
                  The open Problem Trouble Report (PTR) status, combined with the
                  requirements verification status, indicates the maturity of the system and the
                  preparedness for deployment. It is also used to focus resources and to ensure
Rationale
                  that the program is applying resources to the appropriate areas. The program
                  team uses the estimates to refine plans for future releases and increments to
                  increase the accuracy of the plans.
                     Monitor defects during development in order to avoid re-design that
                      translates into performance, cost, and schedule impacts.
                     Monitor defects during test in order to determine the technical competency
                      of the system.
Goals
                     Provide a means for assessing quality of system at a given point in time
                      (number of unresolved PTRs in relation to total number of PTRs written).
                     Provide a means for assessing potential schedule impacts based on time
                      required to close as called out in the found/resolved/closed plan.
                  As a minimum, the following data items are recorded for each PTR:
                   Discovery Date,
                   Description of problem,
                   Type of problem (computation, configuration, data, functionality),
                   Origin of problem (Product, Phase, Release, Function, Process),
                   Priority,
Data Items         Owner,
Collected          Disposition History,
                   Defect Correction Testing History, and
                   Closure Date.

                  As a minimum, the following defect management metric values are calculated
                  and reported periodically:
                   Total Number of Defects Found per Severity Level, Origin, Type, and in
                     total, per period and cumulative, and expressed as both percentage and
01/26/06                                   C-6                                   ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                  Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                 Appendix C
                                               Final

                                  ERA07 - Defect Management
                      actual values.
                     Total Number of Defects Closed per Severity Level, Origin, Type, and in
                      total, per period and cumulative, and expressed as both percentage and
                      actual values.
                     Average Time to Fix Defect per Severity Level, Origin, Type, and in total.
                     Average Age of Open Defects per Severity Level, Origin, Type, and in
                      total.
                     Defect Detection/Removal Efficiency - This metric tracks the history of
                      defect removal. Each defect should be corrected effectively, requiring only
                      one re-inspection or regression test to verify removal. The data includes:
                          o Total inspections to be conducted or tests to be run,
                          o Inspections or tests completed, and
                          o Cumulative inspections or tests failed.
                  This metric is derived from data contained in the PTR database. During the
                  test phase, defects (PTRs) will continue to be documented, tracked, and
Collection
                  reported. The I&T PTR Lead, who has intellectual control and ownership of
Procedure
                  the PTR database, will be responsible for providing appropriate defect data
                  reports to the program measurement database on a weekly basis.
                  Representations are workload indicators and will not be subject to thresholds
                  or quantitative controls. Throughout the test phase, the trends of opening
Thresholds        versus closing of PTRs should be monitored for the existence of out-of-control
                  conditions. If out-of-control conditions exist, then analysis will be performed
                  to determine if the Found/Resolved/Closed Plan needs to be re-planned.
                  Collected data will be presented weekly at the Engineering CTR/ System IPT
Frequency         meeting, and monthly at both the Program Management Review. Data
                  collection and reporting will not begin until SW handoff to I&T.
                  This metric is presented by the I&T Lead to the Chief Engineer at the ERA
Baseline          System IPT. This data is also presented to functional management at the IT
Procedure         Functional Metrics Review. This metric data is considered baselined after it
                  has been presented at the System IPT presentation.
Controlled        Source data is retained in the program defect database. Historical data is
Location          retained in the ERA Team Portal Measurements Repository.
Assumptions       None.




01/26/06                                    C-7                                 ERA.DC.MP.4.0.doc

                          ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                 Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                Appendix C
                                              Final



                            ERA08 – Earned Value Management
                  EVM metrics are used to monitor cost, performance, and schedule. The
                  primary report used for analysis of performance in an EVMS system is the
Description       Cost/Schedule Status Report. It includes calculated cost and schedule variance
                  for each Work Breakdown Structure (WBS) element from the control account
                  level up to the Program level.
                  These metrics can be used to monitor progress, provide early warnings of
Rationale         problems, trends, enable process improvement, and enable decision making
                  whether to continue work on the Program.
                     Monitor performance, cost, and schedule.
                     Monitor the schedule and completion of work products relative to their
                      scheduled and actual completion times.
                     Determine how much of the planned work has been done.
                     Forecast the final spending and completion date.
Goals
                     Provide an early warning when the Program starts to go off-track.
                     Discover which areas/tasks are causing the problems, and where anomalies
                      are occurring.
                     IPTs status control accounts on schedule ensuring current and credible cost
                      status
                  Five basic data elements are collected and analyzed:
                  (1) Budgeted Cost of Work Scheduled (BCWS) or Planned Value (PV) – At
                  any given point in time, the BCWS is the amount of budget allocated to
                  complete the work that was scheduled to be completed by that given time.
                  (2) Budgeted Cost of Work Performed (BCWP) or Earned Value (EV) – At
                  any given point in time, the BCWP is the amount of budget allocated to
                  complete the work that has been completed by that given time.
                  (3) Actual Cost of Work Performed (ACWP) or Actual Cost (AC) or Inception
                  To Date (ITD) Cost – At any given point in time, the ACWP is the actual
Data Items        amount spent to complete the work that has been completed by that given
Collected         time.
                  (4) Budget At Completion (BAC) – The total budget to complete the program.
                  (5) Estimate at Completion (EAC) – Current outlook for what program is
                  likely to cost at completion.

                  Several support data elements are also collected and analyzed:
                   Estimate to Complete (ETC) Cost – The expected value of the cost to
                     complete the program from any given point in time until the end.
                   Level of Effort (LOE) – Labor hours associated with Work Package or

01/26/06                                   C-8                                 ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                   Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                  Appendix C
                                              Final

                             ERA08 – Earned Value Management
                      Control Account that DO NOT use Earned Value.
                     Cumulative Budgeted Labor Hours - Total number of hours to be worked
                      on a task through the end of the reporting period as defined in the contract.
                      Includes Task Name and Task Cumulative Estimated Labor Hours.
                     Total Budgeted Labor Hours - Total number of hours to be expended to
                      complete a task as defined in the contract. Includes Task Name and Task
                      Total Estimated Labor Hours.
                     Cumulative Actual Labor Hours - Total number of hours spent working on
                      a task through the end of the reporting period. Includes Task Name, Task
                      Cumulative Estimated Labor Hours, Task Actual Labor Hours, and Task
                      Cumulative Actual Labor Hours.
                  Collected data is used to calculate the following cost and schedule metrics.
                  Schedule Variance:
                          SV$ = BCWP – BCWS
                  SV Percentage:
                          SV% = 100% * (BCWP-BCWS)/BCWS = 100% * SV$/BCWS
                  Cost Variance:
                         CV$ = BCWP – ACWP
                  CV Percentage:
                        CV% = 100% * (BCWP – ACWP)/BCWP = 100% * CV$/BCWP
                  Variance at Completion:
                         VAC$ = BAC – EAC
                  VAC Percentage:
Collection
Procedure                VAC% = 100%* (BAC – EAC)/BAC
                  Cost Performance Index:
                         CPI = BCWP/ACWP
                  Schedule Performance Index:
                         SPI = BCWP/BCWS
                  Percent Complete:
                          PC = ACWP/BAC
                  [unless otherwise qualified, Percent Complete is the BUDGET percent
                  complete, not the schedule percent complete]
                  To Complete Performance Index:
                         TCPI = (BAC – BCWP) / (BAC – ACWP)
                  [Unless otherwise qualified, TCPI is calculated with respect to remaining
                  BUDGET rather than with respect to the EAC. TCPI represents an efficiency

01/26/06                                   C-9                                   ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                  Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                 Appendix C
                                              Final

                             ERA08 – Earned Value Management
                  factor that must be attained if the remaining work is to complete within the
                  available budget. TCPI = (Remaining Work)/(Remaining Budget)]
                  Other metrics are defined and reviewed as directed by the Program Director
                  and IPT leads. For example, the EAC trend, month-to-month EAC change,
                  late control account statusing status and trend, greatest monthly change, etc.
Frequency         Monthly reports, weekly statusing and analysis.
                  CV%, SV%, and VAC% less than 5%.
Thresholds        CPI, SPI, and TCPI greater than 0.95.
                  EAC change less than 5% per reporting period.
                  Budgets are baselined within the first 60 days of the contract. An Integrated
Baseline
                  Baseline Review (IBR) is then conducted. Following the IBR and closure of
Procedure
                  any material actions, the budget is baselined.
Controlled
                  The ERA Team Earned Value Management System.
Location
                  Budgets are realistic and time-phased.
Assumptions       LOE must be quantified and assigned to work packages separate from
                  measurable work.

                                ERA10 –Help Desk Success Rate
               The Technical Help Desk is responsible for handling Tier II and Tier III Help
               Requests.
               Tier II Help Requests deal with questions on system functionality or problems
               accessing needed information. It deals with problems that do not require physically
               touching a component part and that may be handled remotely/electronically. Tier
               III support rectifies performance, software, or hardware problems, such as
               hardware replacements, cold rebooting of a component that cannot be preformed
               remotely, or anything that involves a configuration change that must be approved
Description    through the CCB.
               This metric measures several Help Desk success factors:
                Customer satisfaction with the Help Request resolution process,
                Time to Respond during the normal business day 0600-2200,
                Time to Respond during non-business hours 2200-0600, and
                Time to Resolve or Escalate Help Requests.
               It also provides insight to the effectiveness of the process and reveals areas for
               process improvement.
               Identify instances and trends of customer dissatisfaction with the Help Desk
Rationale      process.
               Identify problems and areas for improvement before issues arise.
01/26/06                                   C-10                                 ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                 Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                Appendix C
                                              Final

                                ERA10 –Help Desk Success Rate
                  Rapid response to Help Requests.
Goals             Rapid closure of open Help Requests.
                  High Customer satisfaction with Help Request resolutions.
                  Help Request Description, including dates and qualifying parameters.
Data Items
                  Help Request Resolution History, including dates and qualifying parameters.
Collected
                  Customer satisfaction with resolution.
               The Help Desk receives questions and problems via telephone calls and email. The
               Help Desk opens a problem ticket to record the problem and its resolution. As
Collection     necessary, the Help Desk routes the trouble ticket to hardware or software
Procedure      specialists.
               The Help Desk records all relevant information to solve and close the ticket.
               The Help Desk asks customer to rate satisfaction.
Thresholds     None.
Frequency      Continuous data collection. Weekly reporting.
Baseline
               N/A
Procedure
Controlled
               Trouble ticket database.
Location
               This activity does not start until Initial Operating Capability (IOC) of the ERA
Assumptions
               system.




01/26/06                                  C-11                                 ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                   Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                  Appendix C
                                              Final

                             ERA12 - I&T ESLOC Productivity
                  This LMTSS Cross-Program metric is designed to capture test program
                  productivity metrics. The data is presented in a tabular or graphical display of
Description
                  plan vs. actual data in terms of ESLOC (effective source lines of code
                  (SLOC)) per labor-month.
                  Assesses actual test productivity against plan. Indicates opportunities for
Rationale         improving the planning for future test sequences. Used by LMTSS to improve
                  the fidelity of test estimates for future proposals.
                  Achieve a System Integration productivity rate on Operational Software of
Goals
                  800 ESLOC/LM.
Thresholds        +/- 4.3% of the Goal.
                  Upon completion of System Integration and Test, the I&T Lead collects the
                  inception to date (ITD) engineering labor-months for the specific test phase
                  work packages. The Integration and Test (I&T) Lead also collects official
                  SLOC counts from the CM organization.
Collection
Procedure         ESLOC is then calculated as
                      ESLOC = New SLOC + Modified SLOC * 0.8 + Carry SLOC * 0.28.
                  The resultant number from this equation is then divided by the total number of
                  labor months to arrive at total Integrated ESLOC/Labor-Month.
                  This information is presented following the completion of the System Test
                  Phase. This activity normally occurs once during a product’s (or increment’s)
Frequency
                  development life cycle. However, while the testing is in progress, productivity
                  outlooks are presented in place of the actuals.
Baseline
                  N/A
Procedure
Controlled
                  Historical data is retained in the ERA Team Portal Measurements Repository.
Location
                  As this is one of the principal metrics used in the LMTSS cost estimation
                  process, no presentation of productivity rates should be completed without
Assumptions
                  analysis. Correlation studies can be used to determine causes for significant
                  deviations from planned rates.


                               ERA14 - Requirements Growth
                  The requirements growth metric is a ratio that measures the amount of change
                  in the A and B specification requirements measured over time and against
                  growth estimated at proposal time. The baseline or original count of
Description       requirements is established when the requirements document goes under
                  control and is approved by the customer (i.e., A specification at SRR). After
                  establishment of the baseline, the requirement growth is plotted graphically to
                  show the amount of change over time and within the schedule milestones. The
01/26/06                                   C-12                                  ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                 Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                Appendix C
                                              Final

                                ERA14 - Requirements Growth
                  graph should also plot proposed A and B Specification requirement levels.
                  The data is shown by CSCI, discipline, and major subsystem, and by A-level
                  and B-level SHALL as appropriate.
                  This metric is a critical indicator for measuring requirements “creep”.
Rationale         If the requirement growth is too high or deviates from the plan by more than
                  the program variance, it may indicate that the system was inaccurately
                  estimated, or that it underwent baseline control too quickly.
                  A-Level:
                   Between SRR and SDR Less than 5%, and
                   After SDR 0%.
Goals             B-Level:
                   Between the Increment-level and Release-level SRR Less than 5% for the
                     release, and
                   After the Release-level SRR 0% for the Release.
                  Current Requirements Count, qualified by level, CSCI, Discipline, subsystem.
Data Items
Collected         Baselined Requirements Count, qualified by level, CSCI, Discipline,
                  subsystem.
                  Requirements Growth metrics are based on change in overall number of
                  SHALLS as compared to an original baseline. The baseline data for the A-
                  level requirements is the proposal baseline, which consists of the requirements
                  in the ERA PMO Requirements Document and an expected expansion ratio.
                  Since a new cost proposal will be submitted concurrent with the System
                  Design Review, the System IPT will reset the SHALL count at the System
                  Design Review for the subsequent Increments. The source data will be the
                  Systems Engineering requirements database. The collection will allow
Collection        stratification of the volatility by:
Procedure
                   Affected Increment or Release,
                   Affected product area, and
                   Qualitative assessment of the changes, e.g., whether it is substantive
                      content or editorial.
                  During A&D, only the A-level metric will be reported. The B-Level Metrics
                  baseline will be established at SDR and will be the basis for subsequent
                  collection.




01/26/06                                  C-13                                 ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                 Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                Appendix C
                                                Final

                                ERA14 - Requirements Growth
                  A-Level:
                   Less than 5% between SRR and SDR,
                   1% between SDR and CDR, and
                   1% between CDR and system acceptance test complete.
Thresholds
                  B-Level
                   Less than 5% between the time of the Increment-level and Release-level
                     SRR, and
                   1% for the Release after the Release-level System Requirements Review.
                  Updated monthly. Each change in requirements will count as part of the month
Frequency
                  in which the corresponding CR receives ERB Approval.
Baseline
                  Baseline at the System IPT.
Procedure
Controlled
                  Historical data is retained in the ERA Team Portal Measurements Repository.
Location
Assumptions       None.

                               ERA15 - Requirements Volatility
                 Requirements Volatility is similar to Requirements Growth in that it tracks
                 changes to the system's requirements. However, the Requirements Volatility
                 metrics measure the overall stability and understanding of the system with
                 respect to requirements. Requirements Growth is more a measure of how well
                 the requirements are estimated and managed.
                 This Requirements Volatility metric is a ratio that measures the amount of
                 change in the requirements measured over time. It is used to indicate how well
                 the ERA Team uses and controls the requirements analysis process.
Description      Volatility of the A Specification identifies either a poorly developed
                 specification or an impact to the proposed baseline which requires immediate
                 action and response from management.
                 Volatility of the B Specification identifies how well the ERA Team
                 understands the tasks and efforts at hand and how effectively the requirements
                 analysis process is being followed.
                 For A&D phase, the metric addresses only the A-Level requirements.
                 The B-Level requirements volatility will begin tracking at the Increment 1
                 System Requirements Review.
                 This metric is a critical indicator for measuring requirements churn. The ability
Rationale        of a program to deliver on time and meet expectations is directly related to the
                 stability of the requirements.
Goals            1% per month after System Requirements Review.
01/26/06                                   C-14                                ERA.DC.MP.4.0.doc

                          ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                    Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                   Appendix C
                                               Final

                               ERA15 - Requirements Volatility
                 The source data comes from the Systems Engineering requirements database.
                 Primary data elements are:
Data Items        Count of Requirement Changes to Date, qualified by type, level, CSCI,
Collected           Discipline, subsystem, release.
                  Baselined Requirements Count, qualified by level, CSCI, Discipline,
                    subsystem, release.
                 Requirements Volatility metrics are based upon the changes in requirements
                 (adds, deletes, or modifies). The baseline or original count of testable
                 requirements is established when the appropriate requirements document goes
Collection       under control. After establishment of the baseline, the Requirements Volatility
Procedure        is plotted graphically to show the amount of change over time in relationship
                 to the schedule. The formula is:
                 Requirements Volatility = 100 * ( ( Original Requirements Count + Total
                 Add/Mod/Deleted Requirements) / Original Requirements Count )
Thresholds       20% over goal.
Frequency        Updated monthly.
Baseline
                 Baseline at the System IPT meeting.
Procedure
Controlled
                 Historical data is retained in the ERA Team Portal Measurements Repository.
Location
Assumptions      None.

                            ERA16 – Risk Containment Summary
                  Risk Metrics provide a useful summary for management to identify risk trends
Description
                  and to assess the effectiveness of the risk management program.
                  Effective risk management is essential to a successful program. Should these
Rationale         metrics indicate a program with excessive risk or an ineffective risk program,
                  immediate ERA PMO and/or ERA Program Director action is necessary.
                     Analyze risk trends by exposure level, impact, phase, functional area.
                     Reduce risk profile over time.
Goals                Identify and analyze trends.
                     Develop risk strategies to mitigate, reduce, or eliminate potential risks.
                     Assess effectiveness of the risk management program.
                  At a minimum, the following data items are recorded:
Data Items         Identification Date,
Collected          Risk Description,
                   Risk Exposure = f(Probability, Impact),
01/26/06                                   C-15                                   ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                  Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                 Appendix C
                                               Final

                            ERA16 – Risk Containment Summary
                     Origin of Risk (Product, Phase, Release, Function, Process),
                     Owner,
                     Risk Mitigation History,
                     Disposition History, and
                     Closure Date.

                  As a minimum, the following risk management metric values are reported:
                   Open Risks by Exposure, Origin, and in total, per period and cumulative.
                   Closed Risks by Exposure, Origin, and in total, per period and cumulative.
                   Average Time to Close a Risk.
                   Average Age of Open Risk.
                  Use of the ERA risk database is mandatory for all personnel on the LM ERA
Collection
                  team. Risks are entered into the tool along with assessment and mitigation
Procedure
                  information. Risk impacts to schedule and cost are recorded.
Thresholds        None.
Frequency         Monthly reporting, weekly assessments.
Baseline          None. Risk Management began during the proposal phase and continues
Procedure         throughout the life of the program.
Controlled
                  Historical data is retained in the ERA Team Portal Measurements Repository.
Location
Assumptions       None.

                                ERA17 – Schedule Performance
                  The program schedules are used to validate the cost status (schedule variance,
                  earned value, etc.) and to alert management to potential program problems
Description       including poor estimation of work and staffing concerns. The critical path
                  analysis is of primary interest and focuses management on the key activities
                  that need to be completed to achieve the overall program completion date.
                  Schedule data, combined with staffing and cost data are the major tools used
Rationale
                  for managing the program.
                  Meet all planned Program Event (PE) milestones.
Goals             IPT status their schedule items on time to ensure schedule is current and
                  credible.
Data Items
                  All data items are drawn from the Integrated Schedule.
Collected
                  The schedule management team within the program management office
Collection
                  controls and reports on schedule performance based on input from the control
01/26/06                                   C-16                                ERA.DC.MP.4.0.doc

                          ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                 Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                Appendix C
                                               Final

                                 ERA17 – Schedule Performance
Procedure         account managers. Control account managers report their schedule
                  performance weekly. The schedule management team analyzes the schedule
                  input as compared to the baseline plan and the previous week’s updates.
                  Presentation of this data is in a graphical format using a schedule management
                  tool and typically shows:
                   Schedule plans vs. actual performance,
                   Schedule item dependencies and critical path items,
                   Task completions, and
                   Schedule churn.
                  Other metrics related to schedule performance. For example: the number of
                  activities that are less than 2 weeks behind, more than 2 weeks behind, and
                  more than 4 weeks behind, the number of items with schedule slips, the
                  number of items with schedule improvements, the on-time statusing rate and
                  trend, etc.
                  This information is collected and presented throughout the life cycle of the
                  program at CTRs, and at the program, functional, and senior management
                  levels. The schedule information is typically formatted in such a way as to
                  summarize the data at various levels of detail as required by various users of
                  the information. If schedule items are identified as risk areas, recovery or
                  mitigation plans are required. Once identified as a risk area, the specific
                  schedule items may need to be presented to senior management in more detail.
                  The ability to meet a program event is dependent upon the satisfaction of the
                  supporting Significant Accomplishments and Accomplishment Criteria
                  defined in the Integrated Plan. Thus, the program has established thresholds on
Thresholds        schedule tasks and accomplishment criteria. In addition, the program reports
                  on schedule activities that are 2 weeks or more behind schedule. Control limits
                  such as 2 weeks overdue or 4 weeks overdue may be applied. Once these
                  limits are exceeded, qualitative or risk management plans may be required.
Frequency         Weekly.
                  The baseline schedule is established through the cost and schedule
                  management processes. The Control Account Managers update the schedule
Baseline
                  weekly; those updates are baselined as their submissions to the schedule
Procedure
                  management team. The analyses conducted by the schedule management team
                  are baselined as output reports from WorkLenz.
                  Source data is extracted from the Integrated Schedule and stored in WorkLenz.
Controlled
                  Data is retained within WorkLenz to support subsequent analyses. Historical
Location
                  data is retained in the ERA Team Portal Measurements Repository.
Assumptions       None.



01/26/06                                   C-17                                ERA.DC.MP.4.0.doc

                          ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                 Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                Appendix C
                                              Final

                                   ERA19 – SLOC Growth
                  This LMTSS Cross-Program metric measures the software size trend over
Description       time. Developed SLOC (new and modified) and total SLOC are analyzed for
                  excessive or unexpected change.
                  The effect of SLOC growth is risk – schedule risk, unplanned overtime costs,
                  cost growth, increased defect detection, and greater system resource
                  requirements. A high percentage of SLOC growth can be an indicator of
Rationale
                  requirement misunderstanding, out-of-scope work, poor productivity, or bad
                  process management. SLOC growth must be understood and the causes must
                  be addressed.
                   Understand how well LMTSS is estimating SLOC and how effectively
                      they are managing to SLOC estimates.
Goals              Understand causes of SLOC growth.
                   Understand impacts of SLOC growth.
                   Improve SLOC estimation processes.
                   Source Lines of Code (SLOC) is count of actual number of lines of code
                      developed and unit tested by major subsystem, product, and CSCI. SLOC
                      is further qualified as new, modified, or reused. SLOC deleted since
                      previous Software Size metric is also identified.
                   Actual CM SLOC – Count of the actual number of lines of code developed
                      and unit tested. Automatically generated by CM tool.
Data Items
                   Estimated SLOC – The total SLOC planned to be placed under CM by the
Collected
                      end of the reporting period according to the program schedule. This is
                      collected by the ERB at specific phase points in the program including
                      initial and subsequent proposal submissions and at requirement and design
                      reviews.
                   SLOC Under CM – Actual SLOC placed under CM during the reporting
                      period. Automatically generated by the configuration management tools.
                  Until SW code is managed by the CM tool, SW size is estimated and reported
Collection
                  manually to the Program Measurement Repository. Once code exists, the CM
Procedure
                  tool automatically reports the metrics.
                  The cumulative variance is less than 20% from initial estimate and less than
Thresholds
                  5% from previous estimate.
                  SLOC Growth metrics are reported at the following milestones: Proposal,
Frequency         Requirements Complete, Design Complete, Delivery of Releases to CM, FAT,
                  and FCA/PCA.
Baseline
                  Through review by the ERB.
Procedure
Controlled
                  Historical data is retained in the ERA Team Portal Measurements Repository.
Location



01/26/06                                  C-18                                 ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                   Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                  Appendix C
                                               Final

                                      ERA20 Software Size
                  This LMTSS Cross-Program metric measures the breakout of the system by
                  type of Source Lines of Code (SLOC). By itself, it is of little value. Its real
Description
                  value is as an input to the Software Growth (ERA19) and Engineering
                  Productivity (ERA09) metrics.
                  Software size is an effective predictor of program stability and growth of
Rationale         function beyond approved requirements. It is most effectively shown as a
                  graphical representation over time.
                  Accurate projection of the software size such that there is confidence in the
Goals             ability to meet cost and schedule commitments for the current and future
                  releases.
                     Source Lines of Code (SLOC) is count of actual number of lines of code
                      developed and unit tested by major subsystem, product, and CSCI. SLOC
                      is further qualified as new, modified, or reused. SLOC deleted since
                      previous Software Size metric is also identified.
                     Source Lines of Code (SLOC) CM Actual – Count of the actual number of
                      lines of code developed and unit tested. Automatically generated by CM
                      tool.
Data Items           Estimated SLOC – The total SLOC planned to be placed under CM by the
Collected             end of the reporting period according to the program schedule. This is
                      collected by the ERB at specific phase points in the program including
                      initial and subsequent proposal submissions and at requirement and design
                      reviews. The estimates are retained in the ERA Team Portal Measurements
                      Repository.
                     SLOC Under CM – Actual SLOC placed under CM during the reporting
                      period. This is automatically generated by the configuration management
                      tools.
                  Until SW code is managed by the CM tool, SW size is estimated and reported
Collection
                  manually to the ERA Team Portal Measurements Repository. Once code
Procedure
                  exists, the CM tool automatically reports the metrics.
Thresholds        None.
                  SW Size metrics are collected and reported at the following milestones:
Frequency         Proposal, Requirements Complete, Design Complete, Delivery of Releases to
                  CM, FAT, and FCA/PCA.
Baseline
                  Through review by the ERB.
Procedure
Controlled
                  Historical data is retained in the ERA Team Portal Measurements Repository.
Location
Assumptions       None.



01/26/06                                   C-19                                  ERA.DC.MP.4.0.doc

                          ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                    Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                   Appendix C
                                               Final

                        ERA22 – Software Development Productivity
                  This LMTSS Cross-Program metric is a measure of productivity given in
                  thousands of source lines of code (KSLOC) per labor month.
                   Planned (both original and current) vs. Actual rate for the overall program.
Description        Planned (both original and current) vs. Actual rate by major subsystem,
                      product, or CSCI.
                  Note: the metric can also be expressed as its inverse, labor month (or hours)
                  per KSLOC.
                  Identify deviations from expected productivity.
Rationale
                  Validate LMTSS productivity models.
Goals             Identify and explain deviations from the plan and from standard LMTSS rates.
Data Items           Lines of code are drawn from CM-managed code repository.
Collected            Labor is drawn from cost management system.
Collection
                  At each reporting interval, SE lead collects the data and reports it.
Procedure
Thresholds        None.
                  Estimated productivity rates are determined when SLOC estimates are revised,
                  based on labor month data from the latest approved Estimate At Completion
                  (EAC). This happens at first baseline (proposal), at A-spec approval, B-spec
Frequency
                  approval, software design complete (all planned and estimated SLOC counts),
                  and upon delivery of the release to CM (actual SLOC counts), at factory
                  acceptance, and program end.
Baseline
                  None.
Procedure
Controlled
                  Due to the sensitivity of the data, access may be restricted.
Location
Assumptions       None.


                                 ERA23 – Test Coverage
                  Test coverage metrics indicate the completeness and the progress of the test
Description
                  program.
                  These metrics provide early insight to problems with the Testing program that
                  could lead to schedule slip or cost overruns. For example: When the number of
Rationale
                  test procedures completed is less than the number planned by a significant
                  amount, it can be inferred that a schedule slip is imminent.
                     All requirements (A-Level and B-Level) are expressed in testable
Goals                 constructs prior to the System Requirements review (For A-Level
                      requirements this will be the system-level SRR; for B-Level requirements,
01/26/06                                   C-20                                   ERA.DC.MP.4.0.doc

                          ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                  Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                                 Appendix C
                                              Final

                                  ERA23 – Test Coverage
                      this will be the Release level SRR for the release in which the requirement
                      will be implemented).
                     All requirements are verified at test levels prior to deployment to the
                      operational environment.
                     Monitor test progress to determine stability of the design.
                     Monitor test progress in regard to overall schedule.
                  As a minimum, the following data items are extracted from the requirements
                  repository:
                   Number of Tests/Test Procedures Scheduled by Test Phase (Unit Test,
                     Integration Test, or FAT).
                   Number of Tests/Test Procedures Executed by Test Phase and Status
                     (Success, Fail).
                   Number of Requirements Tested by Test Phase.
                   Number of Defects Found During Testing by Test Phase and Defect
                     Severity.
                   Total Time Spent Fixing Defects Found in Testing by Defect Severity.
Data Items
Collected         As a minimum the following metrics are calculated and reported:
                   Tests/Test Procedures Executed, expressed as value and percentage of
                     plan.
                   Tests/Test Procedures Passed, expressed as value and percentage of Tests
                     Executed.
                   Tests/Test Procedures Failed, expressed as value and percentage of Tests
                     Executed.
                   Requirements Passed per Test Phase and Cumulatively, expressed as value
                     and as percentage of Requirements Tested.
                   Average time spent fixing a defect in testing by Phase and Severity
                   Number and average age of Open Defects per Severity Level.
Collection        Data is extracted automatically from the requirements database (i.e., DOORS)
Procedure         and input to the ERA Team Portal Measurements Repository.
                  On test procedure execution, if any failed requirements exist at the end of the
Thresholds        test period, the cause, risk and deployment recommendation will be presented
                  at the System IPT and presented to the program director.
                  Information is presented on a weekly basis.
Frequency
                  Note: Many of these metrics are not collected during A&D phase.
Baseline          Plan is established at the respective system requirements review. Collected
Procedure         data presented to the System IPT and baselined as a result of that review.

01/26/06                                  C-21                                  ERA.DC.MP.4.0.doc

                         ♦ National Archives and Records Administration ♦
Electronic Records Archive (ERA)                                                Metrics Plan (MP)
ERA Program Management Office (ERA PMO)                                               Appendix C
                                               Final

                                 ERA23 – Test Coverage
Controlled        Source data derived from the requirements database. The resulting metrics are
Location          stored in the ERA Team Portal Measurements Repository.
Assumptions       None.




01/26/06                                   C-22                               ERA.DC.MP.4.0.doc

                          ♦ National Archives and Records Administration ♦

				
DOCUMENT INFO
Description: Program Management Office Plan document sample