SUBJECT: Test and Independent Verification and Validation (T&IVV by 8gEqZN7

VIEWS: 23 PAGES: 109

									Test and Independent Verification and Validation


              Business Process Guide




                     April 05, 2012
                      Version 0.6

    Department of Defense and Department Veterans Affairs
                                                                       Table of Contents
1          INTRODUCTION .................................................................................................................................................. 5
     1.1          BACKGROUND ....................................................................................................................................................... 5
     1.2          PURPOSE .............................................................................................................................................................. 5
     1.3          SCOPE.................................................................................................................................................................. 5
     1.4          ASSUMPTIONS ....................................................................................................................................................... 6
     1.5          DOCUMENT MAINTENANCE AND DISTRIBUTION ........................................................................................................... 8
2          GOVERNANCE..................................................................................................................................................... 9
3          PRINCIPLES ....................................................................................................................................................... 11
4          GUIDELINES ...................................................................................................................................................... 14
5          SERVICES .......................................................................................................................................................... 16
6          ROLES AND RESPONSIBILITIES .......................................................................................................................... 19
     6.1          RESPONSIBILITY MATRIX ........................................................................................................................................ 20
7          METHODOLOGY, PROCESSES AND PROCEDURES .............................................................................................. 23
     7.1        METHODOLOGY ................................................................................................................................................... 23
     7.2        PROCESSES AND PROCEDURES ................................................................................................................................ 26
           7.2.1     Element Definition ..................................................................................................................................26
           7.2.2     T&IVV Process .......................................................................................................................................27
           7.2.3     Engagement of T&IVV for Product Teams ............................................................................................28
8          ENVIRONMENT AND TOOLS ............................................................................................................................. 28
9          RESOURCES ...................................................................................................................................................... 29
10         MEASUREMENTS AND METRICS ....................................................................................................................... 30
     10.1         PROCESS MEASUREMENTS AND METRICS ................................................................................................................. 30
     10.2         PRODUCT MEASUREMENTS AND METRICS ................................................................................................................ 30
11         RISK AND ISSUE MANAGEMENT ....................................................................................................................... 31
ANNEXES ..................................................................................................................................................................... 1
     ANNEX A. TERMS OF REFERENCE ........................................................................................................................................... 1
     ANNEX B. EXPANDED ROLES AND RESPONSIBILITIES .................................................................................................................. 1
     ANNEX C. STATUS REPORT TEMPLATE .................................................................................................................................... 1
     ANNEX D. PROCESS MEASUREMENTS AND METRICS CHECKLIST .................................................................................................. 1
     ANNEX E. PRODUCT MEASUREMENTS AND METRICS................................................................................................................. 1
     ANNEX F. T&IVV ENGAGEMENT ........................................................................................................................................... 1
     ANNEX G. INTAKE ASSESSMENT / RISK ASSESSMENT ................................................................................................................. 1
     ANNEX H. ACRONYMS......................................................................................................................................................... 1
     ANNEX I. REFERENCES ......................................................................................................................................................... 1




                                                                         Table of Figures
Figure 1: Management Oversight ....................................................................................................................................... 9
Figure 2: High-Level Process Diagram .............................................................................................................................28
Figure 3 High-Level CARA Process .................................................................................................................................. 2

Test and Independent Verification and Validation                                                                                                   Version 0.6
Business Process Guide                                                             ii                                                            April 05, 2012
                                                                     List of Tables
Table 1: Supporting Relationships ....................................................................................................................................10
Table 2: Reporting Requirements ......................................................................................................................................11
Table 3: RACI - Planning Phase........................................................................................................................................20
Table 4: RACI - Development - Deployment Phases ........................................................................................................21
Table 5: RACI - Sustainment Phase ..................................................................................................................................22
Table 6: Element Definition ..............................................................................................................................................26




Test and Independent Verification and Validation                                                                                       Version 0.6
Business Process Guide                                                     iii                                                       April 05, 2012
                                             Revision History
 Date             Revision         Description                                    Authors
 02 Feb 12              0.3        Strawman for review by iEHR IPO T&IVV WIPT     DoD-VA T&IVV WG
 23 Feb 12              0.4        Revised
 28 March 12            0.5        With Intake Assessment annex added
 05 April 12            0.6        With AF comments




  .




Test and Independent Verification and Validation                                  Version 0.6
Business Process Guide                             iv                           April 05, 2012
1    INTRODUCTION

1.1 Background
The Department of Defense (DoD) and Department of Veterans Affairs (VA) have joined in a venture to deliver effective, suitable, and survivable
integrated Electronic Health Record (iEHR) products to their medical practitioners and beneficiaries. As part of this venture, the DoD Deputy
Chief Management Officer (DCMO) and the VA Chief Information Officer (CIO) sanctioned a DoD-VA Test and Independent Verification and
Validation (T&IVV) Work Group (WG) whose tasks included distilling and integrating DoD and VA T&IVV best practices into a singular
Business Process (BP). The term “T&IVV” refers to the combined work of Integrated Quality Assurance (IQA) personnel (testers and inspectors)
and Independent Verification and Validation (IV&V) agents (independent evaluators) within the collective BP about which they collaborate. The
DCMO and VA CIO directed that the resulting T&IVV BP facilitate delivering iEHR products to their Departments in an efficient, cost-effective
manner. They emphasized achieving faster times to market and greater suitability and reliability of Health Information Technology (HIT)
products for their DoD and VA end-users. This T&IVV Business Process Guide (BPG) responds to the expressed direction of the DCMO and VA
CIO. It specifically applies to the T&IVV that will be planned, executed, analyzed, and reported by the iEHR Interagency Program Office (IPO)
along with external T&IVV organizations supporting the IPO’s mission of a joint iEHR Family of Systems (FoS).

1.2 Purpose
This BPG describes how the IPO, IPO product teams, and their supporting T&IVV elements will implement T&IVV in coordination with the
stakeholder organizations. The T&IVV BP will exercise a program-level T&IVV Working-level Integrated Product Team (WIPT) and product-
level T&IVV Integration Work Groups (TIWGs), one for each functionally distinct and deployable IPO product. The T&IVV BP focuses on
empirical IQA metrics regarding both the process for obtaining HIT products and the methods for qualifying those products to be deployed to their
users. The IQA process and product metrics address the needs for objective and actionable information by iEHR decision-makers, IPO product
teams, the IPO staff, and stakeholder organizations.

1.3 Scope
This PBG describes a measurable, standardized, repeatable, rigorous, tailor-able, and operationally relevant T&IVV BP to be followed by all
organizations responsible for and participating in providing iEHR IT products in support of the medical providers and beneficiaries of the DoD
and VA.
    a. As it pertains to the teamwork across organizations that is needed to execute T&IVV, this BPG shall be followed by all organizations
       responsible for the procurement, receipt, infrastructure resourcing, development, integration, testing, inspection, review, certification,
       accreditation, deployment, and sustainment of iEHR IT products, regardless of original sponsorship and consistent with approved
       investment criteria and thresholds.

Test and Independent Verification and Validation                                     Version 0.6
Business Process Guide                             5                               April 05, 2012
    b. This BPG describes a Total System Quality Assurance (TSQA) approach to help achieve timely delivery of high-quality iEHR products to
       DoD and VA users. The term “iEHR product” spans all delivered elements, including the required support, training, and infrastructure
       provisions. The TSQA approach comprises all organizations involved with IQA of an iEHR product, their coordinated IQA activities, and
       the interworking parts of an iEHR system: software, hardware, networking, business process, and users.
    c. This BPG covers the System Development Life Cycle (SDLC) for each distinct and deployable iEHR product from concept identification
       until its cancellation or retirement.
    d. It identifies generic and scalable roles, responsibilities, services provided by T&IVV, measures, and other guidance for executing efficient,
       cost-effective T&IVV of iEHR products.
    e. It serves for orientation of both T&IVV personnel and staff with roles external to T&IVV with whom the T&IVV personnel must
       collaborate as part of delivering iEHR capabilities to the medical practitioners and beneficiaries.
    f.   It includes a comprehensive set of related and structured activities:
         1. Risk Assessments (RAs) from the initial T&IVV Scope Assessment to later RA updates.
         2. Provisioning of all Development and Test Environments (DTEs) needed by iEHR Products with particular focus on Distributed
            Development (DD) and Common Development and Test Environments (CDTEs), which may include the Pacific Joint Information
            Technology Center (JITC), Richmond Development and Test Center (DTC), and VA DTEs.
         3. Product-Team Verification and Validation (V&V), i.e., functional, integration, and performance testing and inspections. This V&V
            spans the spectrum of essential empirical IQA including, but not limited to, the areas of usability, Information Assurance (IA),
            Interoperability (IOP), capacity, compatibility, and Continuity of Operations (COOP).
         4. Software Code Quality Checking (SCQC).
         5. Field Test and Evaluation (FT&E).
         6. IV&V including Independent Testing for iEHR products with high-risk assessment ratings.

1.4 Assumptions
The development of this BPG reflects the following assumptions:
    a. The T&IVV BP must address scalable elemental IQA activities common to all IT systems. The DoD and VA will assign personnel to the
       generic roles and responsibilities based on considerations for each iEHR product.
    b. The procedures selected to resolve process and product quality concerns for each iEHR product will be tailored based on formal RAs.
    c. The DoD and VA will identify a common set of categories of automated tools needed for cost-effective T&IVV. Remote testers and other
       concerned personnel must be able to access and operate such automated tools. Examples of tool categories include, but are not, limited to
       test case management, software analysis, collaboration, automated script execution, task tracking, problem tracking, vulnerability
Test and Independent Verification and Validation                                       Version 0.6
Business Process Guide                             6                                 April 05, 2012
         assessment, network analysis, and capacity testing. The DoD and VA will partner in selecting tools for each category balancing expense
         and performance.
    d. The T&IVV BP depends on teamwork for its optimal execution.
         1. The IQA personnel make up a team of people who execute the various IQA activities in test planning, execution, analysis, and
            reporting. They handle the testing side of the T&IVV BP.
         2. Independent agents will address the IV&V side of the T&IVV BP. These agents examine and present metrics that address both the
            process to coordinate, plan, and execute IQA and IQA results from IQA pertaining to specific products. The IV&V agents analyze
            and report not only whether the “product was built right,” but whether “the right product was built.” The right product has not been
            “built” (or otherwise procured) unless it can be evaluated as effective, suitable, and survivable.
         3. The IQA personnel and the IV&V agents will work together in product-oriented T&IVV TIWGs who keep iEHR decision-makers and
            the IPO staff informed about T&IVV matters for their respective products. For that purpose, the IQA personnel and IV&V agents
            routinely exchange critical information with a variety of personnel, such as requirements authorities, funds managers, infrastructure
            providers, product team personnel, etc.
         4. An ideal T&IVV BP establishes a strong working relationship with the providers of HIT products to incorporate substantive IQA
            measures as early as practical, which enables correction of problems when such correction can be accomplished most economically
            and efficiently.
    e. The T&IVV BP will depend upon a combination of a (1) centralized budget managed by the IPO; (2) by-product T&IVV budgets; and (3)
       specifically allocated funding from other sources.
         1. The T&IVV Steering Committee (TSC) in collaboration with the IPO T&IVV WIPT and the System Engineering/Architecture WIPT
            will develop the centralized budget for approval and identify charge-backs for support of individual product teams.
         2. The budget cannot be created by simply adding the cost estimates for each iEHR product, which does not allow for the potential
            savings from matrixed support.
         3. The iEHR products comprise an FoS that must support health operations in a holistic, fully interoperable manner. To plan, execute,
            and report IQA as needed for such an endeavor, the iEHR IPO must allocate funding for enough core T&IVV capability to ensure
            smooth continuity in development and to enable ramping up with specialists when needed for a specific product.
         4. Each iEHR product must have funding commensurate with the T&IVV Scope Assessment and subsequent RA updates, or an
            executive with the appropriate authority must officially approve proceeding under conditions of explicit test limitations.
    f.   The term “T&IVV team” refers to the combination of IQA personnel and one or more IV&V agents who collectively perform the T&IVV
         BP for a particular product. The T&IVV teams assigned to specific iEHR products cannot be comprised solely of Government personnel
         due inadequate resources. Therefore, contractors must be authorized to participate in the detailed planning, execution, analysis, and
         reporting of IQA. However, Government T&IVV personnel must be available to handle inherently governmental tasks such as providing
         recommendations to decision-makers, approving the expenditure of funds, selecting specific automated tools, etc.
Test and Independent Verification and Validation                                       Version 0.6
Business Process Guide                             7                                 April 05, 2012
    g. Government-Off-The-Shelf (GOTS) and Commercial Off-The-Shelf (COTS) products undergo TSQA. Open source products must
       undergo sufficient T&IVV to ensure that their expected capabilities are present as listed and that they inflict no significant negative
       impacts upon the already deployed iEHR enterprise systems.
    h. The most important issue for T&IVV to resolve is how well an iEHR product supports the healthcare BP as intended. Measuring how
       well an iEHR product supports a health BP area depends on Essential Business Functions (EBFs) (also known as Critical Mission
       Function (CMF) within military Services) to be performed by HIT product that trace back to business Use Cases, which have both been
       validated by authorized user representatives. The Business Use Cases explains the high-level vision for the system such as its end-users,
       environment, intended functionality, etc. Business Use Cases describe the Concept of Operations (CONOPS) for an iEHR product, i.e.,
       the high-level who, what, when, where, why, how, and how much. The EBFs must describe the result expected from the system, the
       circumstances under which the result must be provided, and scoring rules for Use Case. Scoring rules will address accuracy, timeliness,
       completeness, and correct data sources. When a function of an IT product is essential for accomplishing a critical task in support of the
       mission, it is identified as an EBF.
    i.   The DoD and VA intend to use one or more common DTCs for the following objectives: save cost of establishing multiple test
         environments; facilitate achieving interoperability among HIT products; more efficiently provide T&IVV services, tools, environments,
         and support; and enhance Configuration Management (CM).
    j.   Knowledge transfer in the area of T&IVV is an important operational need to facilitate cost-effective and efficient delivery of HIT
         products to DoD and VA users and beneficiaries. This guide explicitly serves as one mechanism for such knowledge transfer. The guide
         will be supplemented by online knowledge resources, which must be accessible by DoD, VA, and military-Service personnel. Some
         information will be restricted based on assigned roles.
    k. The iEHR TSC is a joint team formally chartered by the iEHR Executive. The TSC includes representatives of the DoD, VA, and military
       Services who prepared this BPG for the DoD DCMO and VA CIO. The TSC will continue reviewing and refining this BPG to improve
       how it satisfies enterprise TSQA needs and adapts to emerging Federal HIT circumstances. The TSC will maintain separately a
       coordinated lexicon for all Joint DoD and VA T&IVV team members.
    l.   The iEHR IPO will contain an internal IQA team, who may coordinate with DoD and VA personnel as needed to perform TSQA. The
         Government IQA Managers within the IPO are sanctioned to coordinate, plan, monitor, guide, and report to the Program Managers (PMs)
         and iEHR decision-makers regarding TSQA activities and results.
    m. The T&IVV Director within the IPO will coordinate IV&V support for each iEHR product. The TSC will facilitate obtaining Subject
       Matter Experts (SMEs) from their parent organizations when needed to address specific iEHR products.
    n. None of the DoD Business Capabilities Lifecycle (BCL), the DoD 5000-series policy, or the VA Program Management and
       Accountability System (PMAS) will exclusively constrain the conduct of T&IVV supporting the iEHR program. The iEHR T&IVV will
       align with a unique governance structure developed for managing the iEHR program of development, testing, and deployment. Further, an
       executive group will preside over all iEHR products and control their release. Therefore, the T&IVV team will directly provide T&IVV
       results and identify risks to that group, as well as to the program/product managers, thereby informing decision-makers about defects and
       risks identified during the T&IVV.
Test and Independent Verification and Validation                                        Version 0.6
Business Process Guide                             8                                  April 05, 2012
    o. As a prototype of its kind, this T&IVV BPG has potential applicability to other Defense Business Systems (DBSs) and VA IT systems,
       including business areas external to HIT.
    p. All Components must participate in the T&IVV BP in order to minimize the amount of separate testing they must accomplish for unique
       considerations before deployment.

1.5 Document Maintenance and Distribution
This guide will evolve over the course of the iEHR Program as new information and insights are gained and as events occur that stimulate revising
the guide. As a minimum, the TSC will annually coordinate its review among the iEHR community, updating as needed. All revisions will be
noted in the beginning of this document in the Revision History section.
The iEHR Program teams will receive access to this guide online at the TKBS. It may be provided to stakeholders and customers upon request.
Unrestricted access may be provided for additional authorized personnel on request.


2    GOVERNANCE
The following diagram illustrates the lines of authority within the Governance construct for the iEHR program. The diagram uses generic roles
that will be assigned to specific individuals and organizations as the iEHR program commences.




Test and Independent Verification and Validation                                     Version 0.6
Business Process Guide                             9                               April 05, 2012
                                                                       DoD                    VA




                                                   Requirements                      Budget




                                                                   IPO Executive
                                                                    Committee




                                                                   IPO Program
                                                                     Executive




                                                                     DoD/VA
                                                                                     DoD/VA
                                                                   IPO Program                                      DTC IPT
                                                                                       TSC
                                                                     Managers




                               IQA            Engineering           Resource         Other             Product
                             Managers          Managers             Managers         STaff             Managers
                                                                                                                     IV&V
                                                                                                                    Agents


                                                                                                        Product
                                                                                                         Teams




                                                                                                        Product
                                                                                                      T&IVV Teams


                                                                  Figure 1: Management Oversight




Test and Independent Verification and Validation                                                Version 0.6
Business Process Guide                                10                                      April 05, 2012
The table below annotates how the varieties of organizations within the iEHR community support each other. Support can be in the form of
informing, advising, or providing resources.




                                                                                                   Table 1: Supporting Relationships



                                                                                                                                                                               Organization Receiving Support




                                                                                                                                                                 DoD/VA IPO Program Managers
                                                                                                       IPO Executive Committee


                                                                                                                                     IPO Program Executive




                                                                                                                                                                                                                     Engineering Manager




                                                                                                                                                                                                                                                                                                                                                                   DoD/VA T&IVV CoP
                                                                                                                                                                                                                                           Resource Manager




                                                                                                                                                                                                                                                                            Product Manager
                                               Military Services




                                                                                                                                                                                                                                                                                                                     Product T&IVV
                                                                                                                                                                                                                                                                                                  Product Team
                                                                                                                                                                                                   IQA Manager
                                                                       Requirements




                                                                                                                                                                                                                                                                                                                                                      DoD/VA TSC
                                                                                                                                                                                                                                                                                                                                     IV&V Agent
                                                                                                                                                                                                                                                              Other Staff




                                                                                                                                                                                                                                                                                                                                                                                          DTC IPT
                                                                                          Budget
                                DoD


                                          VA




    Organization
    Providing
    Support
    DoD                               X                            X                               X
    VA                      X                                                                      X
    Military Services                                              X*                 X*                                                                                                                                                                                                                                                          X
    Requirements            X                                                                                                    X                                                             X                 X                                                                            X                  X
    Budget                                                                                                                       X                           X
    IPO Executive
    Committee               X         X                                                                                          X                           X                                                                                                                                                                                                                        X


    IPO Program Executive                                          X                  X

    DoD/VA IPO Program
    Managers                                                                          X            X
    IQA Manager                                                    X

    Engineering Manager                                            X

Test and Independent Verification and Validation                                                                                                                                                                       Version 0.6
Business Process Guide                                                          11                                                                                                                                   April 05, 2012
                                                                                                                                                                 Organization Receiving Support




                                                                                                                                                   DoD/VA IPO Program Managers
                                                                                                 IPO Executive Committee


                                                                                                                           IPO Program Executive




                                                                                                                                                                                                 Engineering Manager




                                                                                                                                                                                                                                                                                                                                                               DoD/VA T&IVV CoP
                                                                                                                                                                                                                           Resource Manager




                                                                                                                                                                                                                                                                Product Manager
                                             Military Services




                                                                                                                                                                                                                                                                                                         Product T&IVV
                                                                                                                                                                                                                                                                                      Product Team
                                                                                                                                                                                 IQA Manager
                                                                     Requirements




                                                                                                                                                                                                                                                                                                                                              DoD/VA TSC
                                                                                                                                                                                                                                                                                                                             IV&V Agent
                                                                                                                                                                                                                                              Other Staff




                                                                                                                                                                                                                                                                                                                                                                                      DTC IPT
                                                                                    Budget
                             DoD


                                    VA
    Organization
    Providing
    Support

    Resource Manager
    Other Staff
    Product Manager                                                                                                                                                                                                                                                                                  X
    Product Team                                                 X                                                                                                                                                                                                                                   X
    Product T&IVV                                                X                                                                                                                                                     X                                    X                     X                                      X                                                        X
    IV&V Agent                                                                                                                                                                                                                                                                                       X
    DoD/VA TSC                           X                                                                                                                                                                                                                                                                                                                 X                      X


    DoD/VA T&IVV CoP                                                                                                                                                                                                                                                                                 X*                                   X
    DTC IPT                                                                                  X                                                                                                                                                                                                       X                                    X

  * - Indicates unidirectional relationship

The table below records reporting responsibilities. Reporting can be formal test reports or metrics in the case of the IQA team or an IV&V agent.
In the case of the PMs, product managers, and product teams, reporting could be progress status updates, risk analyses, issues identification,
mitigation plans, or other required programmatic information.
                                                                                             Table 2: Reporting Requirements



                                                                                                                                                                                               Reports To


Test and Independent Verification and Validation                                                                                                                                                  Version 0.6
Business Process Guide                                                        12                                                                                                                April 05, 2012
                                                                                                                                                                 DoD/VA IPO Program Managers
                                                                                                       IPO Executive Committee


                                                                                                                                     IPO Program Executive




                                                                                                                                                                                                                   Engineering Manager




                                                                                                                                                                                                                                                                                                                                                         DoD/VA T&IVV CoP
                                                                                                                                                                                                                                         Resource Manager




                                                                                                                                                                                                                                                                              Product Manager
                                                      Military Services




                                                                                                                                                                                                                                                                                                               Product T&IVV
                                                                                                                                                                                                                                                                                                Product Team
                                                                                                                                                                                                   IQA Manager
                                                                           Requirements




                                                                                                                                                                                                                                                                                                                                            DoD/VA TSC
                                                                                                                                                                                                                                                                                                                               IV&V Agent
                                                                                                                                                                                                                                                            Other Staff




                                                                                                                                                                                                                                                                                                                                                                            DTC IPT
                                                                                          Budget
                                   DoD


                                             VA
 Organization
 DoD
 VA
 Military Services
 Requirements
 Budget
 IPO Executive Committee
 IPO Program Executive                                                                             X
 DoD/VA IPO Program Managers                                                                       X                             X
 IQA Manager
 Engineering Manager
 Resource Manager
 Other Staff
 Product Manager
 Product Team
 Product T&IVV Team            X         X        X                                                X                                                         X                                 X                                                                          X                     X                              X            X
 IV&V Agent                                                                                        X                             X                           X                                                                                                            X                     X              X                            X
 DoD/VA TSC
 DoD/VA T&IVV CoP
 DTC IPT




Test and Independent Verification and Validation                                                                                                                                                                   Version 0.6
Business Process Guide                                                    13                                                                                                                                     April 05, 2012
3    PRINCIPLES
The T&IVV BP process follows the following principles as guides to the performance and behavior of IQA teams, IV&V agents, and
program/product team members:
    a. Independence. The T&IVV plans and reports will provide the leadership with unbiased, fact-based information about the products
       undergoing IQA. Legitimate inputs from outside of the T&IVV community to the test plans and reports will be considered; however, such
       inputs will not introduce bias in the objectivity of the plans and reports.
    b. Continuous Evaluation (CE). CE refers to using all appropriate sources of data to resolve how well a system satisfies its Critical
       Operational Issues and Criteria (COIC). CE and Early Involvement are mutually supportive and interrelated activities. Information
       collected at any stage in the system development and testing processes will be shared, subject to contract constraints, as part of an overall
       Continuous Process Improvement (CPI) program to ensure future improvements in both code quality as well as streamlined/more cost-
       effective and properly-focused testing. CE supports cost-effective follow-on testing by eliminating the need to test system/product aspects
       that have demonstrated both success on prior tests and traceability back to customer needs (except when necessary as part of regression
       testing). No T&IVV organization will be allowed a monopoly on providing test data to address or reporting whether a product satisfies its
       requirements. For example, IA testing in a System Acceptance Test (SAT) will not arbitrarily repeat all IA testing previously done in
       order to obtain the Interim Authority to Operate (IATO) required for the SAT deployment of the product under test.
    c. Early Involvement (EI). T&IVV personnel will engage in every phase of the acquisition life cycle from requirements analysis and
       systems design through formal testing to sustainment and retirement of all iEHR IT products. In requirements analysis and design
       reviews, they will concentrate on the testability and traceability of requirements to help ensure that the quality of the delivered product,
       when demonstrated through testing, will satisfy the customers’ business needs as intended. In design reviews, they will focus on planning
       for the earliest possible detection and resolution of defects as well as, developing plans for test cases, needed expertise in test
       methodology, automated test tools, and production-representative environments. Defects discovered during any phase of development or
       testing will be identified, reported, and corrected. EI of T&IVV personnel in the acquisition life cycle stands on the premise that early
       detection and correction of defects is much more cost-effective than discovering and correcting a defect later, such as after the system has
       been deployed. EI of T&IVV personnel may help Project Management Office (PMO) and contracting authorities in incorporating
       language into developmental and integration contracts that incentivize contractors to meet explicit quality standards on a defined schedule.
       Based on the guidance for EI, a T&IVV organization that has been duly afforded an opportunity to participate in the TIWG/T&IVV WIPT
       for developing the Test and Evaluation Master Plan (TEMP) or TEMP Annex for a product may not arbitrarily inject new test data
       collection requirements. This does not imply that data collection requirements may not change based on unforeseen, but compelling,
       circumstances.
    d. Automated Data Collection and Analysis (ADCA). Each TIWG will plan to exploit ADCA techniques to systematically collect the
       necessary test data to address the objectives for the testing, to help fault isolate and diagnose defects, and to produce metrics. The
       objective of ADCA for any iEHR IT product undergoing IQA is automated management of test cases and their associated results in order
       to produce a product dashboard for the test. Each IQA team will strive to create automated scripts for each use case identified by the
       Functional Proponents (FPs) for an iEHR IT product.

Test and Independent Verification and Validation                                       Version 0.6
Business Process Guide                             14                                April 05, 2012
    e. Total System Quality Assurance. The means of monitoring the engineering process and methods used to ensure the delivery of quality
       iEHR IT products with the necessary performance, features, characteristics, and attributes to make them effective, suitable, and survivable.
       The term “total system” includes such aspects of hardware, software, infrastructure, environment, support, personnel, and business process
       collectively. For iEHR IT products, TSQA spans the entire development process including, but not limited to, software design, Enterprise
       Architecture (EA) compliance, coding, source code control, code reviews, code testing, change management, configuration management,
       and release management for applications, infrastructure, and support items. It includes procedures to cover all key processes in the
       business such as:

            Monitoring processes to ensure they are effective;
            Keeping adequate records;
            Checking output for defects, with appropriate and corrective action where necessary;
            Regularly reviewing individual processes and the quality system itself for effectiveness; and
            Facilitating continual improvement.
    f.   Discretion (Respect). Participants in the T&IVV BP exercise discretion in two different ways. First, they are expected to exercise the
         power or right to decide or act according to their own judgment based on the empowerment given to them by higher authorities. As an
         example, even the best-planned test events may not go according to plan, but an obvious minor modification would enable the event to
         proceed without an extensive delay or waste of testing resources. When such a minor adjustment becomes necessary, the modification
         should be made on the spot, documented and reported through the chain-of-command without undo formal re-staffing. Second, T&IVV
         BP participants are prudent in their conduct or speech with regard to respecting privacy or maintaining silence about something of a
         delicate nature. For example, test results should only be communicated with those requiring immediate knowledge of the results to either
         initiate a corrective action or risk mitigation strategy. Divulging “adverse” test results to a wide community before the program office has
         the chance to lay out a course of action to correct the identified defect adds an unnecessary level of complication to an already complicated
         process and should be avoided at all reasonable costs.
    g. TEMP / Planning. Early involvement of the testing community has been proven through many acquisition programs to constitute a very
       significant risk mitigation strategy. The testing community can assist a program to be successful by applying lessons learned in other
       programs. Those lessons learned should be reflected by the test and evaluation planning document, i.e., TEMP. The TEMP describes how
       component technologies under development will be evaluated through a process of continuous evaluation using product and progress
       metrics to determined system effectiveness, suitability and survivability.
    h. Risk Control (Mitigation). As T&IVV professionals have long realized, Test & Evaluation (T&E) cannot solely concentrates on the
       finding of defects and verifying their correction. More recent T&E doctrine explicitly recognizes that T&E must now also focus on
       “providing knowledge to assist in the managing of risks involved in developing, producing, operating, and sustaining systems and
       capabilities” (reference the Defense Acquisition Guidebook). Risk-mitigation includes risk-based testing. Within the Office of the Chief
       Information Officer (OCIO), and for any external organizations developing applications for iEHR-wide use, the T&IVV BP must facilitate
       deploying rapidly, cost-effective solutions for our customers’ needs.


Test and Independent Verification and Validation                                        Version 0.6
Business Process Guide                             15                                 April 05, 2012
    i.   Design-to-Test. The principle of “design-to-test” refers to describing how a design will be tested as one of the methods of constraining the
         design. In describing how a design will be tested, the testers ensure the requirements are testable (clear, criteria, context, complete,
         consistent), check that they understand the factors and conditions align with the CONOPS and EA, and have been prioritized, with
         thresholds and objective values when appropriate. The concept includes the test team describing their test cases for each requirement
         before detailed engineering or software-code programming commences. The design-to-test concept depends on EI of the T&IVV
         community during the planning phase, especially requirements analyses.
    j.   Fair Quality Assurance. This principle refers to informing a product team in advance how the IQA team intends to execute T&IVV for a
         product, seeking feedback from the product team in regards to refining testing methodology, and always giving the product team ample
         opportunity to review and comment upon T&IVV plans, metrics, and reports before releasing them to a broader audience. The point does
         not infer that a product team may censor the artifacts that the IQA team produces. It means that they are given ample opportunity to check
         for and provide feedback about factual accuracy and logical analysis conducted from the perspective of the user community.
    k. Accountability. The T&IVV Team expects to fulfill the tasks delegated by the supported community in a timely, accurate, and complete
       manner. To achieve this expectation, the T&IVV Team will identify, monitor, execute, and report specific Action Items (AIs). Each AI
       will identify the responsible party (internal or external to the T&IVV team), the suspense, the product, and coordinating instructions.
    l.   Accountable Risk-Based Testing. The IQA plans will apply the assets entrusted in a manner to do the most good. The IQA organization
         cannot test, simply for the sake of testing. All testing will follow a TEMP tailored to the system under evaluation. The T&IVV Team will
         constantly balance the concerns of cost, schedule, and performance, striving to ensure that the customers shall receive the value expected
         for their procurement and sustainment funds.
    m. T&IVV Labor-Cost-Time Tracking. The IQA teams must record how much labor they perform in their various tasks, the expenses they
       accumulate for each product, and statistics regarding how long tasks take given how many personnel of which labor categories work upon
       them. This information provides valuable data for both cost estimation and for Return On Investment (ROI) related to T&IVV support.
       To facilitate collecting such statistics, the TSC will develop appropriate automated techniques.
    n. Promotion Entry/Exit Criteria. For each phase or step of an acquisition strategy for a HIT product, the product team must have a clear
       understanding of the conditions to be met for completing that phase or step (the “exit” criteria). In addition, they must understand what
       conditions will be sufficient to be “promoted” into the next phase or step (the “entry” criteria).
    o. T&IVV Knowledge Base System (TKBS). Due to the complexity and volume of information addressed by the T&IVV BP, the T&IVV
       Division will act as proponent for an online, topically searchable TKBS. Access to portions of the TKBS will be controlled by the
       Assistant Division Chief, T&IVV, who will also oversee its maintenance. Currently, the TKBS has been placed on Army Knowledge
       Online (AKO). The TKBS will contain product documentation, programmatic documentation, IQA and IV&V artifacts, policy,
       procedures, white papers, professional publications, Frequently Asked Question (FAQs), best-of-breed examples of various types, product
       metrics, test data, templates, checklists, and useful Uniform Resource Locators (URLs). One future capability intend for the TKBS is to
       be able to produce IQA plans, reports, and mandatory briefing slides from TKBS content input by TIWGS that relate to a product with
       minimal addition manual creation and technical editing. Another future capability is semantic searching.


Test and Independent Verification and Validation                                        Version 0.6
Business Process Guide                             16                                 April 05, 2012
4    Guidelines
    a. Coordination. The IQA team will coordinate their plans, activities, and reporting with the product team and T&IVV collaborators. The
       team provides inputs regarding key events, essential activities, and major decisions into the iEHR IPO’s Integrated Master Schedule
       (IMS). At the beginning of a product’s life cycle, all necessary coordination to exercise T&IVV at the level of effort indicated by the
       independent RA will be documented in the TEMP document, including resources and the source of those resources to execute the strategy.
    b. Metrics. The IV&V Agents will prepare metrics on both process and product maturity. The process metrics will address the Checkpoints
       as described in this guide. If either a Checkpoint or one of that Checkpoint’s supporting checklists do not apply, an Agent must explain
       the rationale. Process metrics will address the measures identified in Annex D - Process Measurements and Metrics Checklist. In some
       cases, additional product metrics may be specified within the product TEMP annex upon the consensus the product TIWG. For example,
       some artificial intelligence might need extraordinary measures distinct from transaction processing, rudimentary database operations, or
       analysis reporting.
    c. IQA documentation. The IQA team will develop a product TEMP annex that provides specific approaches for implementing the program-
       level TEMP. The program-level TEMP contains content guidance for product TEMP annexes. The TEMP developed by the product team
       will tailor the program-level TEMP based on the RA of the IQA team. The team will add to the TEMP any needed supplemental test
       planning (e.g., planning that had to be deferred for information that was not ready at the original creation of the TEMP). The team will
       record all test cases in an automated-test-case-management tool along with their results when executed, and then provide any analyses
       regarding defects or problems observed, and product testing reports as called for in the TEMP. The IQA team will ensure that TIWG or
       Agile team planning sessions are documented. The IQA team will ensure that documentation for Systems Under Test (SUTs) have been
       provided for necessary IQA planning, quality reviews, and future reference. The IQA team and IV&V Agents will ensure that the
       document artifacts that they produce reside in the automated documentation system designated for the iEHR program.
    d. TSC authority. The TSC will coordinate the common approaches to repeating tasks within the BP described in this guide. The tasks may
       be accomplished using techniques tailored to the product and its developmental situation, as documented in the TEMP. Issues with the
       guidance should be resolved in coordination at the worker level, but if that is not possible, the Government IQA Manager for a product
       may elevate the issue to the TSC. In turn, if the issue is not resolved with consensus by the TSC and product team, the issue may be raised
       to the higher iEHR authorities.
    e. Regression Testing. Whenever a defect is corrected in software or any other change is made in the software, the IQA team must ensure
       that the change works as expected and does not negatively impact the performance of other portions of the software. The scope of
       regression testing depends on analysis to identify the parts of the software that could be impacted.
    f.   Configuration control. All items undergoing IQA and IV&V must be under configuration control.
    g. Evaluation Framework. All TEMPs will address how the product measures identified in Section 10.2 will be accomplished.
    h. Competency. Members of the IQA team and IV&V agents must have experience and demonstrated skill sets appropriate to their roles.
       When such members will serve in Government positions they must satisfy the full set of requirements for the job category and level of
       their respective assignments, as well as fulfill the continuing education requirements applicable to their respective Components. Each
Test and Independent Verification and Validation                                      Version 0.6
Business Process Guide                             17                               April 05, 2012
         Component will evaluate the competency of their personnel. The TSC will prepare recommendations to the appropriate senior executives
         when augmentation of Component position and training requirements seems appropriate. For example, IV&V agents need training on the
         doctrine contained in this BPG.
    i.   Resource Planning. Every product will require levels of T&IVV resources to gauge the degree to which the product is effective, suitable,
         and survivable, therefore warranting full deployment across the DoD and VA enterprise. As a minimum, every product TEMP annex will
         discuss the resources required by the categories identified in the program-level TEMP. In cases when the T&IVV team determines that
         extraordinary resources will be required (not mentioned in the program-level TEMP), those resource will also be identified in the product
         TEMP annex. Resources required for a product must be identified to a level of precision such that any necessary procurement actions can
         be executed within the defined product schedule. In addition, the IPO will plan resources for a sufficient central T&IVV workforce to
         ensure continuity of T&IVV work across the enterprise.
    j.   DTC Governance and Product Promotion Information. The Integrated Development Environment (IDE) approach, at the time this version
         of the T&IVV BP publication, largely depends upon exploiting the DTC initiatives for the Military Health Service (MHS). The T&IVV
         community will participate in the Integrated Product Team (IPT) that plans and implements the DTC. The DTC IPT will develop a
         CONOPS that includes annexes for specific services and facility support to be provided at the DTC. The T&IVV community will provide
         input and coordinate regarding the contents of the annexes dealing with automated tools that support T&IVV directly and indirectly, data
         for SUTs, necessary systems and interfaces for testing the integration of products in the enterprise, access to supporting information
         required by T&IVV (e.g., configuration management, collaboration, task tracking, risk management, quality assurance, scheduling), and
         implementation of virtual environments and graduation from zones to zone. In addition the T&IVV community will provide necessary
         information for the procurement and coordinate with DTC administrators to install and maintain all T&IVV automated tools that have
         been identified as required for availability to product and T&IVV teams.


5    SERVICES
The TSC will maintain a catalog of T&IVV services that will be offered in support of iEHR products. Such services supplement the levels of
independent T&IVV determined by the RA approved by the TSC for each iEHR product. The following services should be available, but the TSC
may assist in finding or establishing services not on the list, if sufficient need comes to light. The services will require commensurate funding.
The TSC will provide cost estimation information upon request.
    a. Data for SUT. The T&IVV community will coordinate with the program and product engineers and the DTC management to identify and
       prepare the necessary data for SUT in order to exercise all test cases as required. Section 7.1 provides more detail on data for SUT.
    b. Interfaces or Interface Emulators. This service consists of non-operational, but fully realistic, representation of any IT system providing
       authoritative data necessary for IT product under test. Examples include, but are not limited to, interfaces providing information such as
       demographics, medical history, business code, facilities, and scheduling.
    c. Automated Test Tools with Enterprise Licenses. If an automated tool has been identified as a category that will be made available at one
       of the Integrated Development Centers (IDCs, e.g., the Richmond DTC), that tool will be kept in the catalog of services. The licenses for

Test and Independent Verification and Validation                                       Version 0.6
Business Process Guide                             18                                April 05, 2012
         the tool will be enterprise licenses that will be used by DoD/VA development and T&IVV teams, the quantity of the licenses will be based
         on the projection of peak requirements of tool usage.
    d. Risk Assessment Consultation. Risk Assessment Consultation comes in two general varieties: First the T&IVV team explains to the
       product team their methodology for conducting intake assessment to determine the appropriate levels of T&IVV activities. Second the
       T&IVV community will provide information gained through the conduct of T&IVV upon products, lessons learned across based on
       enterprise-wide T&IVV, and emerging guidance from Department or higher authoritative sources that is disseminated through T&IVV
       channels.
    e. Metrics Consultation. Metrics Consultation includes general training on the topic of product and program metrics, interpretation of
       metrics developed by T&IVV team upon specific products, and collaboration in transitioning risk elements into result concerns,
       documented issues, or ongoing analysis as appropriate.
    f.   Test Planning. The T&IVV community will assist each product team with test planning. The test planning will determine a cost-effective
         and acceptably efficient approach for answering the applicable product metrics. Test planning will be accomplished for each product at
         strategic level (what, why, and generally how) using a product TEMP annex. The product TEMP annex will be amplified with additional
         required information for execution as that information becomes available, but there will be no requirement for distinct supplemental test
         plans. The T&IVV community will ensure that the product team has access to detailed information in test management automated tools
         and other data collection automation.
    g. IV&V Assessment. Within the T&IVV community, assigned IV&V agent will provide status report upon process metrics and product
       metrics as described in this BP guide. The status report will be accessible to authorized individuals in TKBS. An interested party who has
       the necessary permissions will be able to review a single summary and click for more detailed information based on a color-coded system.
       As a minimum, an IV&V agent must provide the rationale for rating with a certain color-code and attached the artifact(s) considered in
       making the rating. The IV&V agent should always provide the intended rating for review by the product team to help ensure accuracy,
       consistency, and completeness of the information used on each assessment. The product team may dialog with the IV&V agent regarding
       updating information to be analyzed. Absence of information about any product or process metric results in it being rated “red”.
    h. Test Environment Instantiation. This service amounts to coordinating with the product team and the DTC regarding the minimal elements
       that must be present for specific testing. The minimal elements include, but are not limited to, hardware, networking, software, data, and
       tools.
    i.   Scripts. In T&IVV jargon, the term “script” is often found to intend one of two connotations. The first connotation amounts to a step by
         step description of a procedure to be followed in testing software. The second connotation is using an automated method to record such
         step by step procedures so that they may be executed using an automated tool with minimal human intervention. The service provided by
         the T&IVV community regarding scripts covers both these connotations. For example, the human interaction with a new software
         product in order to manifest the intended IT capability must be included in the test cases for that product. A script may enable execution
         of a test case without user personnel, who may be operationally engaged. In any case automated scripts facilitate regression testing,
         capacity testing, and dynamic SCQC.


Test and Independent Verification and Validation                                       Version 0.6
Business Process Guide                             19                                April 05, 2012
    j.   Software Code Quality Checking (SCQC). SCQC is a systematic inspection of software code to determine the degree to which it complies
         with applicable standards and has follow best software coding practices. Applicable standards consist of those approved as Enterprise
         Architecture (EA) for the specific product. It most often involves using automated tools to scan the source code and executable, plus the
         systematic review of related documentation artifacts to ensure that the System Under Review (SUR) warrants continuation of
         development, demonstration, or test; and to project if the SUR can meet the stated performance, maintainability, and usability
         requirements within cost (program budget), schedule (program schedule), risk, and other system constraints. SCQC encompasses the use
         of the following:
               Static code analysis;
               Static security analysis;
               Dynamic code analysis;
               Dynamic security analysis, and
               Architectural analysis.
    k. Penetration Testing. The service of penetration testing does not directly include the conduct of penetration testing. Rather this is one
       service that amounts to collaborating with a product team regarding the appropriate penetration testing for their product, then identifying
       the appropriate personnel to perform such testing. In further explanation, the types of penetration testing appropriate for an IT product
       depends upon the product architecture and Information Assurance Vulnerability Assessment (IAVA) controls. The T&IVV community
       will attempt to arrange penetration testing in a non-operational environment, in order to minimize risk to ongoing real operations. When
       the enterprise demands that penetration testing be done in the field environment, it is important to ensure complete recoverability from the
       impact of such testing. Recovery from such testing may be used as evidence of the viability of COOP planning.
    l.   Capacity Testing. Capacity Testing should be done as early as possible in the SDLC. Since the first opportunity for capacity testing arises
         during integration testing, the general approach for capacity testing would be to accomplish it in the developmental test environment.
         Capacity testing must rely on automated tools for execution, because the workload associated with a large number of users across
         geographically distributed network can seldom be emulated before the system is deployed in real operations. In addition automated tools
         measure the capacity of products in the areas of simultaneous data access, network load, central processing unit load, and simultaneous
         transactions, which are not amenable for manual observations.
    m. Agile Team Tester Support. A common denominator of agile development is the involvement of a testing community from start. Often
       this is exercised by a tester participating in the requirements discussion with a user community to define requirements how they will be
       tested before design starts. It is a widely recognized best practice of agile development that individual(s) serving as tester(s) are distinct
       from the IT product builders, but they work very closely with the product development team. This allows the builders and the testers to
       concentrate in their respective responsibilities. Agile testing relies heavily also on assessment by actual users so the testers assist such
       users in systematically planning, executing, analyzing, and documenting their assessment. Testers may also help in documenting
       requirements during their reviews of such requirements for testability.
    n. T&IVV Knowledge Base System. Participants in the T&IVV BP will be able to access a searchable online TKBS. The TKBS will
       contain or provide links to the following supplemental information:
            Policy guidance;
Test and Independent Verification and Validation                                       Version 0.6
Business Process Guide                             20                                April 05, 2012
                T&IVV Lessons Learned;
                Procedural instructions and explanations;
                Best-of-breed examples;
                Answers to FAQs;
                Selected briefing slides;
                T&IVV lexicon;
                Program/Product-level TEMP and IQA documents;
                Other product documents;
                Product-level T&IVV issues and concerns;
                AIs assigned to T&IVV personnel along with their expected deliverables;
                T&IVV BP metrics for each product; and
                References of interest to T&IVV practitioners.

    Note: Product specific information will be limited based on the individual role and permissions. The TKBS can be access via the internet.
    o. Regression Test Planning, Execution, or Both. Regression test planning focuses on analyzing logically the potential of one component
       design for an IT product to negatively impact the performance of other components when it fails to function properly. The intent is to
       organize potential results in a manner so that, when defects are fixed or upgrades/modifications are added, the regression testing can most
       efficiently verify the fix or upgrade/modification, and the absence of unanticipated new defects stemming from the change. Regression
       test execution ideally should rely heavily on automated tools that dynamically exercise interrelated EBF in their end-to-end data
       processing flow. Normally, the T&IVV community will closely collaborate with the product team engineers in preparation to perform
       regression test planning and execution.
    p. Upgrade/Modification Testing. Upgrade/Modification Testing usually includes a subset of testing that would be performed for a new
       product. The T&IVV community must rely on information about proposed upgrades/modification in order to assess the risks associated
       with them, and thereby identify the appropriate regiment and level of testing. As a minimum upgrade/modification testing must always
       demonstrate that the upgrade/modification work as intended, integrates with legacy IT environment, and avoids inserting any IA
       vulnerability. All upgrades/modifications are subject to regression testing as indicated above.
    q. Training Evaluation. Every IT product has an associated universal requirement that its end users can perform task in which they interact
       with the IT product as expected by user community. Those interactions can be described as standards for performance of a task and the
       conditions under which the task must be performed. Training evaluation focuses on the training strategy and training support package
       developed by the product team to prepare end users to operate their product when it is first introduced, when new end users must use the
       product during sustainment, and new functionality, features, or characteristics are introduced to the product. Ideally the training
       evaluation should be integrated with other testing activities. As an example training evaluation should begin by having user SME critic
       the adequacy of training material during functional qualification testing. As another example, the testers may use the training materials as
       inputs for their field test planning and to familiarize the test team with the product. As a last example, the sufficiency of the training

Test and Independent Verification and Validation                                      Version 0.6
Business Process Guide                             21                               April 05, 2012
         approach for typical end users must be measured during field testing, so that both the effectiveness and suitability of the product can be
         evaluated before full deployment decision.
    r.   Vulnerability Assessment. Vulnerability Assessment begins during design in the context of the IT product complying with IA standards
         and computer security policy. Vulnerability assessment should also consider the enterprises expectations for COOP as they apply to the
         particular product. In the case of IA standards, early SCQC constitutes a best practice for identifying vulnerabilities as soon as possible
         when correcting them is easiest and least expensive. The T&IVV community can consult and provide SCQC services that help prevent IA
         vulnerabilities. In regards to computer security, the main elements include roles and privileges, user identification, and passwords. With
         respect to COOP the logical assessment concerns span failure rollover, backup, restore, planned interruptions, and archiving of data per
         applicable laws and regulations. The vulnerability assessment must always be accomplished before IA C&A process. The T&IVV
         community will coordinate the work closely with IA community and product engineers to facilitate accomplishing the vulnerability
         assessment as soon and as complete as possible.
    s. Compatibility Testing. Compatibility spans concerns about whether a new product will interoperate fully with items of hardware,
       network, data, and other IT products already deployed in the enterprise or planed for it. The compatibility testing service amounts to
       including such production representative items in an environment for use in final integration testing of a new IT product. In addition
       compatibility testing relies on automated tools to assist in fault isolation and detailed analysis of the integration testing.


6    ROLES and RESPONSIBILITIES
T&IVV heavily relies on two broad responsibilities: (1) Each T&IVV team for an iEHR product shall report on a regular basis their progress,
findings, risks, issues, and metrics to the supported product team, program office, and the TSC. (2) Program/Project/Product Managers shall
engage with T&IVV community during the conceptual stage of a development effort.
Clear understanding of the more detailed roles and responsibilities amongst the T&IVV team members facilitates their effective teamwork. Annex
B. Expanded Roles and Responsibilities, identifies the responsibilities in details. At a summary level, the T&IVV BP involves four varieties of
participants whose work needs to be accomplished in a complementary manner:
    a. Product Team. The product team includes the product manager, product support staff, and the IQA team. The product team acts upon
       guidance received from higher authorities to deliver new or update HIT products to their DoD and VA users and beneficiaries.
    b. T&IVV Collaborators. The T&IVV collaborators come from outside the iEHR IPO to fulfill such roles as requirements analysis and
       management, user representative, budget authorities, EA management, senior oversight, etc. The T&IVV collaborators provide
       information and clarifications to the IQA teams and IV&V agents to plan and analyze TSQA. They receive advice about testability and
       T&IVV resource requirements from the IQA teams and IV&V agents.
    c. Integrated Quality Assurance Teams. The IQA teams conduct formal testing and inspection, analyze and report the results, make
       recommendations based on sound engineering judgment, and collaborate with the IV&V agents. Such recommendations should help
       programs be successful in terms of high quality products and shortening the “time to market.” IQA Team also can help shorten the “time

Test and Independent Verification and Validation                                       Version 0.6
Business Process Guide                             22                                April 05, 2012
        to market” by finding defects that need correction as early as possible in the SDLC when such defects are easiest and most economically
        correctable.
    d. Independent Verification and Validation Agents. IV&V agents inspect, review, observe, analyze, and report evidence throughout the
       software development life cycle. They concern themselves with how well the process for delivering or updating new HIT products has
       been followed and works. They also look at how well HIT products perform as expected and satisfy approved requirements. IV&V
       agents make recommendations to help programs be successful in terms of high quality products and shortening the “time to market.”
       IV&V agents may collaborate intimately with IQA teams, but in their IV&V role, their responsibility includes objectively critiquing the
       adequacy of IQA.

6.1 Responsibility Matrix
The RACI chart will be used for identifying roles and responsibilities for the four work group identified above.
RACI stands for:
    R – Responsible – Owns the problem/project
    A –Accountable – who must sign-off on or otherwise formally approve work before it is considered officially complete
    C- To be consulted – has information and/or capability necessary to complete the work
    I – To be informed – must be notified of results, but need not be consulted.




                                                                                  Table 3: RACI - Planning Phase

                                                                                         Program / Product           T&IVV
                                                                                               Team                Collaborators   IQA Team   IV&V Agent

                                          Requirements Traceability Matrix                       A                      R             C           I
                Planning / Requirements




                                          Systems Engineering Plan                               A                      R             C           I

                                          Information Support Plan                               A                      R             C           I

                                          Enterprise Architecture                                 R                     A             I           I

                                          Test and Evaluation Strategy                            C                     A             R           I

                                          System CONOPS/ Business Case                            R                     A             I           I


Test and Independent Verification and Validation                                                                Version 0.6
Business Process Guide                                                       23                               April 05, 2012
                                                                                      Table 3: RACI - Planning Phase

                                                                                             Program / Product           T&IVV
                                                                                                   Team                Collaborators   IQA Team   IV&V Agent

                               Acquisition Decision Memorandum                                        R                     A             I           I

                               Use Case                                                               R                     A             C           I

                               Configuration Control Board                                           A                      R             I           I

                               Acquisition Strategy                                                   R                     A             I           I

                               Quality Surveillance Plan / Quality Assurance Surveillances
                               Plan                                                                  A                      R             C           I

                               Evaluation Framework                                                   R                     C             A           R

                               Intake Assessment / CARA                                               C                     C             C          R/A

                               IV&V REPORTING                                                         I                      I            I          R/A




                                                                            Table 4: RACI - Development - Deployment Phases

                                                                                              Program / Product          T&IVV
                                                                                                    Team               Collaborators   IQA Team   IV&V Agent

                               Developer Integration Testing                                          R                      C            A               I

                               User Assessments                                                       R                      A             C              I
                 Development




                               Software Code Quality Checking                                         A                          I         R              I

                               User and System Documentation                                          A                      R             R              I

                               Scripts (for capacity and regression)                                  R                      R           R, A             I

                               Test Data                                                              A                      C             R              I

                               Automated Data Collection and Analysis                                 R                          I        A               I

Test and Independent Verification and Validation                                                                    Version 0.6
Business Process Guide                                                 24                                         April 05, 2012
                                                                            Table 4: RACI - Development - Deployment Phases

                                                                                            Program / Product         T&IVV
                                                                                                  Team              Collaborators   IQA Team   IV&V Agent

                                  Installation Demonstration                                        A                     C            R           I

                                  Functional Requirements Test Cases                                I                     C           R, A         I

                                  Technical Requirements Test Cases                                 C                     I           R, A         I

                                  Capacity Requirements Testing                                     C                     I           R, A         I

                                  Compatibility Requirements Testing                                C                     I           R, A         I
                  Integration




                                  Documentation Verification                                        A                     R            R           I

                                  DIACAP Assessment                                                 R                     A            I           I

                                  Computer Security                                                 A                     R            R           I

                                  Information Assurance                                             A                     R            R           R

                                  Continuity of Operation                                           A                     R            R           I

                                  Usability Requirements Testing                                    C                     A            R           I


                                  Interoperability Requirements Testing                             C                     C           R, A         R

                                  SME Validation                                                    R                     A            I           I
                  Qualification




                                  SME Free play                                                     R                     A            R           I

                                  Training Support Package                                          A                     R            R           I

                                  Supportability Assessment                                         R                     C            A           I

                                  Technical Analysis Supporting User Assessment                     R                     C            A           I
               Accepta




                                  Business Process Support                                          R                     A            C           I
                 nce




                                  Interoperability                                                  R                     R            A           I

Test and Independent Verification and Validation                                                                  Version 0.6
Business Process Guide                                                 25                                       April 05, 2012
                                                                                    Table 4: RACI - Development - Deployment Phases

                                                                                                    Program / Product            T&IVV
                                                                                                          Team                 Collaborators        IQA Team       IV&V Agent

                              Help Provisions                                                               A                       R                  R               I

                              Usability                                                                     A                       R                  R               I

                              Training Evaluations                                                          A                       C                  R               I
                 Deployment




                              OT&E (at the level required by risk analysis                                  R                       C                 R, A             I

                              IV&V Reporting                                                                I                        I                 I              A, R




                                                                                           Table 5: RACI - Sustainment Phase

                                                                                                        Program /         T&IVV                            IV&V
                                                                                                      Product Team      Collaborators    IQA Team          Agent

                                                        User Satisfaction Surveys                          A                   R               C             I
                                          Sustainment




                                                        Trouble Ticket Analysis                            R                   I               I             A


                                                        IV&V Reporting                                      I                  I               I             A



7    METHODOLOGY, PROCESSES AND PROCEDURES
The methodology, processes, and procedures will encompass T&IVV engagement and involvement from project inception through retirement.
    a. The T&IVV high-level approach is delineated by three categories of products anticipated for iEHR:

Test and Independent Verification and Validation                                                                            Version 0.6
Business Process Guide                                                    26                                              April 05, 2012
        (1) Developmental products – where the code must be developed as custom code and has not been tested or evaluated in an operational
            setting and reliability and maintainability history cannot be ascertained. This category carries with it the highest risk to the iEHR
            Program and will receive the most scrutiny by the T&IVV team.
        (2) Commercial Off-The-Shelf products – where the product has been tested, has a history in a field environment, or both so that the
            T&IVV team can determine its behavior, sustainability, and deployment impact, this category might carry the least amount of risk to
            the iEHR Program. The T&IVV team will focus the capability of COTS product to integrate and perform as required in an
            operational environment.
        (3) Government Off-the-Shelf and Open-Source products – where the product reuses existing Government-developed software or uses
            Open-Source software, which do not have completely documented testing and reliable usage history. This category of products shall
            undergo evaluation prior to the final selection to ensure that the code is scalable, sustainable, and maintainable. This category carries
            with it an unknown level of risk to the iEHR Program.
    b. System Development Life Cycle. T&IVV methodology, processes and procedures address (1) measures of SDLC process; (2) measures
       of iEHR product performance, features, and characteristics in order to provide insightful information needed by iEHR decision-makers
       during the phases of the SDLC; and (3) services that the T&IVV community may extend, upon request and availability, to help the iEHR
       product teams. This guide addresses how the methodology, processes and procedures of T&IVV apply to the following SDLC activity
       areas:
        (1) Planning.
        (2) Development, Integration, Testing, and Deployment.
        (3) Sustainment.

7.1 Methodology
The DoD-VA joint T&IVV methodology will lean heavily upon the practices listed below. The product team will address, in their product TEMP
annexes, how each of the practices below will be applied to their specific needs.
    a. Agile Team. An agile team includes closely collaborating representatives of software development/integration, user/customer, and
       T&IVV communities. From the prospective of TSQA, the key approach of the development which an agile team practices is determining
       how to measure successful development of requirements before they are developed / integrated. The T&IVV members should examine
       the Test Data Management (TDM) process as it applies to every requirement. The TDM process spans preparation for data collection, the
       data collection, quality assurance, analysis of results, defect isolation, and display of analyzed test results.
    b. Risk Doctrine. From the perspective of T&IVV, the risk doctrine most heavily concerns how T&IVV procedures can control process and
       product risks using empirical techniques. This motivation leads to tailoring the product metrics based on analysis of each product
       situation. The risk assessment should address a combination of the following:
             1) The degree of technical challenges phased in meeting the requirements
Test and Independent Verification and Validation                                       Version 0.6
Business Process Guide                             27                                April 05, 2012
             2) The potential negative impacts of not meeting requirements
             3) The rigor and expected resolution of the measurement for each requirement
             4) The reasonable and affordable availability of T&IVV resources, including available time for T&IVV activities
             5) The urgency of the capabilities that should be provided by a new product
             6) The degree to which it is feasible to defer requirements that are not achieved, but still have a release with sufficient utility for
                users
             7) The track record of the developer/integrator team
    Note: Situational considerations might cause the T&IVV team to update their RA multiple times over the SDLC.
    c. Modeling and Simulation (M&S). M&S should be considered to facilitate planning and analysis at both the enterprise and product levels.
       Examples of enterprise planning and analysis would include EA, planning and allocation in the dimensions of people, process, and
       technology plus technical architectural considerations related to the global information grid, processing capacity, and continuity of
       operations. Most often, product simulations would be used to analyze impact of the product upon the enterprise, however, product testing
       usually requires certain emulations of data for SUT, and for interfaces with which tampering would be forbidden due to live operational
       constraints.
    d. Scripts. The T&IVV team, working with user representative and the product engineers, will develop detailed descriptions of the steps
       expected in each logical sequence of functional steps comprising an EBF that a product is expected to perform. Such descriptions must
       span all EBFs expected of a medical IT product. These descriptions are referred to as “Scripts”. Scripts include identification of the
       necessary interfaces, components, and data for execution. Scripts can be utilized for manual test case execution. They also form the basis
       for automating functional test cases.
    e. Test cases. Fundamentally a test case is a discretely identified step by step procedure to make one observation of the systems performance
       related to a single or predetermine set of requirements. Test cases that involve procedures that directly map to the expected human
       interactions with the SUT may be categorized as functional test cases. On the other hand, technical test cases relate to parameters,
       attributes, and features that contribute to the satisfactory performance of the product but are not directly observable by human beings.
       Technical test cases typically require test tools to execute, whereas functional test cases could be executed by human observers and no
       more than database queries.
    f.   Field Testing. All products must undergo field testing before full deployment across the enterprise. Field testing examines how typical
         end users can use the product to accomplish their business task as expected, given the intended training and support. Field testing will be
         usually conducted in two phases. First, one or two sites will conduct a period of functional operations to verify the most fundamental
         operational performance of the product. Next, additional sites will be added to expand the factors and conditions of observations to a
         degree such that a full deployment decision can be made with an acceptable level of confidence. When data are collected to resolve
         operational issues by an independent Operational Test Agency (OTA) or a designated independent VA test organization, field testing is
         known as operational test and evaluation (OT&E) in DoD and as Initial Operational Capability (IOC) in VA, respectively.

Test and Independent Verification and Validation                                          Version 0.6
Business Process Guide                             28                                   April 05, 2012
    g. Development Plan (Contract Language) Guidelines. Any contract, whether with the commercial vendor or with another non-commercial
       provider of a medical IT product, will include language that ensures the following:
             1) Government possession of documented source code under configuration control
             2) Full access to any quality assurance information available (including witnessing testing procedures if they have not already been
                conducted)
             3) Collaboration in the conduct of SCQC (spanning static, dynamic, and IA vulnerability checks) before acceptance of the product by
                the government
             4) The fact that all support items such as user instructions, technical guidelines, training aids, etc… must have quality assurance
                evidence delivered with them and will be subject to the government testing and inspection
             5) If the provider of an IT product has already completed the development, the provider will provide evidence of how well the
                product satisfies the government requirements. If the product is still under development/integration, the provider will provide the
                quality assurance plans for review and comments. In a procurement data package, the quality assurance plan is often referred to as
                a Quality Surveillance Plan (QSP), and the government plan complementing it is called a Quality Assurance Surveillance Plan
                (QASP).
    h. Metrics. Metrics refer to expressions of measures dealing with either process or product testing or inspection. Metrics combine the results
       from multiple observations in a meaningful manner for managers to provide guidance or make decisions. The T&IVV community has
       defined process and product metrics in this document so that each product team will know the measures intended.
    i.   Evaluation Framework (EF). Each product team in their product TEMP annex will explain their approach to accomplishing their
         measures in the standard EF included in this document. The approach should explain how satisfaction of the standard measures for the
         different requirements categories will be accomplished by testing, inspection, or review procedures.
    j.   Data for SUT. Each product team will explain how data will be created that enables the exercising of functional and technical
         requirements in appropriate test cases. Such data must be depersonalized except in field testing. Any test team member of test players
         who sees Personally Identifiable Information (PII) or Personal Health Information (PHI) must have the appropriate Automated Data
         Processing (ADP) clearance.
    k. Configuration Control Board - Severity and Priority Codes. Each product team will identify a Configuration Control Board (CCB) to
       validate the assignment of severity and priority code to reported defects (in accordance with the program definitions). CCBs require both
       empowered user representatives and knowledgeable program staff members, particularly the product engineers.
    l.   Distributed Development. DD refers to an enterprise IT product development approach that recognizes that the IT products making up the
         enterprise can potentially come from a variety of sources. Such sources include traditional software developers and integrators, vendors
         with mature COTS products, government agencies with transferrable GOTS products, open source listings and experimental prototyping
         products that have the demonstrated potential to transition into full deployment. The DD process coordinated between the DoD and VA


Test and Independent Verification and Validation                                        Version 0.6
Business Process Guide                             29                                 April 05, 2012
        provides standards and techniques such that a product can reasonably be expected to perform as required in the enterprise System of
        Systems (SoS).
    m. Rapid Initiatives. When the executive level management for a program accepts a need for a new software project as an urgent priority, the
       requirements may be initially only high-level. Development and integration of the product may depend highly upon prototyping. The
       T&IVV must assist the product team with achieving timely deployment of an effective suitable and survivable product. In the absence of
       completely documented requirements, the T&IVV team will simply report their independent conclusions about the capabilities and
       limitations regarding the product. A capabilities and limitations report from the T&IVV community does not revoke the responsibility of
       the product provider to complete and deliver all needed user, administrator, and technical documentation to the government.
    n. Software Code Quality Checking. SCQC exploits automated analysis tools, combined with expert code inspection, to determine the
       degree to which a proposed iEHR product complies with the applicable standards and coding best practices. Such compliance leads to
       more efficient processing (faster system performance), greater reliability, and less expensive maintenance. SCQC includes static analysis
       of the code architecture, dynamic analysis of the efficiency of the code in exercising EBFs, both static and dynamic checks of code
       vulnerabilities, and some degree of manual analysis.
             1) Benefits. Benefits of SCQC includes usually isolating problems to exactly which line of code causes the problem and identifying
                how the problem should be corrected.
             2) Corrective Labor Estimation. Some SCQC tools provide the estimated hours of labor to correct problems that have been
                identified. However, such estimated hours of labor currently reflect additive estimation, rather than considering a learning curve.
             3) Information Assurance Certification Facilitation. SCQC reduces the detection of vulnerabilities in the IA and Certification &
                Accreditation (C&A) scans dramatically. Current empirical results indicate SCQC catches at least 95 percent of the
                vulnerabilities that a “Gold Disk” scan would catch, but identifies specifically in the lines of code where vulnerabilities are located
                and what needs to be done to fix them. Gold Disk scans provide much less actionable information.
    o. IV&V Status Reports. IV&V Status Reports depicts using color codes in one page summaries for each of the process and product metrics
       described in this document. Each assessment will be linked to the documented evidence to the assessment and the IV&V agent rationale.
    p. Failure Definition/Scoring Criteria (FDSC). The measurable level of performance for each product requirement must be defined in a
       manner that explains for an observation of that requirement whether or not it has been fulfilled. The FDSC are developed by the
       authoritative user representative in coordination with the T&IVV and engineering communities. The T&IVV community uses the FDSC
       as one element of their determination whether requirements are testable. The T&IVV community relies upon the user and engineering
       representative to validate how the threshold for each requirement is expressed.
    q. TEMP. The iEHR T&IVV WIPT will develop one high-level document to coordinate and obtain the commitment of DoD and VA
       stakeholders to how IQA and IV&V will be accomplished. Each Product Team will establish an appropriate TIWG to tailor the program-
       level TEMP to the specific product TEMP annex.



Test and Independent Verification and Validation                                         Version 0.6
Business Process Guide                             30                                  April 05, 2012
7.2 Processes and Procedures

7.2.1 Element Definition
Process workflow diagrams are used to describe complex activities in this document. The elements used in the process workflow diagrams are
syntactically consistent with Business Process Modeling Notation (BPMN). The table below provides the definitions of the elements used in the
process workflow diagrams.

                                                          Table 6: Element Definition
Name                                                    Icon                            Description
Start Events                                                                            Start Event indicates where a process will begin. Start event
                                                                                        has arrow(s) going out of the icon.

Activities                                                                              An activity is work that is performed within a business
                                                                                        process. An activity can be atomic or non-atomic (compound).


Gateways                                                                                Exclusive Gateway (Exclusive OR) – used to create alternative
                                                                                        paths based on defined conditions.


Sequence Flows                                                                          A Sequence Flow is used to show the order that activities will
                                                                                        be performed in a process.
Lanes                                                                                   Lanes represents organization roles or any desired process
                                                                                        characteristics.




7.2.2 T&IVV Process
When distilled to its elemental arcs of authority, at a high-level, the T&IVV process can be portrayed as shown in Figure 2. When viewing Figure
2, the reader must keep in mind that the T&IVV process below actually includes the iterative and recursive part of Governance discussed in
Section 2. A TEMP shall contain coordinated provisions to address the arcs and boxes shown below.



Test and Independent Verification and Validation                                     Version 0.6
Business Process Guide                             31                              April 05, 2012
                                                                                           Planning                                                                         Development - Deployment                                     Sustainment




                      Program/Product Team
                                                                                     Develop
                                                                                      Product
                                                                                    Management                                                                                                                                          Conduct
                                                                                    Documents                                                                                                      Support
                                                          Initiate                                                                                                                                                                      Survey &
                                                                                                                                                                                                   Testing
                                                          Product                                                                                                                                                                        Sustain
                                                                                                                                                                                                   Activity
                                                          Launch                                                                                                                                                                        Product
                                                                                      Charter
                                                                                       TIWG
                                                                                    (Agile Team)

                      Integrated Quality Assurance

                                                                                                                                                                   Assess
                                                                                                                                                Identify          Product
                                                                                                                                                Product                        Proceed
                                                                                                                                                  Type
                                                                                                                                                            C(G)OTS /                                                                    Perform
                                                                                                                                                               OS         Determine                              Yes        Conduct
                                                                                       Conduct                Build                Develop                                 Level of                                                     Production
                                                                                                                                                                                                                           Regression
                                                                                         Risk             Coordination          Product TEMP                               Testing                                                       Support
                                                                                                                                                                                                                              Test
                                                                                      Assessment            Strategy                Annex                                                                     Regression
                                                                                                                                                                                                                Test?
                                                                                                                                                Develop                                                                     Report
                                                                                                                                                                                      Conduct
                                                                                                                                                Detailed    Development                                                      Test
                                                                                                                                                                                        Test
                                                                                                                                                Test Plan                                                         No        Results
                      T&IVV Collaborators




                                                                                                                    Develop
                                                                                                                   Supporting
                                                                                                                   Documents
                                                                                                                                                                               Support                 Make
                                                          Receive                                                                                                                                                             Deploy
                                                                                                                                                                               Testing               Deployment
                                                       Capability Start                                                                                                                                                       Product
                                                                                                                                      Provide                                  Activity               Decision
                                                         Approval                                                                    Resource
                                                                                                                                   Commitment




                                                                                              Monitor                                                                                                                                   Monitor &
                                                                                                                                                                                                                        Develop
                      IV&V Agent




                                                                                             Planning                                                 Develop               Conduct             Monitor &                                Develop
                                                                                                                                                                                                                       Deployment
                                                                                             Activity &                                              IV&V Plan                Test               Report                                  Survey
                                                                                                                                                                                                                         Report
                                                                                              Report                                                                                                                                     Report


                                                     Note: Some arcs are represented in a different
                                                     color to make the diagram more readable




                                                                                                           Figure 2: High-Level Process Diagram



7.2.3 Engagement of T&IVV for Product Teams




Test and Independent Verification and Validation                                                                                                                Version 0.6
Business Process Guide                                                                         32                                                             April 05, 2012
Annex F. T&IVV Engagement, provides a matrix that identifies the primary interactions of the T&IVV team
with the product team and other involved organizations during the SDLC. The matrix explains the purpose,
objectives, expected inputs, expected outputs, entry criteria and exit criteria of such interactions from the
perspective of T&IVV business process.


8     Environment and Tools
    These paragraphs address the Environment and Tools at a high-level.
     1) Testing iEHR products requires operationally realistic test environments that include representations
        of the software, hardware, protocols, data, networking, processing load, communications load, and
        security provisions that will perform in a manner that has predictive validity for the iEHR products
        when they are put into live operations. The T&IVV community will collaborate with iEHR
        engineers to clearly understand the requirements for DTEs to be realistic. To reduce the expense of
        such environments, the DoD and VA will partner to establish physical testing facilities that can be
        commonly shared by developers and testers. The East Coast (DTC – Richmond serves as one of the
        initial implementations of this concept. The DoD and VA T&IVV community will provide their
        requirements to the DoD and VA team assigned to implement such DTCs.
     2) T&IVV of iEHR products require the following categories of tools, which the TSC will oversee with
        the product teams the selection of standard products, coordinate resources for enterprise
        implementation, and update the tools planning no less than yearly. Tools will be funded centrally,
        unless a product team determines that a unique need for such a tool exists.
            Automated script exercise
            Capacity Testing
            Computer Security/Information Assurance
            Configuration Version Control*
            CPU Utilization
            Data Collection and Analysis (Developmental Test and Evaluation (DT&E), Operational Test
             and Evaluation (OT&E))
            Defect Management
            Depersonalized data creation
            Functional Testing
            Memory Capacity
            Modeling and Simulation*
            Network Emulation and Analysis*
            Penetration Testing
            Performance Testing
            Requirements Management*
            Risk Management*
            Section 508 Compliance
            SCQC
Test and Independent Verification and Validation                                       Version 0.6
Business Process Guide                             33                                April 05, 2012
             Test Case Management
             Usability Testing
    * - Indicates business areas not specific to T&IVV. T&IVV will rely on information from this tool
    category and will collaborate with the respective business area.


9      RESOURCES
Ultimately, resources refer to funding to meet the constraints of the expected SDLC schedule. The T&IVV
community must recommend to iEHR decision-makers adequate T&IVV budget provisions to collectively
address appropriately the following considerations for all iEHR products.
      a. Personnel. The T&IVV Team will be comprised of dedicated personnel for the sole purpose of
         conducting T&IVV activities in support of an iEHR product. The quantity, knowledge, skills, and
         abilities of the T&IVV Team will be coordinated by the IPO test manager based on the RA for each
         product. This coordination includes sizing the personnel requirements of any independent testing
         organization sanctioned or mandated to support a particular product, such as personnel from
         Operational Test and Evaluation (OT&E), IOP, and IA organizations. The TSC will regularly
         analyze the adequacy of T&IVV staffing plans and advise decision-makers as needed.
      b. Facilities. The T&IVV Team will use a CDTE approach such as the DTC – Richmond and the JTIC
         - Maui. The DTC will focus on development, integration, and testing. The JTIC will focus on
         concept exploration and prototyping. The T&IVV BP will always include checks of products in
         operationally realistic settings for the user organizations involved.
      c. Tools. The T&IVV Team will develop and maintain a list of all tools required to conduct all T&IVV
         activities.
Note: Since the methodology to estimate costs for the resources listed above is contractually sensitive, only
Government personnel with the need to know may request the guidance from the TSC.


10 MEASUREMENTS AND METRICS
Measurements and Metrics constitute related terms essential for TSQA. Measurements express in
summation the results of a set of observations regarding the degree of satisfactions for some criteria or
requirements, under predetermined conditions. Metrics report selected measurement results over a period of
time in order to analyze trends and progress towards program goals. Both measurements and metrics may be
used to address two categories of the information: 1) indicators about the process being implemented by
organizations in order to develop, integrate, deploy, and maintain IT products; and 2) indicators about the IT
product itself in terms of how well it works, how suitable it is for business process and enterprise constraints,
and how dependably it will serve for its user’s needs and protecting the enterprise IT with which it interfaces.
Both the product status and the EF use a color coding scheme to indicate the degree of progress measured for
what concern the metric addresses.
    Green – Satisfactorily completed
    Amber – Completed with shortfall(s) to be corrected, or unable to be planned based on currently available
    information, yet not constituting such a serious constraint that will likely cause the schedule to slip
    Red – Showstopper detected or deficiencies documented to be corrected
    Aqua – Planned for resolution, but premature to be addressed as part of the current status
    Yellow – Assessed as not essential

Test and Independent Verification and Validation                                         Version 0.6
Business Process Guide                             34                                  April 05, 2012
The assignment of color code to the metrics should be considered a snapshot along the continuum of the
SDLC. However, lessons learned have shown that indicators or checkpoints that are red are serious issues
that must be resolved for the IT product program health.

10.1 Process Measurements and Metrics
Process Measurements deal with selected indicators that address progress of organizational elements in
fulfilling assigned responsibilities. Those indicators have been selected, because they have demonstrated
predictive values in terms of delivering IT products that meet user’s needs on time and with high quality.
Annex C. Status Report Template includes the product status report that will be used as metrics. Annex D -
Process Measurements and Metrics Checklist also explains the questions that will be examined to determine
those metrics.

10.2 Product Measurements and Metrics
Product Measures address specific criteria and requirements levied upon the suppliers of IT products.
Product Measures vary in importance, with the more critical and essential criteria and requirements rating the
highest priority for measurements. High priority measures must be supplemented and complemented by
additional lower level measurements, which do the following:
    1) Provide analytic detail
    2) Help isolate fault for corrective actions
    3) Address explicit lower level requirements, which are not considered absolutely critical or essential.
Annex E. Product Measurements and Metrics includes the EF applying to all iEHR products. The EF
display shows the degree to which the product demonstrated its performance, characteristics, or features that
satisfy each type of criteria (requirements). Metrics regarding the essential or critical criteria and
requirements must be satisfied for the system to be deployed.


11 Risk and Issue Management
The T&IVV BP is highly focused on risk management from the perspective of providing objective empirical
and analyzed information needed by each product team and the decision-makers involved with each product.
The approach of the T&IVV community intends to complement the overarching program management policy
and guidance in the area of risk and issue management. The T&IVV community will focus on providing
information regarding areas that have historically delayed or prevented products from deploying to serve the
needs of users as intended in terms of quality, timeliness, maintainability, sustainment, and affordability.
The T&IVV BP will lead to metrics dealing with process risk and product risk.
       Process risk regards points that if not well defined and managed can lead to less than satisfactory IT
        implementation for the enterprise. Examples include clarity in the fundamental CONOPS for a
        product, feasibility of the product to integrate into the EA in an effective manner, and planning
        sufficient resources for T&IVV (e.g., labor, facilities, tools, time, and planning).
       Product risks regard the degree to which the system can be demonstrated as effective, suitable, and
        survivable for the intended implementation. Examples of questions that must be answered about
        each product include EBFs, enterprise or program-level criteria that must be achieved by the product
        and engineering constraints that must be satisfied by all products.
In the majority of cases in which the T&IVV community themselves manage issues, the T&IVV community
will be focused on how well the product is engineered to satisfy requirements. When a product does not

Test and Independent Verification and Validation                                       Version 0.6
Business Process Guide                             35                                April 05, 2012
satisfy a requirement, the defects will be tracked by the T&IVV community until it is successfully regression
tested, provided that the defects are deemed of sufficient priority by the user community and product staff to
warrant correction before deployment. In other cases, the T&IVV community will communicate issues that
they discover to the program and product issue manager(s). The T&IVV community may share issues with
the product team such as scheduling, training, and preparation.
Members of the T&IVV community will tailor the scope of T&IVV activities for a particular HIT product
based on their RA of that product. They will conduct an initial T&IVV Scope Assessment to the level and
varieties of independent testing and other inspections that they will perform. The intake assessment will
consider the state of preparation for delivering the product as scheduled, the complexity and urgency
associated with the product, the availability of successfully experienced personnel working on the product,
and other matters. The expected independent testing and inspections for an HIT product will be documented
in its TEMP annex. The T&IVV community may update the intake assessment as a new RA when
significant new information arises during the SDLC or decision-makers need an updated RA as input for a
decision. Annex G. Intake Assessment / Risk Assessment provides a generic intake assessment form.


ISSUED BY:




______________________________________                               ____________
GREGORY G. GUERNSEY, P.H.D                                       Date
Director, Test & Independent Verification and Validation (T&IVV)
DoD/VA Interagency Program Office
Medical Health System (MHS)




______________________________________                               ____________
MARILYN HODGE                                                        Date
Director, Testing Service
Enterprise Systems Engineering (ESE)
Office of Information and Technology (OIT)




Test and Independent Verification and Validation                                       Version 0.6
Business Process Guide                             36                                April 05, 2012
Annexes

Annex A. Terms of Reference
Computer Security (COMPSEC) – COMPSEC addresses the protection of computer systems and
information from harm, theft, and unauthorized use.
Constraints - Restrictions or boundaries impacting overall capability, priority, and resources. Constraints are
factors that limit the team's options, such as limits on resources, budget, schedule, scope, system, design.
Continuity of Operations Plan (COOP) - A predetermined set of instructions or procedures that describe how
an organization’s essential functions will be sustained for a predetermine time as a result of a disaster event
before returning to normal operations. The goal of a COOP is to ensure that required IT services and
facilities can be recovered, in the event of a significant or catastrophic loss, within required and agreed
business time-scales.
Database (DB) - A collection of interrelated data stored together in one or more computerized files.
Deployment Decision Authority (DDA) - The DoD Deputy Chief Management Officer serves as DDA in
coordination with the VA CIO. The DDA exercises oversight for iEHR IT products.
Derived Functional Requirements (DFR) - A lower-order requirement regarding considerations such as
inputs, outputs, calculations (derivative), external interfaces, etc., that is analytically identified based on
either what the system is expected to do ( irrespective of nonfunctional requirements) or a functional-level
capability or business rule identified by an organization which is necessary to solve a problem or achieve an
objective.
Derived Technical Requirements (DTR) - A requirement that is derived from functional requirements or
other higher-order technical requirements, which constrains the design and often becomes specifications for
an IT system.
Effectiveness - The extent to which the goals of the system are attained, or the degree to which a system can
be elected to achieve a set of specific mission requirements.
Empirical - Derived from or guided by experience or experiment. Also, provable or verifiable by experience
or experiment.
Essential Business Function (also known as Critical Mission Function (CMF)) - Software functionality that
begins with input from an external system or a human operator that produces an information product, either
stored data or some form of output, which is considered essential for one or more users to perform a task that
has been identified as critical to the business process for accomplishing the mission.
Functional Requirements - Functional requirements capture the intended behavior of the system. This
behavior may be expressed as services, tasks or functions the system is required to perform.
Gold Disk - An application developed by Defense Information System Agency (DISA) Field Security
Operations (FSO) to assist system administrators in securing systems and applications in accordance with the
guidance documented in the DISA Security Technical Implementation Guides, checklists and applicable
Center for Internet Security (CIS) benchmarks. This application provides the ability to detect installed
products, identify and remediate applicable vulnerabilities and generate a report that can be used for asset
registration and findings.
Human System Integration (HSI) – Techniques that promote usability, especially for achieving satisfactory
user-friendliness and sufficient mission utility.


Test and Independent Verification and Validation                                       Version 0.6
Business Process Guide                             A-1                               April 05, 2012
Information Assurance (IA) - Information operations that protect and defend information and information
systems by ensuring their availability, integrity, authentication, confidentiality, and non-repudiation. This
includes providing for the restoration of information systems by incorporating protection, detection, and
reaction capabilities.
Intake Assessment – It is a form that captures key data necessary to perform the Criticality Analysis and Risk
Assessment (CARA). The IV&V agent in collaboration with the product team and the FP community
completes the Intake Assessment form early in the product development life cycle once the CONOPS is
developed and the following information documented:
       CONOPS
       Essential Business Functions
       Criteria
       Key Performance Parameters
       Information Exchange Requirements
       Standards (EA, Data)
Interoperability (IOP) - The ability of systems, units or forces to provide data, information, materiel and
services to and accept the same from other systems, units or forces and to use the data, information, materiel
and services so exchanged to enable them to operate effectively together. Information Security and National
Security Systems (NSS) interoperability includes both the technical exchange of information and the end-to-
end operational effectiveness of that exchanged information as required for mission accomplishment.
Interoperability is more than just information exchange. It includes systems, processes, procedures,
organizations, and missions over the life-cycle and must be balanced within formation assurance
Issue – 1) A major concern in terms of seriousness of the consequences if the matter is not resolved about
which stakeholder organizations disagree. 2) A challenge or obstacle that must be overcome to ensure that
the intended progress within or results of a process can be achieved. 3) In the context of COI, a question,
which when answered favorably, indicates that a system will be effective, suitable, or survivable in an
essential manner. If a COI is answered negatively, there is a significant likelihood that the system will not be
deployed.
Independent Verification and Validation (IV&V) - Systematic evaluation of a software product as part of a
total system, its associated provisions for its supportability, and the related Software Development Life Cycle
activities. 1) The process of determining whether the requirements for a system or component are complete
and correct, the products of each development phase fulfill the requirements or conditions imposed by the
previous phase, and the final system or component complies with specified requirements. 2) For Product
Development, the service that evaluates applications by performing independent and impartial product
reviews, as well as functional testing. IVV ensures the functional integrity and technical correctness of the
software and documentation.
Key Performance Parameter (KPP) - Attributes or characteristics of a system that are considered critical or
essential to the development of an effective iEHR product capability and those attributes that make a
significant contribution to the key characteristics.
Key System Attribute (KSA) - System attributes considered most critical or essential for an effective iEHR
product capability but not selected as KPPs. KSAs provide decision-makers with an additional level of
capability prioritization below the KPP but with senior sponsor leadership control. KSAs do not apply to the
NR-KPP.
Methodology - A set or system of methods, principles, and rules for regulating a given discipline, as in the
arts or sciences.
Measure of Effectiveness (MOE) - Metrics used to assess the overall degree of mission accomplishment of a
system when used by representative personnel in the environment planned or expected for operational
Test and Independent Verification and Validation                                        Version 0.6
Business Process Guide                             A-2                                April 05, 2012
employment of the system considering organization, training, doctrine, tactics, survivability, vulnerability,
and threat.
Measure of Survivability (MOSurv) - Metrics used to assess the susceptibility, vulnerability, and
recoverability of a system and are tightly coupled to both operational effectiveness and suitability
Measure of Suitability (MOSut) - Measures used to assess the degree to which a system can be satisfactorily
placed in field use, with consideration given to reliability, availability, compatibility, transportability,
interoperability, wartime usage rates, maintainability, safety, human factors, manpower supportability,
logistics supportability, documentation, environmental effects, and training requirements.
Network System Management (NSM) – It is a combination of hardware and software used to monitor and
administer a computer network.
Critical Operational Issue (COI) – 1) A key Operational Effectiveness (OE) and/or Operational Suitability
(OS) issue (not a parameter, objective, or threshold) that must be examined in Operational Test and
Evaluation (OT&E) to de­termine the system’s capability to perform its mission. A COI is normally phrased
as a question that must be answered in order to properly evaluate OE (e.g., Will the system detect the threat
in a combat environment at adequate range to allow successful engagement??) or OS (e.g., Will the system
be safe to operate in a combat environment??). A COI may be decomposed into a set of Measures of
Effectiveness and/or Measures of Performance, and Measures of Suitability. 2) A question, which when
answered favorably, indicates that a system will be effective, suitable, or survivable in a very important
manner. (If a critical operational issue is answered negatively, there is a significant likelihood that the system
will not be acquired for fielding. Critical operational issues are defined by functional proponents who may be
called combat developers in the Army. In the Army, the functional proponents will explain the scope,
rationale, and criteria for each critical operational issue.) 3) The key operational effectiveness, suitability, or
survivability issues that must be examined in OT&E to determine the system's capability to perform its
mission
Principle - 1) A fundamental truth or proposition that serves as the foundation for a system of belief or
behavior or for a chain of reasoning. 2) A rule or belief governing one's personal behavior.
Process - A set of interrelated activities, which transform inputs into outputs.
Product - The result of Research, Development, Test, and Evaluation (RDT&E) in terms of hardware or
software being produced (manufactured). Also known as an end item. The item stipulated in a contract to be
delivered under the contract (i.e., service, study, or hardware).
Program - A group of related projects managed in a coordinated way. Programs usually include an element
of ongoing work. Collection of Projects and Activities that are planned and managed together to achieve
related Objectives. A program is a group of related projects that are planned, managed, and coordinated
together to maximize benefits that would otherwise not be available from managing the projects individually.
A program may include overarching capabilities and services that are necessary but not within the scope of
the individual projects.
Risk - A measure of the inability to achieve program objectives within defined cost and schedule constraints.
Risk is associated with all aspects of the program, e.g., threat, opportunity, technology, design processes, or
Work Breakdown Structure (WBS) elements. It has two components: the probability of failing to achieve a
particular outcome, and the consequences of failing to achieve that outcome.
Standards - Mandatory requirements employed and enforced to prescribe a disciplined uniform approach to
software development, that is, mandatory conventions and practices are in fact standards.
Suitable - The capability of a product to provide an appropriate set of functions for specified tasks and user
objectives.


Test and Independent Verification and Validation                                           Version 0.6
Business Process Guide                             A-3                                   April 05, 2012
Survivable - The ability of a system and its crew to avoid or withstand a man-made hostile environment
without suffering an abortive impairment of its ability to accomplish its designated mission.
Sustainment - The provision of personnel, training, logistics, and other support required to maintain and
prolong operations or combat until successful accomplishment or revision of the mission or of the national
objective.
Technical Requirement (TR) - The standards, supportability, quality and reliability which must be adhered to
in the design (how the system will be constructed).
Test - 1) A formally organized observation of an Automated Information System (AIS), AIS subsystem, AIS
component, or AIS-related equipment item under predetermined conditions that makes an empirical
comparison of the results observed to the expected or hypothetical results delineated within a pre-established,
formal document (example: Test Case). The proof or disproof of the expected or hypothetical results
provides a subjective basis for the test’s passage or failure. 2) Any program or procedure which is designed
to obtain, verify, or provide data for the evaluation of any of the following: a) progress in accomplishing
developmental objectives; b) the performance, operational capability and suitability of systems, subsystems,
components, and equipment items; and c) the vulnerability and lethality of systems, subsystems, components,
and equipment items.
Training - The acquisition of knowledge, skills, and competencies as a result of the teaching of practical
skills and knowledge that relate to the T&IVV competencies.




Test and Independent Verification and Validation                                        Version 0.6
Business Process Guide                             A-4                                April 05, 2012
Annex B. Expanded Roles and Responsibilities

Note: The following roles and responsibilities have been prepared in a generic manner, because
organizations constantly reorganize. A TIWG should identify the personnel performing the following roles
and responsibilities for their particular iEHR IT product.
B.1.  iEHR IT Providers. iEHR IT providers include the following:
 COTS vendors of IT products and licenses;
 Common Service(s) IT developers;
 Contractor IT developers;
 Contractor IT integrators;
 Distributed IT developers;
 Open Source products providers of IT products and services;
 GOTS providers of IT products and services; and
 Rapid Initiative (RI) IT developers.

B.1.1. All iEHR IT providers share, demonstrate or present documentation that their product satisfies the
    requirements as expected by iEHR enterprise. How they fulfill this responsibility must be tailored based
    on the risks associated with each iEHR IT product.
B.1.2. They must partner with the relevant TIWG to complete any necessary testing and inspection, which
    involves collaboration in planning testing and sharing technical expertise about the SUT. This means
    that if the provider is a contractor developer or integrator, they must execute the Government-approved
    Quality Surveillance Plan (QSP) and satisfy the Government’s Quality Surveillance Assurance Plan
    (QASP). The QSP and QASP should include operationally realistic testing, careful product inspections,
    verification and validation (V&V) by IPO representatives and IV&V. Depending on the iEHR IT
    product, they might be required to provide technical advice and information regarding observations
    during OT&E.
B.1.3. If an iEHR IT product provider means to supply a COTS or GOTS, the iEHR IT product needs to be
    accompanied by credible evidence that it satisfies a relevant EF, as well as its warranties, certificates,
    licensing arrangements, and support provisions. These stipulations will be required in relevant
    agreements, e.g., contract, Service Level Agreement (SLA), etc.
B.1.4. All iEHR IT providers should assist in fault isolation, correction, and regression testing for reported
    problems, given priorities that those problems must be corrected.
B.1.5. iEHR enterprise must ensure that a potential iEHR IT product performs as required, integrates with
    legacy systems without decreasing the performance of those legacy systems, insert no unacceptable
    vulnerabilities, and can be supported in a cost-effective manner. All iEHR IT product providers must be
    prepared to facilitate this ultimate responsibility of iEHR enterprise regarding its iEHR IT products by
    sharing technical insight, assisting in fault isolation, and participating in detailed system performance
    analysis.
B.2.    T&IVV Collaborators. Many collaborators within iEHR enterprise support the T&IVV BP in
    essential ways:
B.2.1. Contracting Officers and Contracting Specialists:
a. They ensure that appropriate QSP and QASP provisions are incorporated into contracts.1
b. They support the procurement of contractor services needed for IQA.2

1
    A QSP and a QASP may be combined into a single coordinated vend
Test and Independent Verification and Validation                                       Version 0.6
Business Process Guide                             B-1                               April 05, 2012
B.2.2. Cost Estimation personnel:
          a. They ensure that cost estimates for iEHR IT products to be provided by commercial vendors
             include the following:
                    Test environments;
                    1)
                    Any development needed of data to be processed by the system under test;
                    2)
                    Existing interfaces or interface emulators that support the effort;
                    3)
                    Automated test tools and SCQC tools;
                    4)
                    Testing personnel;
                    5)
                    When needed, collaboration with the TIWG that may be chartered for the particular
                    6)
                    product.
          b. They support the development of T&IVV cost estimates as requested.
B.2.3. Enterprise Architecture (EA) personnel3:
          a. They provide appropriate views of a iEHR IT product in terms of:
                    1) Data processing for the intended business support;
                    2) Physical laydown of the system including major infrastructure components; and
                    3) Connectivity of the system in its network environment.
          b. They verify that all Information Exchange Requirements (IERs) have been identified as needed
             for authoritative data to be processed by the product or by other interoperating products.
          c. They document all data to be processed by the system and each data element’s authoritative
             source based on detailed designs by the product team.
B.2.4. Chief Technology Officers (CTO) Advisory Board* members
          a. They establish the technical standards to which all iEHR IT products must adhere.
          b. They provide the vision of current and emerging iEHR IT that shapes the configuration of the
             DTC(s).
               * The final name for this organization has not been identified.
B.2.5. Functional Proponents (FPs):
          a. They develop the CONOPS or Business Case that the IQA community uses to identify factors
             and conditions for DOE.
          b. They identify needed capabilities and use cases for them.
          c. They validate the COIC developed by T&IVV personnel in coordination with the TIWG.4
          d. They work with TIWGs to ensure that EBFs with scoring criteria have been clearly defined and
             validated. For example, appropriate EBFs have been identified to measure the functional
             capabilities described in the relevant Use Cases.
          e. They validate and ensure all derived functional or technical requirements are placed into an
             automated requirements management tool for configuration control, traceability, and
             accountability.
B.2.6. Process Improvement (PI) personnel:



2
  The specific language for the Performance Work Statement (PWS) and the conduct of the Independent Government Cost Estimate
(IGCE) remain the responsibility of iEHR enterprise T&IVV Government personnel in the Program Offices and the T&IVV
Division.
3
  Currently, EA personnel would include the CTO and other personnel to be identified.
4
  FPs do this by ensuring the criteria for the COIs focus on essential requirements for the iEHR IT product such as essential business
requirements and that the COIC are consistent with approved requirements.
Test and Independent Verification and Validation                                                          Version 0.6
Business Process Guide                                    B-2                                           April 05, 2012
        a. They assist the T&IVV community by providing current management constructs for
           consideration.
        b. They coordinate with the T&IVV community to identify and implement substantive metrics for
           Return of Investment (ROI) regarding T&IVV BP components.
B.2.7. Program Managers (PMs):
        a. They provide overall guidance for their programs, balancing across multiple projects and
           products the risks and resource allocation related to cost, schedule, and performance.
        b. They charter WIPTs needed to assist the PMs in accomplishing high-level tasks, e.g., systems
           integration, cost and requirements, T&IVV, etc., programs.
        c. They assist in resolving T&IVV issues that may arise as required.
        d. They sanction TIWGs as needed to coordinate and execute TEMPs.
        e. They or their appointed representative chair T&IVV WIPTs and TIWGs
        f. They safeguard that sufficient resources have been allocated for their TSQA, specifically
           T&IVV.
B.2.8. Interagency Program Office (IPO) staff:
        a. Assist the PM in planning, coordinating, executing, monitoring, and reporting in regards to the
           program and its subordinate projects and products.
        b. Assist all WIPTs and TIWGs as needed pertaining to their respective specialties.
B.2.8.1. Configuration Management (CM) personnel:
        a. Provide the components and documentations of SUTs under configuration control for IQA
        b. Maintain a IPO’s library of iEHR IT product documentation including T&IVV documentation
B.2.8.2 Deployment personnel:
        a. Coordinate for installation and training of new products at user test and field sites.
        b. Facilitate site surveys in preparation for T&IVV
B.2.8.3 Engineers:
        a. Develop the DTRs, especially the CTPs
        b. Perform V&V of contractor WBS
        c. Coordinate information exchange control documentation
        d. Conduct Preliminary Design Reviews (PDRs), Critical Design Reviews (CDRs), and Test
           Readiness Reviews (TRRs)
        e. Facilitate fault isolation and corrective action analysis.
        f. Identify the compatibility requirements.
        g. Ensure that product interfaces work as designed prior to product acceptance for Government
           IQA
B.2.8.4 Human Resources personnel:
        a. Recruit qualified T&E, QA, and Engineering personnel in sufficient quantity and quality to
           manage a program’s TSQA.
        b. Track and support that T&E, QA, and Engineering personnel maintain the currency of their
           professional development training.
B.2.8.5 Information Assurance (IA) personnel:
        a. Ensure the SUT can satisfy its applicable IA Certification and Accreditation (C&A) standards.
        b. Plan for penetration and COOP testing
B.2.8.6 Logistics personnel:

Test and Independent Verification and Validation                                         Version 0.6
Business Process Guide                             B-3                                 April 05, 2012
        a. Keep Government Furnished Equipment (GFE) on hand receipt or amended to contracts
        b. Order ancillary equipment and supplies needed for TIWG operations, e.g., white boards,
           markers, paper, etc.
B.2.8.7 Product Coordinators:
        a. Participate in TIWGs for individual products focusing on schedule, coordinating with the
           provider, and required programmatic documentation.
        b. Request the appropriate support of IPO personnel as needed.
B.2.8.8 Project Managers:
        a. Participate in TIWGs for individual products focusing on schedule, coordinating with the
           provider, and required programmatic documentation.
        b. Help to resolve issues above the authority of Product Coordinators.
B.2.8.9 Quality Assurance (QA) personnel:
        a. Assist in documenting and inspecting processes pertaining to IPO operations, particularly the
           T&IVV BP as documented in the T&IVV high-level CONOPS and this process guide.
        b. Assist in implementing the QASP for vendor products.
B.2.8.10 Requirements Managers and Analysts:
        a. Ensure that complete Requirements Traceability Matrices (RTMs) exist for all products.
        b. Work with the TIWGs to ensure that testable requirements have been validated and maintained
           with accountability
B.2.8.11 Risk Management personnel:
        a. Analyze risks across multiple projects to make recommendations for risk mitigation.
        b. Provide input to T&IVV WIPTs and TIWGs for risk control.
B.2.8.12 Resource Managers and RM staff:
        a. Input into the Program Objective Memoranda (POMs) for resources to enable a cost-effective
           T&IVV BP for the IPO.
        b. Directly participate in planning and coordinating annual budgets for the T&IVV BP, including
           the involvement of external agencies.
B.2.8.13 Sustainment (SUST) personnel:
        a. Coordinate the support needed to sustain a product in live operations, e.g., licenses, continuous
           training, help desks, etc.
        b. Plan modernization and refresh of deployed products.
        c. Ensure IQA Manager receives notification about changes that should be tested or regression
           tested before deployment.
B.2.8.14 Training (TNG) personnel:
        a. Ensure the training of user personnel and user test players as required;
        b. Ensure the training of T&IVV personnel as required;
        c. Develop the Training Support Package (TSP)
B.2.9. Senior Leaders and support staff from the DoD, the military Services, VA, other Government
    agencies, and selected commercial organizations such as the following:
B.2.9.1 iEHR Advisory Board. Disseminate guidance regarding process, procedures, funding, and other
matters related to the planning, development, integration, procurement, testing, deployment and sustainment


Test and Independent Verification and Validation                                        Version 0.6
Business Process Guide                             B-4                                April 05, 2012
of the iEHR family of products. The Advisory Board my provide guidance to IPO related to such areas as
manpower, personnel qualifications, enterprise standards, scheduling, and others as needed.
B.2.9.2 Chief Information Officers (CIOs). For the Component, command, or other organization, a CIO
serves as the most senior executive in an enterprise responsible for the processes and practices to support IT
enterprise goals with increased information flow and accessibility, plus integrated systems management.
B.2.9.3 Combatant Commanders (COCOMS). Among their overall responsibilities for their assigned
theater, COCOMS (through their staffs) identify Joint Urgent Operational Needs (JUONs), which may on
occasion result in a RI project for the iEHR. They also decide whether new products may be deployed
within their theater and control the access of IPO and T&IVV personnel into the theater for forward
evaluation of prototype products.
B.2.9.4 Deployment Decision Authority (DDA*). The DoD Deputy Chief Management Officer serves as
DDA in coordination with the VA CIO. The DDA exercises oversight for iEHR IT products.
* The final name for this role has not been identified.
B.2.9.5 Department Secretaries. They sanction and direct the coordination and collaboration of IT programs
affecting more than one Department of the Executive Branch.
B.2.9.6 IPO Executive Director. The Executive Director from the DoD, along with the Deputy Executive
Director from VA, exercises direct management oversight of the planning, budgeting, prioritizing, in policies
executed by the IPO overall.
B.2.9.7 Surgeons General (SGs). The Surgeons General of the military Services provide advice and
assistance to their respective Chiefs of Staff on all health care matters pertaining to their Service and its
military health care system. He or she is responsible for development, policy direction, organization and
overall management of an integrated Service-wide health care system and medical materiel developers for
the Service. These duties include formulating policy regulations on health service support, health hazard
assessment and the establishment of health standards. The SG is assisted by a Deputy Surgeon General who
typically coordinates regarding iEHR IT at the Advisory Board level.
B.2.9.8 DoD Test and Evaluation Authorities. DoD has two T&E authorities. Each of which will serve in
iEHR advisory board. In order to help facilitate a joint DoD-VA iEHR solution set.
        a. The Director, Operational Test and Evaluation (DOT&E) reports directly to the Secretary and
           Deputy Secretary of Defense, with the following responsibilities:
              Development and dissemination of DoD OT&E policy and procedures
              Review and analysis of OT&E results for DoD acquisition programs
              Independent assessments of OT&E activities as it pertains to budgetary and financial issues
               for the Secretary of Defense (SecDef), the Under Secretary of Defense, Acquisition,
               Technology and logistics (USD(AT&L)), and Congress
            Oversight to ensure the adequacy of OT&E for major DoD acquisition programs and to
               confirm the operational effectiveness and suitability of the defense systems in combat use.
        b. The DASD DT&E reports up the chain to the USD(AT&L) regarding how well a product
           performs in terms of its requirements.
B.2.10. User Representatives (User Reps):
        a. User representatives consult for their Component as required for the FPs to understand current
           CONOPS and how a proposed iEHR IT product would be expected to interact with the business
           process of their mission.
        b. They provide access to user-controlled resources including, but not limited to, test players,
           facilities, network connections, training exercises, platforms (ambulances, ships, etc.).

Test and Independent Verification and Validation                                       Version 0.6
Business Process Guide                             B-5                               April 05, 2012
B.3.    Integrated Quality Assurance Teams. The following types of personnel plan, coordinate, execute,
    analyze, and report regarding testability and empirical results about an iEHR IT product’s performance,
    features, and characteristics:
B.3.1. Developmental Test and Evaluation (DT&E) Services Task Order (T.O.) contractor team.
This team should also contribute in the following ways:
        a. Assess validity of assumptions and conclusions from the Analysis of Alternatives (AOA);
        b. Support the identification and description of design technical risks;
        c. Measure or assess how a SUT performs in terms of the EF below the level of OT&E MOEs and
           MOSs.
        d. Assess progress toward meeting COIC;
        e. Provide data and analysis in support of the decision to certify the system ready for OT&E (also
           known as IOC in VA);
        f. Support an information systems security certification prior to processing classified or sensitive
           data;
        g. For competitive prototyping, measures Capabilities and Limitations (C&L) of alternative
           concepts and design options;
        h. Ensure a standards conformance certification.
B.3.2. Government IQA Managers within IPO. These Government- managers oversee, facilitate, and
    control the day-to-day T&E BP for their Program Offices. They accomplish the following:
        a. Guide the development of and implement their Program Managers’ (PMs’) strategies for
           satisfying the statutory and regulatory requirements related to the IT&E of iEHR IT products.
        b. Balance the needs for empirically based facts as input to decisions with the needs of cost-
           effective TSQA procedures and other programmatic needs.
        c. In consultation with the T&IVV community and the organizations assigned to CDTE (a.k.a
           Integrated Development Environment in VA), plan, coordinate, and help in procuring,
           implementing, and sustaining production-representative T&E infrastructure in environments
           enabling sound T&IVV of their support iEHR IT products.
        d. Serve as primary participants in developing budget inputs for T&IVV, including licenses and
           other expenses for automated test tools as well as modeling and simulation tools and support.
        e. When delegated by the PM, chair T&IVV WIIPT and TIWGs, facilitating and coaching T&IVV
           WIPT and TIWG participants as required.
        f. Participate in design reviews, technical reviews, and other technology assessments
        g. Serve as Task Managers for the DT&E Services Task Order.
        h. Monitor and provide supplemental guidance for the IQA of their PMO’s products in regards to
           specific T&IVV CONOPS procedures and to emerging authoritative and applicable practices
           and sub-practices for areas such as, but not limited to, planning, metrics, coordination, test
           conduct, data analysis, reporting, T&E budgeting, briefing, Risk Management Plan (RMP)
           development, test site coordination, etc.
        i. Ensure the conduct of TRRs to graduate a SUT from one phase to another.
        j. Develop the implementation guidance within their IPO for the T&IVV CONOPS and to prepare
           their TIWGs for the IV&V Checkpoints identified within this guide
B.3.3. Information Assurance (IA) personnel from within the IPO and from IA Directorate:
        a. Advise TIWGs in preparation for IA C&A standards and required penetration testing.
        b. Facilitate resolution of security defects found by SCQC or later IA scans.
B.3.4. IOP Certification (Cert) support personnel from the Joint Interoperability Test Command (JITC):
        a. Conduct independent analysis and provide reports for achieving IOP Certifications of iEHR IT
           products.
Test and Independent Verification and Validation                                     Version 0.6
Business Process Guide                             B-6                             April 05, 2012
        b. Collaborate in TIWGs regarding the empirical demonstration of the Net-Ready Key Performance
           Parameter / Essential Business Requirements (NR-KPP).
B.3.5. Operational Test and Evaluation (OT&E) personnel from organizations sanctioned by the military
    Services for OT&E:
        a. Provide independent OT&E of iEHR IT products reporting to the PM and Milestone Decision
           Authority (MDA), plus to the Director, OT&E (DOT&E) for programs under the oversight his
           office.
        b. Participate in TIWGs.
        c. Conduct IV&V as CE.
B.3.6. Quality Assurance Surveillance Plan (QASP) executing personnel from among IPO staff.
        a. Implement the QASP as required, particularly V&V by the Engineering staff of designs and
           products.
        b. Incorporate T&IVV into the IPO’s overall Quality Assurance (QA), Risk Management, and CM
           strategies, plans, and execution.
B.4.    Independent Verification and Validation (IV&V) Agents:
When assigned as an IV&V Agent for an iEHR IT product, that Agent automatically becomes a member of
the relevant TIWG. An Agent is empowered to report progress, issues, and concerns in the T&IVV Status
reports. An Agent will identify issues and concerns within the supported TIWG, but may also elevate those
issues and concerns to the senior Government IQA Manager for a program, the Functional Lead of the
T&IVV Division, the T&IVV Director, or the parent organization. The IV&V Agent in collaboration with
the product team and the FP community completes the testing intake assessment form to determine the risk
level and the scope of IV&V testing required for the product.
B.4.1. Technical Agents.
        a. Conduct SCQC.
        b. Review the technical clarity, accuracy, and consistency of iEHR IT product documentation.
        c. Validate that valid relationships are defined between the test plans, designs, test cases, and
           procedures for test types and documents subject to IV&V test analysis.
        d. Validate that the planned regression testing is sufficient to identify any unintended side effects or
           impacts of changes on other aspects of the system.
        e. Verify that a sufficient number and type of case scenarios are used to ensure comprehensive but
           manageable testing and that test are run in a realistic, real-time environment.
        f. Verify that test scripts are complete, with step-by-step procedures, required preexisting events or
           triggers, and expected results.
        g. Review all defined processes and product standards associated with the system development,
           identify gaps if any.
        h. Verify that the test process achieves an appropriate level of test coverage, test results are
           verified, the correct code configuration has been tested, and that the tests are appropriately
           documented, including formal logging of errors found in testing.
        i. Analyze past project performance as an input into identifying and making recommendations as
           well as providing input into lessons learned for the project
        j. Provide assessment reports related to the technical aspects of the project
B.4.2. Functional Agents.
        a. Review T&IVV plans, reports, and online test case results for coverage of the EF.
        b. Review the TEMP for completeness and consistency.
        c. Ensure issues and concerns are reported in the System Status Report.

Test and Independent Verification and Validation                                        Version 0.6
Business Process Guide                             B-7                                April 05, 2012
        d. Evaluate the plans, requirements, environment, tools, and procedures used for unit testing system
           modules.
        e. Evaluate the level of test automation, interactive testing, and interactive debugging available in
           the test environment.
        f. Verify that the test process achieves an appropriate level of test coverage, test results are
           verified, the correct code configuration has been tested, and that the tests are appropriately
           documented.
        g. Review all defined processes and product standards associated with the system development,
           identify gaps if any.
        h. Provide assessment reports related to the functional aspects of the project

B.4.3. T&IVV WIPT. A T&IVV WIPT constitutes an empowered organization chartered by an iEHR
Program Managers (PMs) to develop, document the program-level TEMP for each program or FoS/SoS that
add together to be a program. A T&IVV WIPT is comprised of Action Officer (AO) representatives
designated by stakeholder organizations. A T&IVV WIPT typically monitor, guides, and works with
multiple TIWGs addressing specific products in an FoS of the program or SoS within a single business area.
The PM or the PM’s designated representative chairs, establishes the agenda for, schedules, facilitates, and
ensures minutes are produced regarding T&IVV WIPT sessions. A T&IVV WIPT does not vote to settle
issues. The Chair makes decisions with consideration of timely input received from T&IVV WIPT
participants. Ideally, the decisions represent consensus. If a T&IVV WIPT member non-concurs, the
member will submit written rationale to be included with the T&IVV WIPT minutes and will formally raise
the issue to their parent organization’s leadership (with courtesy copy to the T&IVV WIPT Chair). Without
clearly convincing need or significant new information, absence or lack of participation of a T&IVV WIPT
member does not constitute a rationale to revisit old business.
A T&IVV WIPT:
            Assists the PM to plan, facilitate, identify needed resources, monitor, and keep on-track T&IVV
             for their product line of interest
            Coordinates the overarching TEMP for a FoS or SoS with their parent organizations, monitors
             the execution of the TEMP, and when needed, assist TIWGs to resolve issues regarding T&IVV
             beyond the empowered authority of a TIWG
            Represents the T&IVV community and other stakeholders, in regard to T&IVV matters, to any
             other higher-level acquisition decision bodies.
            Calls upon SMEs from their parent organizations when needed to substantively contribute to the
             deliberations of the T&IVV WIPT
            Perform the roles and responsibilities identified in the T&IVV WIPT Charter and the relevant
             TEMP.
B.4.4. TIWG. A TIWG constitutes an empowered organization, chartered by the PM and comprised of
officially designated representatives of stakeholder organizations. The organizations who participate in a
T&IVV WIPT also often provide participants to a TIWG, but usually at a lower-ranking, action-officer level.
A TIWG documents the TEMP, product TEMP annex, or other equivalent document for its assigned iEHR
product. A TIWG coordinates that TEMP with their parent organizations and manages its execution. A
TIWG is formed to handle the details specific to a iEHR IT product or related set of iEHR IT products under
a T&IVV WIPT. A TIWG is chaired by the PM’s Senior Test and Evaluation (T&E) Manager, Senior
Quality Assurance Manager, or other designated representative. A TIWG does not vote to settle issues. The
TIWG Chair makes decisions, as empowered by the PM, with consideration of timely input received from
TIWG participants. Ideally, the decisions represent consensus. If a TIWG member non-concurs, the
member will submit written rationale to be included with the TIWG minutes and will formally raise the issue
to their organization’s representative in the T&IVV WIPT (with courtesy copy to the TIWG Chair). Without

Test and Independent Verification and Validation                                      Version 0.6
Business Process Guide                             B-8                              April 05, 2012
clearly convincing need or significant new information, absence or lack of participation of a TIWG member
does not constitute a rationale to revisit old business.
A TIWG …
            Assists the PM to plan, facilitate, identify needed resources, and oversee T&IVV for its assigned
             iEHR IT product
            Makes explicit their roles and responsibilities in a formally approved TEMP for each assigned
             iEHR IT product that might be deployed
            As individuals and teams, perform the roles and responsibilities identified in the TIWG Charter
             and the relevant TEMP.
B.5. C&A Authority. This individual reviews the results of computer security, IA and COOP, testing, along
with system documentation, in order to accredit the system as sufficiently safeguarded for implementation
within the enterprise.
B.6. Other Certification Authorities. In any case in which a new law or federal directive stipulates that an IT
product falls under the jurisdiction of certification authorities outside of DoD and VA, the product and
program teams must comply as directed. The authorities must provide their standards and criteria for
advance planning by program and product teams in order to achieve the mandatory certifications.




Test and Independent Verification and Validation                                       Version 0.6
Business Process Guide                             B-9                               April 05, 2012
Annex C. Status Report Template
  Table D-1 and Table D-2 illustrate snapshots of summary reports for process and product measurements respectively. Each cell in the table
  represents a specific checkpoint, which is linked to multiple indicators or measurements. These status reports use the following color codes (see
  sample color codes in the tables below):
  Green – Satisfactorily completed
  Amber – Completed with shortfall(s) to be corrected, or unable to be planned based on currently available information, yet not constituting such
  a serious constraint that will likely cause the schedule to slip
  Red – Showstopper detected or deficiencies documented to be corrected
  Aqua – Planned for resolution, but premature to be addressed as part of the current status
  Yellow – Assessed as not essential
  The online version of the summary reports supports a drill-down functionality, which allows a user to move from summary level information to
  more detailed data. The table below briefly describes, at a high-level, the actions that an online user may take to get detailed data:


                   User’s Action                                System Response

                      Mouse Rollover or Point                     Displays a brief explanation of the checkpoint

                      Mouse click                                 Displays the rationale for assigning the specified color for the
                                                                  checkpoint

                      Mouse click on Checklists Used              Opens the checklist(s) used to evaluate the artifact (s)
                                                                  associated with the checkpoint and determine the factors that
                                                                  resulted on the assigned status color

                     Mouse click on Supporting Artifacts          Opens the artifact(s) evaluated to determine the status of the
                                                                  checkpoint




Test and Independent Verification and Validation                                      Version 0.6
Business Process Guide                             C-1                              April 05, 2012
Table D - 1: T&IVV Process Measurement Status Report Template
System Summary (from CONOPS)                                                                         System Illustration
T&IVV Schedule                                                                                       T&IVV POCs

Issues:                                                                                              Risks: (always include the mitigation approach)
Concerns:

Start ----------------------------------------------+---------< Schedule Scale >------------------------------------------------------------ End
      Planning – Requirements Phases                            Development - Deployment Phases                                Sustainment Phase
               Checkpoints                                                 Checkpoints                                            Checkpoints

           <planned Date Range>                                       <planned Date Range>                                    <planned Date Range>

CONOPS                                             Developer Product Test Planning                                       User satisfaction surveys

EA                                                 Installation Documentation
Use Cases                                          Test Cases
SEP                                                C&A Data Collection and Analysis
ISP                                                Supportability Assessment
QSP/QASP                                           SCQC
TEMP
RA
ADCA
User and System Documentation Review


*Note: See Annex D for definition of acronyms


Table D - 2: T&IVV Product Measurement Status Report Template
System Summary (from CONOPS)                                                                         System Illustration

Test and Independent Verification and Validation                                                Version 0.6
Business Process Guide                               C-2                                      April 05, 2012
T&IVV Schedule:                                                                    T&IVV POCs

Issues:                                                                            Risks: (always include the mitigation approach)
Concerns:


         COT            Criteria         KPP              KSA    CTP    DFR             DTR            IER         Constraints        CMF
1. BP                   MOE            MOE                      MOP    MOE                                         MOP               MOE
2. IOP                  MOE            MOE               MOP    MOP                                 MOP
                        MOE                              MOP    MOP    MOP                                         MOP
3. DBM
4. Training             MOE            MOE                             MOP
5. Usability            MOE            MOE                      MOP    MOP                                         MOP
6. Supportability       MOE            MOE               MOP
7. NSM                  MOE                                     MOP                   MOP                          MOP
8. CS                   MOE                                            MOP                                         MOP
                        MOE            MOE
9. IA
                        MOE            MOE                             MOP
10. COOP




Test and Independent Verification and Validation                               Version 0.6
Business Process Guide                             C-3                       April 05, 2012
Annex D - Process Measurements and Metrics Checklist
                         Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


          Checkpoint/Summary                                 Indicators/Measurement                      Metric


CONOPS (Business Case)
A verbal or graphic statement, in broad
outline, of a commander's assumptions or
intent in regard to an operation or series of
operations.
                                                Business Needs
                                                What information technology capabilities must the
                                                system provide in terms of specific EBFs?

                                                What assumed levels of IT system performance must
                                                be achieved?

                                                What inter-organizational relationships must exist to
                                                satisfy the business needs?

                                                What are the requirements for the business process
                                                the system supports (current process gaps the IT
                                                solution will be addressing), including the categories
                                                of personnel who will interact with the system and
                                                how?
                                                User Community
                                                What are the categories of end users, administrators,
                                                maintainers, trainers, and customers for the
                                                information from the intended system, and how do
                                                they interact?
                                                Which organizations will have which operational and
                                                maintenance support responsibilities?

                                                Has an enterprise level approach been defined for
                                                both near term deployment needs and long term
                                                sustainment needs (e.g. Configuration Management,
                                                upgrades, documentation, etc…)?
                                                What operational assumptions apply for the system
                                                to achieve its utility and user-friendliness
                                                expectations?
                                                Implementation Plan
                                                What are the anticipated coordination requirements
                                                for deployment and initial training?

                                                Has the enterprise infrastructure and network
                                                connectivity scheme been outlined?

                                                Will there be any data migration, transition, and
                                                standardization requirements?




Test and Independent Verification and Validation                                                  Version 0.6
Business Process Guide                               D-1                                        April 05, 2012
                        Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


          Checkpoint/Summary                                Indicators/Measurement                      Metric


                                              Is there a funding strategy sufficient to cover the
                                              lifecycle of the solution?

                                              Support Approach
                                              Will there be any specific functional and technical
                                              sustainment-training requirements?

                                              Have aspects of the system concept of operations
                                              depending upon business process re-engineering
                                              been documented?
                                              Are enterprise-wide service and maintenance
                                              provisions defined?

                                              Are enterprise occupational specialties available to
                                              accomplish the new CONOPS?

Enterprise Architecture
Enterprise Architecture (EA) is a
conceptual blueprint that defines the
structure and operation of an organization.
The intent of EA is to determine how an
organization can most effectively achieve
its current and future objectives.
                                              System Connectivity
                                              Have agreed-to operational security, communication
                                              security, computer security, IA and COOP criteria
                                              been defined for this project?
                                              Has the physical laydown of the system, its inter
                                              operating systems, and its network connections been
                                              defined?
                                              Have all the required information exchange
                                              requirements been defined and have the necessary
                                              interface control document been coordinated?
                                              Are the critical technical parameters and other
                                              technical performance measures that must be
                                              achieved to satisfy the response time requirements
                                              been defined?
                                              Business Process
                                              Have the essential elements of the required
                                              information both internally and externally been
                                              defined?
                                              Are system requirements for continuity of operations
                                              and data archiving defined and in place?

                                              What is the business process diagram in terms of
                                              tasks and has RACI chart been developed for those
                                              tasks?
                                              Are the business rules identified and in place for all
                                              affected organizations?

Test and Independent Verification and Validation                                                 Version 0.6
Business Process Guide                           D-2                                           April 05, 2012
                       Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


         Checkpoint/Summary                                Indicators/Measurement                      Metric


                                              Data Layer
                                              Have all authoritative data sources been identified
                                              and defined (has a data dictionary been developed)?

                                              Have data standards and criteria been established as
                                              needed to implement automated data quality
                                              controls?
                                              Are data use agreement established, coordinated an
                                              in effect?

                                              Have all data elements needed for the system been
                                              identified in the Enterprise Architecture
                                              documentation tool (including restricted, sensitive,
                                              classified)?
                                              Data Processing
                                              What informational products must be provided as
                                              input to critical business tasks?

                                              Have the necessary assumptions for data
                                              compatibility (forward, backward) been defined?

                                              Have any necessary steps to achieve data
                                              duplication, data normalization, extraction,
                                              transformation and load been identified?
                                              Have all enterprise level reporting requirements been
                                              defined?

Quality Surveillance Plan (QSP) /
Quality Assurance Surveillance Plan
(QASP)
The QSP refers to the documented process
and procedures to which the provider of an
IT product commits to execute in order to
ensure that the customer's requirements are
satisfied before delivering the product to
the customer. The QASP refers to the
documented process and procedures that
the customer intends to apply to each
deliverable during development and before
accepting the deliverable.
                                              a. Developer Test and SCQC Planning
                                              Has each requirement been mapped to at least one
                                              test case?
                                              Have the EA standards and coding compliance been
                                              identified and applied as applicable to the project?


                                              Has user validation process been documented?


Test and Independent Verification and Validation                                                Version 0.6
Business Process Guide                             D-3                                        April 05, 2012
                       Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


          Checkpoint/Summary                               Indicators/Measurement                     Metric


                                             Have all design and system artifacts been reviewed?
                                             b. List of Deliverables
                                             Are the deliverables called out in the contractor
                                             agreement?
                                             Are there quality (including schedule) criteria for
                                             each deliverable?
                                             Is each deliverable explicitly addressed by QSP and
                                             QASP?
                                             Are procedures for configuration management being
                                             applied to each deliverable?
                                             c. Inspection Procedures
                                             Are the procedures for review and corrections before
                                             acceptance explained?
                                             Are specific organizations and individuals detailed to
                                             accomplish the necessary inspections?
                                             Have predetermined acceptance criteria been
                                             established for each inspectable item?
                                             Is policy defined for informing those who need to
                                             know?
Developer Product Test Planning
This checkpoint addresses all planning by
the provider of an IT product to test and
inspect how well deliverables satisfy the
customer's requirements. This planning
spans component items, modules, and
integrated systems. It also should address
how early user feedback will be obtained
and acted upon.
                                             a.    Review of Plan(s)
                                             Does the plan explain the contribution to overall
                                             evaluation framework?
                                             Does the plan have empowered consensus from all
                                             involved organizations?
                                             Does the plan explain the intended analysis and
                                             reporting?
                                             Does the plan effectively explain its linkage with
                                             product metrics (e.g. specific increment
                                             considerations)?

                                             b. Review of Results
                                             Has the design of experiment or sampling guidance
                                             been achieved?

Test and Independent Verification and Validation                                               Version 0.6
Business Process Guide                             D-4                                       April 05, 2012
                        Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


          Checkpoint/Summary                                  Indicators/Measurement                      Metric


                                               Do the results meet or exceed the expected threshold
                                               or objectives?
                                               Does the analysis isolate fault to a satisfactory level?


                                               Do the results match the expected measures and their
                                               criteria from the plan?
                                               c.   Scripts
                                               Have scripts been written for functional
                                               requirements?
                                               Have appropriate scripts been automated for use in
                                               regression and capacity testing?
                                               Do the scripts support sufficient dynamic SCQC?


                                               Do the scripts cover the sampling metrics in regards
                                               to factors and conditions?
                                               d.   Data for System Under Test (SUT)
                                               Have all data with which the application interacts
                                               directly been identified, and mapped to the EA data
                                               layer?

                                               Have the depersonalized data been developed to test
                                               the SUT?
                                               Are sufficient data available for capacity and
                                               interface testing?
                                               Does the breadth of the data for SUT cover the
                                               business rules?
Systems Engineering Planning                   a.   Provision for Software Reliability Growth
This planning is the interdisciplinary
application of engineering to design a total
system that satisfies the user's needs,
intelligently incorporates technology and
addresses all functional and technical
consideration of lifecycle of the product.
                                               Does the SEP address defect tracking and removal
                                               including recidivism?
                                               Does the SEP address CCB procedures for
                                               assignment of severities and priorities?
                                               Does the SEP clearly identify and describe the
                                               products that will be provided to T&IVV personnel
                                               for test planning?

Test and Independent Verification and Validation                                                  Version 0.6
Business Process Guide                               D-5                                        April 05, 2012
                       Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


         Checkpoint/Summary                               Indicators/Measurement                     Metric


                                            Does the SEP explain how the software design will
                                            be linked to the supported business process and
                                            CONOPS?

                                            b.     Risk Planning
                                            Does the program have a documented procedure for
                                            risk management?
                                            Does the programmatic documentation clearly
                                            express the assumptions that must be verified by
                                            testing?

                                            Does the program have adequate assigned personnel
                                            for risk management?
                                            Does risk planning address both process and product
                                            performance risks?
                                            c.     Technical Review Procedures
                                            Does the program have documented technical review
                                            procedures?
                                            Do the technical review procedures cover the
                                            prospective of data processing, information
                                            exchange, business process, and infrastructure?


                                            Does the technical review consider usability,
                                            maintainability, and vulnerability?
                                            Does the technical review address both compliance
                                            with EA standards and ensuring that the appropriate
                                            design information will be recorded in the EA
                                            database?



                                            d. Development Process
                                            Does the development process include automated
                                            techniques to ensure compliance with standards and
                                            coding best practices?


                                            Does the development process include safeguards
                                            such that the software accurately reflects the
                                            intended capabilities?

                                            Does the development process include T&IVV
                                            through its continuum and does the IMS reflect this?



Test and Independent Verification and Validation                                              Version 0.6
Business Process Guide                              D-6                                     April 05, 2012
                       Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


         Checkpoint/Summary                               Indicators/Measurement                       Metric


                                            Does the development process reflect the planned
                                            QASP?
                                            e. Standards
                                            Have all data been defined in terms of metadata,
                                            format, and authoritative sources?


                                            Have the distributed development standards been
                                            applied within contract and design documentation?


                                            Have standards related to human system integration
                                            and roles and privileges been established?


                                            Have the infrastructure and coding architecture
                                            standards to facilitate maintainability been
                                            identified?

                                            f. Regression Test Planning
                                            Has analysis for potential interaction effects been
                                            completed to help limit the scope of the necessary
                                            regression testing?

                                            Does the engineering plan calls for adequate time for
                                            regression testing for each product?


                                            Is there a process for closing defects when regression
                                            testing has been successful?


                                            Does the plan for regression testing address the
                                            degree of variations in conditions expected for
                                            regression testing?

                                            g.     Problem Track and Resolution Planning
                                            Is there an automated system to track problems and
                                            resolutions?
                                            Is there a process for assigning access right to all
                                            those who need to know?
                                            Does the problem tracking plan include mechanism
                                            to identify trends for risk management planning and
                                            to appropriately share lessons learned?
                                            Does the problem tracking system address how to
                                            aggregate essentially equivalent problems expressed
                                            with different descriptions?

Test and Independent Verification and Validation                                                Version 0.6
Business Process Guide                             D-7                                        April 05, 2012
                        Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


          Checkpoint/Summary                                 Indicators/Measurement                        Metric


 Information Support Planning (ISP)
This planning addresses the information
that will be handled by the system in terms
of its architecture, data sources, business
processes, and interfaces. It also addresses
how the data and information will be
maintained.
                                               a. Interfaces
                                               Does the ISP identify all incoming, outgoing, and bi-
                                               directional interfaces?
                                               Does the ISP address all architectural information
                                               needed to verify net readiness?
                                               Does the ISP address how the design will be
                                               managed to align with business process engineering
                                               and change management?

                                               Does the ISP include the required EA views?


                                               b. Network Protocols
                                               Does the ISP explain the expected firewall
                                               implementations?
                                               Does the ISP explain organizational responsibilities
                                               for network monitoring, maintenance, and upgrades?

                                               Does the ISP identify any mandatory enterprise wide
                                               communication protocols and data standards?
                                               Does the ISP explain an automated network
                                               management approach, including remote operations?
                                               c. Data Standards
                                               Do the data have metadata?

                                               Do the data reflect current national health data
                                               standards?
                                               Has design provision for encrypting and decrypting
                                               data been addressed?
                                               Does the data dictionary include format and content
                                               definition?
Risk Assessment
In the context of gauging, the appropriate
level of testing and IV&V for product, the
risk assessment is the determination of
quantitative and qualitative indicators that
correspond with a prudent level of total
system quality assurance.

Test and Independent Verification and Validation                                                    Version 0.6
Business Process Guide                              D-8                                           April 05, 2012
                       Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


         Checkpoint/Summary                               Indicators/Measurement                        Metric


                                            a.     Developer Track Record

                                            Have the applicable information sources been
                                            checked to substantiate the developer's track record?


                                            Does the developer have substantial experience in
                                            the intended development approach, which
                                            demonstrated functional and technical proficiency?


                                            Does the developer track record reflect strong
                                            internal management controls to ensure alignment
                                            with programmatic guidance?


                                            Does the developer track record reflect stability of
                                            well-qualified, highly motivated, and cohesive team?


                                            b.     Technical Challenge

                                            To what degree have all needed product technologies
                                            been successfully and well exercised in previous
                                            products?

                                            Does the development team have experienced
                                            personnel for implementing each critical technology
                                            for development?

                                            What is the projected availability of interoperable
                                            systems that are needed for end to end testing?


                                            Are any required technical readiness assessment or
                                            defense business technology certification completed?


                                            c.     Operational Impact

                                            For each capability or major functional area how
                                            serious is the failure to deliver that part of the system
                                            to immediately mandated operations?




Test and Independent Verification and Validation                                                Version 0.6
Business Process Guide                              D-9                                       April 05, 2012
                      Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


         Checkpoint/Summary                              Indicators/Measurement                       Metric


                                           Are there any parts of the system that would
                                           negatively impact, patient, personnel, or
                                           infrastructure safety, if they do not work as required?


                                           Have any high-level directives (congress, president,
                                           joint chief of staff, etc…) identified capabilities of
                                           the system as critical to national defense, security, or
                                           readiness?



                                           Has the negative impact been identified of failing to
                                           deliver each EBF needed to satisfy each use case?




                                           d.   Requirements Coverage and Testability

                                           Have all the requirements been addressed in an
                                           evaluation framework to the level of requirements
                                           category appropriate to the time frame of the risk
                                           assessment and the developmental approach?



                                           Does each requirement satisfy the testability
                                           considerations of clear, stated criterion(ia), and
                                           conditions/context, completeness, and consistency?


                                           Are there any requirements that require development
                                           of new testing technology or data collection
                                           instrument, and if so with what degree of surety will
                                           these new items be ready on schedule?



                                           Have the appropriate T&IVV personnel been
                                           identified and are they projected to be available for
                                           all foreseeable T&IVV activities?

                                           e.   Resources

                                           Has the appropriate testing environment been
                                           earmarked to meet the required testing schedule,
                                           including all data needed to test the SUT and
                                           interface emulations?


Test and Independent Verification and Validation                                              Version 0.6
Business Process Guide                           D-10                                       April 05, 2012
                         Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


          Checkpoint/Summary                                    Indicators/Measurement                     Metric


                                                  Are any contracted T&IVV support activities on
                                                  schedule to comply with overall IMS?


                                                  Is funding available in the proper appropriations and
                                                  expirations for all needed engineering and T&IVV
                                                  activities?


                                                  Have arrangements been completed to make
                                                  available all Application Lifecycle Management
                                                  (ALM) tools and software testing/analysis tools as
                                                  needed by the plan, including sufficient licenses?



Test and Evaluation Strategy
This strategy addresses the management,
roles and responsibilities, testing activities,
IV&V, schedule, and resources expected to
be implemented as total system quality
assurance throughout the system lifecycle
from concept approval through
sustainment.
                                                  a.   Integrated Master Schedule

                                                  Have the key decisions requiring T&IVV input been
                                                  identified?
                                                  Have all distinct T&IVV activities been included?


                                                  Does the schedule include any relevant certification
                                                  activities (e.g. meaningful use, information
                                                  assurance, interoperability assessment, etc…)?


                                                  Does the schedule reflect clearly defined delivery of
                                                  increments?
                                                  b.   Test Tools

                                                  Is there a test case management tool that is readily
                                                  available for use that has appropriate interfaces with
                                                  other ALM tools?


                                                  Has the set of SCQC tools been earmarked with the
                                                  necessary licenses?


Test and Independent Verification and Validation                                                    Version 0.6
Business Process Guide                           D-11                                             April 05, 2012
                      Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


         Checkpoint/Summary                              Indicators/Measurement                        Metric


                                           Are all tools needed to support interoperability
                                           testing available (data generation, interface
                                           simulation, capacity testing, network analysis,
                                           etc…)?

                                           Does the T&IVV have the appropriate accesses to
                                           collaboration tools to work with other business
                                           areas?

                                           c.   Test Environments

                                           Have each distinct test environment required for test
                                           strategy been identified and reserved?


                                           Do procedures for installation and for configuration
                                           control exist within each test environment?


                                           Do all stakeholder organizations understand and
                                           have the necessary access right for each test
                                           environment in which they need to operate?


                                           Does each test environment have at least initial
                                           authority to test sufficient for the intended testing in
                                           that environment?

                                           d.   Evaluation Framework Coverage

                                           Does the evaluation framework cover the
                                           information technology evaluation model
                                           adequately?

                                           If product increments are intended under the
                                           acquisition strategy, has the evaluation framework
                                           been allocated to each incremental product?


                                           Has the testing activity or event been identified for
                                           each measure in the evaluation framework?




Test and Independent Verification and Validation                                                Version 0.6
Business Process Guide                           D-12                                         April 05, 2012
                       Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


         Checkpoint/Summary                                Indicators/Measurement                       Metric


                                             Did the appropriate functional proponent or user
                                             representative concur with the evaluation framework,
                                             especially with the criteria for operational issues, key
                                             performance parameters, and business rule guidance?




                                             e.   Roles and Responsibilities

                                             Has each role and responsibility spoken about in the
                                             DoD/VA joint iEHR T&IVV CONOPS and BPG
                                             been adequately addressed?


                                             Is the reporting chains of command (management
                                             structure) adequately addressed?


                                             Is the issue resolution procedure explained clearly?


                                             Have appropriate organizations been called out for
                                             each T&IVV activity?
                                             f.   Funding

                                             Do all T&IVV activities have appropriate funding
                                             for the expected labor and travel?
                                             Have provisions been made for any expected user
                                             participation in requirements definition and T&IVV
                                             activities, including SCQC?


                                             Is there funding for test environment including tools
                                             and data?
                                             Is each essential certification activity provided for?


 Use Cases
These vignettes describe how specific
categories of users will interact with the
system to accomplish mission task and to
maintain the system. They explain task
condition, standards, and precise expected
inputs and outputs.
                                             a. Essential Business Function with Accuracy,
                                             timeliness, completeness


Test and Independent Verification and Validation                                                Version 0.6
Business Process Guide                           D-13                                         April 05, 2012
                      Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


         Checkpoint/Summary                              Indicators/Measurement                        Metric


                                           Are the EBFs clearly supported by validated use
                                           cases?
                                           Have failure definitions scoring criteria been
                                           established for each EBF?
                                           Has at least one testing activity been identified for
                                           exercising the EBF by typical users in operational
                                           realistic environment?


                                           Do the use cases cover explicit sustainment
                                           expectations?
                                           b.   Operational Conditions for Testing

                                           Is there a plan to train typical users and have them
                                           participate in the final operational validation before
                                           deployment?

                                           Is a production representative set of articles, slices of
                                           operational environment, and typical users available
                                           for testing?

                                           Do provisions exist to verify that data exchanged by
                                           the SUT and other systems can be processed as
                                           intended?

                                           Have mechanisms been planned to exercise the SUT
                                           with all expected critical tasks (which the SUT is
                                           expected to support) and the pick expected
                                           workload?
                                           c. Prioritization for Development

                                           Have the use cases been prioritized, has a process
                                           been identified to update the prioritization during
                                           SDLC?

                                           Do all stakeholder organizations concur with a
                                           mutually understood categorization of defect
                                           severities, which categorizes defects into groups
                                           from most important to correct to less important to
                                           correct?

                                           Does scheduling of software development consider
                                           the availability of any developmental resources
                                           required identify, design, and develop a working
                                           implementation of a functional use case?



Test and Independent Verification and Validation                                               Version 0.6
Business Process Guide                           D-14                                        April 05, 2012
                       Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


         Checkpoint/Summary                               Indicators/Measurement                      Metric


                                             If the budget will not support developing all
                                             expected capabilities and requirements, is there a
                                             timely method identified to reduce the scope of the
                                             developmental effort?

Automated Data Collection and Analysis
ADCA techniques will systematically
collect the necessary test data to address
the objectives for the testing, help fault
isolate and diagnose defects, and produce
metrics. The objective ADCA for any
SUT undergoing IQA is automated
management of test cases and their
associated results in order to produce a
product dashboard for the test.
                                             a.   Requirements Management Tool

                                             Does the tool have mechanism to assign
                                             requirements attributes for each requirement?


                                             Does the tool integrate with the test management
                                             tool?
                                             Does the tool support incremental allocation of
                                             requirements to release and baselining functionality?


                                             Does the tool have a user friendly mechanism to
                                             produce a graphical metrics displays (e.g. dashboard,
                                             aggregate results using specified criteria, etc…)?


                                             b.   Test Case Management Tool

                                             Does the tool have a user friendly mechanism to
                                             produce a graphical metrics displays (e.g. dashboard,
                                             aggregate results using specified criteria, etc…)?


                                             Does the tool have the capability to report problems
                                             against individual steps of a script?


                                             Does the tool have levels of review and management
                                             functionality for test quality assurance?




Test and Independent Verification and Validation                                               Version 0.6
Business Process Guide                           D-15                                        April 05, 2012
                      Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


         Checkpoint/Summary                             Indicators/Measurement                        Metric


                                           Can the tool be operated by the required numbers of
                                           T&IVV and stakeholder personnel?


                                           c.   Capacity Testing Tool(s)

                                           Have the capacity testing tools been identified for
                                           processing, communications, database, and
                                           workload?

                                           Have all dataset required for capacity testing have
                                           been clearly planned?
                                           If capacity testing is assigned to one or more
                                           organizations, has the necessary coordination been
                                           finalized?

                                           Do the scripts planned for capacity testing cover
                                           credible range of conditions?
                                           d.   Technical Inspection Tool(s)

                                           Do T&IVV personnel have access to the same
                                           graphic tools used by systems engineers?


                                           Has the appropriate technical review checklist of
                                           standards been approved?
                                           Is there a documented final adjudication of the
                                           design (or scrum goals) to proceed with
                                           development?

                                           Have all components called for by EA doctrine been
                                           clearly identified?
                                           e.   SCQC Tool(s)

                                           Have the static and dynamic SCQC tools and
                                           qualified personnel to operate, analyze, and report
                                           from them been identified and made available?


                                           Have there been reservations made for the needed
                                           test environment, data, and scripts to conduct SCQC?


                                           Has any planned SCQC activities been coordinated
                                           with IA certifying organization(s)?



Test and Independent Verification and Validation                                               Version 0.6
Business Process Guide                           D-16                                        April 05, 2012
                      Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


         Checkpoint/Summary                             Indicators/Measurement                     Metric


                                           Whenever possible, will the SCQC tools be obtained
                                           using the most economical enterprise level licenses?


Installation Documentation
Installation documentation addresses a
formally approved document that provide
step by step guidance on the process of
installing a new application/system and
upgrading from a previous version to a
new version. The document contains
instruction on how to conduct the
installation onsite or through remote
access.
                                           a.   System Administrator Instructions

                                           Have the minimum requirements (training and
                                           applicable certifications) to conduct the system
                                           administrator activities been clearly documented?


                                           Do the instructions address clear procedures (clearly
                                           and completely document) for diagnosing system
                                           problems, remote and local monitoring of the SUT,
                                           and monitoring security administration?


                                           Do the instructions address clear procedures for
                                           database consistency, restoration of system
                                           databases, initialization of database services,
                                           management of free space with thresholds,
                                           configuration of memory, and data caches?



                                           Do the instructions address the hard disk, memory,
                                           operating system, and platform requirements for the
                                           installation of the SUT?


                                           b.   Deployment Instructions

                                           Do the instructions have user targeted sections that
                                           address the business Operational View driven from
                                           business requirements and functional Service View
                                           driven from technical requirements?




Test and Independent Verification and Validation                                            Version 0.6
Business Process Guide                           D-17                                     April 05, 2012
                      Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


         Checkpoint/Summary                              Indicators/Measurement                       Metric


                                           Do the instructions contain network and deployment
                                           diagrams where applicable and system
                                           configuration?

                                           Do the instructions address the required skills and
                                           knowledge?
                                           Do the instructions consistent with the product
                                           release planning?
                                           c.   Production-Representative Environment

                                           Has the production-representative environment been
                                           identified and thoroughly assessed?


                                           Has the production-representative environment been
                                           setup for a dedicated phase of initial Operational test
                                           and evaluation?


                                           Is there demonstrated ability to collect and analyze
                                           quality data in the production representative
                                           environment?

                                           Have procedures for deploying the IT product in the
                                           production-representative environment been
                                           documented?

                                           d.   Installation Instruction Testing



                                           Have the instructions for software installation been
                                           clearly and completely documented?


                                           Have the instruction been tested in production
                                           representative environment?

                                           Have the necessary corrections been made to the
                                           installation documentation and other applicable
                                           artifacts following the test run(s)?

                                           Have the instruction been re-tested after the
                                           installation documentation was updated?




Test and Independent Verification and Validation                                               Version 0.6
Business Process Guide                           D-18                                        April 05, 2012
                        Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


          Checkpoint/Summary                                  Indicators/Measurement                      Metric


Test Cases
A test case is an explicit explanation of
procedures to determine the degree to
which a requirement has been by a product.
Each requirement requires at least one test
case. Most requirements require iterations
of the test case under varied conditions.
Some requirements require multiple test
cases to fully resolve. A test case specifies
any script, data, infrastructure, personnel,
etc... required for its execution.

                                                a.   Functional Requirements Test Cases

                                                Are appropriate test cases planned for each variety of
                                                testing needed to completely resolve each functional
                                                requirement?


                                                Have the failure definition and scoring criteria been
                                                predefined for each functional test case?


                                                Are suitable user SMEs available for clarifying
                                                functional requirements and consulting on the
                                                results?

                                                Has sufficient iterations of each functional test case
                                                been planned for sampling the factors and
                                                conditions?

                                                b.   Technical Requirements Test Cases

                                                Has sufficient iterations of each technical test case
                                                been planned for sampling the factors and
                                                conditions?

                                                Have the failure definition and scoring criteria been
                                                predefined for each technical test case?


                                                Are suitable engineers and developers available for
                                                clarifying technical requirements and consulting on
                                                the results?




Test and Independent Verification and Validation                                                   Version 0.6
Business Process Guide                           D-19                                            April 05, 2012
                      Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


         Checkpoint/Summary                              Indicators/Measurement                       Metric


                                           Are appropriate test cases planned for each variety of
                                           testing needed to completely resolve each technical
                                           requirement?


                                           c.   Compatibility Test Cases

                                           Are all mandatory infrastructure requirements
                                           adequately represented by the infrastructure within
                                           the test environment, which enables infrastructure
                                           compatibility testing?



                                           If infrastructure is virtualized, has it been validated
                                           as sufficient for testing by the engineers?


                                           Will sufficient user test players from Service
                                           Components participate to ensure that the technical
                                           solution represented by the SUT can work in the
                                           home organizational environment?



                                           Does the plan for testing include backward and
                                           forward application and data compatibility with
                                           legacy systems that must remain operational?


                                           d.   Data for System Under Test (SUT)

                                           Has specific data been identified for executing the
                                           different iterations of the test cases when data
                                           processing is required?

                                           When data exchange between different Component
                                           organizations is part of the test case, has the
                                           necessary coordination among those organizations
                                           been completed to determine if the particular test
                                           case observation represents successful performance
                                           of the requirement?



                                           Do all T&IVV personnel have the mandatory
                                           clearances for the data they might observe during
                                           testing?



Test and Independent Verification and Validation                                               Version 0.6
Business Process Guide                           D-20                                        April 05, 2012
                        Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


          Checkpoint/Summary                                 Indicators/Measurement                      Metric


                                               Has sufficient data been planned for or developed to
                                               enable the needed capacity testing?


Certification and Accreditation Data
Collection and Analysis
This checkpoint amounts to the collective
procedures that must be accomplished to
satisfy any applicable constraints regarding
the security and dependability of the
system, which must be authoritatively
signed off before deployment.

                                               a.   Computer Security

                                               Have roles and privileges been defined along with
                                               appropriate access controls?
                                               Has any necessary in-depth security been planned
                                               and developed?
                                               Do the computer security provisions include any
                                               high-level directives that apply (e.g. single sign on,
                                               two level authentications, etc…)


                                               Do the COMPSEC procedures address cancellation
                                               of access privileges when personal move on or lose
                                               their clearance?

                                               b.   Information Assurance

                                               Does the SUT eliminate mandatory IAVAs?


                                               Does the SUT provide for confidentiality,
                                               availability, authentication, non-repudiation, and
                                               integrity?

                                               Does the test strategy include penetration testing, and
                                               have the necessary coordination completed?


                                               Does the system log all accesses?

                                               c.   Continuity of Operation

                                               Does the system automatically backup critical data?




Test and Independent Verification and Validation                                                  Version 0.6
Business Process Guide                           D-21                                           April 05, 2012
                       Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


         Checkpoint/Summary                                Indicators/Measurement                      Metric


                                              Has the continuity of operation plan been developed?


                                              Where the plans for rollover and restoration tested?


                                              Does the SUT comply with legal constraints in
                                              regard to archiving data?
 Supportability Assessment
This assessment deals with aspects of
logistics planning needed to sustain the
system after its acceptance by the customer
until its decommissioning.

                                              a.   Configuration Management Planning

                                              Has the configuration management plan with clear
                                              roles and responsibilities been developed?


                                              Are procedures in place and being exercised to
                                              ensure each article of the SUT is under configuration
                                              control?

                                              Have appropriate personnel from each organization
                                              involved in the configuration management activity
                                              been assigned?

                                              Are adequate records of configuration management
                                              activities and filed artifacts being maintained?


                                              b.   Problem Resolution Planning

                                              Does the test strategy explain how system problems
                                              will be tracked from observation through successful
                                              regression testing or cancellation of the
                                              requirements?

                                              Have promotion criteria been assigned for the SUT
                                              to pass from any distinct phase of software
                                              development to another?

                                              Have appropriate higher authorities been identified
                                              to whom problems of organizational or personnel
                                              performance can be escalated?




Test and Independent Verification and Validation                                                Version 0.6
Business Process Guide                           D-22                                         April 05, 2012
                      Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


         Checkpoint/Summary                              Indicators/Measurement                        Metric


                                           Have appropriate representatives of stakeholder
                                           organizations been identified to participate in
                                           problem severity assignment?

                                           c.   Deployment Planning

                                           Is there a coordinated schedule for all sites requiring
                                           deployment coordination?
                                           Is there a plan to approve both limited and full
                                           deployment?
                                           Have sufficient resources been identified in terms of
                                           user expectations, the deployment team, training, and
                                           infrastructure?


                                           If deployment is to be accomplished remotely, is
                                           there a plan for doing so without disturbing of
                                           ongoing mission requirements?


                                           d.   Help Desk

                                           Have the users been informed regarding how to
                                           contact the appropriate helpdesk?
                                           If a user's request for assistance is not immediately
                                           responded to, does the help desk have procedure to
                                           follow up in a satisfactory manner?


                                           Does the front line helpdesk have diagnostic
                                           procedures to quickly resolve frequently asked
                                           questions and to disseminate lessons learned in a
                                           useful manner to all helpdesk personnel?



                                           Does the helpdesk have procedures to analyze trends
                                           and provide feedback for consideration in upgrading
                                           systems?

                                           e.   Infrastructure Planning

                                           Have the infrastructure items required at each type of
                                           organization and level of the enterprise been
                                           identified in engineering plans?




Test and Independent Verification and Validation                                                Version 0.6
Business Process Guide                           D-23                                         April 05, 2012
                      Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


         Checkpoint/Summary                              Indicators/Measurement                        Metric


                                           Has the funding and schedule for implementing all
                                           infrastructure items been addressed?


                                           If a common development and test environment is
                                           envision for SUT and its deployment, have the
                                           necessary organizational agreement been completed?


                                           If the infrastructure plan says that applications will
                                           be distributed to user carried devices, has the plan for
                                           remotely installing the applications and keeping
                                           them updated been completed?




                                           f.   Maintenance Concept

                                           Is there a plan for long term sustainment of the
                                           system?
                                           Is there an approach to detect performance anomalies
                                           indicating required maintenance is imminent?


                                           Do non-enterprise organizations connected with
                                           implementation of the SUT acknowledge their
                                           network infrastructure and training expectations?


                                           Is there a centralized online resource for users and
                                           administrators to learn about successful corrective
                                           actions for certain system problems?


                                           g.   Modernization Planning

                                           When the SUT will result in retirement of legacy
                                           system, has the transition plan been developed?




Test and Independent Verification and Validation                                                Version 0.6
Business Process Guide                           D-24                                         April 05, 2012
                      Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


         Checkpoint/Summary                             Indicators/Measurement                      Metric


                                           When data processed by the SUT are subject to
                                           evolving standards, does the engineering planning
                                           address how to ensure that new data elements can be
                                           incorporated and that legacy data element can
                                           continue being supported?


                                           Are there provisions for reviewing, selecting, and
                                           resourcing approved changes in the system?


                                           Does the program office address user training as part
                                           of change management
                                           h.   Training Planning

                                           Does the training plan address deployment and
                                           sustainment training?
                                           Does the training plan address end users,
                                           administrator, maintainer, customers, and trainers
                                           themselves?

                                           Is there an adequate training support package to
                                           include, system documentation for the user, a
                                           documented program of instructions, and any
                                           training support materials needed?



                                           For each category of the target audience, does the
                                           training address the tasks, conditions, and standards
                                           with performance criteria?


                                           i.   Transition Planning

                                           Is there a plan to migrate the SUT into the legacy
                                           environment and terminate legacy system in
                                           organized schedule?

                                           Does the engineering allow for and define steps to
                                           evolve from stove pipe IT solutions to enterprise
                                           solutions?

                                           Has all expected user community actions been
                                           coordinated and approved by appropriate authorities?




Test and Independent Verification and Validation                                             Version 0.6
Business Process Guide                           D-25                                      April 05, 2012
                       Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


         Checkpoint/Summary                               Indicators/Measurement                       Metric


                                             Does the transition plan address any necessary
                                             backward or forward data transition, and also, does it
                                             specify safeguards to cleanse legacy data before
                                             migrating into the new system?



User and System Documentation Review
User documentation refers to written
guidance for operating, administering, and
maintaining the system. System
documentation refers to recorded technical
information needed by systems engineers
and programmers. Documentation may be
online or on storage device.
                                             a.   Deliverables Review

                                             Have all deliverables called for by contract or
                                             agreement been delivered or remain scheduled for
                                             delivery?

                                             Is there an approved inspection checklist
                                             predetermined for content and quality of the contents
                                             in each deliverable document?

                                             Are software delivery supported by appropriate
                                             system documentation (working installation
                                             instruction), source code, and compiled code?


                                             Are defined procedures been available for each
                                             anticipated deliverable review?
                                             b.   User Clarity Rating

                                             Is the language at the required reading level?


                                             Are appropriate and sufficient illustrations provided?


                                             Does the documentation include table of contents,
                                             index, and glossary?
                                             Does the documentation provide the essential
                                             information for each category of user?


                                             c.   Technical Accuracy and Completeness



Test and Independent Verification and Validation                                                Version 0.6
Business Process Guide                           D-26                                         April 05, 2012
                        Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


          Checkpoint/Summary                                 Indicators/Measurement                       Metric


                                                Does each set of instruction for operating and
                                                maintaining the system resemble and work as
                                                described?

                                                Is each use case and any required administrator
                                                actions addressed by the documentation?


                                                If documentation will be provided online, are there
                                                means of efficiently navigating online sources and
                                                finding the needed information?


                                                Is the technical documentation rated as satisfactory
                                                by the engineers for fault isolation, maintenance
                                                planning, and modular upgrades?


User Satisfaction Survey after
Deployment
User satisfaction surveys seek feedback
from the different categories of users in all
user organizations that ultimately calibrates
how much they approve of the system.

                                                Do the end users respond that the system supports
                                                their missions on a timely, accurate, and reliable,
                                                dependable manner?


                                                Do end users and customers rate the helpdesk and
                                                problem resolution procedures as satisfactory?


                                                Does the system save users time in accomplishing
                                                their duties (i.e. does it make their job easier)?


                                                Has a plan to conduct, analyze, and report a user
                                                survey been developed?
Software Code Quality Checking
(SCQC)
SCQC is the analysis of scans (source code
and executables) and related artifacts
(documentation) to ensure that the System
Under Review (SUR) satisfies specific
performance, maintainability, and usability
requirements and selected best practices.
…..

Test and Independent Verification and Validation                                                   Version 0.6
Business Process Guide                           D-27                                            April 05, 2012
                      Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


         Checkpoint/Summary                             Indicators/Measurement                      Metric


                                           a.   Static Code Analysis
                                           Is there a provision in effect for the Government to
                                           obtain source code from the IT product provider, and
                                           if so, is there an appropriate lab environment
                                           available in which static code analysis can be
                                           conducted?
                                           Is/are an appropriate SCQC tool(s) available for the
                                           IQA team and the development team to exercise in
                                           static code analysis of the coding language(s) used,
                                           and does/do the tools have a level of capability such
                                           that it/they should catch most potential instances of
                                           standards and best-practices non-compliance?
                                           Has the IT product provider planned to incorporate
                                           static code analysis during development and
                                           integration, inform the customer about the results
                                           and analysis, and to take appropriate corrective
                                           actions, if any are indicated?
                                           Have qualified and experienced personnel in static
                                           code analysis been earmarked to support the IT
                                           product?
                                           b. Dynamic Code Analysis
                                           Is there an operationally realistic DTE)with data for
                                           the SUT to process, scripts (or at least scenarios) to
                                           support dynamic code analysis, interface
                                           representations, and upon which the SUT may be
                                           installed?
                                           Is/are an appropriate SCQC tool(s) available for the
                                           IQA team and the development team to exercise in
                                           dynamic code analysis of the coding language(s)
                                           used, and does/do the tools have a level of capability
                                           such that it/they should catch most potential
                                           instances of standards and best-practices non-
                                           compliance?
                                           Has the IT product provider allowed for slack in the
                                           development/integration schedule to accommodate
                                           any adjustments needed based on their disclosure
                                           during dynamic code analysis?
                                           Have qualified and experienced personnel in
                                           dynamic code analysis been earmarked to support the
                                           IT product?
                                           c. Vulnerabilities
                                           Has the list of IAVAs for which the Federal
                                           Government has active waivers been obtained and
                                           considered?
                                           Have coordination been completed between the
                                           SCQC agents with the IA authorities so that SCQC
                                           can potentially reduce the time needed for
                                           certification?
                                           Do the SCQC tools selected for the product address
Test and Independent Verification and Validation                                            Version 0.6
Business Process Guide                           D-28                                     April 05, 2012
                      Possible Metric Values {Green, Amber, Red, Aqua, Yellow}


         Checkpoint/Summary                             Indicators/Measurement                    Metric


                                           the appropriate set of IA safeguards?
                                           Does the IT product provider have a clear plan to
                                           reduce vulnerabilities by using SCQC, rather than
                                           relying solely on manual inspection?
                                           d. Architectural View
                                           Does the design keep the layers of functional code
                                           implementation less than four?
                                           Does the design rely upon standard data when such
                                           data have been prescribed?
                                           Is the design modularized as to facilitate future
                                           maintenance?
                                           Does the design avoid unused code, call to
                                           appropriately identified components, and have
                                           sufficient comments for future coders to understand
                                           the processing?




Test and Independent Verification and Validation                                           Version 0.6
Business Process Guide                           D-29                                    April 05, 2012
Annex E. Product Measurements and Metrics
Evaluation Framework. An EF provides a high-level, but clear, representation of the IQA planned for an
iEHR IT product by its TIWG. It serves as an elementary tool to facilitate checking the scope planned in
a TEMP. An EF for an iEHR IT product is multi-dimensional. Recording an EF requires a many-to-
many relational database, which is often referred to as a Requirements Management (RM) automated tool.
Due to its multi-dimensional complexity, the reader might find it convenient to begin considering the
Requirements Structure (RS) by looking at a two-dimensional piece of the EF.
Understanding an EF relies on two central concepts:
    a. Standard requirements categories for an iEHR IT product (see the columns in Table below)
    b. Requirements coverage of the standard Critical Operational Issues (COIs) for an iEHR IT product
       (see the rows in Table below).
The matrix formed by the rows and columns in Table represent a “Requirements Structure (RS).” The RS
portrayed by Table combines with information in other dimensions to form the EF. For example, the RS
in Table pertains to a distinct deployable increment, not the increments that might have preceded it or the
ones that might follow. Thus, time contributes a third dimension.
Each cell with an entry in Table will have at least one requirement linked to it. (Additional cells might
have entries for some systems.) Each requirement linked to a cell in Table will have at least one measure
linked to the requirement. (Often there may be several measures, and some measures might not need to
be executed except for fault isolation or analytical detail.) Each measure will have at least one data
source (testing or inspection) linked to it. Some measures will be repeated across multiple data sources.
In nearly all cases, the complexity of an EF for an iEHR IT product constrains it to a simplified recording
within the RM tool, e.g., some links in the RM tool will not be recorded unless needed for fault isolation
or analytical detail. For example, this is often the case for CTPs. CTPs often relate to multiple rows
despite the simplification of reality shown in Table shows any requirement only one time. The EF relates
to a widely used conceptual tool in software development called a RTM. Traceability concerns two
distinct concepts:
    a. First, any change in requirements or the increment to which they are assigned must be approved
       by the appropriate authority. When a requirement is changed from association with one
       increment to another, the change must also be approved by the appropriate authority. The RTM
       maintains version in archived memory with the records of who approved what requirements
       changes, when the changes were approved, when the changes were made in the RM tool, and who
       made the changes in the RM tool.
    b. Second, an RTM maintains the logical connections between requirements. This information
       helps support analyzing the scope of any regression testing that might be needed. However, once
       again, the logical connections are usually simplified to a superficial level.

EF Terms of Reference and Definitions
1. Essential Business Functions (EBFs). EBFs will typically be measured using MOEs (successes
   divided by attempts) and analyzed using a variety of MOPs calculated using EBF observations sorted
   into sets based on factors and conditions. (The term “EBF” equates to the term Critical Mission
   Function (CMF) within the DoD.)
2. EBF Performance Criterion(ia). The Business Process Support (BPS) COI criterion(ia) will typically
   be measured by MOEs calculated by EBF successes divided by EBF attempts for relevant sets of EBF
   observations.

Test and Independent Verification and Validation                                       Version 0.6
Business Process Guide                             E-1                               April 05, 2012
3. Interface Performance Criterion(ia). The IOP COI criterion(ia) will typically be measured by MOEs
   calculated by interface transmission successes divided by interface transmission attempts for relevant
   sets of interface transmission observations. An interface transmission success means that the data
   received can be processed as expected where received.
4. Data Retrieval Performance Criterion(ia). The Data Base Management (DBM) COI criterion(ia) will
   typically be measured by MOEs calculated by query successes divided by query attempts for relevant
   sets of interface query observations. A query success means that the data received is accurate,
   complete, and timely. FPs and engineers should design the system to achieve data validity.
5. Task Performance Criterion(ia). The Training COI criterion(ia) will typically be measured by MOEs
   calculated by task successes divided by task attempts for relevant sets of task query observations. A
   task success relates to achieving a standard for a specified category of user under prescribed
   conditions.
6. Usability Ratings Criterion(ia). The Usability COI criterion(ia) will typically be measured by overall
   user ratings of selected categories of users in regards to the combination of user-friendliness and job
   utility. The users must be trained and have experience enough to be proficient in the new-system
   training tasks.
7. System Reliability Criterion(ia). The ultimate Supportability COI criterion(ia) will typically be
   measured by the percent of attempts in which system performs EBF for selected sets of end-users as
   expected.
8. NSM Criterion(ia). The NSM COI criterion(ia) will typically be measured by the percent of time
   that network connectivity is available and supports EBF processing within timeliness scoring
   guidelines.
9. Access Control Criterion(ia). The COMPSEC COI criterion(ia) will typically be measured by the
   percent of times that verifications of role and privileges work as intended to both allow and prevent
   access.
10. Penetration Control Criterion(ia). The IA COI criterion(ia) will typically be measured by the percent
    of times that penetration defenses work to stop hacker penetrations.
11. Rollover Performance Criterion(ia). The COOP COI by verification that rollover achieves its planned
    performance in the event of a catastrophic failure.
12. System Performance: Response Time KPP. This KPP will typically be measured (characterized)
    using the distributions of EBF response times.
13. System Performance: Simultaneous Processing KPP. This KPP will typically be measured by both
    the number of simultaneous EBF executions achieved and the number of simultaneous application
    logons achieved.




Test and Independent Verification and Validation                                      Version 0.6
Business Process Guide                             E-2                              April 05, 2012
14. Net Ready - KPP. Ultimately, this KPP is resolved by whether the SUT achieves its Authority to
    Connect (ATC) or Interim ATC (IATC) and its Authority to Operate (ATO) or Interim ATO (IATO). 5
15. System Training KPP. See subparagraph Task Performance Criteria
16. Usability KPP.      This KPP will typically be resolved by assessment of individual usability
    requirements.
17. System Availability KPP. This KPP is usually resolved by being used as the Supportability COI
    criterion(ia). However, supporting measures should address verifications of ILS-Sustainment
    Planning and the percentages of attempts in which system performs EBF for each end-user category
    as expected.
18. Continuity of Operations (COOP) KPP. This KPP is typically measured with the Mean Time to
    Restore (MTTR) from a catastrophic failure.
19. Required Features KSA. This KSA is typically measured by an inventory of features achieved and
    not achieved.
20. Defense Information Standards Registry (DISR) Compliance KSA. This KSA is typically measured
    by certification or not of DISR compliance and by certification of Global Information Grid (GIG) net-
    worthiness.
21. Reliability KSA. This KSA is often measured using Mean Time Between Operational Mission
    Failure (MTBOMF), Mean Time Between specific Operational Mission Failures
    (MTBspecificOMFs), and Mean Time Between Failure (MTBF).
22. Maintainability KSA. This KSA is typically measured using MTTR, but SCQC assessment of code
    maintainability is recommended.
23. Total Ownership Cost KSA. This KSA is typically resolved using an assessment of demonstrated
    system performance and features to satisfy assumption in the Integrated Logistics Support
    (ILS)/Sustainability Plan.
24. Processing Segment Response Time CTP(s). This/these CTP(s) can be measured using response time
    distribution(s) by processing segment.
25. Output Capacity CTP(s). This/these CTP(s) can be measured using maximum size EBF output
    successfully achieved.
26. Paging CTP(s). This/these CTP(s) can be measured using pageins per second demonstrated
27. Network Capacity CTP(s). This/these CTP(s) can be measured using network retransmits per
    transmission and achievement of the minimal sizes of planned network interconnects.
28. Memory Capacity CTP(s). This/these CTP(s) can be measured using achievement of the minimum
    volatile memory capacity system requires to meet demands of functional requirements.
29. Simultaneous Memory Transactions CTP(s). This/these CTP(s) can be measured using a tool for
    processing capacity testing.
30. File System Capacity CTP(s). This/these CTP(s) can be measured using fixed storage capacity
    adequate for projected needs.




Test and Independent Verification and Validation                                     Version 0.6
Business Process Guide                             E-3                             April 05, 2012
31. Disk Utilization CTP(s). This/these CTP(s) can be measured using achievement of application
    requirements within allotted disk allocation.
32. Graphical User Interface (GUI) CTP(s). This/these CTP(s) can be measured using achievement of
    GUI implementation guidelines (e.g., Application Program Interface (API)) and Section 508
    compliance.
33. Input/Output (I/O) Time CTP(s). This/these CTP(s) can be measured using the I/O wait percentage.
34. System Load Average CTP(s). This/these CTP(s) can be measured using verification of achieving the
    “light” SLA using a commercial tool.
35. Central Processing Unit (CPU) Utilization: User-State CTP(s). This/these CTP(s) can be measured
    using a commercial tool to verify achieving uptime criterion(ia) for user state.
36. CPU Utilization: System-State CTP(s). This/these CTP(s) can be measured using a commercial tool
    to verify achieving uptime criterion(ia) for system state.
37. Interfaces. See subparagraph Interface Performance Criteria
38. Protocols. See subparagraph Interface Performance Criteria
39. Starting Number of Users DFR. This DFR can be measured by achievement of required starting
    number of users.
40. Maximum Number of Users DFR. This DFR can be measured by achievement of required maximum
    number of users.
41. Number of Concurrent Users DFR. This DFR can be measured by achievement of required
    maximum number of concurrent users.
42. Health Insurance Portability and Accountability Act (HIPPA) compliance. This DFR can be
    measured by assessment of roles and privileges complying with HIPAA (preventing accidental
    PII/PHI disclosures).
43. Data Archiving DFR(s). This DFR can be measured by assessment achievement and compliance with
    retention criteria.
44. Administration DFR(s). This DFR can be measured by assessment of a Database Administrator’s
    (DBA's) ability to implement the administrators' plan to maintain tables, journal, and rollback logs.
45. Help DFR. This DFR can be measured by assessment quality and timeliness of help responses.
46. Documentation DFR. This DFR can be measured by reviewing the clarity, consistency, accuracy,
    and completeness of user documentation (end-user, administrators, trainers, maintainers, and
    customers).
47. Training Support Package DFR(s). This DFR can be measured by assessing the Training Strategy and
    by conducting Training Evaluations.
48. User Friendliness DFR(s). See subparagraph Usability Rating Criteria
49. Utility DFR(s). See subparagraph Usability Rating Criteria
50. Roles and Privileges DFR(s). This DFR can be measured by assessing capabilities and
    implementation of roles and privileges.
51. Restoration DFR(s). This DFR can be measured by observation of successful rollover(s).
52. Backup DFR(s). This DFR can be measured by observation of successful rollover.


Test and Independent Verification and Validation                                     Version 0.6
Business Process Guide                             E-4                             April 05, 2012
53. Maintenance Shutdown DFR(s). This DFR can be measured by verification of shutdown and restore
    procedures
54. Firewalls DTR(s). This/these DTR(s) can be measured by assessing achievement of specific
    guidelines for interfaces, ports, and protocols.
55. Health Industry Standards Constraints. These constraints can be measured by assessing compliance
    with Health Industry Standards.
56. Data Standards Constraints. See subparagraph Standards
57. Section 508 Constraints. These constraints are measured by compliance with the law.
58. Standards. These constraints can be measured by assessing compliance with applicable technical
    architectural standards (data) approved for the EA. (Note: This includes compatibility verification
    using "gold standard" infrastructure and network architecture from the Components.)




Test and Independent Verification and Validation                                    Version 0.6
Business Process Guide                             E-5                            April 05, 2012
Table X: Requirements Structure Concepts


Legend:
                                                                                                           DFR = Derived Functional Requirement
COI = Critical Operational Issue
                                 KPP = Key Performance Parameter    CTP = Critical Technical Parameter     DTR = Derived Technical Requirement
EBF = Essential Business
                                 KSA = Key System Attribute         IER = Information Exchange Requirement MOE = Measure of Effectiveness
Function
                                                                                                           MOP = Measure of Performance
COI                   EBF          Criterion(ia)     KPP            KSA           CTP            IER            DFR             DTR   Constraints

                                                                                                                   MOP:
                                                                                     MOP:                         Starting
                                                                                   Processing                    Number of
                                                                                    Segment                      Users DFR
                                                         MOE:                       Response                       MOP:
                                                                                  Time CTP(s).                   Maximum
1. Business Process                                     System                                                                           MOP:
                        MOE:                                                         MOP:                        Number of               Health
Support                              MOE: EBF        Performance:
                        EBF                                                          Output                      Users DFR.             Industry
                                                     Simultaneous
                                                      Processing                    Capacity                       MOP:                Standards.
                                                         KPP.                       CTP(s).                      Number of
                                                                                     MOP:                        Concurrent
                                                                                    Paging                       Users DFR.
                                                                                    CTP(s).                      MOP: Free
                                                                                                                   play

                                                                                     MOP:           MOP:
                                                                      MOP:                        Interfaces.
                                     MOE: EBF,        MOE: Net                      Network
2. Interoperability                                                  Required
                                       IOP           Ready – KPP.                   Capacity        MOP:
                                                                     Features.
                                                                                    CTP(s).       Protocols.
                                                                       MOP:          MOP:
                                                                      Defense       Memory                       MOP: Data
                                                                    Informatio      Capacity                     Archiving
                                                                                                                  DFR(s).             MOP: Data
                                     MOE: EBF,                      n Standards     CTP(s).
                                                                                                                                       Standards
                                     Datasource                       Registry                                  MOP:
                                                                                     MOP:                                             Constraints.
                                                                      (DISR)                                    Administratio
                                                                                  Simultaneous
3. Database                                                         Complianc                                   n DFR(s).
                                                                                    Memory
Management                                                               e.
                                                                                  Transactions

Test and Independent Verification and Validation                                          Version 0.6
Business Process Guide                             E-6                                  April 05, 2012
Table X: Requirements Structure Concepts


Legend:
                                                                                                                 DFR = Derived Functional Requirement
COI = Critical Operational Issue
                                 KPP = Key Performance Parameter          CTP = Critical Technical Parameter     DTR = Derived Technical Requirement
EBF = Essential Business
                                 KSA = Key System Attribute               IER = Information Exchange Requirement MOE = Measure of Effectiveness
Function
                                                                                                                 MOP = Measure of Performance
COI                   EBF          Criterion(ia)      KPP                 KSA            CTP              IER    DFR             DTR      Constraints

                                                                                            CTP(s).
                                                                                          MOP: File
                                                                                           System
                                                                                           Capacity
                                                                                           CTP(s).
                                                                                          MOP: Disk
                                                                                          Utilization
                                                                                           CTP(s).
                                                                                                                  MOP: Help
                                                                                                                   DFR(s).
                                                         MOE:
                                                                                                                    MOP:
                                                         System           Addressed
                                                                                                                 Documentatio
                                     MOE: Task        Training KPP         by KPP
                                                                                                                  n DFR(s).
                                    Performance,          (Task               and
                                   User Categories    Performance,        Criterion(ia                              MOP:
                                                           User                 )                                  Training
                                                       Categories)                                                 Support
                                                                                                                   Package
4. Training                                                                                                        DFR(s).
                                                             MOE:
                                    MOE: User               Usability                       MOP:                  MOP: User
                                    Satisfaction,          KPP (User                      Graphical               Friendliness               MOP:
                                        User              Satisfaction,                  User Interface             DFR(s).                Section 508
                                    Friendliness,             User                          (GUI)                MOP: Utility              Constraints.
                                       Utility            Friendliness,                    CTP(s).                DFR(s).
5. Usability                                                 Utility)


Test and Independent Verification and Validation                                                 Version 0.6
Business Process Guide                              E-7                                        April 05, 2012
Table X: Requirements Structure Concepts


Legend:
                                                                                                                DFR = Derived Functional Requirement
COI = Critical Operational Issue
                                 KPP = Key Performance Parameter         CTP = Critical Technical Parameter     DTR = Derived Technical Requirement
EBF = Essential Business
                                 KSA = Key System Attribute              IER = Information Exchange Requirement MOE = Measure of Effectiveness
Function
                                                                                                                MOP = Measure of Performance
COI                   EBF          Criterion(ia)      KPP                KSA            CTP             IER     DFR            DTR        Constraints

                                                                           MOP:
                                                                         Reliability.
                                                                          MOP:
                                    MOE: System             MOE:         Maintaina-
                                     Availability          System         bility.
                                    MOP: System           Availability
                                                                           MOP:
                                     Reliability             KPP
                                                                           Total
                                                                         Ownership
                                                                           Cost
6. Supportability                                                         (TOC).
                                                                                            MOP:
                                                                                        Input/Output
                                                                                         (I/O) Time
                                                                                           CTP(s).
                                                                                        MOP: System
                                                                                        Load Average
                                                                                           (SLA)                                MOP:
                                   MOE: Network                                           CTP(s).                              Firewall
                                                                                                                                   s         MOP:
                                    Availability,
                                                                                           MOP:                                DTR(s).     Standards.
                                     Reliability
                                                                                          Central
                                                                                         Processing
                                                                                         Unit (CPU)
                                                                                         Utilization:
7. Network                                                                               User-State
Systems                                                                                   CTP(s).
Management
                                                                                           MOP:
(NSM)
                                                                                           Central
Test and Independent Verification and Validation                                                Version 0.6
Business Process Guide                              E-8                                       April 05, 2012
Table X: Requirements Structure Concepts


Legend:
                                                                                                            DFR = Derived Functional Requirement
COI = Critical Operational Issue
                                 KPP = Key Performance Parameter     CTP = Critical Technical Parameter     DTR = Derived Technical Requirement
EBF = Essential Business
                                 KSA = Key System Attribute          IER = Information Exchange Requirement MOE = Measure of Effectiveness
Function
                                                                                                            MOP = Measure of Performance
COI                   EBF          Criterion(ia)     KPP             KSA        CTP              IER        DFR              DTR     Constraints

                                                                                  Processing
                                                                                  Unit (CPU)
                                                                                  Utilization:
                                                                                 System-State
                                                                                   CTP(s).
                                                                                                                                         MOP:
                                                                                                                                         Health
                                                                                                                                       Insurance
                                                                                                             MOP: Roles                Portability
                                    MOE: Access
                                                                                                            and Privileges                and
                                      Control
                                                                                                               DFR(s).                 Accounta-
                                                                                                                                       bility Act
8. Computer                                                                                                                             (HIPPA)
Security                                                                                                                              compliance.
                                       MOE:
                                                      MOE: Net
9. Information                       Penetration
                                                     Ready – KPP.
Assurance                             Control
                                                                                                               MOP:
                                                                                                             Restoration.
                                                        MOE:                                                    MOP:
                                                     Continuity of                                             Backup.
10. Continuity of
                                   MOE: Rollover      Operations
Operations                                                                                                     MOP:
                                                     (COOP) KPP
                                                      (Rollover)                                            Mainten-ance
                                                                                                             Shutdown.




Test and Independent Verification and Validation                                         Version 0.6
Business Process Guide                             E-9                                 April 05, 2012
Annex F. T&IVV Engagement
  T&IVV
  Business      Engagement
Process (BP)       Model         Key T&IVV                                  Expected Inputs     Expected Outputs (to         Minimal Entry
   Node         Intersection      Purpose                Activities          (from Whom)              Whom)                    Criteria             Minimal Exit Criteria
                                                                               T&IVV
                                                          Receive
                                                                                Support
                                                           service
                                                                                Request
                                                           request with
                               Notification to the                              from an             Acknowledgeme
                                                           initial
                               iEHR T&IVV                                       authoritative        nt to the
                                                           project
                               Org about a                                      source               authoritative
                                                           document
                               product requiring
                                                           set                 Business             source
Initial                                                                                                                  High level project     Completed TIVV Support
               Initiation      T&IVV support                                    Case from           Notification of
Notification                                              Assess the                                                    startup documents      Request
                               from an                                          “Business            key organizations
                                                           scope of the
                               authoritative                                    Requirement          of the initiation
                                                           product
                               source to T&IVV                                  s Work               of the TIWG
                               activities                 Identify key
                                                                                Group
                                                           internal and
                                                                                (BRWG) /
                                                           external
                                                                                Program
                                                           stakeholders
                                                                                Sponsor”
                                                                               Requirement
                                                                                s
                                                                                specification
                                                                                and
                                                                                                                                                Analysis of:
                                                                                supporting
                                                                                                                                                   Evaluation Framework
                                                                                documents
                                                          Provide                                                                                        o Use cases (with
                                                                                from BRWG
                                                           comments                                                            Business                      essential
                                                                                and the         Recommendations
                               Requirements that           about                                                                case                          business
                                                                                Product         about testability,
                               are testable and            testability of
                                                                                Engineering     traceability,                  Requirement                   requirements,)
               Planning,       conform to the              requirements                                                         s                         o Threshold and
Requirements                                                                    team (for       completeness, and
               Requirements,   attributes of well-        Provide              review)                                         specification                 objective values
Analysis                                                                                        performance
               Development     written                     comments                                                             and                       o Standards
                                                                               Baselined       characteristics to
                               requirements                about the                            BRWG                            supporting         Support items
                                                                                RTM from
                                                           completenes                                                          documents                 o Hardware
                                                                                the
                                                           s of                                                                                               requirements
                                                                                “Requiremen
                                                           requirements                                                                                   o Training
                                                                                ts Manager
                                                                                                                                                              requirements
                                                                                (RM)” (for
                                                                                review)
                                                                               Initial
                                                                                Enterprise
                                                                                Architecture
Test and Independent Verification and Validation                                                      Version 0.6
Business Process Guide                               F-1                                            April 05, 2012
  T&IVV
  Business         Engagement
Process (BP)          Model          Key T&IVV                                 Expected Inputs     Expected Outputs (to          Minimal Entry
   Node            Intersection       Purpose                Activities          (from Whom)             Whom)                     Criteria             Minimal Exit Criteria
                                                                                    Views from
                                                                                    the Product
                                                                                    Engineering
                                                                                    team
                                                                                   Analysis of
                                                                                    Alternatives
                                                                                    from BRWG

                                                              Identify the
                                                                                  CONOPS
                                                               services to
                                   Determination of
                                                               be delivered       Requirement
                                   appropriate levels                              s
                                                              Tailor
                                   of T&IVV for the                                specification   Risk Assessment
                                                               T&IVV
                                   product*                                        and             T&IVV Scope Report
                                                               level of                                                            Acquisition
Risk                               When there are                                  supporting      (risks, issues),
                                                               effort                                                               Strategy
Assessment         Planning,       changes in the                                  documentati     including qualitative
                                                              Coordinate                                                          Requirement     Minimal required T&IVV effort
(for T&IVV         Requirements,   scope of the                                    on from         and quantitative
                                                               appropriate                                                          s               identified
levels of effort   Development     product and more                                BRWG            measures to Release
                                                               resources                                                            specification
(LOEs))                            requirements                                   Acquisition     Authority(s), Risk
                                   details are                 when the                            Manager, Product
                                                                                   Strategy
                                   available, the risk         risk                                team
                                                                                   from
                                   assessment will be          assessment
                                                                                   Program
                                   revisited.                  indicates the
                                                                                   staff
                                                               needs

                                                              Develop
                                                               TIWG               Product
                                                               Charter                             TIWG Charter to
                                                                                   Description
                                                              Finalize list       from the
                                                                                                   coordinate, plan,
                                                                                                   execute, analyze,               Approved
Test                                                           of                  Product team                                                         TIWG Charter Signed
                                                               membership                          monitor, and report              product
Integration
                   Planning
                                   Establishment                                  All relevant    T&IVV activities and             budget              TIWG Schedule
Work Group                         TIWG                        with                programmati                                                           Established
                                                               participating                       results to the                  Acquisition
Initiation                                                                         c artifacts
                                                               organization                        stakeholder                      Strategy
                                                                                   from the        organizations
                                                               s                   product team


                                   Product               Review/inspect*          Design              Testability          Views that cover       Sufficient information to refine
                   Architecture,   documentation         product artifacts         Documents            analysis report to   the business           the test cases in terms of factors
Design
                   Design          review to ensure      * Verify the              including            the Product          process, the data      and conditions for the Design of
                                   traceability,         artifacts for             conceptual,          Manager and          processing and         Experiment (DoE)

Test and Independent Verification and Validation                                                         Version 0.6
Business Process Guide                                   F-2                                           April 05, 2012
  T&IVV
  Business      Engagement
Process (BP)       Model          Key T&IVV                                 Expected Inputs      Expected Outputs (to       Minimal Entry
   Node         Intersection        Purpose                Activities         (from Whom)              Whom)                   Criteria            Minimal Exit Criteria
                               testability, and        consistency,              logical, and        Program              flow, the physical
                               design constraints      standards,                physical            Manager              laydown,
                                                       accuracy, and             design             Inspection/Revie     integration of
                                                       completeness              documents           w report (product    system
                                                                                 from the            metrics, process     components, and
                                                                                 Product team        metrics) to the      network topology
                                                                                Requirement         Product Manager
                                                                                 s                   and Program
                                                                                 Traceability        Manager
                                                                                 Matrix from
                                                                                 the RM
                                                                                Product
                                                                                 defect matrix
                                                                                 from the
                                                                                 Quality
                                                                                 Assurance
                                                                                 team
                                                                                Information
                                                                                 Assurance
                                                                                 Plan from
                                                                                 the assigned
                                                                                 Information
                                                                                 Assurance
                                                                                 (IA) Subject
                                                                                 Matter
                                                                                 Expert
                                                                                 (SME)
                                                                                Information
                                                                                 Support Plan
                                                                                 (Interoperabi
                                                                                 lity Plan)
                                                                                 and System
                                                                                 Engineering
                                                                                 Plan
                               Tailor the                   Document           Product             T&IVV Process            TIWG is
                               overarching                   the                 description          Guide to the             running with
T&IVV                          T&IVV Strategy                methodology         from the             Product team             empowered
                                                                                                                                               Formally staffed and approved
Strategy       Planning        so that satisfies all         , processes,        Product team        Test and IV&V            representativ
                                                                                                                                               T&IVV strategy document
Planning                       applicable laws,              and                Acquisition          Strategy (TIS) to        es of
                               regulations, and              procedures          Strategy             the Product team         stakeholder
                               policies, which               for the             from                Risks/Issues to          organizations
Test and Independent Verification and Validation                                                       Version 0.6
Business Process Guide                                 F-3                                           April 05, 2012
  T&IVV
  Business      Engagement
Process (BP)       Model          Key T&IVV                               Expected Inputs     Expected Outputs (to        Minimal Entry
   Node         Intersection         Purpose             Activities         (from Whom)             Whom)                   Criteria             Minimal Exit Criteria
                                has been approved        conduct of            Program            the Product
                                and has the              T&IVV                 staff              Manager and
                                commitment of           Identify and         Risk               Risk Manager
                                each stakeholder         document              Register          Approved
                                organization             process and           from the           T&IVV Strategy
                                                         product               Risk               Annex to the
                                                         metrics               Manager            program staff
                                                                              Product
                                Identification and                             description
                                obligation of                                  from the
                                required                                       Product team
                                                     Identify minimal                                                                        Budget obligation, stakeholder
                                resources* by                                 Risk
                                                     resources for                            Resources for T&IVV                            commitment for personnel,
                                their respective                               Register
T&IVV                                                effective and                            to the Product          Approved               program management approval
                Planning        authorities                                    from the
Resourcing                                           efficient                                Manager and Program     T&IVV Strategy         of schedule, environment
                                * Resources                                    Risk
                                                     execution of the                         Manager                                        reservation, and tools
                                include funding,                               Manager
                                                     T&IVV activities                                                                        procurement covered
                                personnel,                                    RTM from
                                facilities, tools,                             the RM
                                and schedule.                             

                                                                             RTM from
                                                                              RM                                            Approved
                                                                                                  Review report to
                                                                             Design                                         T&IVV
                                                     Review test                                   the Product                               Data instrument design
                                                                              artifacts                                      Strategy
                                Specific measures    documentation to                              Manager                                   completed, test cases fully
                                                                              from the                                      Draft test
                                and their            ensure adequate                              Risks/Issues to                           populated, necessary data for
Detailed Test                                                                 Product team                                   cases
                Requirements,   associated           test coverage                                 the Product                               System Under Test (SUT)
Planning and                                                                 Schedule                                      Information
                Development     conditions           including                                     Manager                                   identified and assigned for
Preparation                                                                   from the                                       Support Plan
                                allocated to each    essential business                           Informal                                  production, and coordination
                                requirement          requirements             Product                                        (ISP) /         for interface and test
                                                                                                   feedback to the
                                                                              Manager                                        Interoperabil
                                                                                                   Product team                              environment completed
                                                                             Test Cases                                     ity Plan
                                                                              from the
                                                                              Product team
                                                          Inspect the       RTM from            Inspection /             Established
                                                           development        the RM               Review report to          development
                                Inspection of all          approach,         Design               the Product               and test
                                design artifacts           coding             artifacts            Manager                   environment     Product/Process metrics
Development     Development
                                and reaffirmation          standard,          from the            Risks/Issues to           with            collected
                                of measures                and                Product team         the Product               automated
                                                           traceability      Software             Manager                   tools
                                                          Review             code from           Informal                 Assigned
Test and Independent Verification and Validation                                                    Version 0.6
Business Process Guide                               F-4                                          April 05, 2012
  T&IVV
  Business      Engagement
Process (BP)       Model           Key T&IVV                             Expected Inputs     Expected Outputs (to         Minimal Entry
   Node         Intersection        Purpose             Activities        (from Whom)              Whom)                    Criteria              Minimal Exit Criteria
                                                        source code          the Product         feedback to the            T&IVV
                                                       Gather               Developmen          Product team               personnel
                                                        Software             t team             SCQC report to
                                                        Code                                     the Product
                                                        Quality                                  Manager
                                                        Checking                             
                                                        (SCQC) data


                                                                            A testable
                                                                             product/com
                                                         Generate           ponent from
                                                          independent        the Product         Product Metrics
                                                          reports            team                 (test reports) to
                                                          regarding         Design               the Product
                                                          how well the       artifacts            Manager
                Developmenta                              system has         from the            Risks/Issues to
                               Product testing                                                                                                   Product / Process metrics
                l Test, User                              achieved           Product team         the Product         Accepted product
                               with the necessary                                                                                                 collected
Developmenta    Acceptance                                each and          Test Cases           Manager             and support items
                               analysis,                                                                                                         Information Assurance
l Testing       Test,
                               reporting, and
                                                          every              (SIT, SAT,          SCQC report to      as deliverables
                Independent                               requirement                                                                             Certification as applicable
                                                                             UAT) and             the Product
                               regression testing
                Test                                     Gather             scripts from         Manager
                                                          Software           the Product         Informal
                                                          Code               team                 feedback to the
                                                          Quality           Software             Product team
                                                          Checking           code from
                                                          (SCQC) data        the Product
                                                                             Developmen
                                                                             t team
                                   Verification                            User                Limited
                                                                                                                            Successful
                                    that the                                 documentati          deployment
                                                                                                                             operational
                                    production-                              on from the          Operational Test                               Authenticated field test
                                                                                                                             test readiness
                                    representativ   Ensure that the          Product team         and Evaluation                                  database, field test report,
                                                                                                                             review
                Release             e product       solution meets its      Operational          report to the                                   and necessary regression
                                                                                                                            Approval of
                Management,         works as        mission                  Test and             Product Manager,                                testing
Field Testing                                                                                                                limited
                Oversight           expected        requirements in          Evaluation           Program                                        No unresolved shortfalls
                Review                              the intended field                            Manager, and               deployment
                                    when                                     Plan                                                                 from Operational Test and
                                                                                                                             by the
                                    operated in     environment             Limited              DDA                                             Evaluation
                                                                                                                             deployment
                                    the Field                                deployment          User Satisfaction
                                                                                                                             decision
                                    environment,                             authorization        survey result to
                                                                                                                             authority
                                    by typical                                                    the Program
Test and Independent Verification and Validation                                                   Version 0.6
Business Process Guide                              F-5                                          April 05, 2012
  T&IVV
  Business       Engagement
Process (BP)        Model         Key T&IVV                               Expected Inputs     Expected Outputs (to          Minimal Entry
   Node          Intersection      Purpose               Activities        (from Whom)              Whom)                     Criteria              Minimal Exit Criteria
                                   users, with                                                    Manager
                                   realistic
                                   workload,
                                   and for
                                   mission
                                   tasks.
                                  Validation
                                   that the
                                   product
                                   satisfies the
                                   business
                                   needs as
                                   intended.

                                                                                                  Full deployment
                                                                                                   Operational Test           Regression
                                                                             Final user           and Evaluation              test
                                Analysis and
                                                                              documentati          report to the               completed           Approval of full
                                reporting to
                                                                              on from the          Product Manager,            for any              deployment by the
                                deployment           Verify that any
                                                                              Product team         Program                     requirement          deployment decision
                                decision authority   needed regression
                                                                             Operational          Manager, and                that has not         authority
                 Production,    the degree to        testing has been
Deployment       Oversight      which defects        accomplished and
                                                                              Test and             DDA                         been                Agreement in place for
                 Review         have been            the solution is
                                                                              Evaluation          Lessons learned             deferred             monitoring of the release
                                                                              Report               report to the              Release              by the product team and
                                resolved and the     ready for
                                product is ready     deployment              Full                 Product Manager             evaluated as         the T&IVV community
                                                                              deployment           and program                 effective,
                                for deployment
                                                                              authorization        Manager                     suitable, and
                                                                                                                               survivable


Sustainment*                    Reporting of              Execute           Updated user        User Satisfaction          A full
*                               analyzed                   continuous         documentati          survey result to            deployment
“Sustainment”                   information                monitoring         on (if               the Program                 decision
in the context                  gathered from the          to ensure          applicable)          Manager                    Agreement
                                                                                                                                                   A retirement or
of T&IVV                        field environment          that the           from the            Continuous                  in place for
                                                                                                                                                    evolutionary upgrade
refers to                       regarding how              solution           Product team         Evaluation                  monitoring
                                                                                                                                                    decision
ensuring the                    well the product           continuous        Full                 Report (as                  of the release
continuing                      continues to               to meet its        deployment           applicable) to the          by the
operation of a                  support the                operational        authorization        Product Manager             product team
release, not                    business need              requirements       (Production          and Program                 and the
the funding                                                and to work        Readiness            Manager                     T&IVV
Test and Independent Verification and Validation                                                    Version 0.6
Business Process Guide                               F-6                                          April 05, 2012
    T&IVV
    Business    Engagement
 Process (BP)      Model         Key T&IVV                             Expected Inputs   Expected Outputs (to     Minimal Entry
     Node       Intersection      Purpose              Activities       (from Whom)            Whom)                Criteria      Minimal Exit Criteria
source to do                                           as intended         Review)                                  community
so.                                                   Ensure that                                                 Plan to
                                                       regression                                                   conduct
                                                       testing after                                                customer
                                                       the                                                          surveys
                                                       implementati
                                                       on of
                                                       applicable
                                                       change
                                                       requests
                                                       (patches) has
                                                       been
                                                       successfully
                                                       completed

Notes:
   1.  Update metrics displays no less than upon each completion of exit criteria for a T&IVV BP node.
   2.  Nodes may be executed in parallel when appropriate.
   3.  All sprints must include an integration phase with testing.
   4.  Deviations from the T&IVV BP described here must be explicitly approved by the PM.
   5.  IV&V occurs continuously throughout the T&IVV BP.
   6.  The scope of “Developmental Testing” includes the following varieties of testing:
       - Functional
       - Technical
       - IOP
       - IA
       - Training package
       - Capacity
       - Compatibility
       - User (subject-matter expert, SME) acceptance testing
       - Integration testing.
    7. The T&IVV BP initiations may occur simultaneously for multiple products in a system family and for upgrades/enhancements/fixes of a
       particular product.
    8. Expected Inputs – Refers to what the T&IVV expects to receive from the external sources (external in the context of T&IVV). The
       T&IVV community will receive all “Expected Inputs” unless stated differently.

Test and Independent Verification and Validation                                             Version 0.6
Business Process Guide                             F-7                                     April 05, 2012
    9. Expected Output – Refers to what the T&IVV provides to the external team. The T&IVV community will provide all “Expected Outputs”
        unless stated differently.
    10. Entry Criteria – Refers to the minimum conditions that must exist before the T&IVV community begins to address the stated objective(s).
    11. Exit Criteria – Refers to the minimum results that must be obtained at the completion of addressing the stated objective(s)
    12. Key T&IVV Purpose – why does TIVV care about what goes on in this business node/circle
    13. Activities – what is TIVV accomplishing during this node/circle




Test and Independent Verification and Validation                                     Version 0.6
Business Process Guide                             F-8                             April 05, 2012
Annex G. Intake Assessment / Risk Assessment
Background
In the DoD/VA T&IVV Business Process, risk assessment includes two distinct types of analyses. First, an
Intake Assessment determines the degree to which information about a product is available for the first
iteration of Criticality Analysis and Risk Assessment (CARA). The Intake Assessment gauges what
information shortfalls exist such that the following CARA would have an unacceptably large uncertainty.
The Intake Assessment begins during the development of the business case and continues through
requirements analysis, culminating in a report to the product team and Program Manager. Second, CARA
begins by addressing information collected during the Intake Assessment in order to determine the
appropriate levels of analysis and testing for a product. CARA is a risk-based methodology that identifies the
functions, system design elements, and software tasks, which pose the greatest risk to the success of software
development initiatives. The key feature of CARA is considering risk and criticality in combination. CARA
will be applied to measures, rather than an entire product collectively.
Purpose
The purposes of CARA are to:
          1   Provide a mechanism to identify and mitigate potential developmental and operational risks;
          2   Provide inputs to the Product Manager to adjust the product schedule by identifying the expected
              LOE of T&IVV testing required for the product;
          3   Assist the product development team to develop and deliver a quality software product;
          4   Facilitate early involvement of the T&IVV community in the product development life cycle;
          5   Through early involvement and joint exercise, reduce reliance on Operational Test and
              Evaluation (OT&E) to find deficiencies and shortfalls in system effectiveness, suitability, and
              survivability; and
          6   Support the identification of product risks throughout the product development life cycle.
Process
Risk-Based Testing depends on the T&IVV risk assessment process. That process starts with Intake
Assessment and transitions to CARA. The risk assessment process will be performed for each iEHR
product.
The evaluation criteria for CARA analysis depends on the product type (COTS, GOTS, Open Source,
Development). Each product will undergo a set of CARA analysis iterations. Factors that cause the CARA
analysis to be repeated include, but are not limited to the following:
         Change in product scope
         Change in requirements
         Change in design approach
         Change in resources
         Evolution of the Evaluation Framework

The IV&V Agent assigned to conduct IV&V activities will closely collaborate with the product team
engineers, Integrated Quality Assurance (IQA) team, and the OT&E team in preparation to perform CARA
analysis. The iEHR testing methodology will help minimize the level of independent testing that would
normally be conducted by the OT&E team, by involving the OT&E team as part of the testing before full
product deployment.


Test and Independent Verification and Validation                                       Version 0.6
Business Process Guide                             G-1                               April 05, 2012
                                                               Figure 3 High-Level CARA Process
High-Level Criticality Analysis and Risk Assessment (CARA) Process
  T&IVV Leadership




                     Receive Request
                                        Assign T&IVV
                        for T&IVV
                                            Agent
                         Support




                                                                                         Contact Product
                                                                                         Manager and Get                                           Repeat
                                                                                            Updated                                                CARA?
                                                                                                                                                            No
                                                                                No         Documents                      Generate & Send
  IV&V Agent




                                                                                                                              Report
                                                             Evaluate
                                       Receive Relevant
                                                          Completeness of                  Complete?
                                           Project                                                                                           Yes
                                                           Documents for
                                         Documents
                                                               CARA

                                                                                Yes                                       Perform CARA
                                                                                          Prepare Inputs   Perform CARA
                                                                                                                           Analysis and
                                                                                             for CARA        Iteration
                                                                                                                          Determine LOE




The following Table provides the descriptions of activities contained within the above figure

 Activity                                                                   Description
 Receive Request for T&IVV Support                                                   The Deputy Chief Management Officer (DCMO), Director,
                                                                                      Operational Test and Evaluation (DOT&E), Component
                                                                                      Acquisition Executive (CAE), Program Executive Officer
                                                                                      (PEO), Program Manager (PM), or Product Manager
                                                                                      notifies the T&IVV leadership about the acquisition of a
                                                                                      new product or product upgrade and requests T&IVV
                                                                                      support.
 Assign IV&V Agent                                                                   Upon receipt of the T&IVV support request, the T&IVV
                                                                                      Director assigns an IV&V agent to support the product
                                                                                      regarding T&IVV following the procedures documented in
                                                                                      the T&IVV Business Process Guide.
 Receive Relevant Project Documents                                                  The IV&V Agent identifies key project documents to
                                                                                      support the CARA assessment and acquires the documents
                                                                                      from the Product Manager.
                                                                                      * See below for specific parameters to each CARA iteration




Test and Independent Verification and Validation                                                                                              Version 0.6
Business Process Guide                           G-2                                                                                        April 05, 2012
 Activity                                     Description
 Evaluate Completeness of Documents               The IV&V Agent reviews the documents to gain an
 for CARA                                          understanding of project background and key attributes as
                                                   specified for the current CARA iteration.
                                                  The IV&V Agent ensures that the process metrics
                                                   documented in the T&IVV Business Process Guide are
                                                   addressed.
                                                  The IV&V Agent determines whether the documentation
                                                   provided by the Product Manager in accordance with the
                                                   parameters to perform the CARA analysis.
                                                  During the evaluation of the documents provided, the IV&V
                                                   Agent focuses on CARA specific requirements, as well as
                                                   process and product specific metrics as defined in the
                                                   T&IVV Business Process Guide.
 Contact Product Manager and Get                  If the documents provided by the Product Manager do not
 Updated Documents                                 meet the CARA requirements, the IV&V Agent contacts the
                                                   Product Manager to request clarification or updated
                                                   documents.
 Prepare Inputs for CARA                          The IV&V Agent completes testing intake assessment form
                                                   for the product.
 Perform CARA Iteration                           The IV&V Agent performs CARA calculations to determine
                                                   the level of risks associated with the product. The iterations
                                                   /phases of CARA are based on the Evaluation Framework.
                                                   Some of the artifacts that will be used for the first iteration
                                                   of CARA assessment may not be ready at the time of the
                                                   assessment. Therefore, the parameters used during the first
                                                   iteration of CARA and the later iterations may differ
                                                   slightly.
 Perform CARA Analysis and                        The IV&V Agent evaluates the CARA calculations results
 Determine LOE                                     and determines the level of required testing.

 Generate and Send Report                         The IV&V Agent develops a report based on the CARA
                                                   calculation result and sends the report to the Product
                                                   Manager and the T&IVV leadership.

       Parameters for Initial CARA assessment
            o Concept of Operation (user types, locations, business objectives, future business process)
            o Enterprise Architecture /SOA
            o Requirements Documentation
            o Critical Operational Issues
            o Criteria
            o Constraints
            o Essential Business Functions
            o Information Exchange Requirements
            o Key Performance Parameters

Test and Independent Verification and Validation                                           Version 0.6
Business Process Guide                             G-3                                   April 05, 2012
       Parameters for Subsequent CARA assessments
            o Concept of Operation (user types, locations, business objectives, future business process)
            o Enterprise Architecture
            o Requirements Documentation
            o Design Artifacts
            o Interface Control
            o Key System Attributers
            o Derived Functional Requirements
            o Derived Technical Requirements




Intake Assessment               CARA
       Form               Workbook-Template-0.xls




Test and Independent Verification and Validation                                          Version 0.6
Business Process Guide                              G-4                                 April 05, 2012
Annex H. Acronyms
Acronym            Definition
ADCA               Automated Data Collection and Analysis
ADP                Automated Data Processing
AI                 Action Item
AIS                Automated Information System
AKO                Army Knowledge Online
AO                 Action Officer
AoA                Analysis of Alternatives
API                Application Program Interface
APL                Application Lifecycle Management
ATC                Authority to Connect
ATO                Authority to Operate
BCL                Business Capabilities Lifecycle
BP                 Business Process
BPG                Business Process Guide
BPMN               Business Process Modeling Notation
BPS                Business Process Support
C&A                Certification and Accreditation

C&L                Capabilities and Limitations

CARA               Criticality Analysis and Risk Assessment
CCB                Configuration Control Board
CDR                Critical Design Review
CDTE               Common Development and Test Environment
CE                 Continuous Evaluation
CIO                Chief Information Officer
CIS                Center for Internet Security
CM                 Configuration Management
CMF                Critical Mission Function
COI                Critical Operational Issue
COIC               Critical Operational Issues and Criteria
COMPSEC            Computer Security
CONOPS             Concept of Operations
Test and Independent Verification and Validation                Version 0.6
Business Process Guide                             H-1        April 05, 2012
Acronym            Definition
COOP               Continuity of Operation
COTS               Commercial Off-The-Shelf
CPI                Continuous Process Improvement
CPU                Central Processing Unit
CTO                Chief Technology Officer
CTP                Critical Technical Parameters
DASD
DB                 Database
DBA                Database Administrator
DBM                Data Base Management
DBS                Defense Business System
DCMO               Deputy Chief Management Officer
DD                 Distributed Development
DDA                Deployment Decision Authority
DDT&E              Director, Developmental Test and Evaluation
DFR                Derived Functional Requirement
DISA               Defense Information System Agency
DISR               Defense Information Standards Registry
DoD                Department of Defense
DoE                Design of Experiment
DOT&E              Director Operational Test and Evaluation
DT&E               Developmental Test and Evaluation
DTC                Development and Test Center
DTE                Development and Test Environment
DTR                Derived Technical Requirement
EA                 Enterprise Architecture
EBF                Essential Business Function
EF                 Evaluation Framework
EMD                Engineering and Manufacturing Development
ESE                Enterprise Systems Engineering
FAQ                Frequently Asked Question
FDSC               Failure Definition/Scoring Criteria

Test and Independent Verification and Validation                   Version 0.6
Business Process Guide                             H-2           April 05, 2012
Acronym            Definition
FoS                Family of Systems
FP                 Functional Proponent
FSO                Field Security Operations
FT&E               Field Test and Evaluation
GFE                Government Furnished Equipment
GIG                Global Information Grid
GOTS               Government Off-the-Shelf
GUI                Graphical User Interface
HIPPA              Health Insurance Portability and Accountability Act
HIT                Health Information Technology
HSI                Human System Integration
IA                 Information Assurance
IATC               Interim ATC
IATO               Interim ATO
IAVA               Information Assurance Vulnerability Assessment
IDE                Integrated Development Environment
iEHR               integrated Electronic Health Record
IER                Information Exchange Requirement
IGCE               Independent Government Cost Estimate
ILS                Integrated Logistics Support
IMS                Integrated Master Schedule
IOC                Initial Operating Capability
IOP                Interoperability
IPO                Interagency Program Office
IPT                Integrated Product Team
IQA                Integrated Quality Assurance
ISP                Information Support Planning
IT                 Information Technology
IV&V               Independent Verification and Validation
JITC               Joint Information Technology Center
JMIS               Joint Medical Information Systems
JUON               Joint Urgent Operational Need

Test and Independent Verification and Validation                           Version 0.6
Business Process Guide                             H-3                   April 05, 2012
Acronym            Definition
KPP                Key Performance Parameter
KSA                Key System Attribute
M&S                Modeling and Simulation
MDA                Milestone Decision Authority
MHS                Military Health System
MOE                Measure of Effectiveness
MTBF               Mean Time Between Failure
MTBOMF             Mean Time Between Operational Mission Failure
MTTR               Mean Time to Restore
NSM                Network System Management
NSS                National Security Systems
OCIO               Office of the Chief Information Officer
OE                 Operational Effectiveness
OIT                Office of Information and Technology
OS                 Operational Suitability
OT&E               Operational Test and Evaluation
PDR                Preliminary Design Review
PHI                Personal Health Information
PI                 Process Improvement
PII                Personally Identifiable Information
PM                 Program Manager
PMAS               Program Management and Accountability System
PMO                Project Management Office
POM                Program Objective Memoranda
PWS                Performance Work Statement
QA                 Quality Assurance
QASP               Quality Assurance Surveillance Plan
QSP                Quality Surveillance Plan
RA                 Risk Assessment
RDT&E              Research, Development, Test, and Evaluation
RI                 Rapid Initiative
RM                 Requirements Management

Test and Independent Verification and Validation                     Version 0.6
Business Process Guide                             H-4             April 05, 2012
Acronym            Definition
RMP                Risk Management Plan
ROI                Return-On-Investment
RS                 Requirements Structure
RSD                Requirements Specification Document
RTM                Requirements Traceability Matrix
SAT                System Acceptance Test
SCQC               Software Code Quality Checking
SDD                System Design Document
SDLC               System Development Life Cycle
SG                 Surgeons General
SLA                Service Level Agreement
SME                Subject Matter Expert
SoS                System of Systems
SUR                System Under Review
SUT                System Under Test
T&E                Test and Evaluation
T&IVV              Test and Independent Validation Verification
TDM                Test Data Management
TEMP               Test and Evaluation Master Plan
TIWG               Test Integration Work Group
TKBS               T&IVV Knowledge Base System
TR                 Technical Requirement
TRR                Test Readiness Review
TSC                T&IVV Steering Committee
TSP                Training Support Package
TSQA               Total System Quality Assurance
URL                Uniform Resource Locator
USD                Under Secretary of Defense
USD AT&L           Under Secretary of Defense, Acquisition, Technology and logistics
V&V                Verification and Validation
VA                 Department of Veterans Affairs
WBS                Work Breakdown Structure

Test and Independent Verification and Validation                                     Version 0.6
Business Process Guide                             H-5                             April 05, 2012
Acronym            Definition
WIPT               Working-level Integrated Product Team




Test and Independent Verification and Validation             Version 0.6
Business Process Guide                           H-6       April 05, 2012
Annex I. References
The following references provide background information for T&IVV.
    1. Test and Independent Verification and Validation (T&IVV) Steering Committee (TSC) Charter
    2. Joint DoD and VA T&IVV VA Lexicon
    3. T&IVV Knowledge Base System (TKBS)
    4. Department of Defense (DoD) Directive (DoDD) 5000.01, “The Defense Acquisition System,”
       November 20, 2007
    5. DoD Instruction (DoDI) 5000.02, “Operation of the Defense Acquisition System,” December 8,
       2008
    6. Defense Acquisition Guidebook, 17 December, 2009.




Test and Independent Verification and Validation                                Version 0.6
Business Process Guide                             I-1                        April 05, 2012

								
To top