Your Federal Quarterly Tax Payments are due April 15th Get Help Now >>

Appendix D US Marine Corps Integrated Logistics Capability ILC System by guy21

VIEWS: 542 PAGES: 77

									                                 Appendix D




           US Marine Corps



Integrated Logistics Capability (ILC)


  System Realignment and
 Categorization/Consolidation
           (SRAC)




           SRAC Guide

         Draft Version 2.9
           March 2001
DRAFT Version 2.8, 02 FEB 2001                                                                              Appendix D


                                            Table of Contents

1.0     Introduction...................................................................................1
2.0     Definitions ......................................................................................1
3.0     Objectives.......................................................................................1
4.0     Scope ..............................................................................................2
5.0     Principles and Assumptions ...........................................................2
6.0     Organizational Responsibilities......................................................3
7.0     The SRAC Process .........................................................................4
  7.1 Phase 1 – No Value AISs ..................................................................................... 4
  7.2 Phase 2 – Low Value AISs................................................................................... 7
  7.3 Phase 3: High-Value AISs ............................................................................... 11
    7.3.1 Phase 3, Part 1 – AIS Categorization ........................................................... 12
    7.3.2 Phase 3, Part 2 – Application Evaluation..................................................... 17
    7.3.3 Phase 3, Part 3 – Domain Solutions .............................................................. 21
8.0      SRAC Methods and Tools............................................................25
  8.1 SRAC AIS Nomination...................................................................................... 25
  8.2 Functional Evaluation....................................................................................... 27
  8.3 AIS Usage ........................................................................................................... 29
  8.4 AIS Retirement Impact ..................................................................................... 31
  8.5 Total Ownership Cost (TOC) ........................................................................... 33
  8.6 Technical Evaluation ......................................................................................... 35
    8.6.1 DII COE Compliance ..................................................................................... 36
    8.6.2 Application Technology Evaluation ............................................................. 37
    8.6.3 Technical Architecture Compliance LOE.................................................... 43
    8.6.4 Documentation Evaluation............................................................................ 43
  8.7 Vendor/Developer Evaluation .......................................................................... 45
    8.7.1 SRAC Vendor Worksheet ............................................................................. 45
    8.7.2 SRAC Government Developer Worksheet .................................................. 47
  8.8 Support Evaluation............................................................................................ 47
  8.9 Overall AIS/Application Evaluation ................................................................ 49
  8.10 Domain Solution Evaluation............................................................................. 50
APPENDIX A .........................................................................................1
APPENDIX B..........................................................................................1
APPENDIX C .........................................................................................1
APPENDIX D .........................................................................................6




                                                         - ii-D-
DRAFT Version 2.8, 02 FEB 2001                                                 Appendix D




1.0    Introduction

The US Marine Corps logistics community uses over 175 Automated Information
Systems (AISs) to support logistics. These systems utilize a combination of in- house
developed application software, Government Off-the-Shelf (GOTS) software developed
by other Services and a few Commercial-Off-the-Shelf (COTS) products. These systems
have evolved over a period of time, but were never designed to work together as
integrated network of systems. They were originally designed to support stove-piped
logistics functions and outdated logistics processes of the 1960’s. As time passed, lack of
an overall development plan created multiple systems with overlapping capabilities.

The Marine Corps can no longer afford to maintain such a large number of AISs with
overlapping functionality. In response to the USMC Integrated Logistics Capability
(ILC) Initiative, the Marine Corps has begun the System Realignment and
Categorization/Consolidation (SRAC) program to address problems with the current
logistics AIS portfolio. This document defines the SRAC process, methods/tools and
organizational responsibilities.

2.0    Definitions

SRAC is a complex, multi-phase, multidimensional decision process aided by standards
and common methods and tools. As such, it is essential that all key terms used to describe
the process, its constituent tasks, methods, tools and evaluation criteria be well defined.

Key words appearing in the text of this document are shown in red. In the Web version of
this document, placing your mouse cursor over red words will cause the definition to pop
up.

3.0    Objectives

As originally envisioned in the ILC engagements, SRAC was created as a means to
address the large number of legacy systems with redundant capabilities. By taking a
deliberate, methodical approach based on a common definition of logistics functions,
systems will be identified that have low value to the Marine Corps. Systems categorized
as low value are those that have few users, support few logistics functions relative to
other alternatives, and have costs that are not justified by perceived benefits. Higher-
value systems will also be evaluated based on their technology, interoperability, vendor
viability and support to determine their disposition.

The SRAC will use a phased approach that ensures maximum participation by vested
owners/users, minimum disruption to regular schedules, and optimum value to the
information technology (IT) re-engineering process.




                                          - 1-D-
DRAFT Version 2.8, 02 FEB 2001                                                  Appendix D


The objectives of SRAC can be summarized as:

      1. Recommend which legacy systems should be replaced or migrated
      2. Create a detailed Migration Plan for legacy systems
      3. Create a detailed Integration Plan for end-state Migration Systems

4.0      Scope

SRAC applies to Logistics functions for ground supply chain management (including
aviation ground support) across the strategic, operational and tactical levels. As such, it
will deal with IT investments which support the following major functional domains:

1.       Transportation
2.       Supply
3.       Maintenance
4.       Health Services
5.       Engineering
6.       Acquisition
7.       General Services

The functional domains are listed in the order of priority for SRAC execution.
Transportation, Supply and Maintenance will be addressed simultaneously, followed by
Health Services, Engineering and Acquisition. General Services applications will be
considered as they are encountered within each functional domain. More detailed
descriptions of the scope of each domain is contained in MCWP 4-1 and in Appendix B
of this document. Definitions of the functions within each domain are contained in the
Appendix D.

Automated information system (AIS) lists from several references were examined to
determine which applications would be considered in SRAC, including the Logistics
Information Resource (LOG IR) Plan, Version 2, and the ILC Engagement 1 listing. The
SRAC AIS Master List will be used as the initial list of existing applications to be
considered for SRAC.

Additional applications can be added to the master list by completing the AIS
Nomination Form (see section 8.1) and emailing the form to the address supplied with
it. At the end of the SRAC process, any USMC Logistics AISs that have not been tested
by the SRAC process will be retired.

5.0      Principles and Assumptions

SRAC will be based on the following principles and assumptions:

1. IT investments which are not used and/or supported will be eliminated.
2. The remaining IT investments will be evaluated first on the basis of how well they
   satisfy user functions within domains as defined by the Operational Architecture.



                                          - 2-D-
DRAFT Version 2.8, 02 FEB 2001                                              Appendix D


3. Duplication of function will be a primary criterion for downselecting existing IT
   investments.
4. Filling gaps in functional coverage will be a primary criterion for prioritizing new
   investments.
5. SRAC will proceed by functional domains that have been prioritized according to the
   objectives above.
6. Functional breakdowns will be defined by the best operational architecture (OA)
   available at the time of the SRAC that describes activities in a functional domain.
7. Final SRAC recommendations for high value applications will consider technical and
   cost criteria as well as functional evaluation.
8. COTS, GOTS and USMC-managed AISs will be given equal treatment in all
   evaluations of high- value applications. No evaluations will be conducted without
   considering potential COTS applications.

6.0    Organizational Responsibilities
Organizing to execute a complex SRAC process against over 175 AISs is a substantial
challenge. Decisions to cancel programs and retire AISs can only be made at high levels
of the organization. Fair and accurate evaluation of AISs can only be accomplished
via involvement of end users, operational subject matter experts (SMEs) and system
SMEs. At the same time, the SRAC program must dovetail with other on-going USMC
programs such as operational architecture development and technical assessments.

SRAC will utilize a bottom- up cascading of information gathering, analysis and decision-
making involving interlocking teams. In order to evaluate applications and propose
integrated solutions for each of these functional domains, 6 domain teams consisting of a
mixture of functional experts, users and systems SMEs will be formed. After a kickoff
workshop, each of these teams will be assigned a Web portal where they will meet to
gather categorization data, analyze systems, execute the SRAC process for their domain
and formulate recommendations.

The domain team recommendations will be passed to the SRAC Core Team that will
score the applications, migration strategies and integration scenarios and make
consolidated SRAC recommendations to the ILC Integrated Product Team (IPT). The
ILC IPT will validate the SRAC Core Team recommendations relative to operational
architecture and other on-going initiatives under ILC sponsorship. The ILC IPT will
formulate SRAC decisions or pass final recommendations to the Combat Service Support
Element (CSSE) Advocacy Board for major decisions .

The Center Of Software Management In the Corps (COSMIC) and MAGTF C4I
Systems/Technical Architecture & Repository (MSTAR) data repositories are being
evaluated as both sources and repositories for data collected by domain teams during the
SRAC process.




                                        - 3-D-
DRAFT Version 2.8, 02 FEB 2001                                                                                 Appendix D


7.0      The SRAC Process

The SRAC process has four phases:

??    Phase 0–Establish SRAC Process and Criteria
??    Phase 1– No-Value AISs
??    Phase 2– Low-Value AISs
??    Phase 3 – High-Value Applications & Domain Solutions

This document describes the results of Phase 0 which was conducted from October
through December of 2000. It will act as a guidebook for executing SRAC Phases 1, 2,
and 3. Phase 1 occurred in December, 2000. Phase 2 will occur from February until
April, 2001. SRAC Phase 3 will commence in May with the transportation, supply and
maintenance logistics domains. It is expected that the Phase 3 work for each domain will
take approximately 3 months.

Figure 1 illustrates the SRAC process.


                                                      Figure 1
                                                    SRAC Process

                 Start
                 SRAC




                 Phase 0
                Establish   Phase 1              Phase 2                Phase 3         Phase 3    Implement
                Process/    No Value            Low Value              High Value       Domain        SRAC
                              AISs                AISs                    AISs         Solutions     Results
                 Criteria



                                        Review
                                                            Review
                                        Process
                                                            Criteria    Once per AIS
                                       & Criteria



                                                       Once per Functional Domain


                             New Operational &
                            Technical Architecture
                                 (OA &TA)



At a more detailed level, the SRAC process is made up of 53 steps associated with the
SRAC Phases 1 through 3. The steps are either tasks or decisions. Tasks are represented
by rectangles and decisions by diamonds in the detailed process diagrams discussed
below. Arrows show the general flow of the process, although sequence of tasks and
decisions may vary in some instances. As each step is discussed the supporting methods
and tools and organizational responsibility is also discussed.

7.1      Phase 1 – No Value AISs

The USMC can no longer afford to invest in Logistics applications that are not used or
supported or supportable. The first pass of SRAC, or Phase1, was applied to 10 selected
applications identified in the Combat Service Support Element Shared Data Environment


                                                       - 4-D-
                                                Figure 2
       DRAFT Version 2.8, 02 FEB 2001        SRAC Process                             Appendix D
                                         Phase 1 - No Value AISs




1.Develop/Maintain
SRAC Master List




                                    No                          No



                                            Yes                        Yes                               6.
                                3. AIS                      4. AIS                     5. Draft
2. Examine AIS                                                                                       Implement
                                Used ?                     Supported               Retirement Plan
                                                                                                       Plan
                                                               ?




                                                                  No
                                                                       7. Done List
                                                                            ?


                                                                             Yes




                                                                          Go To
                                                                         Phase 2


       CSSE SDE initiative. AISs determined to be un-used, un-supported or un-supportable,
       during SRAC Phase 1 were retired.

       Figure 2 shows the process used in SRAC Phase1.


       Step 1 – Develop/Maintain SRAC AIS Master List

       The scope of all SRAC actions are determined by the SRAC AIS Master List that will
       be maintained over time. The current version of the list is contained in Appendix C of
       this document. As USMC Logistics programs change status (e.g. retirement via the
       SRAC Phases 1 through 3), AISs will be deleted from the SRAC Master List. In some
       cases, (e.g. introduction of a new COTS or GOTS application), an application may be
       added to the list that replaces others that are being deleted.

       The SRAC AIS Master List is maintained by the SRAC Core Team in an Excel
       spreadsheet format to allow easy sorting and comparison of software tools used to
       support USMC logistics.

       Step 2 – Examine an AIS

       Ten AISs were identified by ILC as part of the CSSE SDE initiative as requiring further
       investigation as potential no-value applications. SRAC Phase 1 applied the criteria to
       these ten applications and moved quickly into Phase 2.

       The 10 inactive USMC Logistics AISs considered as potential no- value AISs were:

       1. Amphibious Assault Planner (AAP) – HQMC (LPO-3)
       2. Ammunition Logistics System (AMMOLOGS) – SYSCOM (PMAM)
       3. Knowledge Based Logistics Planning System (KBLPS) –


                                                  - 5-D-
DRAFT Version 2.8, 02 FEB 2001                                                  Appendix D


4. Logistics Information System (LIS) – MCLB (760)
5. Marine Corps Automated Readiness Evaluation System (MCARES) – HQMC (LPO-
    4)
6. Marine Corps Ammunition Requirements Management System (MCARMS) –
    MCCDC
7. Marine Corps Level of Repair Analysis (MCLORA) – SYSCOM
8. Principal End Item Stratification (PEI-STRAT) – MCLB
9. Prepositioning Planning and Execution AIS (PREPO AIS) – MCLB
10. Real Property Management /Family Housing System (RPM/FHS) – HQMC (LFF)

Step 3 – AIS Used ?

The licensing, distribution and support records for each AIS are examined to determine if
the software is being used.

If there is no reason to believe that the program is being used, communication with the
POC is initiated to confirm. If no usage is encountered, or if plans are in place to cease all
usage of a program, the AIS is passed to Step 5, retirement planning.

If a small number of users do not justify the investments being expended, a user impact
statement is developed to be included in the retirement plan. In some cases where the
operation of the AIS is critical, or it is the only system that performs an important
function, this situation should be reflected in the impact statement. Impact statements
should also capture migration recommendations for important functions not supported by
other AISs.

If there is an important reason for keeping the AIS or a decision cannot be reached, the
AIS is retained on the SRAC Master list and passed forward into Step 4.

Step 4 – AIS Supported ?

For each AIS on the list of 10 inactive logistics applications, the support resources are
determined. Supported programs shall mean that an organization that owns the support of
the AIS can be identified, that this organization has developed or is developing a support
plan for the AIS and that the funding source for the support has been identified or
committed. If the AIS is found to be un-supported, plans are either put in place to correct
the lack of support or this AIS is passed to Step 5, retirement planning.

If the AIS is judged to have a support plan and/or active support, the technical
architecture of the AIS is examined by IT subject matter experts to determine whether the
program will continue to be supportable over time. If a finding of un-supportability is
reached, and no plans to re-engineer the AIS have been developed, this AIS is passed to
Step 5, retirement planning.




                                           - 6-D-
DRAFT Version 2.8, 02 FEB 2001                                             Appendix D


Step 5 – Draft Retirement Plan

Retirement plans for AISs found to be unused, unsupported or unsupportable will be
developed by MARCORSYSCOM according to DoD 5000.1 requirements and the
retirement plans will be implemented.


Step 6 – Implement Plan

The retirement plan will be executed by MARCORSYSCOM in conjunction with the AIS
POC and PM.

Step 7 – Done?

The examination of the AISs on the list of suspect logistics AISs continues until all
unused, unsupported and unsupportable programs have been identified and appropriate
retirement plans have been developed.

7.2    Phase 2 – Low Value AISs

After SRAC Phase 1 has been completed and all of the No Value AISs have been
eliminated, the SRAC Phase 2 process for Low Value AISs begins. In Phase 2 of SRAC,
low value AISs are identified and all of these whose value is judged to be not cost
effective are put into retirement planning. Phase 2 SRAC consists of steps 8 through 21
of the SRAC process as shown in Figure 3.




                                        - 7-D-
        DRAFT Version 2.8, 02 FEB 2001                                                                Appendix D

                                            Figure 3
                                        SRAC Process
                                    Phase 2 - Low Value AISs




                                 8. Remove No
 9. Form Domain                    Value AISs
      Team                     fromSRAC Master
                                      List




                                   11. Calculate AIS                       12. Done            Yes           13. Determine Low
   10. Pick AIS
                                        Score                                List                                Value AISs



                                                                            No




                                                                                    15. Develop
                          17. Evaluate Low                                                                   14. Pick Low Value
                                                       16.Calculate TOC          Retirement Impact
                              Value AIS                                                                              AIS
                                                                                     Statement




                                                                  No


 19. Go To 5                                                                                                Go To
                     No       18. AIS         Yes          20. Done       Yes       21. Done         Yes
Execute 5 & 6                                                                                              Phase 3
                             Justified?                      List?                  Domains?
 Return Here                                                                                               Step 22


                                                                                         No



                                                                                      Go To
                                                                                      Step 9




        Step 8 – Remove No Value AISs from SRAC Master List

        In this step, the SRAC Core Team reduces the SRAC Master List by removing all the No
        Value AISs identified in SRAC Phase 1. The new shortened list is then available to be
        used in SRAC Phase 2.




                                                         - 8-D-
DRAFT Version 2.8, 02 FEB 2001                                                  Appendix D


Step 9 – Form Domain Teams

Domain teams are formed from functional, AIS user and AIS developer SMEs for the 6
logistics functional domains: transportation, supply, maintenance, health services,
engineering and acquisition. Each team is assigned a list of applications for their domain
by the SRAC Core Team and is set up on an on-line domain Web portal where the
categorization work will be performed. The domain teams will review the domain
application list and the domain functional breakdown at the domain team kickoff
workshop. Team leaders for each domain will also be determined and training on the on-
line workplaces will be conducted at the domain team workshops.

Step 10 – Pick an AIS

The domain team picks an AIS from its list of domain applications.

Step 11 – Calculate AIS Score

Each AIS is mapped into the domain functions as determined in the domain Phase 2
workshop using the SRAC Functional Categorization Worksheet (see section 8.2) and
the number of functions supported by the AIS is recorded. The number of users of the
application is recorded using the SRAC AIS Usage Worksheet (see section 8.3).The
number of functions supported is then multiplied by the number of users and recorded as
the AIS score.

Step 12 – Done List ?

Step 11 is repeated for every AIS on the domain list until list is completed.

Step 13 – Determine Low Value AISs

The domain application list is sorted by ascending AIS score and potential Low Value
AISs are then selected from the top of the list. AIS scores and recommendations for Low
Value AISs are then passed by the domain team to the SRAC Core Team for evaluation.

Step 14 – Pick Low Value AIS

An AIS is selected for further investigation from the Low Value AIS list determined in
Step 13.

Step 15 – Develop Retirement Impact Statement

A SRAC Retirement Impact Statement (see section 8.4) is developed for the AIS
selected in step 14. The statement is developed by the domain team and submitted to the
SRAC Core Team.




                                          - 9-D-
DRAFT Version 2.8, 02 FEB 2001                                                 Appendix D




Step 16 – Calculate TOC

The total ownership cost (TOC) for the AIS selected in step 14 is calculated by collecting
cost data via the SRAC TOC Worksheet (see section 8.5). The TOC includes all
lifecycle costs of retaining the AIS in operation, over a 5 year period, including:

??   Development/acquisition costs
??   Production costs
??   Operational costs
??   Support and maintenance costs
??   Retirement costs

TOCs are calculated in Phase 2 for all applications suspected of being Low Value AISs.
Optionally, the domain team may continue to collect TOC information for AISs which
will obviously pass on to Phase 3, Part 1 – High Value AISs since all of this information
will be required in Phase 3.

Step 17 – Evaluate Low Value AISs

In this step, the SRAC Core Team balances the impact of AIS retirement with the
expected costs of continued operation and maintenance of the AIS. This step determines
whether or not the value of the AIS and the impact of retirement justify continued
investment and whether or not potential low-value AISs from step 13 are truly low-
value.The SRAC Core Team completes the evaluation and recommends retirement of
low-value AISs to the ILC IPT.

Step 18 – AIS Justified?

The ILC IPT reviews recommendations of the SRAC Core Team. If it is determined that
the AIS investment is justified, the AIS is passed into SRAC Phase 3. If it is determined
that the investment is not justified, the AIS is moved into retirement planning.

Step 19 – Go To 5, etc.

If it is determined by the ILC IPT that continued investment in the AIS is not justified, a
retirement plan is developed and implemented. In some cases, the retirement plan may
include recommendations for AIS functionality to be moved to another AIS. Such
recommendation will be passed on to Phase 3 of the SRAC.

Step 20 – Done List ?

Steps 10 through 18 are executed as many times as necessary to process all of the AISs
for a domain team through Phase 2 SRAC evaluation. When the last AIS on the domain




                                          - 10-D-
DRAFT Version 2.8, 02 FEB 2001                                                                       Appendix D


list has been processed and retirement plans have been started for unjustifiable
investments, the process moves on to Phase 2 for the next logistics domain.

Step 21 – Done Domains ?

Steps 9 through 18 are executed for each of the 6 SRAC logistics domains. When all 6
are completed, SRAC moves on to Phase 3 which deals with High Value AISs and
integrated domain solutions.


7.3    Phase 3: High-Value AISs

It is assumed that any AIS which survives into the Phase 3 process has sufficient value
that it cannot be eliminated without major impact to users and USMC missions and that
further SRACing will require functionality to be migrated to other applications.
Furthermore, the migration systems identified by the SRAC Phase 3 process will have to
be well integrated to support Focused Logistics and Operational Maneuver from the
Sea.

Figure 4 shows a high level summary of the SRAC Phase 3 process.


                                             Figure 4
                                         SRAC Process
                                    Phase 3 - High Value AISs



                                         OA                   TA             ??




                                                                                         Domain
                  Select   Select      Functional         Technical       Cost/Vendor
                                                                                         Solution
                 Domain     AIS        Evaluation         Evaluation       Evaluation
                                                                                        Evaluation




                                                    Once per Domain AIS



                                              Once per Functional Domain




SRAC Phase 3 applies rigorous functional, technical and cost analysis to the remaining
AISs on the SRAC Master List to further focus the investment of the USMC on a fewer
number of integrated systems. In the process, migration and integration strategies for the
selected portfolio of AISs are developed to guide further implementation planning.




                                                    - 11-D-
DRAFT Version 2.8, 02 FEB 2001                                                Appendix D


Because of its complexity, Phase 3 domain evaluation is broken into 3 parts discussed
separately:

Part 1 – AIS Categorization
Part 2 – AIS Evaluation
Part 2 – Domain Solution Evaluation

7.3.1   Phase 3, Part 1 – AIS Categorization

In Phase 3, Part 1, data is collected by the domain team and High Value AISs are
categorized.

 The High Value AIS SRAC List is created, a domain is selected and a reasonable
Operational Architecture (OA) is determined for the domain. The OA is used to
determine a standard set of functions that will be used to evaluate functional coverage of
all high value AISs and other potential applications that support the domain. Technical,
cost, vendor/developer and support data is collected and categorized.

Phase 3, Part 1 completes all the catego rization work that is needed before proceeding
with detailed high value AIS evaluations for a logistics domain.

Figure 5 shows the process for AIS Categorization.




                                         - 12-D-
DRAFT Version 2.8, 02 FEB 2001                                            Appendix D


                                    Figure 5
                                SRAC Process
                Phase 3, Part 1 - High Value AIS Categorization




      22. Remove Low
       Value AISs from
      SRAC Master List




           23. Pick               24. Validate           25.Validate             26. Determine
          Functional             Domain Team             Operational                Domain
           Domain                 Membership             Architecture              Functions




                                                           28.Map
         30. Collect               29. Collect                                   27.Update the
                                                          Application
       Application Cost           Application                                       Domain
                                                         Capabiliy into
       and Vendor Data           Technical Data                                  Application List
                                                       Domain Functions




         Go To
      Phase 3, Part 2
         Step 31




Step 22 – Remove Low Value AISs from SRAC AIS Master List

After Phase 2 SRAC has been completed the low value AISs selected for retirement are
removed by the SRAC Core Team from the SRAC AIS Master List. What remains in the
list are the AISs that will be processed in SRAC Phase 3.




                                          - 13-D-
DRAFT Version 2.8, 02 FEB 2001                                               Appendix D


Step 23 – Pick a Functional Domain

The SRAC Core Team selects a particular domain team (or teams) to start Phase 3 of the
SRAC process.

The logistics functional domain teams are listed here roughly in the order in which they
will navigate the SRAC process:

1.     Transportation
2.     Supply
3.     Maintenance
4.     Health Services
5.     Engineering
6.     Acquisition

There is no domain team for the general services functions (e.g. financial, personnel,
contracts, etc.) The general services AISs are considered by the 6 domain teams in areas
where they support each domain’s functions.

The domain team continues to use its assigned workspace on the Tiger Knowledge
Center to collaborate on its SRAC Phase 3 work. Phase 3 begins with a workshop where
the domain teams are given the high value AIS domain list and the training on additional
worksheets and tools that will be used for the SRAC Phase 3 process.

Step 24 – Validate Domain Team Membership

The domain team examines its membership’s skills relative to the expected work in
SRAC Phase 3 and adjusts the membership accordingly. At this point it may be necessary
to add members familiar with the development and usage of GOTS or COTS applications
not on the original SRAC Master List. It may also be advisable to bring in functional
experts who have a more detailed understanding of functional requirements in areas of
expected application overlaps or coverage gaps.

Step 25 – Validate Operational Architecture

The Domain Team will determine which Operational Architecture (OA) will be used to
generate a list of functions to be used for AIS functional evaluation. The team may decide
to use new OA material developed by the ILC IPT, augment the functional breakdowns
used in their Phase 2 work or select/generate/modify other domain functional models to
be used as a basis for the AIS evaluations.

Step 26 – Determine Domain Functions

The domain expert team will select a group of functions/major tasks from the operational
architecture for use in the functional evaluation of high value AISs and other potential
applications that might support the domain. It is expected that functional breakdowns



                                         - 14-D-
DRAFT Version 2.8, 02 FEB 2001                                                Appendix D


used in Phase 2 of SRAC will have to be broken down to one or two more levels of detail
to support the Phase 3 work. This additional functional determination and definition is
accomplished during the Phase 3 workshop.

Step 27 – Update the Domain Application List

The domain team performs preliminary evaluations of potential GOTS and COTS
packages which do not appear on its domain application list and adds potential
applications to the list. The SRAC Core Team updates the SRAC Master List to reflect
applications added by the domain team.

Step 28 – Map Application Capability into Domain Functions

The domain team creates a matrix that shows which domain functions and tasks are
supported by each application in the domain application list. An example of a SRAC
Functional Mapping Matrix (see section 8.2) is provided as a guide to developing
consistent maps that can later be compared. The matrices developed by the domain team
will be displayed on the domain team’s Web workspace and saved as input to the
functional evaluation of domain applications in Phase 3, Part 2.

Step 29 – Collect Application Technical Data

Members of the domain team collect data on each of the technical criteria for each of the
applications on the domain application list. This data is recorded in worksheets available
on the domain team Web workspace.

The technical categorization for an application contains data for various criteria grouped
in the following categories:

??   DII COE Compliance Level
??   System Architecture Application Categorization
??   Technical Architecture Compliance Level of Effort/Cost
??   Documentation Evaluation

The application technology categorization is further broken down into the following
criteria:

??   Platform
??   Hardware Type
??   Operating System
??   Data Management
??   User Interface
??   Application Interfaces
??   Middleware
??   Network/Security



                                         - 15-D-
DRAFT Version 2.8, 02 FEB 2001                                                Appendix D


Two worksheets are provided for collecting application data: the SRAC General
Application Worksheet and the SRAC Technical Application Worksheet (see section
8.6.2) . These worksheets, which are resident on the domain team Web workspace, allow
consistent categorization of SRAC high- value applications. Data collection for these
worksheets will be coordinated with data collection for the COSMIC system, which
requires much of the same data.

The documentation categorization is further broken down into categories of completeness
and quality for the documentation definitions listed in the SRAC Documentation
Worksheet (see section 8.6.4). The documentation categorization is accomplished by
subject matter rather than document titles. It is not important in which document the
subject discussion resides-only that it is documented and how well. All of the
documentation for a particular AIS should be scanned by a responsible member of the
domain team in the execution of this step.

Document categorization worksheets are completed by the domain teams and submitted
to the SRAC Core Team.

Step 30 – Collect Application Cost and Vendor Data

Business evaluation of individual domain applications includes evaluation of the total
cost of ownership for the application as well as the capability and stability of the
developing and supporting organizations. The domain team collects business data for
TOC, vendor/developer and support evaluation on electronic worksheets.

Total ownership cost (TOC) is categorized for all applications being analyzed for the
current domain. Cost elements in the TOC include:

??   Development/acquisition costs
??   Production costs
??   Operational costs
??   Support and maintenance costs
??   Retirement costs

The domain team categorizes the costs through data collection and calculation/estimation
(where data are not available) for all AISs and new applications that were not subject to
TOC determination in SRAC Phase 2. The costs are collected on a SRAC TOC
Worksheet (see section 8.5) and passed on to the SRAC Core Team.

The development and support sources of all applications are evaluated. For COTS
applications, vendors will be evaluated for viability and support capability on the
following criteria:

?? Length of time in the logistics/supply chain management applications market
?? Revenue, growth, and market share history.
?? Revenue per employee.


                                         - 16-D-
DRAFT Version 2.8, 02 FEB 2001                                               Appendix D


??   R&D spending as a percentage of sales.
??   Geographic coverage.
??   Compatibility of future business and channel strategies with USMC missions.
??   New product release cycle times

A SRAC Vendor Worksheet (see section 8.7.1) for rating COTS vendors in the
commercial market is supplied. A separate SRAC Government Developer Worksheet
(TBD)is supplied for rating USMC and other government agency software developers. A
separate evaluation of support for AISs/applications is accomplished with the aid of a
SRAC Support Worksheet (see section 8.8). The support criteria include availability,
capacity and quality for the following items:

     ??   Hotline
     ??   Tech Support
     ??   Maintenance
     ??   Training
     ??   Education
     ??   Customization
     ??   Professional Services

The domain team categorizes the vendor/developer and support for their domain
applications and pass this information to the SRAC Core Team.

Completion of Step 30 of the SRAC process completes the AIS categorization prior to
application evaluation that will be accomplished in Phase 3, Part 2 of SRAC.

7.3.2     Phase 3, Part 2 – Application Evaluation

This part of the SRAC process is dedicated to application evaluation within a domain.
Functional evaluations are developed for each application on the domain application list
by the domain team and converted into scores by the SRAC Core Team. Functional
evaluations are done for all Logistics functions within the domain, functional overlap
areas and gap areas.

The SRAC Core Team also receives application categorizations from the domain teams
and scores each application in technical, cost, and vendor/developer categories. The
ranked scores are recorded, analyzed and passed on to Phase 3, Part 3 where they are
used to evaluate integrated domain solution scenarios. Analysis includes the development
of an application scorecard for each AIS/application.

Figure 7 describes the process for SRAC high value application evaluation.




                                         - 17-D-
               DRAFT Version 2.8, 02 FEB 2001                                                                 Appendix D

                                                              Figure 7
                                                          SRAC Process
                                               Phase 3, Part 2 - Application Evaluation




  31.Calculate             32. Determine and
                                               33. Pick Overlap          34. List Overlap      35. Determine          36. Last         Yes
Overall Functional           Name Overlap
                                                    Area                  Applications         Overlap Scores         Overlap ?
    Scores                      Areas


                                                                                                                      No



                     Yes    41. Last Gap?
                                               40. Determine Gap        39. List Gap Filling
                                                                                               38. Pick Gap
                                                                                                                   37. Determine and
                                                 Filling Scores           Applications                               Name Gaps


                                   No



                             42. Calculate      43. Calculate                44. Rank
                              Technical        Cost and Vendor                  All
                               Scores              Scores                     Scores




                                                                           Go To
                                                                        Phase 3, Part 3
                                                                           Step 45




               Step 31 – Calculate Overall Functional Score

               The domain functional score for an application is calculated in two steps. In the first
               step, the domain team determines how well the tasks and functions defined in the SRAC
               Functional Mapping Matrices developed in step 28 are support by applications on the
               domain application list. This done with the aid of a SRAC Functional Evaluation
               Matrix (see section 8.2) developed by the domain team.

               Domain team experts evaluate each function of the applications across the domain using
               a “consumer report” approach. The domain team may need to engage commercial users
               to perform categorization for COTS applications that have not been used by the team.
               The completed matrices are then submitted to the SRAC Core Team who determine the
               numerical score for domain functions and total functional scores for AISs supporting
               more than one domain.


                                                                   - 18-D-
DRAFT Version 2.8, 02 FEB 2001                                                   Appendix D




Step 32 – Determine and Name Overlap Areas

The matrices from steps 28 and 31 are examined to identify areas of overlap of
functionality for applications. Areas where there is substantial overlap are named.

Step 33 – Pick Overlap Area

A particular named overlap area is selected for further analysis.

Step 34 – List Overlap Applications

A list of applications that address functions in the overlap area is made. In general, this is
a sub-set of the total list of domain applications developed in step 28.

Step 35 – Determine Overlap Scores

SRAC Functional Mapping Matrix and SRAC Functional Evaluation Matrix (see
section 8.2) are constructed for the overlap area and used to determine the overlap score
of applications supporting the overlap area under consideration. In this case, only the
functions, sub-functions or major tasks describing the area of overlap are used for the
columns of the matrices and only the applications supporting the overlap area define the
matrix rows. The domain team may decide to develop a finer breakdown of sub- functions
and tasks in the overlap area to perform the evaluation.

The domain team performs the overlap functional evaluation and passes it onto the SRAC
Core Team who calculates the overlap scores.

Step 36 – Last Overlap?

If there are more overlaps to be examined, steps 33 through 35 are repeated for each
overlap area. When the last overlap has been analyzed, the functional evaluation proceeds
with gap evaluation starting with step 37.

Step 37 – Determine and Name Gaps

The matrices from steps 28 and 31 are examined to identify gaps in functional coverage
for the applications as a group. Areas where there are substantial gaps are named.

Step 38 – Pick Gap Area

A particular functional gap area is selected for further evaluation.

Step 39 – List Gap-filling Applications



                                           - 19-D-
DRAFT Version 2.8, 02 FEB 2001                                                  Appendix D


A list of applications that support the gap functions, sub- functions or major tasks is made.
In general, this is a sub-set of the total list of domain applications developed in step 28
plus other promising COTS or GOTS applications that are not currently used in the
domain.

If applications are added to the domain list at this time for potential evaluation as gap
fillers, they are also added to the SRAC AIS Master List and run through steps 31
through 36.

Step 40 – Determine Gap-filling Scores

A SRAC Functional Mapping Matrix and SRAC Functional Evaluation Matrix (see
section 8.2) are constructed for the gap area under consideration and used to determine
the gap-filling score of applications supporting the gap functions. In this case, only the
functions, sub-functions or major tasks describing the gap area are used for the columns
of the matrix and only the applications supporting the gap functions define the matrix
rows. The domain team may decide to develop a finer breakdown of sub-functions and
tasks in gap areas to perform the evaluation.

The domain team performs the functional evaluation for the gap areas and passes it onto
the SRAC Core Team who calculates the gap- filling scores.

Step 41 - Last Gap?

If there are more gaps to be examined, steps 38 through 40 are repeated for each gap area.
When the last gap has been analyzed, the functional analysis is complete and the process
passes on to the technical application categorization and scoring in step 42.

Step 42 – Calculate Technical Scores

The technical score for an application is a combination of scores for categorization of
applications in four areas:

??   DII COE Compliance Level
??   System Architecture Application Technology Evaluation
??   Technical Architecture Compliance Level of Effort/Cost
??   Documentation Evaluation

In this step, categorization worksheets completed in step 29 by the domain team are
examined and scored by the SRAC Core Team. Individual AIS scores are calculated here
and rolled up in step 44 of the process.

Step 43 – Calculate Cost and Vendor Scores

In this step, cost vendor and support categorization worksheets completed in step 30 by
the domain team are examined and scored by the SRAC Core Team. Individual scores are
calculated here and rolled up in step 44 of the process.


                                          - 20-D-
DRAFT Version 2.8, 02 FEB 2001                                                Appendix D



Step 44 – Rollup Scores and Rank Applications

After all of the functional, technical, cost, developer/vendor and support scores are
calculated, scores are rolled up by the SRAC Core Team in four categories to provide an
easy to read SRAC Application Scorecard (see section 8.9). The rollup includes
weighting of scores to reflect their relative importance at the individual criteria and
category levels.

Once the scores have been rolled up, the applications can be ranked at the metric category
and/or at the overall scorecard level.

At this point all the application scoring information for a domain has been developed and
summarized and we are ready to enter SRAC Phase 3, Part 3 – Domain Solutions.

7.3.3   Phase 3, Part 3 – Domain Solutions

Up until this point, all the scoring and ranking has been done on an individual application
basis. In SRAC Phase 3, Part 3, we build and test alternative solutions for integrated
system support of the functional domain. These solutions are called scenarios in SRAC
terminology. Two or three scenarios are a reasonable number of alternatives to examine
in detail. Each scenario defines which high-value AISs will be retired, what functions
will migrate to other applications, what AISs will have technology upgrades, what COTS
or GOTS applications will be acquired, and how the resulting migration applications will
be integrated among themselves to applications outside the domain and to data sources
and sinks.

The process for Phase 3, Part 3 is illustrated in Figure 10.




                                          - 21-D-
DRAFT Version 2.8, 02 FEB 2001                                              Appendix D


                                               Figure 10
                                           SRAC Phase 3, Part 3
                                            Domain Solutions




     45. Develop
                                                         47. Develop          48. Develop
      Alternative                46. Pick a
                                                         a Migration         an Integration
       Domain                    Scenario
                                                          Diagram              Strategy
      Scenarios




                                    51.                  50. Cost the         49. Develop
    52. Select the
                                    Last                   Domain            an Integration
    Best Scenario
                                 Scenario ?                Solution            Diagram




        53.
                                  Go to
        Last
                                 Step 22
      Domain ?




     End SRAC -
    Begin SRAC
   Implementation



Step 45 – Develop Alternative Domain Solution Scenarios

The scoring information for the high- value applications and the overlap and gap
identification obtained in SRAC Phase 3, Part 2 will lead to the identification of
alternative strategies for domain- wide solutions which we refer to as Domain Solution
Scenarios. Two or three scenarios are a good number of alternatives to examine in detail.
These scenarios are documented in this step based on templates discussed in section 8.
Each scenario will define what high- value AISs will be retired, what functions
will migrate to other applications, what AISs will have technology upgrades, what COTS
or GOTS applications will be acquired, and how the resulting migration applications will
be integrated among themselves, to applications outside the domain, if required, and to
data sources and sinks.



                                               - 22-D-
DRAFT Version 2.8, 02 FEB 2001                                                 Appendix D


Domain solution scenarios proposed by the domain teams are evaluated on functional
coverage and total cost by the SRAC Core Team before making recommendations to the
ILC IPT.

Step 45 – Develop Scenarios

Considering all of the SRAC AIS analysis results, the domain team and/or SRAC Core
Team develops a concept description of a scenario for providing an integrated solution
for the domain. This concept should be a short (1 page maximum) description of what
applications would be selected as migration systems, what applications would be retired
and what major groups of functionality would be migrated from retired to migration
systems.

Step 46 – Pick a Scenario

One scenario from the list of alternatives is selected.

Step 47 – Create a Migration Strategy and Diagram

A migration strategy is a narrative text description for each scenario of how functionality
will migrate between AISs/applications over a five year period. The strategy is illustrated
by a SRAC Migration Diagram which is a visual check on the migration strategy for
each scenario. It also adds the element of time necessary to create an acquisition strategy.

Step 48 – Develop an Integration Strategy

The applications on the right side of the migration diagram represent the tools that will be
used in an end state, 5 years after the implementation begins on the current SRAC
process. The migration diagram does not tell how the migration applications will be
integrated, however, either among themselves or with other domain applications and data
sources/sinks. The integration strategy provides a narrative of how the end state will be
integrated.

Step 49 – Create an Integration Diagram

Based on the domain integration strategy, a systems architecture SRAC Integration
Diagram is created which acts as a visual check and summary of the domain integration
strategy.

Step 50 – Cost the Domain Solution

The migration and integration strategies of the domain plus their associated diagrams
may be used to create a phased set of project descriptions that will be needed to get from
the current to the future state. Typically, these projects will have descriptors such as
“Retire application A”, “Move functionality B from application C to D” , “Raise
application E to DII COE compliance level 5”, “Integrate application F with application



                                           - 23-D-
DRAFT Version 2.8, 02 FEB 2001                                                Appendix D


G”, etc. A rough time-phased cost estimate for the 5 year domain re-alignment period can
then be created for each phased project and added up to give a cost profile and total cost
for the scenario.

Step 51 - Last Scenario?

If there are more domain scenarios to be analyzed, steps 46 through 50 are repeated for
each remaining scenario. When the last scenario has been analyzed, the individual
analysis of alternative domain solutions is complete and the process passes on to
selecting the best scenario and implementing the results of the SRAC.

Step 52 – Select the Best Scenario

The domain scenarios are developed by the domain teams and submitted to the SRAC
Core Team for review along with recommendations. The SRAC Core Team uses the
accumulated evaluation data to validate domain team recommendations compares
recommendations across domains and submits multi-domain recommendations to the ILC
IPT. The ILC IPT validates the proposed integration solutions against the current state of
its architectures and other ILC Initiatives and prepares final recommendations for the
CSSE Advocacy Board on the best overall solution.

The benefits versus the costs of each scenario are analyzed using all of the summarized
data accumulated in the SRAC Phase 3 process as reference material. The domain team
puts together recommendations for the SRAC Core Team. The SRAC Core Team uses
the accumulated evaluation data to validate domain team recommendations, compares
recommendations across domains and submits consistent domain recommendations to the
ILC IPT. The ILC IPT validates the proposed integration solutions against the current
state of its architectures and other ILC Initiatives and prepares final recommendations for
the CSSE Advocacy Board on the best overall solution. The CSSE Advocacy Board
selects the best solution and it is published on the SRAC Website.

Step 53 – Last Domain ?

If there are more domains to be analyzed, the process returns to Step 22 to select another
domain. When the last domain has been analyzed, the domain solutions are complete and
this round of the SRAC is complete.

At this point we have come to the end of the discussion of the SRAC process. As SRAC
for each domain is completed, the solution definition information is passed on to the
appropriate PMs to support the development of an acquisition strategy and design of
projects for POM submissions.




                                         - 24-D-
DRAFT Version 2.8, 02 FEB 2001                                              Appendix D


8.0    SRAC Methods and Tools

This section of the SRAC document contains detailed descriptions of methods and tools
used in SRAC. This includes templates for nominating, evaluating and scoring SRAC
AISs/Applications and Integrated Domain Solutions.

8.1    SRAC AIS Nomination

All AISs that have not survived the SRAC process will be slated for retirement. As
domain teams are formed and begin collecting data for AIS categorization within their
domain, team members may find missing AISs on the SRAC Master List that they
believe should be considered during SRAC. The AIS nomination form below should be
filled out for AISs falling into this category and submitted to the SRAC Core Team
before adding them to the domain team application list. The SRAC Core Team will then
add these AISs to the SRAC Master List.

Note that most of the required data on the AIS Nomination form is the same data that is
required on the SRAC General AIS Information Worksheet.




                                        - 25-D-
DRAFT Version 2.8, 02 FEB 2001                                                                Appendix D



                                    SRAC AIS Nomination Form

Domain Team = ___________                                           AIS = _________________

                Required Data                                            Data Input

AIS/Application Type                                       Select One: COTS, GOTS or Legacy
Owner Agency
Owner Agency POC                                           Name:
                                                           Tel:
                                                           email:
Vendor/Developer Organization
Vendor/Developer POC                                       Name:
                                                           Tel:
                                                           email:
Support Organization
Support Organization POC                                   Name:
                                                           Tel:
                                                           email:
USMC Program Manager                                       Name:
                                                           Tel:
                                                           email:
USMC Technical POC                                         Name:
                                                           Tel:
                                                           email:
Reason for adding this AIS to the
SRAC Master List.




----------editors note: the definitions for the items in the left column are shown in the right column which
will be blank in the web version of the worksheet

-------------insert definitions here -------------------------
-------------




                                                        - 26-D-
DRAFT Version 2.8, 02 FEB 2001                                                Appendix D


8.2    Functional Evaluation

Four types of functional evaluation of high- value AISs and other potential applications
are possible:

       1.      Overall evaluation across all domains
       2.      Overall evaluation across the current domain
       3.      Evaluation of overlap areas within a domain
       4.      Evaluation of gap areas within a domain

A complete type 1 functional evaluation is dependent on having operational architectures
and domain teams in place for all domains. The SRAC Core Team will perform type 1
functional evaluation using the results from overall functional evaluations from the 6
domain teams.

Types 2, 3, and 4 functional evaluation will be performed and/or directed by each domain
team using the same tools. The domain teams first determine what functions and tasks are
supported by which applications. This is done using a SRAC Functional Mapping Matrix
below.




                                         - 27-D-
DRAFT Version 2.8, 02 FEB 2001                                                                                                                                                                                                                                                                                                                                                                                                                                                                                            Appendix D


                                                                                              Sample SRAC Functional Mapping Matrix
-----------------------replace this with Tim’s supply chart--------------------------




                                                                                                                                                                                                                                                            Source
                                                                                                                                                                                                                                                                                                                     S1: Source                                                   S2: Source
                                                                                                                                                                                                                                                                                                                                                                                                                                                                                         S3: Source Engineer-to-
                                                                                           S0: Source Infrastructure                                                                                                                                                                                                  Stocked                                                    Make-to-Order                                                                                                Order materiel
                                                                                                                                                                                                                                                                                                                      materiel                                                     materiel
                                                                Manage Vendor Agreements




                                                                                                                                                                                                                                                                                      Schedule materiel Deliveries




                                                                                                                                                                                                                                                                                                                                                                           Schedule Materiel Deliveries




                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           Select Final Supplier(s) and




                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             Receive and Verify materiel
                                                                                                                                                Manage Vendor Calibration


                                                                                                                                                                            Manage Sourcing Business

                                                                                                                                                                                                       Authorize Vendor Payment




                                                                                                                                                                                                                                                          Manage materiel Inventory




                                                                                                                                                                                                                                                                                                                         Receive and Verify Materiel




                                                                                                                                                                                                                                                                                                                                                                                                          Receive and Verify materiel
                                                                                           Maintain Source Planning




                                                                                                                                                                                                                                                                                                                                                                                                                                                            Identify Sources of Supply
                                                                                                                      Manage Incoming Freight
                Establish New materiel




                                                                                                                                                                                                                                  Supplier Technologies
                                         Establish New Source




                                                                                                                                                                                                                                   Determine Emerging
                                                                                             and Execution Data




                                                                                                                                                                                                                                                                                                                                                       Transfer Materiel




                                                                                                                                                                                                                                                                                                                                                                                                                                        Transfer materiel




                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           Transfer materiel
                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           Install Product
                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                    Negotiate
                                                                                                                                                                                     Rules




 AMMOLOGS                                                                                                                                                                                                                                                                             ?                                ??                                                                                 ?                             ?                                                                                  ??
 ATLASS II+                              ?                                                                                                                                                                                                                                            ?                                ??                                                  ?                              ?                             ?                                                                                 ???
 MAGTFII                                                                                                                                                                                                                                                                              ?                                ??                                                  ?                              ?                             ?
 SASSY                                   ?                                                                                                                                                                                                                                            ?                                ??                                                  ?                              ?                             ?                                                                                 ???
 SCS                                     ?                                                                                                                                                                                                                                            ?                                ??                                                  ?                              ?                             ?                                                                                 ?? ?
 SS07(DSSC)                              ?                                                                                                                                                                                                                                            ?                                 ?                                                  ?                                                            ?                                                                                    ?


This exa mple is taken from ILC Engagement Final Report which uses the SCOR
Model as the basis for its operational architecture definitions. The matrices in this report
define functions (e.g. source) which are subsets of functional domains; subfunctions (e.g.
source infrastructure, source stocked material, etc.); and tasks (e.g. establish new
material, establish new source, etc.). The functional mapping matrix therefore answers
the question, ”Which tasks, sub- functions and functions of the functional domain are
supported by the applications on the domain list.”

After the functional mapping matrix is completed, the domain team creates a SRAC
Functional Evaluation Matrix, a consumer report type of evaluation of the functional
coverage indicated by the functional mapping matrix. A sample functional evaluation
matrix is shown below:




                                                                                                                                                                                                                                  - 28-D-
DRAFT Version 2.8, 02 FEB 2001                                                          Appendix D




                                 Functional Evaluation Matrix Example

                       FUNCTION                                               UNWEIGHTED
                                          F1   F2   F4    F5   F6   F7   F8   APPLICATION
                APPLICATION                                                      SCORE

                USMC AIS 1                                                       55

                  USMC AIS 2                                                      22

                  USMC AIS 3                                                      26

                    GOTS 1                                                        27

                    COTS 1                                                        52

                    COTS 2                                                        18
                  FUNCTIONAL             19    12   45    29   29   27   29
                   COVERAGE

                LEGEND
                    No Capability
                     Little Capability
                     Partial Capability

                     Full Capability




The scoring of functional coverage and applications is performed by the SRAC Core
Team using the simple scoring equivalence:

No capability = zero points
Little capability = 2 points
Partial capability = 5 points
Full capability = 10 points

Functional mapping matrices and functional evaluation matrices and scoring are used to
determine scores for overall functional coverage and applications within a domain, within
functional overlap areas and within gap areas of functional coverage. Functional scores
determine how well the collection of candidate applications supports a specific function.
Application scores are used to rank the functional capabilities of an application against
other applications.

8.3    AIS Usage

AIS usage will be defined as the number of individuals actively accessing the system or
number of licensed users.Using this definition, the domain teams will categorize the
usage of each AIS via the following worksheet:




                                                     - 29-D-
DRAFT Version 2.8, 02 FEB 2001                                  Appendix D


                           SRAC AIS Usage Worksheet

Domain = ____________________                 Application = _____________

      Organization                 Location            Number of Users




                                     Total Users =




                                    - 30-D-
DRAFT Version 2.8, 02 FEB 2001                                            Appendix D



8.4    AIS Retirement Impact

Retirement impact statements for AISs will be developed in both Phase2 and Phase 3 of
SRAC using the same worksheet as shown below:




                                       - 31-D-
DRAFT Version 2.8, 02 FEB 2001                                  Appendix D


               SRAC Retirement Impact Statement Worksheet
Domain = __________________           Application = __________________

Impact on Users:




Impact on Development Organization (USMC or GOTS AISs only):




Impact on Support Organization (USMC or GOTS AISs only):




Required functionality and integration capability to be migrated:




Other actions required for retirement:



Benefits of retirement



Retirement risks




                                  - 32-D-
DRAFT Version 2.8, 02 FEB 2001                                               Appendix D



8.5    Total Ownership Cost (TOC)

The Total Ownership Cost is calculated in SRAC Phases 2 and 3 for Low- and High-
Value AISs and applications. In Phase 2, TOC is a primary metric that is used to
determine whether continued investment in a low-value AIS is justified by the number of
users and functions supported. In Phase 3, the TOC is combined with a number of other
metric scores to get an overall score for an application and application scores are added
up to get an overall score for a domain solution.

The TOC for an AIS is calculated using a simplified version of the program baseline
worksheet from the TOC-R Program in MARCORSYSCOM as shown below.




                                         - 33-D-
       DRAFT Version 2.8, 02 FEB 2001                                                              Appendix D



                                         SRAC TOC WORKSHEET
                         Projected Cost without Initiatives (Cost profiles in constant FY-XX $K)

                                               PRE         FY-00        FY-01     FY-02            FY-03    FY-04
                                               FY-00
1. Development Category
    1.1. Hardware
    1.2 Software
      1.2.1 Organizational
      1.2.2 Acquisition
      1.2.3 Development

Total Development Costs

2. Production Category
     2.1. Hardware
     2.2 Software

Total Production Costs

3. Operations & Support Category
      3.1. Hardware
     3.2 Software
     3.3 Operation
    3.4 Maintenance
    3.5 Misc. Contractor Services
    3.6. Supplies/Consumables
    3.7 Formal Training
    3.8 Indirect/Infrastructure

Total Operational & Support Costs

4. Total Retirement Costs

TOTAL COSTS
TOTAL OBLIGATED COSTS

       --------------editor’s note- the following are definitions to be implemented as popup
       windows in the web version of the SRAC Guide. Numbers should be deleted from the
       TOC worksheet and definitions on the Web version------------------

       1. DEVELOPMENT. Development has two subcategories, hardware and software. These categories
       represent the costs associated with the research and development of AISs. These cost are generally
       associated with phases O, I, and II of the DoD 5000 Acquisition process.

                 1.1     Hardware. Cost of hardware purchased during the development phase of the system.

                 1.2     Software. All associated costs of developing software.




                                                         - 34-D-
DRAFT Version 2.8, 02 FEB 2001                                                               Appendix D



                 1.2.1 Organizational. Infrastructure needs.

                 1.2.2 Acquisition. Costs that include labor, printing, travel in association with RFP and
                 selection of supplier.

                 1.2.3 Development. Labor to develop system (programmers, analysts, etc.).

2. PRODUCTION. Production has two subcategories, hardware and software. These categories represent
the costs associated with the production of AISs the program has developed. If the program has only
developmental cost, then there will be no production costs. In that case, place N/A in the appropriate boxes.
Generally, these costs are incurred after milestone III.

               2.1 Hardware. Hardware upgrades in outyears.

               2.2 Software . Purchase of operating system software.

3. OPERATIONS AND SUPPORT . The total costs associated with maintaining the system throughout the
   life of the program.

         3.1 Hardware. Hardware maintenance to include LAN and peripherals.

          3.2 Software. O/S software maintenance, internet fees, PM labor and travel to User Conf, CCBs,
         etc.

         3.3 Operation. DISA run time, system administration labor, help desk labor.

          3.4 Maintenance. Analyst and programmer labor, software maintenance fees including COTS
         products.

         3.5 Miscellaneous Contractor Services. The cost of contractor services providing technical
         services to maintenance centers.

          3.6 Supplies/Consumables. A fixed rate (referenced in current LCCEs) times number of FTEs
         attributed to system. References supplies used in day-to-day business.

         3.7 Formal Training. Training throughout the life cycle of the system.

          3.8 Indirect/Infrastructure. A fixed rate (referenced in current LCCEs) times number of FTEs
         attributed to system. References space, furniture, utilities, etc. used in day-to-day business.

               .
4. DISPOSAL The costs associated with retirement of the AIS/application and any associated equipment
disposal. This information may be found in the LCCE.

5.Obligated costs – Costs that will be incurred based on prior commitments even though the AIS has been
retired.


8.6      Technical Evaluation

The technical evaluation of an application is a combination of scores for categorization of
applications in four areas:

1. DII COE Compliance Level


                                                 - 35-D-
DRAFT Version 2.8, 02 FEB 2001                                                 Appendix D


2. Application Technology (System Architecture)
3. Technical Architecture Compliance Level of Effort/Cost
4. Documentation

Methods and tools for AIS/application technical evaluation are discussed below.

In cases where applications support multiple logistics domains, the SRAC Core Team
will determine the domain team responsible for providing the technical categorization of
the AIS/application.

8.6.1   DII COE Compliance

The Defense Information Infrastructure Common Operating Environment (DII COE) is a
set of standards and software infrastructure that ensure that Department of Defense AISs
will be able to easily interoperate and share data. Compliance with the DII COE
architecture is measured in SRAC by the DII COE Runtime Environment compliance
level as follows:

Level 1: Standards Compliance Level
Level 2: Network Compliance Level
Level 3: Workstation Compliance Level
Level 4: Bootstrap Compliance Level
Level 5: Minimal DII Compliance Level
Level 6: Intermediate DII Compliance Level
Level 7: Interoperable Compliance Level
Level 8: Full DII Compliance Level

-------------editor’s note: the following definitions for the above terms are implemented as
popup, rollover text boxes in the Web version of this document--------------

Level 1: Two capabilities share only a common set of COTS standards. Sharing of data is
undisciplined and minimal software reuse exists beyond the COTS. Level 1, may, but is
not guaranteed to, allow simultaneous execution of the two systems.

Level 2: Two capabilities co-exist on the same LAN but on different CPUs. Limited data
sharing is possible. If common user interface standards are used, applications on the LAN
may have a common appearance to the user.

Level 3: Environmental conflicts have been resolved so that two applications operating
on the same LAN share data and co-exist on the same workstation as COE-based
software. The kernal COE , or its equivalent must reside on the workstation. Segmenting
may not have been performed, but some COE components may be reused. Applications
do not use the COE services and are not necessarily interoperable.

Level 4: All applications are in segment format and share the bootstrap COE. Segment
formatting allows automatic checking for certain types of application conflicts. Use of



                                          - 36-D-
DRAFT Version 2.8, 02 FEB 2001                                               Appendix D


COE services is not achieved and users may require separate login accounts to switch
between applications.

Level 5: All segments share the same kernal COE and functionality is available via the
Executive Manager. Boot, background, session and local processes are specified through
the appropriate segment descriptor files. Segments adhere to the basic “look and feel” of
the native GUI as defined in the Style Guide. Segments are registered and available
through the on- line library. Applications appear integrated to the user, but there may be
duplication of functionality and full interoperability is not guaranteed. Segments may be
successively installed and removed through the COE installation tools. Database
segments are identified as unique or sharable according to their potential for sharing.

Level 6: Segments utilize existing account groups and reuse one or more COE-
component segments. Minor documented differences may exist between the Style Guide
and the segment’s GUI implementation. Use of non-standard SQL in database segments
is documented and, where applicable, packaged in a separate database segment.

Level 7: Segments reuse COE-component segments to ensure interoperability. These
include COE-provided communications interfaces, message parsers, database segments,
track data elements, and logistics services. All access is through published APIs with
documented use of few, if any , private APIs. Segments do not duplicate any
functionality obtained in COE-component segments. The data objects contained within a
database are standardized according to DoD 8320 guidance.

Level 8: Proposed new functionality is completely integrated into the system (e.g. makes
maximum possible use of COE services). and is available through the Executive
Manager. The segment is fully compliant with the Style Guide and uses only published
public APIs. The segment does not duplicate any functionality contained elsewhere in
the system whether as part of the COE or as part of another mission application or
database segment.

The DII COE compliance level is recorded by the responsible domain team for each
application on the SRAC Application Worksheet – Technical Data.

8.6.2   Application Technology Evaluation

Application technology categorization is performed by the logistics domain teams using
data collection worksheets described below. The worksheets are filled out by the domain
team for each application on their domain application list. Each domain team is
responsible for validating AIS data with appropriate POCs and PMs. In cases where an
application supports functions in multiple domains, the SRAC Core Team will assign a
domain team to collect the categorization data. Additional help from AIS users or
customers of COTS vendors may be desirable where appropriate.

The technology evaluation of AISs/applications in SRAC is based on criteria that are a
subset of application data from the COSMIC data repository augmented by a small



                                         - 37-D-
DRAFT Version 2.8, 02 FEB 2001                                                  Appendix D


amount of additional data not currently required by COSMIC. Initially, COSMIC may act
as both a source and a repository for SRAC categorization data. The current plan is to
replace the COSMIC system with the MSTAR data repository while retaining the
COSMIC data definitions in MSTAR. MSTAR is based on a more scaleable Oracle
database technology.

Two worksheets are used to capture the AIS/application technology categorization; a
general data worksheet and a technical data worksheet:




                           SRAC Application Worksheet
                                 General Data

AIS/Application = _______________                      Domain Team = ___________

           Required Data                                      Data Input

AIS/Application Type                         Select One: COTS, GOTS or Legacy
Owner Agency
Owner Agency POC                             Name:
                                             Tel:
                                             email:
Vendor/Developer Organization
Vendor/Developer POC                         Name:
                                             Tel:
                                             email:
Support Organization
Support Organization POC                     Name:
                                             Tel:
                                             email:
USMC POC                                     Name:
                                             Tel:
                                             email:
USMC Technical POC                           Name:
                                             Tel:
                                             email:

----------editors note: the definitions for the items in the left column are shown in the
right column which will be blank in the web version of the worksheet-------------

The technology data worksheet is filled out for each of three AIS/application states,
where applicable:


                                           - 38-D-
DRAFT Version 2.8, 02 FEB 2001                        Appendix D


       1.      Current Configuration
       2.      Planned and Funded Configuration
       3.      Planned and Non-Funded Configuration




                                      - 39-D-
       DRAFT Version 2.8, 02 FEB 2001                                                  Appendix D



                                  SRAC Application Worksheet
                                       Technical Data

       AIS/Application = ____________         Version = _____ Logistics Domain = __________

       Status: Current_______ Planned & Funded ______ Planned & Nonfunded ______

           Required Data                                         Data Definitions

Platform                                Standalone, client-server or web-enabled. Web enabled implies
                                        browser access from any hardware.
Hardware type                           PC, Mainframe or Mid-tier. If platform = client server above, enter
                                        two of the above (e.g. PC and mainframe)
Operating System                        O/S names for hardware above (e.g. HPUNIX, Linux, MS-DOS,
                                        MVS/DFP, MVS-XA, OS-390, SCO UNIX, Solaris, Windows 98,
                                        Windows NT, etc.) Insert one entry per hardware type above.
Data Management                         Data base management system and version number (e.g.
                                        ADABASE 5.3.3, DB2, MS Access 2000. Oracle 8i, SAGE, etc.)
                                        If the application uses no DBMS, enter flat files or no data storage,
                                        etc.
User Interface                          User interface style (e.g. standard browser, standard MS Windows,
                                        MOTIF, Xwindows, 3270 emulation, DOS Command Line, etc.)
Programming Language(s)                 Application, data access and user interface programming
                                        languages (e.g. ADA, ALC, C, C++, CatMeow, CGI, COBOL,
                                        Intelligence Query, My SQL, NATURAL, NATURAL2, Oracle,
                                        Perl, PowerBuilder, VisualBasic for Applications, etc.)
Application and DB Interfaces           Names of external AISs and databases to which the
                                        AIS/application is interfaced.
Middleware                              Standards methods and integration products that the
                                        AIS/application uses to interface with other applications and
                                        databases (e.g. native format file translators, EDI, XML, published
                                        API, software development kit, RPC, message qeueing, object
                                        request brokers, SQL, etc.)
Security                                Security classification of the AIS/application. (e.g. Confidential,
                                        Confidential – No Foreign, None, Not applicable, Official Use
                                        Only, Secret, Secret – No Foreign, Sensitive, TS, TS – No
                                        Foreign, TS – SIOP, TS – SIOP/ESI, Unclas – No-Foreign, Unclas
                                        – Sensitive, Unclassified, etc.)

       Does this AIS/application use standard DoD data definitions? _____________
       If the answer is no to the above question, does a data map exist between this
       AIS/application and the DoD DDDS? ________
       What is the DII COE runtime compliance level for this application? ____________




                                                  - 40-D-
DRAFT Version 2.8, 02 FEB 2001                                               Appendix D


-------------editor’s note: The right column of the above table will be blank in the Web
version. The right column consists of definitions that will appear in popup text windows--
--------------

The following worksheet is used by the SRAC Core Team to combine the results from
domain team technical evaluation worksheets.




                                         - 41-D-
       DRAFT Version 2.8, 02 FEB 2001                                      Appendix D


                          SRAC Technology Evaluation Worksheet
                              (SRAC Core Team Use Only)

AIS/Application   Existing              Planned and Funded   Planned Non- funded   ATR
                  Configuration         Configuration        Configuration         Score
                  Score                 Score                Score




                                            - 42-D-
DRAFT Version 2.8, 02 FEB 2001                                              Appendix D



8.6.3   Technical Architecture Compliance LOE

TBD

8.6.4   Documentation Evaluation

The SRAC evaluation of AIS/application documentation is based on the IEEE 12207.1-
1997 standard. This standard, which has been adopted by the DoD, describes the
requirements for documentation content without regard for how the content is packaged.
It is based on the concept that documentation should record the planning, execution and
result of the planning and an evaluation of how well the execution/result turned out.

IEEE 12207.1-1997 also specifies that the process for establishing each document should
be documented. Since this standard was not established in the DoD at the time that most
of the documentation was created, this aspect of documentation is not considered in
SRAC.




                                        - 43-D-
DRAFT Version 2.8, 02 FEB 2001                                             Appendix D


                        SRAC Documentation Worksheet

Domain = ________ Application = __________ Vendor/Developer = ____________

Support Org. = ____________


Document Type           Planning      Execution/          Evaluation     Score
                                      Result
Lifecycle Step
Concept of Oper.                      -------N/A-------
Needs Analysis                        -------N/A-------
Requirements
Acquisition/
Modification
Architecture
Change Management
Configuration
Manageme nt
Quality
Assurance
Detailed Design
Database Design
Coding
Testing/
Validation
Installation
Integration and
Customization
Support
Maintenance
Retirement

Legend*

Rating          Symb      Points

None                -             0
Inadequate         ?              2
Adequate           ?              5
Excellent          ?             10

*Note: The IEEE 12207.1-1997 standard for software development states that
data/documentation for software development should be evaluated on the characteristics
defined below:


                                        - 44-D-
DRAFT Version 2.8, 02 FEB 2001                                                 Appendix D


Data/documentation characteristics:

1. Unambiguous – Described in terms that allow a single interpretation, aided where
   necessary by definition.
2. Complete – Includes necessary, relevant requirements and/or descriptive material,
   responses are defined for the range of valid input data and terms and units of measure
   are defined.
3. Verifiable – Can be checked for correctness by a person or tool.
4. Consistent – Contains no conflicting information.
5. Modifiable – Structured in a way or style that allows changes to be made completely,
   consistantly and correctly while retaining the original structure.
6. Traceable – Origin of components can be determined.
7. Presentable – It can be retrieved and viewed easily.
8. Secure and private – There is and appropriate level of controlled access to the
   information.
9. Protected – There is a persistance in data backup and protection for the information.
10 Accurate – Information is correct


8.7     Vendor/Developer Evaluation

Categorization data for vendors/developers of applications on the domain application list
are collected using two separate worksheets that are resident on the domain team’s Web
portal.

8.7.1   SRAC Vendor Worksheet

The SRAC vendor worksheet is used to assess a COTS vendor’s business success,
stability and viability in its primary commercial markets. It is the responsibility of the
domain team to get this worksheet filled out for all high value COTS applications
evaluated in the SRAC process. This may require surveying commercial users of COTS
packages as well as the vendor.

The worksheet used to categorize and evaluate COTS vendors is shown below:




                                          - 45-D-
DRAFT Version 2.8, 02 FEB 2001                                                     Appendix D


                             SRAC Vendor Worksheet

Vendor = ______________               COTS Application(s) = _______________

Vendor Longevity = ______years                 Application Longevity = --------years

 Calendar      Software          Revenue Per       Revenue            Market          R&D          New
  Year           Revenues         Employee         Growth             Share         Intensity    Product
                                                                                                Cycle Time

     1996
     1997
     1998
     1999
     2000


                                   Geographic Coverage

US          Canada      Latin       Europe     Mid-East      Africa      Asia/       ROW
                        America                                          Pacific




Describe the vendor’s future business and channel strategies. On which applications
will the company focus? In what geographies? Through what business models, channels
and types of partners?




                                         - 46-D-
DRAFT Version 2.8, 02 FEB 2001                                                  Appendix D


---------------editor’s note : below are the popup definitions for the above worksheet-------
--------------------

Software revenues = The calendar year sum of revenues for application software licenses
and maintenance fees collected by the vendor and the vendor’s resellers and distributors

Revenues per employee = The vendor’s total revenues divided by the number of
employees at the end of the same calendar year

Revenue Growth = The difference between the current and last year’s software revenues
divided by last year’s revenues expressed as a percentage

Market share = The vendor’s software revenues for this application divided by the
software revenues of the software market in which the application participates.

R&D Intensity = The amount spent on software R&D divided by the software revenues
for the same year.

New Product Cycle Time = the number of months between new major releases of the
application.

Geographic Coverage = The geographic regions in which the vendor has existing sales
and support resources (may be supplied by VARS or distributors)

ROW = rest of world

8.7.2   SRAC Government Developer Worksheet

TBD

8.8     Support Evaluation

Categorization and evaluation data for organizations providing application support for
applications on the domain applications list are collected by the domain teams using a
SRAC Support Worksheet that is resident on the domain team’s Web portal.




                                          - 47-D-
DRAFT Version 2.8, 02 FEB 2001                                              Appendix D


                             SRAC Support Worksheet

Domain = ________ Application = __________ Vendor/Developer = ____________

Support Org. = ____________


Evaluation Criteria     Availability   Capacity         Quality           Score

Support Type

Hotline
Tech Support
Maintenance
Training
Education
Customization
Professional
Services

Legend*

Rating          Symb Points

None                         0
Inadequate        ?          2
Adequate          ?          5
Excellent         ?         10

----------editor’s note: Definitions for the red terms in the worksheet above are shown
below. They will be implemented through rollover, popup text boxes in the Web version
of this document ---------------

Availability: Geographic coverage, timeliness of response, and correct language in
communications and support documentation.

Capacity : Amount of trained resources delivering support and number of
instructors/scheduled training events versus the requirements

Quality: Helpfulness of support personnel and appropriateness, completeness, and
accuracy (i.e. usefulness) of the information provided.

Hotline: Dial up or web based support that produces an immediate response to user
questions.




                                        - 48-D-
DRAFT Version 2.8, 02 FEB 2001                                                            Appendix D


Tech Support: Support for questions/problems that users or administrators may have that
are too complex to be resolved via hotline.

Maintenance: Identification and fixes for code bugs and improvements to application
capability through patches, modifications and new releases.

Training: On- line, CD or classroom courses for the use and administration of the
AIS/application.

Education: On- line, CD or classroom courses for users of applications in related
disciplines.

Customization: AIS customization services available from the support vendor.

Professional services: Consulting and system integration services beyond application
customization that are available from the support vendor.

8.9    Overall AIS/Application Evaluation

The cost, functional, technical, vendor and support scores for an AIS/application may be
rolled up together for analysis and comparison with other AIS scores via an application
scorecard.



                                      SRAC Application Scorecard
                       Integration Strategy Example
           Domain = ______ Application =_______ Score = _____
            Functional Coverage            __% Technical Capability                 __%
              Overall Support                  __ Technical Architecture Compliance   __
              Overlap Support*                 __ Technical Architecture LOE          __
              Gap Filling Support*             __ Application Technology Rating*      __
                                                  Technical Documentation*            __
                      Cost       __ %                Vendor Viability/Support __%
             Acquisition*                      __    Longevity                        __
             Infrastructure*                   __    Financials*                      __
             Support*                          __    R&D Intensity                    __
             Licenses                           __   Geographic Coverage              __
             Training                          __    Business/Channel Strategy        __
             Cuastomization/SI                  __   Support*                         __

              * Note - Indicates individual criteria within criteria groups shown

A decision support tool, Goal-Tender from LABBLEE Corporation , will be used to
organize the scoring, roll up AIS scorecards and drill down into individual criteria scores
during comparative analysis of AIS/applications.




                                               - 49-D-
DRAFT Version 2.8, 02 FEB 2001                                                                                                    Appendix D


8.10      Domain Solution Evaluation

The evaluation of scenarios is based on analysis of migration and integration strategies
provided by the domain teams. The picture of the migration strategy is illustrated by the
example shown below.

The picture of the migration strategy is illustrated by the example shown below.

                                      Example Domain Migration Diagram
  SUPPLY                                                                                   MCDSS II
                  MCDSS                                                                                                          MCDSS II
                  CAV II                                                                   CAV II
                                                                                                                                CAV II
                  MP&E                                                                       MP&E
                                                                                                                                  MP&E
                                                                                        ATLASS II +
               ATLASS II +                                                                                                       ATLASS II +
                                            SASSY                  SASSY                      SASSY
                  SASSY
                                       MIMMS
                  MIMMS
                                               MPR
                    MRP
                    DSSCS                                  DSSCS
                                                                           WAR
                     WAR

                    SCS              SCS                SCS                       SCS


            NON-SYSTEMS
                                               PRF
                  PRF
                                                        DATA ELEMENTS
         DATA ELEMENTS
                                                        DATABASE REFRESH
    DATABASE REFRESH
                                             DESEX
                 DESEX
                                    LOIS
                  LOIS
                                    LBIV
                  LBIV
                                    ITEM APPL       .
ITEM APPLICATION ONLINE
                                    SERV MART
         SERVICE MART
                                                                       STRATIFICATION
         STRATIFICATION
                                                                       REPLENISH REVIEW
       REPLENISHMENT REV
                                                                       FORECASTING
          FORECASTING

  STORAGE & DISTRIBUTION
                                                                       GOTS/COTS                      GOTS/COTS
               GOTS/COTS                                                                                                        GOTS/COTS
                                       WSS
                   WSS
                                           MOWASP
               MOWASP
                                           SET ASSEMBLY
           SET ASSEMBLY
                                                    SERIAL # TRK
           SERIAL # TRK
                                                              WEAPON # TRK
          WEAPON # TRK
                             2000                   2001                   2002              2003       2004      2005   2006




The picture of the integration strategy is illustrated by the example shown below.




                                                                                        - 50-D-
DRAFT Version 2.8, 02 FEB 2001                                 Appendix D



                                 Integration Diagram Example




                                          - 51-D-
DRAFT Version 2.8, 02 FEB 2001                                                                Appendix D



                                         APPENDIX A
                                        SRAC Definitions

------------editor’s note: These definitions are implemented as rollover, popup text boxes in the Web
version of this document…….


 Acrony m          Translation                                   Description

 Acquisition                            A detailed plan for acquiring a Logistics domain solution.
  strategy                              SRAC recommends components for an acquisition strategy but
                                        the development and execution of these strategies is beyond the
                                        scope of SRAC.
    AIS          Automated              Application software and hardware used to support particular
                 Information System     functional work that has been selected by DoD Services.
Application                             Shorthand for application software. Software that is designed to
                                        support particular functional work. Consists of AIS software
                                        plus potential software from other sources (e.g. COTS)
Application                             The evaluation of data obtained in categorization of
Evaluation                              AISs/applications and subsequent scoring for comparison
                                        purposes. Domain teams perform functional evaluation and the
                                        SRAC Core Team performs technical evaluation of
                                        applications.
Application                             A numerical score calculated by examining the technology
Technology                              components of the application system.
  Rating
Applications                            A visual representation of the overall score for a SRAC high-
 Scorecard                              value application and all of the score’s components.
Categorizati                            Collection of data associated with SRAC AISs/Applications
     on                                 and domains according to pre-defined data fields contained in
                                        SRAC data collection worksheets. Categorization is performed
                                        by domain teams.
   COTS          Commercial off-the-    Applications that may be purchased from commercial software
                 shelf software         vendors that are offered to the marketplace as a standard,
                                        packaged product.
    CRM          Customer               Applications used to support customers including call center
                 relationship           management.
                 management
   CSSE                                 Combat Support Services Element------------
 Advocacy
   Board
 CSSE SDE        Combat Support         An initiative that assures sharable data to support logistics
                 Service Element/       functions.
                 Shared Data
                 Environment
  DII COE        Defense                A DoD standard and interoperability software initiative
                 Information            defining
                 Infrastructure/
                 Common Operating
                 Environment
  Domain                                The AIS overall functional score for a particular domain.
 Functional
   Score



                                                 -A 1 -
DRAFT Version 2.8, 02 FEB 2001                                                            Appendix D

  Domain                              Meeting places on the Web where domain teams will collect
  Portals                             data and do their SRAC work.
  Domain                              The integrated collection of application systems and reference
  solution                            databases that optimally supports the operation of a Logistics
                                      domain including required links to applications and data
                                      sources/sinks outside the domain.
  Domain                              A specific instantiation of an alternative domain solution that
  solution                            picks specific application systems as part of the solution.
  scenario
  Domain                              Teams of subject matter experts on the functional operation of,
   Teams                              and applications used in, individual logistics domains.
    DPM
 Evaluation                           SRAC evaluation consists of categorization plus scoring.
                                      Categorization is performed by the SRAC domain teams.
                                      Scoring is performed by the SRAC Core Team based on
                                      categorization data.
 Functional                           Collections of functions and constituent tasks within a
  domains                             prescribed boundary
 Gap-filling                          The score that measures an application’s ability to fill gaps in
   score                              coverage of Logistics functions.
    gaps                              Areas of functional domains that are poorly supported by AISs.
   GOTS        Government off-        Packages applications that are available from other
               the-shelf software     Government sources.
High-Value                            AISs that have been judged in SRAC to be essential for the
   AISs                               efficient performance of USMC Logistics.
    ILC        Integrated Logistics   An initiative of the USMC to improve logistic operations to
               Capability             support Operational Maneuver from the Sea
  ILC IPT      ILC Integrated         The team that has the responsibility for planning ILC programs
               Planning Team          including SRAC.
 Integration                          A narrative description of how migration systems for a domain
  Strategy                            will be integrated among themselves, with external systems and
                                      with data sources and sinks.
  Logistics                           Logistics with a ‘Big L”. Includes all supporting functions such
                                      as services, engineering and acquisition support as defined by
                                      MCWP 4-1.
 Low-Value                            AISs that have been judged in SRAC to have low usage and
   AISs                               functional coverage and whose functionality may be supplied
                                      by alternative AISs/applications..
MARCORS        Marine Corps
 YSCOM         System Command
 MCLB-A        Marine Corps
               Logistics Base –
               Albany, GA
 Migration                            A visual map showing the existing AISs in a Logistics domain
 diagram                              and how they are planned to migrate to a set of migration
                                      systems over a 5 year time frame.
 Migration                            A narrative description of how legacy AISs for a domain will
 Strategy                             migrate to migration systems.
 Migration                            Those high-value AISs that are planned for active use in 2004.
 systems                              Migration systems can be existing AISs that will continue to be
                                      supported/modified or new applications introduced before
                                      2004.
 No-Value                             AISs that have been judged in SRAC to have no-users, no
  AISs                                support, or are unsupportable.
   OA          Operational            A document that establishes the functional requirements for an



                                              -A 2 -
DRAFT Version 2.8, 02 FEB 2001                                                           Appendix D

              Architecture          integrated set of applications covering a functional domain.
                                    OA’s contain functional models and data flow diagrams.
  Overall                           The score that measures an application’s relative ability to
 functional                         support all the functions of USMC Logistics.
   score
  Overlap                           The score that measures an application’s ability to provide
   score                            functional support in identified areas of functional overlap
                                    between applications.
    POC       Point of contact
   retired
    score                           A numerical value given to a SRAC categorization criteria for a
                                    particular application or domain scenario that enables
                                    comparisons across potential applications and domain
                                    solutions..
  SMEs        Subject matter        Members of SRAC domain teams with functional, user and
              experts               development/support knowledge of an AIS/application.
  SRAC        Software              Created by the ILC, SRAC is a program to reduce the IT
              Realignment and       investment and overlap in legacy applications supporting
              Categorization/       USMC logistics.
              Consolidation
SRAC Core                           The team responsible for evaluating the categorization of AIS,
  Team                              migration ands integration strategies and making SRAC
                                    recommendations to the ILC IPT.
supportable                         AISs that have been judged to be capable of being supported
                                    now and in the foreseeable future. Unsupportable AISs contain
                                    obsolete and/or retired technologies and/or programming
                                    languages.
 supported                          AISs that have designated/funded support groups that are
                                    currently in operation and supplying adequate support.
    SA        System Architecture   The collection of preferred technology selections that satisfy
                                    the requirements of the technology architecture.
    TA        Technical             The USMC technical architecture for Logistics systems as
              Architecture          defined by ILC. TA is also used in SRAC to describe the
                                    technical assessment work that will be done by the ILC IPT.
   TOC        Total Ownership       The total cost of continued development, maintenance and
              Cost                  support for an AIS through its lifecycle including retirement.
   Total                            The AIS functional score across all USMC functional domains.
 Functional                         This score is calculated by the SRAC Core Team for AISs
   Score                            supporting multiple domains.
    used                            AISs that have registered users that are actively using the
                                    software in performing their work.
   users                            The number of individuals actively accessing an AIS or the
                                    number of licensed users.
 weighting                          Applying weighting factors to individual criteria or criteria
                                    category scores to indicate their relative importance in an
                                    overall application score.




                                             -A 3 -
DRAFT Version 2.8, 02 FEB 2001                                                 Appendix D



                                 APPENDIX B
                             SRAC Functional Domains

SRAC Phases 2 and 3 will be applied to AISs by functional domain as listed below in the
priority order of SRAC execution:

1.     Transportation
2.     Supply
3.     Maintenance
4.     Health Services
5.     Engineering
6.     Acquisition

General services applications (i.e. finance, human resources, legal, etc.) will be
considered in terms of the support which they supply to these 6 logistics functional
domains.

The scope of the functional domains is defined in MCWP 4-1 and repeated here for
clarification.

Transportation

Transportation and distribution consists of moving containers, supply items, materials
and people from one location to another using highways, railroads, waterways, pipelines,
oceans or air. For a MAGTF, this function includes that support needed to put
sustainability assets personnel and materiel) in the correct location at the proper time in
order to start and maintain operations.

The transportation and distribution system that supports an expeditionary MAGTF not
only includes the means of transportation but also the methods to control and manage
those transportation means. (local storage??)

The functions within the Transportation and Distribution functional domain include:

?? Embarkation
?? Landing support
?? Motor transport
?? Port and terminal operations
?? Air delivery
?? Material handling equipment
?? Freight or passenger transportation

Supply



                                          -B 1 -
DRAFT Version 2.8, 02 FEB 2001                                                Appendix D


Supply is separated into ten general classes based on physical characteristics or purpose
of supply items as defined in Table B-1.

                            Table B-1: Classes of Supply

Class                                       Description
  of
Supply

    I      Subsistence which includes gratuitous health and welfare items and rations.
   II      Clothing, individual equipment, tentage, organizational tool sets and tool kits,
           hand tools, administrative and housekeeping supplies, and equipment.
   III     Petroleum, oils, and lubricants (POL), which consist of petroleum fuels,
           lubricants, hydraulic and insulating oils, liquid and compressed gases, bulk
           chemical products, coolants, de- icing and antifreeze compounds,
           preservatives together with components and additives of such products, and
           coal.
   IV      Construction, which includes all construction material, installed equipment,
           and all fortification, barrier, and bridging materials.
   V       Ammunition of all types, which includes, but is not limited to, chemical,
           radiological, special weapons, bombs, explosives, mines, detonators,
           pyrotechnics, missiles, rockets, propellants, and fuzes.
   VI      Personal demand items or nonmilitary sales items.
   VII     Major end items, which are the combination of end products assembled and
           configured in their intended form and ready for use (e.g. launchers, tanks,
           mobile machine shops, vehicles).
  VIII     Medical/dental material that includes medical- unique repair parts, blood and
           blood products, and medical and dental material.
   IX      Repair parts (less Class VIII), including components, kits, assemblies, and
           subassemblies (reparable and nonreparable), required for maintenance support
           of all equipment.
   X       Material to support nonmilitary requirements and programs that are not
           included in Classes I through IX. For example, materials needed for
           agricultural and economic development.


In ILC, the classes of supply are mapped into four quadrants as shown in Table B-2.




                                         -B 2 -
DRAFT Version 2.8, 02 FEB 2001                                                                     Appendix D



                                                       Table B-2
                                                USMC ILC Quadrant Model

                High               Bottleneck                       Critical
                                   One or more restricted sources   Few selected sources
                                   Few options                      Few options
                                   Low volume                       Low volume
                 UNIQUENESS/RISK
                                   Low market capacity              Low market capacity
                                   Low value                        High value


                                   Routine                          Levereged
                                   Many sources                     Many sources
                                   Many options                     Many options
                                   High volume                      High volume
                                   Large market capacity            Large market capacity
                                   Low value                        High value
               Low

                                   Low                        VALUE                         High




The quadrants determine the business rules and processes that are used to handle supply
items in the supply chain. Supply items are classified for each type of Marines Corps
mission by their characteristics as shown in Table B-2.

The functions of the supply domain are –

??   Requirements determination (routine, pre-planned, or long range)
??   Procurement
??   Distribution
??   Disposal
??   Storage
??   Salvage

Maintenance
Maintenance includes those actions taken to retain or restore materiel to serviceable
condition. The Marines Corps has developed distinct applications for the support of
ground-common and aviation –unique equipment.

The maintenance domain consists of the following functions:

??   Inspection and classification
??   Servicing, adjusting, and tuning
??   Testing and calibration
??   Repair


                                                          -B 3 -
DRAFT Version 2.8, 02 FEB 2001                                                Appendix D


??   Modification
??   Rebuilding and overhaul
??   Reclamation
??   Recovery and evacuation

There are three levels of maintenance; Organizational, Intermediate and Depot. Within
ground equipment maintenance, the maintenance levels are further divided into five
echelons. The location of echelons of maintenance may be changed by ILC to enable the
objectives of industrial best practices for maintenance.

Health Services

Health Services involves a proactive and preventive medical program and a
phased/leveled health care system that extends from actions taken at the point of
wounding, injury or illness through evacuation to a medical treatment facility that
provides more definitive treatment.

The functions of the health services domain are –

?? Health maintenance – routine sick call, physical examination, preventive medicine,
   dental maintenance, record maintenance and report submission.
?? Casualty collection – selection of and manning locations where casualties are
   assembled, triaged, treated, protected from further injury and evacuated.
?? Casualty treatment – triage and treatment ( self-aid, buddy aid, and initial
   resuscitative care)
?? Temporary casualty holding – facilities and services to hold sick, wounded and
   injured personnel for a limited time (usually not to exceed 72 hours).
?? Casualty evacuation – movement and on-going treatment of the sick, wounded or
   injured while in transit to medical treatment facilities by ground, sea or air.

Engineering

The engineering functional domain involves a wide range of tasks performed in the rear
area that serve to sustain forward combat operations. Engineering includes the following
functions:

??   Engineer reconnaissance
??   Horizontal and vertical construction
??   Facility maintenance
??   Demolition and obstacle removal
??   Explosive ordinance removal

Acquisition




                                            -B 4 -
DRAFT Version 2.8, 02 FEB 2001                                               Appendix D


This functional domain includes actions necessary to introduce weapon systems,
equipment and AISs to the Marines Corps inventory. The acquisition domain contains
the following functions:
?? Generate Marines Corps Program Decision Memorandum (Uses a Mission Needs
    Statement to assign a Program Manager, conduct an Analysis of Alternatives and
    establish an Integrated Product Team. These actions are documented in the MCPDM
    or an APB for a logistics AIS.)
?? Demonstrate/Validate System (Prototypes, demonstrations, and early operational
    assessments are considered to manage risk. Technology, manufacturing, support,
    lifecycle cost, tradeoffs, interoperability and acquisition strategy are considered to
    select the best prototype which becomes the engineering basic design.
?? Develop System (The product and manufacturing process is designed, logistics
    support is developed and the engineering prototype is tested)
?? Deploy system (This includes producing the system, issuing the system, issuing initial
    spares for hardware, and issuing initial publications.)

General Services

This functional domain includes a variety of non-materiel and support activities. These
activities are executed in varying degrees by each of the military Services, the Marine
Corps supporting establishment, and the MAGTF.

For example, within the Marine expeditionary Force (MEF), the FSSG provides the
following services:

??   Disbursing
??   Postal
??   Legal
??   Security support
??   Exchange
??   Civil affairs
??   Graves registration




                                         -B 5 -
      DRAFT Version 2.8, 02 FEB 2001                                                                Appendix D


                                             APPENDIX C
                                          SRAC AIS Master List


      OFFICIAL                 SYSTEM NAME                        T   S   M   H   S   E   A         OWNER/ MAINTAINER
      ACRONYM                                                     R   U   A   L   E   N   C
                                                                  A   P   I   T   R   G   Q
                                                                  N   P   N   H   V   R
1    AALPS          Army Aircraft Load Planning System            X                           ARMY
2    AAP            Amphibious Assault Planner                    X                           LPO-3
     (Retired)
3    ABMS           Ammunition Budget Management System               X                       SYSCOM PMAM
4    ACE            Automated Compliance Evaluation                                   X       LFL-6
5    ADANS          AMC Deployment Analysis System                X                           USAF (MC-LPO-3)
6    AGTRS          Automated Government Transportation           X                           MCLB Albany
                    Request System
7    ALM            Air Load Module                               X                           ARMY
8    AMMOLOGS       Ammunition Logistics System                       X                       SYSCOM PMAM
     (Retired)
9    AMS            Asset Management System                       X   X                       Army MTMC    (MC-LFT)
10   AMS            Automated Manifesting System                  X   X                       SYSCOM (LFT)
11   AMSS           Ammunition Management Standard                                            ARMY
                    System
12   APPS           Ammunition Prepo Planning System              X   X                       SYSCOM PMAM
13   AQUIS          Air Quality Information System                                    X       NAVY (MC-LFL-6)
14   ARTEMIS        Artemis Project Management Software                   X                   MCLB-Albany
15   ASIFCS         Airlift Services Industrial Fund Integrated   X                           AMC
                    Computer System
16   ATLASS         Asset Trac king Logistics and Supply          X   X   X           X   X   SYSCOM PMIS
                    System (Phase I)
17   ATLASS II      Asset Tracking Logistics and Supply           X   X   X   X           X   SYSCOM PMIS
                    System (Phase II)
18   CAEMS          Computer Aided Embarkation                    X                           SYSCOM PMIS
                    Management System
19   CAIMS          Conventional Ammunition Integrated                X                       NAVY (MC-ASL 30)
                    Management System
20   CALM           Computer Aided Load Manifest System           X                           USAF (SYSCOM PMIS/HQMC LPO)
21   CAPS II        Consolidated Aerial Port System II            X                           USAF (MC-LFT)
22   CAV II PC      Commercial Asset V isibility 2 – Personal         X   X               X   NAVY (MCLB Albany)
                    Computer
23   CCS/EPOS       Clothing Cash Sales/ Electronic Point of          X                       MCLB Albany
                    Sale
24   CCSS           Commodity Command Standard System                 X                       MCLB Albany
25   CDDCS          Contracts Directorate Document Control            X                   X   MCLB Albany
                    System
26   CFM            CONUS Freight Management                      X                           ARMY (MC-LFT)
27   Citidirect     Citidirect                                                    X           NAVSUP (MC-LB)
28   CHCSII         Composite Healthcare System II                            X               JOINT (MC-LPC-4)
29   CMIS           Configuration Management Information              X   X               X   NAVY (MCLB Albany)
                    System
30   CMOS           Cargo Movement Operations System              X                           USAF (MC-LFT)
31   CMS/TOMS       Cargo Management System / Terminal            X                           NAVY (MC-LPO-3)
                    Offload Management System
32   COMPASS        Computerized Provisioning Allowance and           X                       MCLB Albany
                    Supply System
33   COMPTRAK       USMC Environmental Compliance                                     X       LFL-6
                    Tracking System
34   CORRS          Commanders Readiness Reporting                                    X       LFF
                    System
35   CPARS          Contract Performance Assessment                                   X   X   NAVSEALOGCENDET Portsmouth
                    Reporting System                                                          (MC-LB)
36   DAAS           Defense Automated Addressing System           X   X               X       DLA (MC-LPC)
37   DAMMS-R        Department of Army Movements                  X                           ARMY
                    Management System - Redesigned


                                                         -C 1 -
      DRAFT Version 2.8, 02 FEB 2001                                                               Appendix D

      OFFICIAL                  SYSTEM NAME                      T   S   M   H   S   E   A         OWNER/ MAINTAINER
      ACRONYM                                                    R   U   A   L   E   N   C
                                                                 A   P   I   T   R   G   Q
                                                                 N   P   N   H   V   R
                      Management System - Redesigned
38   DATA             DATA ELEMENTS                                  X
     ELEMENTS
39   DEPLOYABLE       Deployable MC Food Management Info             X                       LFS
     MCFMIS           System
40   DESEX            Defense Emergency Supply Expert                X                       DLA (MCLB-Albany)
                      System
41   DIFMS            Defense Industrial Financial Management            X                   NAVY (MCLB-Albany)
                      System
42   DISMS            Defense Integrated Management System               X                   JOINT
43   DM-HMMS          Hazardous Material Management System           X               X       USAF (MCLB Albany)
44   DMLSS            Defense Medical Logistics Standard             X       X               JOINT (MC- LPC-4)
                      System
45   DMRIS            Defense Medical Regulating Information                 X               JOINT (MC- LPC-4)
                      System
46   DOHRS            Defense Occupational Health Readiness                  X               JOINT (MC- LPC-4)
                      System
47   DPAS             Defense Property Accountability System         X                       JOINT DPAS (MC-LFS)
48   DRLOG            Deficiency Reporting Logistics                 X                       NAVY (MCLB-Albany)
49   DSERTS           Defense Site Environmental Restoration                         X       DLA (MCLB-Albany)
                      Tracking System
50   DSS              Distribution Standard System                                           DLA (MCLB-Albany)
51   DTSS             Defense Transportation Tracking System                                 NAVY (MC-LFT)
52   E&C              Expenditures and Collections               X                           DFAS (MCLB-Albany)
53   EIM              Environmental Information Management           X   X           X       DESCIM (MC-LFL-6)
54   ELIST            Enhanced Logistics Intra-Theater Support   X                           ARMY MTMC       (MC-LFT/LPO)
                      Tool
55   EPG              Electronic Procurement Generator               X                   X   EA-21 (MC-LB)
56   EPOS             Electronic Point of Sales                      X                       MCLB-Albany
57   ERP              Essex Replacement System                           X                   MCLB Albany
58   ERS              Enterprise Reporting System                    X           X           NAVSUP (MC-LB)
59   ESCRS            Environmental Security Corporate                               X       DESCIM (MC-LFL-6)
                      Reporting System
60   ETPS             Electronic Technical Publications System       X   X               X   MCLB Albany
61   FACTS            Financial Air Clearance Transportation     X                           NAVY (MC-LFT)
                      System
62   FEDLOG           Federal Logistics System                       X   X                   DLA (MC-LPC)
63   FEM              Facility and Equipment Maintenance                 X           X       NAVY (MCLB-Albany)
64   FIRS             Fire Incident Reporting System                                 X       LFF-1
65   FIS              Facilities Information System                                  X       NAVFAC (MC-LFF-2)
66   Fleet Anywhere   Fleet Anywhere Fleet Management                                X       LFL-6
                      System
67   FLIS             Federal Logistics Information System           X   X                   DLA (MCLB-Albany)
68   FOSAMS           Fleet Optical Scanning Ammunition              X                       NAVY (MC-SYSCOM PMAM)
                      Marking System
69   GATES            Global Air Transportation Execution        X                           USAF (MC-LFT)
                      System
70   GCCS             Global Command and Control System                                      JOINT DISA (MC- HQMC & SYSCOM)
71   GCSS             Global Combat Support System               X   X   X                   JOINT DISA (MC-LPC)
72   GDSS             Global Decision Support System                                         USAF (MC-LPO)
73   GIS              Geographic Information System                                  X       JOINT (MC-LFF)
74   GME- MS          Garrison Mobile Equipment Management           X                       LFS
     (Retired)        System
75   GOPAX            Group Operational Passenger System         X                           ARMY (MC-LFT)
76   GTN              Global Transportation Network              X   X   X                   JOINT USTC (MC-LFT)
77   HICS             Hazardous Material Information Control         X   X           X       NAVY (MC-LFL-6)
                      System
78   HMIS             Hazardous Material Information System      X                   X       DLA (MC-LFL-6)
79   HMSS             Headquarters Maintenance Subsystem                 X                   HQMC
80   HSMS             Hazardous Substance Management                 X   X           X       DESCIM (MC-LFL-6)
                      System




                                                        -C 2 -
       DRAFT Version 2.8, 02 FEB 2001                                                              Appendix D

       OFFICIAL                 SYSTEM NAME                      T   S   M   H   S   E   A         OWNER/ MAINTAINER
       ACRONYM                                                   R   U   A   L   E   N   C
                                                                 A   P   I   T   R   G   Q
                                                                 N   P   N   H   V   R
81    IBS             Integrated Freight Booking System                                      ARMY MTMC       (MC-LFT)
82    ICAPS           Interactive Computer Aided Provisioning        X                   X   NAVY (MCLB Albany)
                      System
83    ICODES          Integrated Computer Operated               X                           ARMY MTMC (MC- SYSCOM
                      Deployment Execution System                                            PMIS/HQMC LPO)
84    IMACS           Interservice Material Accounting and           X   X                   USAF (MCLB-Albany)
                      Control System
85    ITIMP           Integrated Technical Item Management &         X                   X   NAVY (MCLB-Albany)
                      Procurement
86    JAMSS           Joint Ammunition Management Standard           X                       Air Force (SYSCOM PMAM)
                      System
87    JCALS-JTMS      Joint Computer-Aided Acquisition and Log       X   X               X   JOINT ARMY (MCLB-Albany)
                      Support Sys - Joint Technic al Manual
                      System
88    JEDMICS         Joint Engineering Data Management              X   X                   JOINT NAVY (MCLB-Albany)
                      Information Control System
89    JFAST           Joint Flow and Analysis System for         X                           JOINT (SYSCOM PMIS)
                      Transportation
90    JFRG II         Joint Force Requirements Generator II      X                           JOINT USMC        (MC-SYSCOM/
                                                                                             HQMC LPC-5)
91    JOPES           Joint Operation Planning and Execution     X                           JOINT DISA        (MC-SYSCOM
                      System.                                                                PMIS)
92    JTAV            Joint Logistics MIS                        X   X   X                   JOINT DLA (MCLB-Albany)
93    K21             Knowledge for Acquisition in the 21st                      X       X   LB
                      Century
94    KBLPS           Knowledge Based Logistics Planning                                     LPO
      (Retired)       System
95    LAKES HELPER    Lakes Helper                                   X   X                   MCLB Albany
96    LBIV II         Logistics Base Inventory View II               X                       MCLB Albany
97    LIS             Logistics Information System                   X                       MCLB Albany
      (Inactive –
      SRAC Phase 1)
98    LMIS            Logistics Management Information System    X   X   X               X   LPO-4
99    LOGPARS         Logistics Planning and Requirements            X                   X   ARMY (MC-SYSCOM)
                      Simplification System
100   MAARS II        Marine Ammunition Accounting and               X                       SYSCOM PMAM
                      Reporting System II
101   MAD             Medical Anchor Desk                                    X               JOINT (MC-LPC-4)
102   MAGTF II        Marine Air Ground Task Force II            X                           SYSCOM PMIS/ HQMC PP&O
103   MAISTER         Data Entry System                              X
104   MARES           Marine Corps Automated Readiness               X   X                   LPO-4
                      Evaluation System (aka MCGERR)
105   MAT             Medical Analysis Tool                                  X               JOINT (MC- LPC-4)
106   MAXIMO          MAXIMO Facilities Management Software              X           X       COTS (MC-LFF)
107   MC DODAAD       Marine Corps Department of Defense         X   X                       MCLB Albany
                      Activity Address Directory
108   MCDSS           Material Capability Decision Support           X   X                   MCLB Albany
                      System
109   MCARES          Marine Corps Automated Readiness                                       HQMC (LPO-4)
      (Inactive –     Evaluation System
      SRAC Phase 1)
110   MCARMS          Marine Corps Ammunition Requirements           X                   X   MCCDC
      (Retired)       Management System
111   MCDSS           Material Capability Decision Support           X                       MCLB Albany
                      System
112   MCFMIS          Marine Corps Food Management                   X                       LFS
                      Information System
113   MCFPS           Marine Corps Facilities Planning System                        X       NAVFAC (MC-LFF)
114   MCGERR          Marine Corps Ground Equipment                  X   X                   LPO-4
                      Resource Reporting (aka MARES)
115   MCHAS           Marine Corps Housing Automated System                      X   X       NFESC (LFF-3)
116   MCLORA          Marine Corps Level of Repair Analysis              X               X   SYSCOM



                                                        -C 3 -
       DRAFT Version 2.8, 02 FEB 2001                                                              Appendix D

       OFFICIAL                 SYSTEM NAME                      T   S   M   H   S   E   A         OWNER/ MAINTAINER
       ACRONYM                                                   R   U   A   L   E   N   C
                                                                 A   P   I   T   R   G   Q
                                                                 N   P   N   H   V   R
      (Inactive –
      SRAC Phase 1)
117   MCPDS           Marine Corps Publications Distribution         X                   X   ARD
                      System
118   MCTEEP          MC Training Exercise & Employment          X               X           PP&O
                      Planner
119   MDL / DDS       MAGTF Data Library / Data Dictionary       X                           SYSCOM PMIS
                      System
120   MDSS II         MAGTF Deployment Support System II         X                           SYSCOM PMIS
121   MEARS           Multi-user Engineering Change Proposal         X   X               X   ARMY (MCLB-Albany)
                      Automated Review System
122   MEDALS          Military Engineering Data Automated            X                       DLA
                      Locator System
123   MICAPS          Marine Corps Interactive Computer Aided        X                       MCLB Albany
                      Provisioning System
124   MIMMS           Marine Corps Integrated Maintenance            X   X                   MCLB Albany
                      Management System
125   MP&E            Maintenance Planning and Execution             X   X                   USAF (MCLB-Albany)
126   MRP             Materiel Returns Program                       X                       MCLB Albany
127   MRP II          Manufacturing Resource Planning II         X       X                   NAVAIR (MCLB-Albany)
128   MSLS            Military Shipping Label System             X   X                       MCLB Albany
129   NAETS           Naval Air Environmental Tracking System                        X       NAVY
130   NAFI            Navy Air Force Interface                                   X       X   EA-21 (MC-LB)
131   NAOMIS          Navy Material Transportation Office        X                           NAVY (MC-LFT)
                      Operations Management Information
                      System
132   NECO            Navy Electronic Commerce Online                            X       X   NAVSUP (MC-LB)
133   NEIMS           NALMEB Equipment Inventory                     X   X                   LPO
                      Management System
134   NFADB           Naval Facilities Assets Database                               X       NAVY (MC-LFL)
135   NIMMS           Naval Inventory Material Management            X   X                   MCLB Albany
                      System
136   PC TRANS        PC-based Transportation System             X                           NAVY (MC-LFT)
137   PDMSS           Programmed Depot Maintenance                       X                   USAF (MCLB-Albany)
                      Scheduling System
138   PDREP           Product Data Report and Evaluation             X   X                   MCLB Albany
                      System
139   PEI-STRAT       Principle End Item Stratification              X   X                   MCLB Albany
      (Inactive –
      SRAC Phase 1)
140   PIC             Personnel Information Carrier                          X               OSD HA (MC- LPO-1)
141   PLMS            Publications Library Management System         X   X                   ARD
142   PMRS            Procurement Management Reporting               X           X           DoN (MC-LB)
                      System
143   Powertrack      Powertrack                                 X                           OSD MRM 15 PMO (MC-LFT)
144   PRAMS           Passenger Reservation and Manifest         X                           USAF (MC-LFT)
                      System
145   PREPO AIS /     PREPO Planning and Execution AIS               X                       MCLB Albany
      FILE 85
      (Retired)
146   RIS             Revitalization Information System                              X       NAVFAC (MC-LFF)
147   ROLMS           Retail Ordnance Logistics Management           X                       NAVY (MC-SYSCOM PMAM)
                      System
148   RPM/FHS         Real Property Management/Family                                X       LFF
      (Retired)       Housing System
149   SABRS           Standard Accounting, Budgeting &           X   X   X       X           DFAS (MC- RFL)
                      Reporting System
150   SAMS            Shipboard Automated Medical System                     X               NAVY (LPO-1 )
151   Set Assembly    Set Assembly System                            X   X                   MCLB Albany
152   SASSY           Supported Activities Supply System             X   X           X   X   MCLB Albany
153   SCATT           Statement of Work CDRL and Tracking            X                   X   SYSCOM
                      Tool



                                                        -C 4 -
       DRAFT Version 2.8, 02 FEB 2001                                                               Appendix D

       OFFICIAL                 SYSTEM NAME                         T   S   M   H   S   E   A       OWNER/ MAINTAINER
       ACRONYM                                                      R   U   A   L   E   N   C
                                                                    A   P   I   T   R   G   Q
                                                                    N   P   N   H   V   R
154   SCS            Stock Control System                               X                       USAF (MCLB Albany)
155   SERIAL         Serial Number Tracking                             X   X                   MCLB Albany
      NUMBER
      TRACKING
156   SERVMART ON    SERVMART ON LINE                                   X                       MCLB Albany
      LINE
157   SL             Stock Lists                                        X                       MCLB Albany
158   SL 1-2         Stock Lists 1&2                                    X                       MCLB Albany
159   SLDCADA                                                               X                   MCLB Albany
160   SPS            Standard Procurement System                        X                   X   DLA (MCLB Albany/ LB)
161   SPVI           Subsistence Prime Vendor Interpreter               X               X       DLA (MC- LPC-4)
                     (LFS?)
162   SS03           Inventory Control Forecasting                      X                       MCLB Albany
      FORECAST
163   SS03 PRF       Inventory Control Project Requirements             X                       MCLB Albany
                     File Follow -up
164   SS03           Inventory Control Replenishment Review             X                       MCLB Albany
      REPREVIEW
165   SS04 STORES    Stores Accounting Subsystem                        X                       DFAS (MCLB Albany)
166   SS05           Automated Procurement Subsystem                    X                   X   MCLB Albany
167   SS06 MOWASP    Mechanization of Warehousing and                   X                   X   MCLB Albany
                     Shipment Processing
168   SS07 DSSC      Direct Support Stock Control Subsystem             X                       MCLB Albany
                     (DSSC)
169   SS09 ITEM      Item Applications                                  X   X                   MCLB Albany
      APPS
170   SS10           Provisioning                                       X                   X   MCLB Albany
171   SS17           Allotment Accounting Subsystem                     X                   X   DFAS (MCLB Albany)
172   STRAT RETAIL   Retail Stratification                              X                       MCLB Albany
173   STRATIS        Storage, Retrieval, Asset, Tracking                x
                     Information System
174   STRAT          Wholesale Stratification                           X                       MCLB Albany
      WHOLESALE
175   SUMMIT         Support Utility for Materiel Management                                    MCLB Albany
                     Information Technology
176   TALPS          T-AVB Automated Load Planning System           X                           USMC
177   TAMMIS         Theater Army Medical Material Information                  X               ARMY (MC-LPC-4)
                     System
178   TCAIMS         Transportation Coordinators' Automated         X                   X       SYSCOM PMIS
                     Information for Movement System
179   TC-AIMS II     Transportation Coordinators' Automated         X                   X       ARMY (MC-SYSCOM PMIS)
                     Information for Movement System II
180   TDMS           Technical Data Management System                   X   X                   MCLB Albany
181   TIMA           Tool & Inventory Management Application            X   X                   NAVY (MCLB-Albany)
182   TMIP           Theater Medical Information Program                        X               OSD HA (MC- LPO-1)
183   TMR            Table of Manpower Requirements                                             USMC
184   TMS            Transportation Management System               X                           MCLB Albany
185   TOPS           Transportation Operational Personal            X                           JOINT ARMY (MC-LFT)
                     Property Standard System
186   TRAC2ES        TRANSCOM Regulating and Command                            X               (MC- LPC-4)
                     and Control Evacuation System
187   UDAPS          Uniform Automated Data Processing                  X                       DFAS/NAVY
                     System
188   WEAPONS        Weapons Serial Tracking System                     X                       MCLB Albany
189   WAWF           Wide Area Workflow                                 X           X           JECPO/DFAS-KC (MC-LB)
190   WPS            Worldwide Port System                          X                           ARMY MTMC       (MC-LFT)
191   WRS            War Reserve System                                 X                       MCLB Albany
192   WSS            Warehouse Support System                           X                       MCLB Albany




                                                           -C 5 -
DRAFT Version 2.8, 02 FEB 2001                                                       Appendix D



                                         APPENDIX D
                               SRAC Functional Definitions – Phase 2

Functional definitions in SRAC Phase 2 for each logistics domain were based, where
available, on definitions from Chapter 3.0 of the LOG IR Plan, Version 2.0, 1998. Each
domain team reviewed these definitions and modified them, as required, as a deliverable
of their breakout sessions at the SRAC Phase 2 Workshops in March and April. The
modified functional definitions are included in this appendix.

Supply Domain Functional Definitions




                                                                       Approved
        Materiel Requirement                                           Requirement
                                 Perform Materiel Management




                                                          A0




                    Figure D-1 Context Diagram for Perform Materiel Management



This function consists of three major activities: Analyze Requirement, Determine Support
Plan, and Manage Assets that work together to provide wholesale and retail supplies,



                                              -C 6 -
DRAFT Version 2.8, 02 FEB 2001                                              Appendix D


equipment, fuels, and munitions for deployment and support operations as shown in the
figure above.
3.3.1 Analyze Requirement determines if an incoming materiel requirement is
authorized and valid. This activity is made up of Perform Research, Validate Technical
Data, and Validate Supply Data that is passed to 3.3.2, Determine Support Plan, as well
as a Weapon System Requirement that is passed to the Acquisition function.
Disapproved Requirements are returned to the Originator. Consumables do not process
through this activity.

3.3.1.1 Perform Research is controlled by Historical Data, the existing Table of
Equipment (T/E) and Table of Organization (T/O), and other regulations and directives in
producing an authorized requirement. In this activity, Technical Data Files are
researched to determine if the NSN is valid/active. Other data provides the needed
details to make a management decision to reject, accept, or fill the requirement.

3.3.1.2 Validate Technical Data acts on the Authorized Requirement using various AISs
to determine if the management codes are correct, publications are up to date, usage data
and Technical Data Packages are accurate and updated to reflect current technical data
elements, and producing an Approved Requirement and updated or new Technical Data
Packages.

3.3.1.3 Validate Supply Data acts on the Authorized Requirement using various AISs to
determine if the specific supply or materiel information is correct for generating
requirements.

3.3.2 Determine Support Plan encompasses five sub-activities: Establish Contract;
Generate Requirements; Request Maintenance; Generate RDO/MRP; and Issue from
Inventory. This activity uses appropriate Regulations and directives as well as the
Budget to arrange how the Approved Requirement will be upheld.

3.3.2.1 Establish Contract is a means to acquire support for the requirement that cannot
be found within the existing (in- house) infrastructure.

3.3.2.2 Generate Requirements initiates the appropriate transactions to buy the goods or
services that support the requirement from another internal source either within the DoD
infrastructure or from an external source.

3.3.2.3 Request Maintenance initiates the action necessary to send the requirement to
the Maintenance activity.

3.3.2.4 Generate Redistribution Order (RDO) and/or Materiel Return Program
(MRP) is the activity that initiates the actions that will alleviate an excess/deficiency
situation between activities and produces the appropriate Documents and Reports.




                                        -C 7 -
DRAFT Version 2.8, 02 FEB 2001                                               Appendix D


3.3.2.5 Issue from Inventory satisfies the requirement by processing a requisition to
issue assets from on hand inventory and produces the appropriate Documents and
Reports.

3.3.3 Manage Assets is made up of the following activities: Execute Method of Support;
Perform Receipt (of Materiel); Store and Track Assets; and Issue Materiel. This activity
takes the Supported Requirement and the Actual Materiel and, under the guidance of
Regulations, Directives, Budget, Technical Guides, and Urgency of Need, administers
those assets by producing Request for Inventory, Requisition or Purchase Documents,
Shipping Actions, Back Order Actions, or Maintenance Requests.

3.3.3.1 Execute Method of Support uses Documents, Reports, and Requests from 3.3.2.,
Determine Support Plan, to complete the actions required to satisfy the Approved
Requirement.

3.3.3.2 Perform Receipt turns Procured Materiel into on-hand inventory to satisfy the
Approved Requirements.

3.3.3.3 Store and Track Assets places the Received Materiel into storage and tracks the
inventory by conducting periodic inventories, initiating care in store processes, and
validating shelf life.

3.3.3.4 Issue Materiel satisfies requirements by placing appropriate materiel in the hands
of the requesting unit, producing Documents and Reports, Request for Transportation and
Request for Maintenance.


Transportation Domain Functional Definitions

The Transportation Domain Team used the USMC LOG IR Plan, Version 2 definitions
for logistics functions in SRAC Phase 2 without change. These functional definitions are
shown below for reference purposes.

3.1.4 Move Personnel, Supply, Equipment, and Pe rsonal Property achieves the
movement of personnel, government cargo, and personal property from origin to
destination via all modes of transport using military and commercial assets, services, and
systems organic to, contracted for, or controlled by DoD in peace and war.

The context diagram below defines the scope of the transportation functions used in
SRAC, Phase 2.




                                         -C 8 -
DRAFT Version 2.8, 02 FEB 2001                                                                 Appendix D




                          Time                                          Defense
                                                                        Regulations
                                   Capacity              Budget         & Federal
                                                                        Statutes




                                   MOVE PERSONNEL,
   Pax, Cargo, Supplies               SUPPLIES,                                        Capability
                                     EQUIPMENT &                                       (See Definition)

 Transportation Requirements          PERSONAL
                                      PROPERTY
                                                                  A0




                                         Info                          Air, rail, truck, sea



                          People




                           Figure D-2 Context Diagram for
                Move Personnel, Supplies, Equipment and Personal Property

This function acts upon a Requirement to move Personnel, Supplies, Equipment or
Personal Property from origin to destination via all modes of transportation through three
subordinate: Plan Move, Execute Move, and Receive Forces and Cargo. This activity
enables the right forces and materiel to be at the right place at the right time, in the right
condition, and at the right cost. The constraints on the activity are time, capacity,
throughput of ports, airfields, road and rail networks, budget, weather and geographic
conditions, defense regulations and federal statutes. The mechanisms for effecting
movement are people, transportation assets (i.e., air, rail, motor, ship), and information.

3.5.1 Plan Move begins with receipt of a movement requirement, or request, and ends
with a type of movement order, directive, or contract. Working under the constraints of
time, lift capacity, budget, environmental, weather and geographic factors, throughput
capacity of nodes and transportation networks (railways, roads, waterways), and
availability of information, the activity results in an order to move, which can be in many
forms including CINC Execution Order, Dispatch Form, Air Tasking Order, Government
Bill of Lading, “Run Rosters”.


                                                -C 9 -
DRAFT Version 2.8, 02 FEB 2001                                                    Appendix D



This activity consists of five sub-activities as shown in Error! Reference source not
found. : Analyze Requirements, Develop Offer (for commercial moves), Allocate Assets,
Select Carrier, and Schedule/Book the Move.

3.5.1.1 In Analyze Requirements, the requirement could be a Joint Operational Planning
and Execution System (JOPES) Time-Phased Force Deployment Data (TPFDD), a
Special Assignment Airlift Mission (SAAM) request, a Material Release Order (MRO), a
Government Transportation Request (GTR), DD Form 1149 (Requisition and
Invoice/Shipping Document), a contract, a Combat Service Support (CSS) Rapid
Request, a Daily S-4 Run Sheet, or a Request Form, among others. It identifies a need to
move an asset. Planners must determine what is needed to move those requirements.
Generally, there are two ways to effect movement: organic or non-organic; that is using
common user, commercial, or general support capabilities.

The outputs from this activity are a list of tasks, time, and support required and a route
and mode of transportation. The activity is controlled by the Defense Transportation
Regulation (DTR) for commercial and channel moves and JOPES for strategic
deployments.

3.5.1.2 Develop Offer follows the determination that commercial assets are required.
The result of this activity is a request for services (Offer) submitted to carriers. The
activity is constrained by the FAR, laws and statutes, Guaranteed Traffic agreements, and
The Military Traffic Management Command (MTMC) regulations.

3.5.1.3 Allocate Assets uses the mode and route and list of tasks, time, and support
required that came from 3.5.1.1, Analyze Requirements to allocate assets to support the
move. The result is a request or tasking for support. The assets allocated are controlled
by asset availability, time, and budget constraints.

3.5.1.4 Select Carrier examines the carriers’ responses from 3.5.1.2, Develop Offer, to
include rates and routes and mode of shipment offered. The result is the selection of a
specific carrier or unit for to fulfill a specific requirement. This activity continues to be
controlled by the assets available, time, budget, FAR, laws, statutes, Service and DOD
regulations, DTR, and local SOP’s that govern the movement of personnel, supplies,
equipment and personal property.

3.5.1.5 Schedule/Book (dispatch) Move is the final step in the transportation planning
process. Using the selected carrier and available assets, the result is the Order to Move.
The constraints on this activity are Private and Public regulations, Time Phased Force
Deployment Data (TPFDD), and METT- T.

3.5.2 Execute Move includes five sub-activities: Create a Movement Plan, Prepare the
Load, Assemble/Marshall the Load, Load the Asset, and Move & Deliver the Load. The
order to move can be a CINC execution order that activates a TPFDD, or a dispatch form
from a dispatch NCO to a driver that results in delivered personnel, supplies, equipment



                                           -C 10 -
DRAFT Version 2.8, 02 FEB 2001                                                Appendix D


and personal property to a desired destination. The activity is controlled by available
assets, Public and Private laws regulating traffic, distance, time, route, range, and METT-
T.

3.5.2.1 Create Movement Plan is the activity that provides requirements, resources
available, and a schedule to accomplish the move. It can be a CINC-validated TPFDD for
force deployments, or a daily “run sheet” for routine, unit motor pool operations, an
MRO, a Letter of Instruction (LOI), or Permanent Change of Station (PCS) orders.
Transportation supports the entire inventory of the DoD (active and reserve forces, goods,
and services). It includes the capabilities and infrastructures needed for general or direct
support activities, readies force and support units, and provides routine and peacetime
support including mobilization, deployment/ re-deployment, and sustainment of forces.

The result of this activity is the plan for movement of personnel, supplies, equipment and
personal property. The constraints are JOPES, Joint Federal Travel Regulation (JFTR),
Marine Corps Transportation Manual, and Military Standard Transportation and
Movement Procedures (MILSTAMP).

3.5.2.2 Prepare Load involves the preparation of the personnel, supplies, equipment, and
personal property that will be moved. During this activity cargo is packaged, labeled,
and prepared for mode of shipment (i.e., air, ship, or motor), and, if hazardous, is
identified, labeled, and documented. This process may also involve inspecting,
inoculating, briefing and issuing special gear to passengers. The controls on this activity
are the DTR, CFR, and IATA.

3.5.2.3 Assemble/Marshall Load consolidates the shipment for its onward move. The
activity includes palletizing, containerizing, or marshaling unit cargo and equipment for
inspection prior to movement. In some cases, cargo and passengers must be moved to an
assembly/marshaling area; i.e., moving from origin to a port of embarkation. Final
information is gathered, adjustments or corrections are made, or joint inspections
conducted between Services or Agencies resulting in the refined plan and the completed,
assembled load. This activity is controlled by the throughput capability of ports on the
route, the capacity of assets to carry or handle cargo, and available space to assemble the
load, as well as JFTR and DTR.

3.5.2.4 Load Asset is the physical loading of an asset onto or into the mode of
transportation. The assembled load, refined plan, shipping manifest and shipping
documentation are required to perform the activity under the constraints of trim, stress,
and stability requirements of ships, the allowable cabin loads of aircraft assets, and the
gross vehicle weight and cube to produce a completed load.

3.5.2.5 Move and Deliver Load uses the completed load and/or manifested personnel
and results in delivered personnel, supplies, equipment and personal property to the point
where the asset was intended to be delivered based on the original requirements. Controls
are the range, speed, and route required to travel.




                                         -C 11 -
DRAFT Version 2.8, 02 FEB 2001                                                Appendix D


3.5.3 Receive Forces and Cargo is the final activity. It acts upon the forces and cargo
delivered and results in a combat capability or mission readiness. Supported by TCAIMS
II, DAMMS-R, and AMS/DSS, as well as AIT, this activity is controlled by Port/terminal
Capacity, Availability of Transportation Assets, Time, and manpower of individual
Services.

Maintenance Domain Functional Definitions

The scope of activities in the maintenance domain are defined in Figure 1 by an IDEF0
context diagram entitled MAINTAIN EQUIPMENT. Inputs to the function “Maintain
Equipment” are the arrows entering the function to the left (i.e. equipment and requests
for maintenance). The function outputs exit to the right (i.e. maintained equipment and
closed work orders). The arrows at the top are constraints that control how the function is
executed (i.e. resources, the MC Master Plan , budget and policy directives). The arrows
at the bottom are called mechanisms. They support the execution of the function
“Maintain Equipment” (i.e. people tools, and facilities). The tools may be hard (i.e.
wrenches and repair manuals) or soft (i.e. AIS software, reference databases, etc.)




                                         -C 12 -
DRAFT Version 2.8, 02 FEB 2001                                                          Appendix D




                                             MC
                  Available                  Master
                  Resources                  Plan
                                                                           Directives
                                            Budget




                                                                           Maintained
                                                                           Equipment

           Equipment
                                      MAINTAIN
                                     EQUIPMENT
                                                                       Closed Work Order
     Request for Maintenance


                                                         A0




                 People                                       Facilities




                                          Tools




                  Figure D-3 - Context Diagram for Maintain Equipment
This function encompasses the activities that perform maintenance at all echelons within
the Marine Corps including Plan for Maintenance, Perform Production Control, and
Execute Production. This activity provides supplies, equipment, fuels, and munitions for
Marine Corps deployment and support operations, and includes the evaluation and
reporting of the status of supplies and equipment against the published standards.

3.4.1 Plan for Maintenance includes Identify Resources, Develop Maintenance Plan and
Identify Maintenance Requirements.

3.4.1.1 Identify Resources requires the consideration of a number of factors: the T/O,
T/E, and facility characteristics; contingencies and the resultant availability of equipment
to support maintenance cycles along with available personnel and their training. This
activity uses Personnel Available (from Manpower activities), Supplies from 3.3.2.5
Issue from Inventory, the Total Budget, and Equipment from 3.3.3.4 Issue Materiel to
create an Available Resource List and, if necessary, a Request to Change Resource
Constraint. Various systems may assist in identifying resources, including: ATLASS II,
DIFMS, ERP, FLIS/FEDLOG, HICS, HMMS, HSMS, ITEM APPS, JCALS-JTMS,


                                         -C 13 -
DRAFT Version 2.8, 02 FEB 2001                                               Appendix D


JEDMICS, JTAV, LAKES HELPER, LMIS, MARES/MCGERR, MCDSS, MEARS,
MIMMS, MP&E, COMPASS CONTRACT, NEIMS, NIMMS, PDMSS, PDREP, PLMS,
SABRS, SASSY, SLDCADA, TDMS, TIMA, ASSET TRACKER, MAP, PMS, MRMS,
MFMD, AIMS, RIDS, IBF, MCPDS, WOLPH, CALTECS, RIPR, LBIV, MCREM/R,
and JEMMS.

3.4.1.2 In Develop Maintenance Plan, the Available Resource List from 3.4.1.1
Identify Resources joins with the Real World Contingency, Equipment, Mission
Statement, Directives, Training Schedule, and Budget to control the activity. Input to this
activity includes the Equipment that the organization is responsible to maintain. The
result of applying the constraints against that Equipment is an Approved Maintenance
Plan that controls 3.4.1.3 Identify Maintained Equipment. It is also recursive to itself in
that the plan is constantly updated and revised. This activity also produces a work
schedule that is one of the mechanisms that facilitates the accomplishment of the mission.
ARTEMIS, ATLASS II, DIFMS, JCALS-JTMS, JEDMICS, LMIS, MARES/MCGERR,
MCDSS, MEARS, MIMMS, MP&E, COMPASS CONTRACT, NEIMS, NIMMS ,
PDMSS, SABRS, SASSY, TIMA, ASSET TRACKER, MAP, PMS, MFMD, AIMS,
SOWPEN, MRP, RIDS, IBF, WOLPH, CALTECS, RIPR, and MCREM/R may assist in
this activity.

3.4.1.3 Identify Maintenance Requirements uses the Approved Maintenance Plan
(3.4.1.2) to control its activity. The Equipment that needs maintenance consideration is
compared against the plan and, using a work schedule, results in Direction to Execute
Maintenance that is passed to 3.4.2 Perform Production Control. AISs that may support
this activity include: ARTEMIS, ATLASS II, CAV II, DIFMS, ERP, ITEM APPS,
JCALS-JTMS, JEDMICS, LAKES HELPER, LMIS, MARES/MCGERR, MCDSS,
MEARS, MIMMS, MP&E, COMPASS CONTRACT, NEIMS, NIMMS, PDMSS,
PDREP, SASSY, SLDCADA, TIMA, ASSET TRACKER, MAP, PMS, MRMS, MFMD,
AIMS, SOW PEN, MRP, WOLPH, CALTECS, RIPR, and MCREM/R.

3.4.2 Perform Production Control encompasses Prioritize Maintenance Production,
Manage Resources, Monitor Production Throughput, and Direct Quality Control. This
activity determines the amount of planned (scheduled) maintenance and anticipates the
amount of unscheduled (corrective) maintenance required. It results in a Prioritized
Maintenance Schedule.


3.4.2.1 Prioritize Maintenance Production receives Unplanned Maintenance Requests
and the Approved Maintenance Plan from 3.4.1 Plan for Maintenance, and produces a
Prioritized Maintenance Schedule. AISs that may support this activity are: ARTEMIS,
ATLASS II, DIFMS, JCALS-JTMS, JEDMICS, MARES/MCGERR, MCDSS, MIMMS,
MP&E, COMPASS CONTRACT, NEIMS, NIMMS, PDMSS, TIMA, DERO, ASSET
TRACKER, MAP, PMS, AIMS, IBF, RIPR, and MCREM/R..

3.4.2.2 Manage Resources aligns the Actual Resources to the Prioritized Maintenance
Schedule and produces the Management Direction necessary to Execute Production



                                         -C 14 -
DRAFT Version 2.8, 02 FEB 2001                                             Appendix D


(3.4.3). Additionally, this activity may submit Request to Update Management Plan back
to 3.4.1.2, Develop Maintenance Plan. It also produces Resource Deficiency and
Available Resources as well as Maintenance Production Reports to 3.4.2.3, Monitor
Production Throughput. AISs that may support this activity are: ARTEMIS, ATLASS II,
CAV II, DIFMS, ERP, FLIS/FEDLOG, HICS, HMMS, HSMS, MCDSS, MIMMS,
MP&E, COMPASS CONTRACT, NEIMS, NIMMS, PDMSS, SABRS, SLDCADA,
TDMS, TIMA, ASSET TRACKER, MAP, PMS, AIMS, PC-MISCO, IBF, CALTECS,
MCREM/R and JEMMS. .

3.4.2.3 Monitor Production Throughput is the activity that provides supervisory
functions to smooth the flow of work, reduce repair cycle time, and obtain feedback on
problems encountered in the maintenance process. Using the output from 3.4.2.2,
Manage Resources, this activity produces Reports and Deposition Requirements. This
activity may use the following AISs: ARTEMIS, ATLASS II, CAV II, DIFMS, ERP,
LAKES HELPER, MARES/MCGERR, MCDSS, MIMMS, MP&E, COMPASS
CONTRACT, NEIMS, NIMMS, PDMSS, SLDCADA, TIMA, QIR, ASSET TRACKER,
MAP, PMS, MFMD, AIMS, PC-MISCO, CALTECS and RIPR.

3.4.2.4 Direct Quality Cont rol takes advantage of Beneficial Suggestions, NAVMC
10772 (recommended changes to technical manuals), Report of Discrepancy (ROD), and
Production Quality Deficiency Report (PQDR), etc. to formulate or revise Quality
Control Procedures, Standards and Requirements to be used by 3.4.3, Conduct Quality
Control.    ATLASS II, ITEM APPS, MARES/MCGERR, MIMMS, COMPASS
CONTRACT, NEIMS, PDMSS, PDREP, QIR, ASSET TRACKER, MAP, PMS, AIMS,
RIPR, and MCREM/R are AISs that may support this activity.

3.4.3 Execute Production is triggered by a Maintenance Request (or Direction to
Execute Maintenance). The major sub-activities are: Perform Preliminary Inspection,
Perform Maintenance Action, Conduct Quality Control, and Perform Final Inspection

3.4.3.1 Perform Preliminary Ins pection is the initial assessment of an item prior to
induction into the maintenance cycle. This activity results in a Disapproved Maintenance
Request being returned to the originator or an Approved Maintenance Task. If the latter,
the Inspected Equipment is passed to 3.4.3.2 Perform Maintenance Action. AISs that
may support this activity are: ATLASS II, CMIS, ITEM APPS, MIMMS, NEIMS, QIR,
ASSET TRACKER, MAP, PMS, and AIMS.

3.4.3.2 Perform Maintenance Action encompasses those activities that return or sustain
Materiel in an appropriate working condition. Acting upon the Inspected Equipment with
an Approved Maintenance Task and the required Supplies, the result is Maintained
Equipment that is passed to 3.4.3.3, Conduct Quality Control, or in Un-repaired
Equipment that is passed to 3.2.4.5, Make Disposition along with any Hazardous Waste
generated or accumulated in the process. AISs that may support this activity are:
ARTEMIS, ATLASS, ATLASS II, CAV II, CMIS, DIFMS, ERP, FLIS/FEDLOGS,
HICS, HMMS, HSMS, ITEM APPS, JCALS-JTMS, JEDMICS, MIMMS, COMPASS




                                        -C 15 -
DRAFT Version 2.8, 02 FEB 2001                                              Appendix D


CONTRACT, NEIMS, NIMMS, PDMSS, SASSY, SLDCADA, TDMS, TIMA, QIR,
ASSET TRACKER, MAP, PMS, AIMS, WOLPH, CALTECS, LBIV and JEMMS.

3.4.3.2.1 Perform Repair is the activity that returns a broken or defective item of
equipment to an operational state/standard.

3.4.3.2.2 Perform Rebuild entails the tearing down and restoring of an item of equipment
to a specified configuration.

3.4.3.2.3 Perform Modification is directed by modification instructions and acts upon
Equipment as required

3.4.3.2.4 Perform Calibration checks the range of performance for an item of equipment
to a specified standard. Using the list of items published by the Navy Meteorological
Laboratory, this activity performs the appropriate actions on equipment to produce
properly Calibrated Equipment.

3.4.3.3 Conduct Quality Control ensures all standards, directives, and procedures being
followed to return, restore, or maintain equipment in proper working condition. AISs that
may be used to support this activity are: ATLASS II, CAV II, CMIS, ITEM APPS,
JCALS-JTMS, JEDMICS, MIMMS, NEIMS, TIMA, QIR, ASSET Tracker, MAP, PMS,
AIMS, and CALTECS.

3.4.3.4 Perform Final Inspection receives the Repaired, Rebuilt, Modified, Calibrated
Equipment and provides Equipment, Ready for Issue and the Closed Work Order. The
AISs that may support this activity include: ATLASS II, CAV II, DIFMS, ERP, ITEM
APPS, MIMMS, COMPASS CONTRACT, NEIMS, PDMSS, SLDCADA, TIMA, QIR,
ASSET TRACKER, MAP, PMS, AIMS, and CALTECS.




                                        -C 16 -

								
To top