ASW Symp Two Metrics Overview 03May07 by JI4c4Mm

VIEWS: 2 PAGES: 35

									ASW METOC Metrics: Metrics Overview,
        Goals, and Tasks

                          Tom Murphree
                 Naval Postgraduate School (NPS)
                       murphree@nps.edu

                            Bruce Ford
                      Clear Science, Inc. (CSI)
                    bruce@clearscienceinc.com

       Paul Vodola, Matt McNamara, and Luke Piepkorn
            Systems Planning and Analysis (SPA)
                     pvodola@spa.com

                        CAPT(s) Mike Angove
                            OPNAV N84
                      michael.angove@navy.mil


        Brief for ASW METOC Metrics Symposium Two
                       02-04 May, 2007
    Ford B and T. Murphree, Metrics Overview, May 07, bruce@clearscienceinc.com, murphree@nps.edu
ASW METOC Metrics: Key Concepts
Definition

  Measures of the performance and operational impacts of the
  CNMOC products provided to ASW decision makers.

Uses

1. Improve product generation and delivery processes, product
   quality, assessments of uncertainty and confidence in products,
   product usage, and outcomes of ASW operations.

2. Allow CNMOC to more effectively participate in ASW fleet synthetic
   training, reconstruction and analysis, campaign analysis, and other
   modeling, simulation, and assessment programs.

3. Evaluate new products, including new performance layer and
   decision layer products.



              Ford B and T. Murphree, Metrics Overview, May 07, bruce@clearscienceinc.com, murphree@nps.edu
ASW METOC Metrics: Key Concepts

Metrics System

1. Capable of collecting and analyzing data on actual products,
   verifying observations, decisions made by users of the products,
   and outcomes of user operations.

2. Minimal manpower impacts through extensive use of automation.

3. Metrics delivered in near real time and in formats that allow CNMOC
   managers to effectively use the metrics in their decision making.

4. Includes operations analysis modeling capability to simulate
   operational impacts of products. Model metrics complement those
   derived from data. Modeling simulates events difficult to represent
   with actual data (e.g., rare or difficult to observe events), and can
   allow experimentation with different scenarios (e.g., different levels
   of product accuracy, different CONOPS for METOC support).
METOC Metrics and Battlespace on Demand Tiers
                                                                                                                                                                   Cumulative Probability of Detecting Both Threat Subs

                                                                                                                                                      100.0%

                                                                                                                                                      90.0%
                                                                                                                                                      80.0%
                                                                                                                                                      70.0%




                                                                                                                             Cumulative Probability
                                                                                                                                                      60.0%




Tier 3 – Decision Layer
                                                    074-45W                                                    072-50W

                                          Updated
                                          CONOPs                                      100 nm                                                          50.0%
                                                              30 nm
                                       29-40N                                                                                                         40.0%                                                    Updated CONOPs
                                                                                                   “A”   “A”                                                                                                   Updated Environment
                                                          Province “A”                                                                                30.0%                                                    Original CONOPs
                                                                                                                                                      20.0%                                                    Updated Environment
                                                    60
                                                               SSN Area                 MPA Station #1
                                                    nm         UPDATED                    UPDATED                                                                                                              Original CONOPs
                                                               CONOPs
                                                              1800 sq nm
                                                                                          CONOPs
                                                                                         4200 sq nm                                                   10.0%
                                                                                                                                                                                                               Historical Environment
                                                                       Province “B”
                                           100 nm                                                                                                      0.0%
                                                                                                                                                               0       5       10      15       20       25      30         35          40
                                                                                                                                                                                               Hour
                                                    40                                  MPA Station #2
                                                                                          UPDATED
                                                    nm                                    CONOPs
                                                                                         4000 sq nm




                                       28-00N




Tier 2 – Performance Layer




Tier 1 – Environment Layer


                                                                                                                                                                              Fleet Data

                                                                                                                         Satellites
                             Initial and Boundary Conditions

       METOC metrics address all three tiers --- in particular, the
   performance and operational impacts of each of the three layers.
Key Questions

1.   In what ways does the METOC community impact the ASW
     fight?

1.   How can we measure those impacts?

2.   How can we use those measures to improve our impacts?
Other Questions Metrics Can Help Answer

1. Where are the gaps in METOC support?
2. Are we good enough?
3. Is a product really worth generating?
4. Is there a more efficient way to produce our products?
5. What difference do we make to our customers?
6. How could we improve our impacts on customers?
7. How much confidence should we and our customers have in
   our products?
High Priority Metrics for ASW Directorate

1. Operational impact metrics: main metric

2. Product performance metrics: stepping stone to main metric
Process for Developing METOC Metrics


    METOC                 METOC                  Operational            Operational
   Forecasts *          Observations               Plans                Outcomes




                METOC                                      Operational
              Performance                                  Performance
                Metrics                                      Metrics




                                Metrics of METOC
                              Impacts on Operational
                                  Performance

                       Apply this process to both real world data and
                           output from military mission models.

 * or other products
Operational Modeling – Overall Intent
•   Operational model of ASW scenarios: “laboratory” for experiments
     • Similar to the strike / NSW application of WIAT
     • Investigate the effect of METOC products and services across a
       wide range of ASW scenarios
         • Effect of increased accuracy
         • Effect of enhanced timeliness
     • Develop METOC support benchmarks to evaluate real world
       performance and establish goals
                                                   Scenario
                                        Pre-       MPRA Area   SURTASS       SLOC
                                     Hostilities    Search      Cuing of    Transit
                                        Area                     MPA       Protection
                                     Clearance                 Pouncers

                                 A
                 METOC Product




                                                  Multiple
                                 B             combinations of
                                               METOC products
                                 C              and services
                                                support ASW
                                 D
Operational Modeling – Development

•     Early stages in model development
    •    Identify the scope of the process to be modeled
    •    Identify the METOC data flow / mission planning for the
         desired scope
    •    Identify the end-user of the model and desired outputs

•     Later stages in model development
    •    Develop a simulation of the METOC data flow and mission
         planning
    •    Incorporate real-world metrics as available to improve
         model fidelity and accuracy
    •    Develop output to describe the impact of improved
         METOC accuracy and/or timeliness
METOC Support Chain for ASW

    NAVO / RBC                  Theater      Operations
     Support                    Planning

                              NOAT / NOAD
      METOC                                    ASW
                               Supporting
       Data                                  Operations
                               Operations



                                               ASW
                                              Results
                              ASW Mission
     ASW RBC                    Planning
                                Process

                                             Operational
   As more components of the support chain    Impacts
    are assessed/modeled, the scope of the
           project becomes greater
METOC Support Chain for ASW – Small Scale Model

   NAVO / RBC                   Theater      Operations
    Support                     Planning

                              NOAT / NOAD
      METOC                                    ASW
                               Supporting
       Data                                  Operations
                               Operations



                                               ASW
                                              Results
                               ASW Mission
     ASW RBC                     Planning
                                 Process

                                             Operational
                                              Impacts
    Smallest scope that could be modeled
         and produce useful results
METOC Support Chain for ASW – Medium Scale Model

    NAVO / RBC                    Theater     Operations
     Support                      Planning

                                NOAT / NOAD
       METOC                                    ASW
                                 Supporting
        Data                                  Operations
                                 Operations



                                                ASW
                                               Results
                                ASW Mission
      ASW RBC                     Planning
                                  Process

 Model starting from the mission planning     Operational
 process through operational impacts (e.g.     Impacts
    HVU losses; expected threats killed)
  Similar scope and LOE as WIAT for STW
METOC Support Chain for ASW – Large Scale Model

    NAVO / RBC                    Theater         Operations
     Support                      Planning

                                NOAT / NOAD
        METOC                                       ASW
                                 Supporting
         Data                                     Operations
                                 Operations



                                                    ASW
                                                   Results
                                ASW Mission
      ASW RBC                     Planning
                                  Process

                                                  Operational
                                                   Impacts
   Extension of the mission planning process
 backwards, to capture the RBC / NOAT influence
             on operational impacts
            Operational Modeling – Notional Modeling Output
            For each combination of METOC support product and scenario, display
            the payoffs from increasing accuracy and/or timeliness to:
                • Determine the level of METOC support that meets ASW requirements
                • Enable decision-maker to balance cost vs. benefit of product improvement


                                                                               Resulting METOC Metrics
                                  Scenario
                       Pre-       MPRA     SURTASS       SLOC                                 No ASW payoff
                    hostilities    Area     Cuing of    Transit
                       Area       Search     MPA       Protection                             from additional
                    Clearance              Pouncers
                                                                                                 accuracy /
                                                                                                 timeliness
                A
                                                                    Accuracy
METOC Product




                                                                                                  ASW
                B                                                                             performance
                                                                                              is improved
                                                                                UNSAT - No
                                                                               impact above
                C                                                                historical


                D                                                                      Timeliness
Operational Modeling – Notional Metrics Assessment

               Ex: SLD Prediction Accuracy
                                                        Perfect
                                                       Prediction
            Actual data come from metrics collection
                 processes instituted for VS07
   Actual




                                Predicted data come from METOC
                                  products distributed by RBC



                             Predicted
Operational Modeling – Notional Metrics Assessment

                 Ex: SLD Prediction Accuracy
                                         Acceptable tolerance
                                         lines (no significant
                                               impact to
                                             operations)
   Actual




            Some impact to
             operations –
            becomes a risk
                                                  Performance thresholds
             management                          generated by modeling and
               decision                                  simulation




                                      Region of
                                     unacceptable
                                     performance


                             Predicted
Capstone Metrics
Errors in environmental depiction …
             10-20X higher likelihood
             of missing sfc layer
             without in-situ sensing




                                                                 “Glider” data
                           GDEM                                  included
                           data only




                           … lead to errors in tactical decisions

  Question: What level of environmental sensing is needed to “sufficiently”
  enable decisions?

                                              Slide provided by CDR Mike Angove, N84
Capstone Metrics
ROI for Environmental Knowledge Study
      Notional Output Curve
              Deliverable: Analysis will fit POM-08 LBSF&I/PR-09 altimeter
              investment to this curve
                                                                          ???
          100%                 Super-invested
          (Ref Case)
                                                  Properly invested
                80%

                       Target Capability Range
                              (notional)
 Decision                                                             Under-invested
 Certainty*
                65%




                 50%


                                        Oceanographic Sensing Resolution

   * e.g., CZ or Sfc Layer presence                        Slide provided by CDR Mike Angove, N84
Metrics Steps
1. Determine what we want to know and be able to do once we have a
     fully functioning metrics system.
2. Determine what metrics we need to in order to know and do these
    things.
3. Determine what calculations need to be done in order to come up
    with the desired metrics.
4. Determine what data needs to be collected in order to do the
    desired calculations (i.e., data analyses).
5. Determine the process to use to collect and analyze the
    needed data.
6. Implement the data collection and analysis process.
         a. If data can be collected, go to step 7.
         b. If data can't be collected, repeat steps 1-5 until you can.
7. Use the metrics obtained from steps 1-6.
8. Assess the results of steps 1-7.
9. Make adjustments to steps 1-8.
10. Repeat steps 1-9 until satisfied with the process and the outcomes
   from the process.
Metrics Steps and Symposium Two Tasks
1. Determine what we want to know and be able to do once we
   have a fully functioning metrics system.
2. Determine what metrics we need to in order to know and do
   these things.
3. Determine what calculations need to be done in order to come
   up with the desired metrics.
4. Determine what data needs to be collected in order to do the
   desired calculations (i.e., data analyses).
5. Determine the process to use to collect and analyze the
   needed data.
6. Implement the data collection and analysis process.
       a. If data can be collected, go to step 7.
                                        Steps completed in you can.
       b. If data can't be collected, repeat steps 1-5 until approximate
                                        form in
7. Use the metrics obtained from steps 1-6. Symposium One and in
8. Assess the results of steps 1-7.            committee reports.
9. Make adjustments to steps 1-8.
10.Repeat steps 1-9 until satisfied with the process and the
   outcomes from the process.
Metrics Steps and Symposium Two Tasks
1. Determine what we want to know and be able to do once we
   have a fully functioning metrics system.
2. Determine what metrics we need to in order to know and do
   these things.
3. Determine what calculations need to be done in order to come
   up with the desired metrics.
4. Determine what data needs to be collected in order to do the
   desired calculations (i.e., data analyses).
5. Determine the process to use to collect and analyze the
   needed data.
6. Implement the data collection and analysis process.
   a. If data can be collected, go to step 7.
                                           Task 1 for Symposium
   b. If data can't be collected, repeat steps 1-5 until you can. Two:
7. Use the metrics obtained from steps 1-6.Review and revise results for
8. Assess the results of steps 1-7.                these steps.
9. Make adjustments to steps 1-8.
10.Repeat steps 1-9 until satisfied with the process and the
   outcomes from the process.
Metrics Steps and Symposium Two Tasks
1. Determine what we want to know and be able to do once we
   have a fully functioning metrics system.
2. Determine what metrics we need to in order to know and do
   these things.
3. Determine what calculations need to be done in order to come
   up with the desired metrics.
4. Determine what data needs to be collected in order to do the
   desired calculations (i.e., data analyses).
5. Determine the process to use to collect and analyze the
   needed data.
6. Implement the data collection and analysis process.
       a. If data can be collected, go to step 7.
       b. If data can't be collected, repeat steps 1-5 until you can.
7. Use the metrics obtained from steps 1-6.
8. Assess the results of steps 1-7.
9. Make adjustments to steps 1-8.   Task 2 for Symposium Two: Outline
                                        plan process 1-6 the
10.Repeat steps 1-9 until satisfied with the for stepsand for VS07.
   outcomes from the process.
Metrics Steps and Symposium Two Tasks
1. Determine what we want to know and be able to do once we
   have a fully functioning metrics system.
2. Determine what metrics we need to in order to know and do
   these things.
3. Determine what calculations need to be done in order to come
   up with the desired metrics.
4. Determine what data needs to be collected in order to do the
   desired calculations (i.e., data analyses).
5. Determine the process to use to collect and analyze the
   needed data.
6. Implement the data collection and analysis process.
   a. If data can be collected, go to step 7.
   b. If data can't be collected, repeat steps 1-5 until you can.
7. Use the metrics obtained from steps 1-6.
8. Assess the results of steps 1-7.
9. Make adjustments to steps 1-8.
                                           Task 3 for and the
10.Repeat steps 1-9 until satisfied with the process Symposium Two:
   outcomes from the process.               Outline plan for steps 1-10
                                               for next several years.
Conceptual Helpers
•   Fence and Gates Analogy

•   Hierarchy of Metrics

•   Bricks and House Analogy
Fence and Gates Analogy: Overall Concept

                                       Future capability
                                       (pri 1)




Future capability    Immediate goals
(pri 3)




                                               Future capability
                                               (pri 2)
Fence and Gates Analogy: An Example

                                             Real-time metrics display
                                             (pri 1)




                   1. VS07 capability and methods
 Exercise level
                      experiment
 data collection
                   2. RBC data collection system
 (pri 3)
                   3. NOAT data collection system



                                                     MPRA data
                                                     collection system
                                                     (pri 2)
Hierarchy of Metrics
Metrics are most useful when they provide information to multiple
levels of the organization

•   Individual forecaster
•   SGOT/OA Chief/Officer
•   METOC activity CO/XO
•   Directorate
•   CNMOC

Fact-based metrics are best developed when developed from data
from the lowest levels of the organization

•   Critical to collect data on the smallest “unit” of support (e.g.,
    forecast, recommendation)
•   Higher level metrics (directorate, CNMOC) rely on lower level
    data collection/metrics
•   Operational modeling is enhanced by quality real world
    information (e.g., significant numbers of mission data records)
Hierarchy of Metrics
                                Higher Level
   Metric Symposium          (Navy-wide SLD accuracy)
   Focus Space                                          Larger Spatial and/or
                                                          Temporal Scale
                                                         (Exercise forecast location)




    Performance                                                                  Impacts
 (Temperature and salinity
                                                                        (Number of positively identified
          accuracy)
                                                                                submarines)




 Smaller Spatial
 and/or Temporal
      Scale                     Lower Level
 (Point forecast location)    (NOATs SLD accuracy)
Hierarchy of Metrics
                                              Higher Level
   Metric Symposium                      (Navy-wide SLD accuracy)   CNMOC/Fleet
   Focus Space                                                        MetricsLarger Spatial and/or
                                                                                  Temporal Scale
                                                                                 (Exercise forecast location)
                                                               Directorate
Bottom-up approach to                                            Metrics
developing higher level,
  larger scale metrics.
                                                          NOAC
Bottom metrics support
                                                          Metrics
 development of fact-
    Performance
   based top metrics.
 (Temperature and salinity                                                                               Impacts
         accuracy)                            Exercise                                          (Number of positively identified
                                               Metrics                                                  submarines)




                                      NOAT
                                      Metrics

                             Ind. Forecast
 Smaller Spatial
 and/or Temporal                Metrics
      Scale                                    Lower Level
 (Point forecast location)                   (MOATs SLD accuracy)
Hierarchy of Metrics
                                              Higher Level
   Metric Symposium                      (Navy-wide SLD accuracy)   CNMOC/Fleet
   Focus Space                                                        MetricsLarger Spatial and/or
                                                                                  Temporal Scale
                                                                                 (Exercise forecast location)
     Lower-level metrics                                       Directorate
      can be input into                                          Metrics
    operational models,
      which can provide                                   NOAC
    higher-level metrics                                  Metrics
    (e.g. mission model)
    Performance
 (Temperature and salinity                                                                               Impacts
       accuracy)                              Exercise                                          (Number of positively identified
                                               Metrics                                                  submarines)




                                      NOAT
                                      Metrics

                             Ind. Forecast
 Smaller Spatial
 and/or Temporal                Metrics
      Scale                                    Lower Level
 (Point forecast location)                   (MOATs SLD accuracy)
Hierarchy of Metrics
                                              Higher Level
   Metric Symposium                      (Navy-wide SLD accuracy)   CNMOC/Fleet
   Focus Space                                                        MetricsLarger Spatial and/or
                                                                                  Temporal Scale
                                                                                 (Exercise forecast location)
     Lower-level metrics                                       Directorate
      can be input into                                          Metrics
    operational models,
      which can provide                                   NOAC
    higher-level metrics                                  Metrics
                                                                                Top-down approach:
    (e.g. mission model)
    Performance
 (Temperature and salinity                                                       Higher level, larger
                                                                                                 Impacts
       accuracy)                              Exercise                                   (Number of can identified
                                                                               scale metrics positivelyalso
                                               Metrics                                           submarines)
                                                                              provide useful feedback
                                                                                for improving lower
                                      NOAT                                       level, smaller scale
                                      Metrics                                           metrics

                             Ind. Forecast
 Smaller Spatial
 and/or Temporal                Metrics
      Scale                                    Lower Level
 (Point forecast location)                   (MOATs SLD accuracy)
Bricks and House Analogy

    Impact on METOC Customers
        (higher-level metrics)




                                 • Each “brick”:
                                     •Each brick represents a
       Support Unit Record           different warfare support area or
            • Forecast data          subset of an area (e.g., MPRA,
          • Verification data        NOAT, RBC)
          • Customer plans           •Takes many records to make
       • Customer outcomes           good high-level metrics
        • Recommendations
              • Other data           •Each record must be well
                                     constructed to make quality
                                     high-level metrics
Where Does the Effort Belong: Early Stages




                      Data Collection
                        Capacity
    Level of Effort




                        Capability
                        Techniques      Data Analysis
                        Data Entry       Capacity
                        Archival         Capability
                        Systems          Algorithms
                                         Display
                                         Archival
                                         Modeling
Where Does the Effort Belong: Later Stages




                                        Data Analysis
                                         Capacity
    Level of Effort




                                         Capability
                      Data Collection    Algorithms
                        Capacity
                                         Display
                        Capability
                                         Archival
                        Techniques
                                         Modeling
                        Data Entry
                        Archival
                        Systems

								
To top