Navy Usability Testing

Document Sample
Navy Usability Testing Powered By Docstoc
					   Results of Usability Testing the U.S. Navy’s
      Performance Management System
                                Michael J. Schwerin, Ph.D.
                                   Elizabeth Dean, M.A.
                                Kimberly M. Robbins, M.A.
                                     RTI International
                                            and
                             CDR Mark J. Bourne, MSC, USN
                                 Navy Personnel Command

3040 Cornwallis Road   ■   P.O. Box 12194   ■   Research Triangle Park, North Carolina, USA 27709
Phone (919) 316-3878                                                        e-mail schwerin@rti.org
                            Overview

 Introduction                                 Results
 Brief Literature Review                      Summary Of Findings
 The New Performance Management               Examples of Usability Errors
  System
                                               Limitations
 Why Conduct Usability Testing?
                                               Future Directions
 Usability Testing Principles & Objectives
                        Introduction

 2001 Chief of Naval Operations chartered the Executive Review of
  Navy Training leading to the development of Task Force for
  Excellence through Commitment to Education and Learning (EXCEL)
 Identified the need for alignment between Fleet requirements,
  training, assignment, and performance management
 Recommended the development of new performance management
  system
    Behaviorally-based vs. trait based
    Web-based system
    Deployed over the Navy/Marine Corps Intranet (NMCI) or other
     web-based networks
 Ultimate outcome – improve personnel system alignment and
  positively affect job satisfaction, organizational commitment, and
  Sailor retention
  Performance Appraisal Affects Workplace
       Satisfaction & Retention Intent

 Job satisfaction & organizational commitment
    Levy and Williams (1998) performance appraisal satisfaction and
     perceived system knowledge have a strong, significant
     relationship with job satisfaction and organizational commitment.
    Olmsted and Farmer (2002) replicated the Staples and Higgins
     (1998) model of job satisfaction using a variety of workplace
     satisfaction measures, including items that reflect satisfaction with
     the Navy performance appraisal system and satisfaction with
     advancement and promotion.
 Retention intent
    Daily and Kirk (1992) – Found a strong relationship between
     perceived workplace fairness and voluntary turnover intent.
    Jones (1998) found the perceived fairness of procedures for pay
     determination, performance appraisals, and appeals were related
     to voluntary turnover.
    Navy-wide Personnel Survey (2003):
Satisfaction with the EVAL/FITREP System
 Percent satisfied with aspects of the                                                The most qualified and deserving
 current EVAL/FITREP system:                                                          Sailors score the highest on their
 Satisfaction with FITREP/EVAL                       Enl         Off                           EVALs/FITREPs

                                                                              E1-E3         46%              21%        33%
  I understand the current                         78%         89%
   EVAL/FITREP system
                                                                              E4-E6             58%             16%         26%
  My last EVAL/FITREP was                          64%         80%
   fair/accurate
                                                                              E7-E9          47%             17%        35%
  My last EVAL/FITREP was                          69%         85%
   conducted in a timely manner
  I was able to submit my own      73%                         93%             WO          42%         17%            41%
   input at my last EVAL/FITREP
  My last advancement/promotion    67%                         82%          O1-O3         37%          22%            40%
   recommendation was fair/accurate
  I am satisfied with the current  42%                         57%      O4 & Above       31%         18%             51%
   EVAL/FITREP system

 Note: The titles of the columns above were abbreviated to save space.                      Disagree        Neither     Agree
 The key to the abbreviations is as follows:
     Enl = Enlisted
     Off = Officer
    The New Performance Management
                System

 Provides mechanisms for both
  supervisor and non-supervisory
  Navy personnel to provide
  performance input
   Human Performance Feedback
    and Development (HPFD): Used
    by all Sailors for self-evaluations
    and by supervisors to document
    and provide performance
    coaching
   ePerformance: Used by
    supervisors to conduct annual
    performance appraisals
      Why Conduct Usability Testing?

 DoD Instruction 5000.2 requires human systems integration in
  developing systems for military personnel
 What is usability testing? Testing assesses the behavior of users
  in their own environment often through the collection of video,
  audio, and behavioral data
 Factors in the “user” experience that can be addressed by testing:
    Time it takes to complete tasks
    Amount of self-editing required (i.e., erasures, re-entry of data,
     changing answers, etc.)
    Navigational problems
    Emotional responses
    Identifying “other” sources of burden to respondents
      Usability Testing: Key Principles

 Set of practices designed to assess and improve the “usability” of a
  product (Dumas and Redish, 1993). Key principles of usability
  testing include:
    Participants must represent real users doing real tasks
    Researchers observe and record what participants do and say
    Researchers analyze data, diagnose problems, and recommend
     solutions to problems
 Nielsen (1993) recommends an iterative (repetitive) approach to
  usability testing
 Bevan and Mcleod (1994) indicate that for usability testing to be
  completely successful it must be “context aware”
    Usability Testing: Study Objectives


 Assess usability of HPFD
  for non-supervisory
  personnel
 Assess usability of HPFD
  and ePerformance for
  supervisory personnel
 Evaluate the relative impact
  of using systems in sea and
  shore settings
           Usability Testing: Approach

 Iterative testing design            Testing conducted at multiple
    Continuous improvement            „real-life‟ environments in the
     approach (i.e., test, revise,     U.S. Navy
     and test again)                     Phase 1:
                                            Naval Air Station (NAS) Brunswick
                                            Dates: June 21-25, 2005

                                         Phase 2:
       Revise         Test
                                            USS KITTY HAWK (CV63)
                                            Dates: July 12-16, 2004

                                         Phase 3:
                                            Naval Base Kitsap – Bangor
                                            Dates: August 9-13, 2004
        Test       Revise
            Testing Design: Procedures

 Pre-Test Survey - Collect information regarding prior experience with
  the Navy‟s performance appraisal system, familiarity with computers,
  and background characteristics
 Computer-based training (CBT) module and/or Quick Reference Guide
  (QRG)
 Usability scenarios and testing to simulate “real-life” situations to
  identify problems such as:
    Unclear navigational instructions
    Confusing help text
    Problems of accessing/responding via NSIPS
 Post-Test Survey - Collect feedback on the ease of use,
  personal/professional value, and subjective feedback on satisfaction
  with the HFPD and ePerformance system
                  Results: Participants

 Naval Air Station (NAS)       Group:
                                  Officers                    23%
  Brunswick
                                  Enlisted                    77%
   14 supervisors
   7 non-supervisors           Gender:
                                  Male                        89%
 USS KITTY HAWK (CV63)
                                  Female                      11%
   14 supervisors
   6 non-supervisors           Supervision Status:
                                  Supervisors                 63%
 Naval Base Kitsap – Bangor         25-44 years in age
   10 supervisors                   Average of >10 years in the
                                      Navy
   9 non-supervisors
                                  Non-supervisors             37%
                                     18-34 years in age
                                     Average of <5 years in the Navy
                  Results: Percentage of Common
                 Usability Errors by Usability Task*
                                                                     Usability Task
                                                       Task 3:                                Task 6:
Common                     Task 1:       Task 2:
                                                        Open
                                                                   Task 4:        Task 5:
                                                                                               Find
                                                                                                           Task 7:      Task 8:     Task 9:     Task 10:
                          Complete       Log into                 Complete         Spell                     Edit      Collapse     Submit       Enter a
Errors                      CBT           NSIPS
                                                        HPFD
                                                         doc
                                                                  HPFD doc        check
                                                                                              Target
                                                                                               Beh’s
                                                                                                           ratings     sections      HPFD       perf note

User does not
follow screen                82%
instructions
General button
                             16%
error
User not able to
                                            7%
set password
Navigational
                                                         8%                                                                                        5%
error
User refers to the
                                                         6%           5%                                                                           2%
QRG for help
User asks for
                                            3%           3%           4%
help
User is timed out
                                                                                   4%           3%                       4%           2%

System or server
                                            3%           2%
error
*Percentage of common usability errors is the frequency of the error for each task divided by the total number of errors (541) across tasks; percentages
do not total to 100%.
              Results: Percentage of Common Usability
              Errors When Completing ePerformance*
                                                                        Usability Task
Common                            Task 11:      Task 12:        Task 13:       Task 14:       Task 15:         Task 16:       Task 17:        Task 18:
                                  Log into      Open perf       Complete        Check          Spell            Check         Calculate      Submit perf
Errors                             NSIPS          doc           perf doc        ratings        check          language         ratings          doc


Navigational error
                                                   38%             8%                            2%

User is timed out
                                                                   4%             2%             4%             10%              6%              6%

System or server
error                                                                             2%




*Percentage of common usability errors is the frequency of the error for each task divided by the total number of errors (541) across tasks; percentages
do not total to 100%.
              Results:
Examples of Common Usability Errors

                            Task:
                            Use the spell check
                            feature.

                            Usability Error:
                            User has trouble
                            finding the button
                            but eventually is
                            able to use it.
Results: Examples of Common Usability
                Errors

                             Task:
                             Log in to NSIPS
                             (requires reset of
                             password.).

                             Usability Error:
                             User has problems
                             entering initial
                             password and then
                             resetting the
                             password.
Results: Examples of Common Usability
                Errors

                             Tasks:
                             Complete various
                             functions within the
                             HPFD document.

                             Usability Error:
                             Users are timed out
                             of the NSIPS
                             system.
                 Results: Post-test survey results
Satisfaction (percent satisfied) with                                                  How effective do you think the
 various aspects of the performance                                                      system will be as a career
 management system:
                                                                                      development and planning tool?
Problem                                             S           NS
 Comfortable with the                             66%         18%
  testing tasks                                                            Non-
                                                                        supervisors    19%   14%           67%
 Certain that tasks were                          74%         32%
  completed correctly
 Easy to use                                      63%         50%
 Easy to understand                               63%         55%
 Professional in appearance                       91%         73%
                                                                        Supervisors 12% 15%               73%
 System was efficient                             49%         64%
 System got easier to use                         83%         73%
  across the testing tasks


                                                                                        Ineffective   Neither    Effective
Note: The titles of the columns above were abbreviated to save space.
The key to the abbreviations is as follows:
    S = Supervisor
    NS = Non-supervisor
                Summary of Findings

 Iterative study approach identified system and usability concerns
  that are likely to help improve the existing system by eliminating or
  dealing with sources of confusion for system users
    Not counting extreme system failures, usability problems
     accounted for an average of 30 additional minutes of Sailor time
     using the HPFD and ePerformance systems
    Recommendations included improvements to navigational aids,
     instructions, confirmation of actions, status messages, and the
     creation of the QRG help guide
    Negative opinions appear to be the result of NSIPS connectivity
     problems and learning to navigate/use the new performance
     management system
         Summary of Findings (cont.)

 Participants were concerned
  about the possible erosion
  of personal relationships in
  professional growth and
  mentoring that may be
  introduced by moving to an
  electronic performance
  management system
 Users raised a number of
  process concerns unique to
  the culture of the Navy
                        Limitations

 Testing of the vertical document flow process between Sailors
  and first and second level supervisors is needed to obtain better
  understanding of real performance appraisal environment.
 Minimal changes between iterations precluded analytic benefits
  of full iterative approach.
    Smaller changes were made than originally planned due to
     system limitations
    Additional improvement may have been possible if more
     change to the system had been allowed between each
     iteration
                   Limitations (cont.)

 Objective of usability testing is not to produce results that
  generalize to the Navy population
    However, if participants represent the majority of users, results
     can provide insight to how others might respond
    Sample design does not represent the diversity of the general
     Navy population
                    Future Directions

 Conduct a follow-up usability study to evaluate . . .
    The impact of changes to the performance management system
    Vertical document flow process
 Examine process and cultural issues with key Navy personnel and
  Fleet stakeholders
    Explore Navy personnel concerns about the possible loss of
     person-to-person interaction in moving to an electronic
     performance management system
    Develop a communications plan that helps to highlight the
     rationale and benefits of moving to the new electronic format
 Conduct a full pilot study to field test the performance management
  system with a representative command – a “real-life” application of
  the system as it will be used in practice
            Future Directions (cont.)

 Conduct a validation study to compare the new performance
  management system with the EVAL/FITREP system currently being
  used by the Navy. Examine…
   Validity and reliability of performance appraisal scores
   Perceived fairness and equity in performance appraisal process
   Impact on new performance management and appraisal process
    on…
      Measures of quality of work life – job satisfaction and
       organizational commitment
      Sailor retention intent and retention behavior
 Questions
Questions?
                          Return on Investment

   Total Active Duty & Reserve Sailors = 457,126              Average amount of time added to task
                                                                due to usability problems = 30 min
   Average Sailor hourly rate = $18.11
                                                               Error rate = 25.22%
   Total number of HPFD & ePerformance
    opportunities = 1,234,240                                  Total time due to usability errors
      Supervisors = 6 (1 self-appraisal; 5 direct              across supervisors & non-supervisors
       reports); 1/3 of Navy having supervisory                 = 152,168 hrs
       responsibility                                          Usability testing and system
      Non-Supervisors = 1 (self-appraisal); 2/3                modification costs = $154,629
       of Navy as non-supervisory
                             Cost in Labor Hours due to Usability Errors
              ROI    =   ------------------------------------------------------------------
                                usability testing & system modification cost
                                          Or $2,755,759 / $154,629
                                                 ROI = 1,782%

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:11
posted:11/10/2011
language:English
pages:26