Customer Satisfaction Worksheet by zfd16645

VIEWS: 29 PAGES: 30

Customer Satisfaction Worksheet document sample

More Info
									    Developing Consistent Administrative
      Performance Measures for State
               Government


                    February 18, 2004
            Department of Administrative Services
                   Oregon Progress Board
                  www.econ.state.or.us/opb


1
                                           Overview

    • Project Description – Cindy
    • Performance Measurement 101 – Jeff
      – Current concepts and definitions
      – How is this different?
    • Establishing the Baseline – Small Groups - Rita
    • Where We Go From Here – Cindy and Jeff

2
    PROJECT DESCRIPTION




3
       This initiative will focus on a specific
             set of administrative processes.

    First priority                        Second priority
       –   Human Resources                    –   Facilities
       –   Procurement                        –   Budget
       –   Information Technology             –   Asset Management
       –   Financial                          –   Internal Audit
       –   Customer Satisfaction*             –   Communications


4   * Purists note: Not an “administrative process”.
                                       Project timeline


                     Research          Goals,          Performance
    Form work
                     and select   strategies, P.M.   measures and early
      groups
                      models         categories            data

    February          March            June              September



                DAS                Admin             Report to
            Requirements          Directors           Admin
         (including targets)      Approve            Directors

               November           October             October

5
    PERFORMANCE MEASURES
                     101




6
         Performance measurement is part of a
                     larger planning process.
          Contextual
           Analysis
                                          Mission and
                       Where are we?
                                            Goals

      How do we know                   Where do we
        we arrived?                    want to go?

    Performance        How do we get
     Measures             there?
                                         Strategies

7
                      Why measure performance?

    It’s one way to know how we’re doing.
    • Are we carrying out our mission?
      Is the ship on course?
    • Are we doing it as effectively as possible?
      Is the ship running well?




8
       Didn’t state agencies just develop a set of
              measures for the Progress Board?

• “Key” performance measures
    – Focus primarily on whether the ship is on course
      (alignment to mission and effectiveness)
    – Measure program-related aspects of agency
    – Measure linkages to Benchmarks and other high level
      policy-related results
• Administrative performance measures
    – Mostly focused on internal support
    – Flow from the administrative planning process
    – Focus primarily on how well the ship/fleet is running
      (efficiency, equity, integrity)
9
         “Agency” vs. “Enterprise” Planning

     • An agency..
      develops goals, strategies and performance
      measures to achieve its individual
      mandate and mission.

     • The enterprise (state government)..
      collaborates on enterprise-wide goals, strategies
      and measures, where appropriate, to support
      individual agencies in
      achieving their missions.

10
     What makes a good performance
                          measure?




11
     Good measures must gauge progress
               toward achieving goals.




12
     Often a mix of measures is needed to tell
                             the whole story.
     • OUTCOME = Result
       – High-level (societal) = Recidivism (OBM # 65)
       – Intermediate = % of treated youth exhibiting
         reduced risk factors.
     • OUTPUT = Product or service (“widget”)
       – # of youth completing treatment.
     • EFFICIENCY = Input per output/outcome
       – Average number of staff hours per completed
         training.
13
       Different kinds of measures play different
                                           roles.

     • High-Level Outcomes – Is the world you are
       attempting to affect changing?


     • Intermediate Outcomes – Are strategies
      having the desired result?


     • Outputs – Is the work getting done?
14
  Agencies must decide how “high up” to go
                     for their key measures.
  Goal: Prevent
                                                       Juvenile
 Juvenile Crime                                        recidivism
                                                       DECREASES.
                                         “So That”
                                   % of served       High-Level Outcome
                                   youth with
                                   mitigated risk
                      “So That”    factors
                                   INCREASES.
                 % of high risk
                 youth            Intermediate
                 completing       Outcome
                 program
    “So That”    INCREASES.
% of crime      Intermediate
prev.           Outcome
grantees
trained
INCREASES .
Output
     But what about administrative support
                                    units?




16
    Administrative measures, generally, support
                           program measures.
      Goal: Prevent                       Youth                                     Juvenile
                                                                                    recidivism
                                         tracked.
     Juvenile Crime                                                    “So That”
                                                                                    DECREASES.

                                                                                   High-Level Outcome
                                                            % of served youth
                                                            with mitigated risk
                                                            factors
                                       “So That”            INCREASES.
Hours per                                                                            Juvenile dept.
                                       % of high risk
 training.                             youth             Intermediate Outcome        contracts signed.
                                       completing
                                       program
                        “So That”
                                       INCREASES.
                     % of crime       Intermediate
                     prev. grantees   Outcome
                     trained                                                  Surveys
                     INCREASES.                                           conducted on- line
                  Output

Calls returned
   on time.                                           Performance con-
                 Grants paid                         tracts implemented
                   on time.
                 Performance measures must have
     100% of grantees will
                                        targets.
      have performance
      contracts by 2005.
                           • TARGET = Desired level at a
                             given point in time
                           • Should be ambitious but realistic
          70% of surveys
           conducted on-   • Target setting is based on:
            line by 2005
                             – trend data
                             – comparisons
                             – expert opinion

18
           Performance measure data must be
                       accurate and reliable.

     • Without trustworthy data, the system is
       meaningless.
       Example: verifiable records trump estimates
     • Each measure should have at least one data
       point, preferably several.
     • Data should describe what is being
       measured.

19
              Why do the measures need to be
           consistent across state government?

     • So we can tell the story of state government
       performance, as a whole.
     • Enterprise-wide measures provide us with a
       new way to compare and improve at the
       agency level.
     • Implementation will be easier if agencies
       can assist one another.
     • It’s the next logical thing to do.
20
     Logic models are a step-by-step approach
              to linking measures to mission.

         Mission or
          HLO.                   Intermediate Outcome
                                       Measures
                        Impact


            Goal                       Output
                                      Measures


         Common                  Strategies to achieve
       Administrative                  the goal
         Measures
21
                  Logic Model Example – H.R.
           H.R.Mission
     Provide the best possible
      service at a reasonable             INT. OUTCOMES – 1. % of mngrs
               cost.                           w/ key skill sets; 2. M.M.
                                           compensation comparison; 3. % of
                                 Impact         “new hires” advancing.

           H.R. Goal
        Improve skills of                   OUTPUTS – 1. # trainings; 2.
        middle managers.                     Classifications assessed; 3.
                                              Credentials of new hires.


         Common                                    STRATEGIES
       Administrative                       1. Continuous training; 2. Fair
                                             compensation; 3. Aggressive
         Measures                                    recruitment.
22
     Questions or comments?




23
     ESTABLISHING THE BASELINE: SMALL
                   GROUP DISCUSSIONS




24
                                 “Where are we?”
     • Groups form by function - H.R., I.T.,
       Procurement, Financial, Customer Satisfaction.
     • Group discussion – 30 minutes; worksheet
       – Task #1: Pick a reporter.
       – Task #2: Identify up to 4 sub-functions
       – Task #3: Walk through baseline exercise
         (11x17 worksheets)
     • Report to large group– 5 minutes per group
       – Cross-agency summary (poster worksheet)
       – General observations
25
“WHERE DO WE GO FROM HERE?”




26
                                        Project timeline


                      Research          Goals,          Performance
     Form work
                      and select   strategies, P.M.   measures and early
       groups
                       models         categories            data

     February          March            June              September



                 DAS                Admin             Report to
             Requirements          Directors           Admin
          (including targets)      Approve            Directors

                November           October             October

27
              PROJECT CONTACTS

     • Cindy Becker     • Jeff Tryens
       (503) 378-5097     (503) 378-3201
     • Rita Conrad      • Reese Lord
       (503) 378-3204     (503) 378-5465



28
                COMMITTEE CHAIRS

     Human Resources:         Sheryl Warren
     Procurement:             Jeremy Emerson
     Info. Technology:        John Koreski
     Financial:               Mike Marsh
     Customer Satisfaction:   Scott Harra



29
                        Helpful websites
      Governmental Accounting Standards Board
         www.gasb.org GASB home page
      Sampling of State of Washington’s Administrative
       Measures
         www.ofm.wa.gov/budget/manage/perfrept/0103/111.pdf/
         www.ofm.wa.gov/budget/manage/perfrept/0103/105.pdf
         www.ofm.wa.gov/budget/manage/perfrept/0103/155.pdf
      The Results and Performance Accountability
       Implementation Guide
         www.raguide.org/Default.htm
30

								
To top