Docstoc

Bnh Employee Forms

Document Sample
Bnh Employee Forms Powered By Docstoc
					 Session P10 Evaluating Online Learning:
      Frameworks and Perspectives
  (Workshop: Sunday Sept 22nd, Online Learning 2002)


          Dr. Curtis J. Bonk
         President, CourseShare.com
    Associate Professor, Indiana University
         http://php.indiana.edu/~cjbonk,
           cjbonk@indiana.edu

     Dr. Vanessa Paz Dennen
Assistant Professor, San Diego State University
            vdennen@mail.sdsu.edu
      http://edweb.sdsu.edu/people/vdennen
 Workshop Overview
• Part I: The State of Online Learning
• Part II. Evaluation Purposes,
  Approaches, and Frameworks
• Part III. Applying Kirkpatrick’s 4
  Levels
• Part IV. ROI and Online Learning
• Part V. Collecting Evaluation Data &
  Online Evaluation Tools
Sevilla & Wells (July, 2001), e-learning

We could be very productive by ignoring
 assessment altogether and assume
 competence if the learner simply gets
 through the course.
          Why Evaluate?
• Cost-savings
  – Becoming less important reason to evaluate
    as more people recognize that the initial
    expense is balanced by long-term financial
    benefits
• Performance improvement
  – A clear place to see impact of online learning
• Competency advancement
         16 Evaluation Methods
1. Formative Evaluation            9. K-Level 6 budget and stability
2. Summative Evaluation               of e-learning team.
                                   10. K-Level 7 whether e-learning
3. CIPP Model Evaluation              champion(s) are promoted
4. Objectives-Oriented             11. Cost/Benefit Analysis (CBA)
     Evaluation                    12. Time to Competency
5. Marshall & Shriver's 5 Levels   13. Time to Market
     of Evaluation                 14. Return on Expectation
6. Bonk’s 8 Part Evaluation Plan   15. AEIOU: Accountability,
     (& the Ridiculous Model)         Effectiveness, Impact,
7. Kirkpatrick’s 4 Levels             Organizational Context, U =
                                      Unintended Consequences
8. Return on Investment (ROI):
                                   16. Consumer-Oriented
                                      Evaluation
Part I. The State of Online Learning
Survey of 201 Trainers, Instructors,
Managers, Instructional Designers,
        CEOs, CLOs, etc.
    Survey Limitations
• Sample pool—e-PostDirect
• The Web is changing rapidly
• Lengthy survey, low response rate
• No password or keycode
• Many backgrounds—hard to
  generalize
• Does not address all issues (e.g., ROI
  calculations, how trained & supported,
  specific assessments)
                                    Figure 2. Size of Respondent Organizations

                         30

                         25
Percent of Respondents




                         20

                         15

                         10

                         5

                         0
                              1 to 30   31-100   101 to   501 to   1,001 to   5,001 to   10,001 to More than
                                                  500     1,000     5,000      10,000     100,000 100,001

                                                     Number of Employees
                Figure 12. Methods Used to Deliver Training in
                                Organization

                       Other

Paper-Based Correspondence

                   Videotape

                  Multimedia

             Internet/Intranet

    Instructor-Led Classroom


                                 0   20   40   60     80     100   120
Why Interested in E-Learning?
 Mainly cost savings
 Reduced travel time
 Greater flexibility in delivery
 Timeliness of training
 Better allocation of resources, speed of delivery,
  convenience, course customization, lifelong
  learning options, personal growth, greater
  distrib of materials
   Figure 25. Percent of Respondent Organizations
Conducting Formal Evaluations of Web-Based Learning




            Yes
            41%

                                        No
                                       59%
A Few Assessment
   Comments
 Level 1 Comments. Reactions
“We assess our courses based on
  participation levels and online surveys
  after course completion. All of our courses
  are asynchronous.”
“I conduct a post course survey of course
  material, delivery methods and mode, and
  instructor effectiveness. I look for
  suggestions and modify each course based
  on the results of the survey.”
“We use the Halo Survey process of asking
  them when the course is concluding.”
Level 2 Comments: Learning

 “We use online testing and simulation
  frequently     for    testing student
  knowledge.”
 “Do multiple choice exams after each
  section of the course.”
 “We use online exams and use level 2
  evaluation forms.”
      Level 3 Comment: Job
          Performance
“I feel strongly there is a need to measure
  the success of any training in terms of the
  implementation of the new behaviors on
  the job. Having said that, I find there is
  very limited by our clients in spending
  the dollars required…”
   More Assessment Comments
    Multiple Level Evaluation
“Using Level One Evaluations for each session followed
  by a summary evaluation. Thirty days post-training,
  conversations occur with learners’ managers to assess
  Level 2” (actually Level 3).”
“We do Level 1 measurements to gauge student
  reactions to online training using an online evaluation
  form. We do Level 2 measurements to determine
  whether or not learning has occurred…
“Currently, we are using online teaching and following
  up with manager assessments that the instructional
  material is being put to use on the job.”
    Who is Evaluating Online
           Learning?
• 59% of respondents said they did not
  have a formal evaluation program
• At Reaction level: 79%
• At Learning level: 61%
• At Behavior/Job Performance level: 47%
• At Results or Return on Investment: 30%
                                Figure 26. How Respondent Organizations Measure
                                         Success of Web-Based Learning

                         90
Percent of Respondents




                         80
                         70
                         60
                         50
                         40
                         30
                         20
                         10
                          0
                              Learner satisfaction      Change in        Job performance   ROI
                                                     knowledge, skill,
                                                         atttitude
                                                     Kirkpatrick's Evaluation Level
Assessment Lacking or Too Early
“We are just beginning to use Web-based
 technology for education of both
 associates and customers, and do not
 have the metric to measure our success.
 However, we are putting together a
 focus group to determine what to
 measure (and) how.”
“We have no online evaluation for
 students at this time.”
“We lack useful tools in this area.”
Limitations with Current System
“I feel strongly there is a need to measure the
  success of any training in terms of the
  implementation of the new behaviors on the
  job. Having said that, I find there is very
  limited by our clients in spending the
  dollars required…”
“We are looking for better ways to track
  learner progress, learner satisfaction, and
  retention of material.”
“Have had fairly poor ratings on reliability,
  customer support, and interactivity…”
Pause…How and
 What Do You
  Evaluate…?
     Readiness Checklist
1.   ___ Is your organization undergoing
  significant change, in part related to e-
  learning?
2. ___ Is there pressure from senior
  management to measure the results of e-
  learning?
3. ___ Has your company experienced one
  or more training/learning disasters in the
  past?
4. ___ Is the image of the training/learning
  function lower than you want?
         Part II
   Evaluation Purposes,
Approaches and Frameworks
    What is Evaluation???
“Simply put, an evaluation is concerned
  with judging the worth of a program and
  is essentially conducted to aid in the
  making of decisions by stakeholders.”
 (e.g., does it work as effectively as the standard
 instructional approach).

 (Champagne & Wisher, in press)
   What is assessment?
• Assessment refers to…efforts to obtain info about
  how and what students are learning in order to
  improve…teaching efforts and/or to demo to
  others the degree to which students have
  accomplished the learning goals for a course.‖
  (Millar, 2001, p. 11).
• It is a way of using info obtained through various
  types of measurement to determine a learner’s
  performance or skill on some task or situation
  (Rosenkrans, 2000).
Who are you evaluating for?
The level of evaluation will depend on
 articulation of the stakeholders.
 Stakeholders of evaluation in
 corporate settings may range
 from…???
       Evaluation Purposes
• Determine learner progress
  – What did they learn?
• Document learning impact
  – How well do learners use what they learned?
  – How much do learners use what they learn?
       Evaluation Purposes
• Efficiency
  – Was online learning more effective than
    another medium?
  – Was online learning more cost-effective than
    another medium/what was the return on
    investment (ROI)?
• Improvement
  – How do we do this better?
       Evaluation Purposes
“An evaluation plan can evaluate the
 delivery of e-learning, identify ways to
 improve the online delivery of it, and
 justify the investment in the online
 training package, program, or initiative.”
 (Champagne & Wisher, in press)
     Evaluation Plans

Does your company have a training
        evaluation plan?
    Steps to Developing an OL
       Evaluation Program
• Select a purpose and framework
• Develop benchmarks
• Develop online survey instruments
  – For learner reactions
  – For learner post-training performance
  – For manager post-training reactions
• Develop data analysis and management
  plan
   1. Formative Evaluation
• Formative evaluations focus on
  improving the online learning experience.
• A formative focus will try to find out
  what worked or did not work.
• Formative evaluation is particularly
  useful for examining instructional design
  and instructor performance.
      Formative Questions
• -How can we improve our OL program?
• -How can we make our OL program
  more efficient?
• -More effective?
• -More accessible?
   2. Summative Evaluation
• Summative evaluations focus on the
  overall success of the OL experience
  (should it be continued?).
• A summative focus will look at
  whether or not objectives are met,
  the training is cost-effective, etc.
         Course Completion
• Jeanne Meister, Corporate University Xchange,
  found a 70 percent drop out rate compared to
  classroom rates of 15%.
• Perhaps need new metrics. Need to see if they can
  test out.
• “Almost any measure would be better than
  course completion, which is not a predictor of
  anything.” Tom Kelly, Cisco, March 2002, e-
  Learning.
  What Can OL Evaluation
        Measure?
• Categories of Evaluation Info (Woodley
  and Kirkwood, 1986)
     •   Measures of activity
     •   Measures of efficiency
     •   Measures of outcomes
     •   Measures of program aims
     •   Measures of policy
     •   Measures of organizations
          Typical Evaluation
          Frameworks for OL
• Commonly used frameworks include:
  –   CIPP Model
  –   Objectives-oriented
  –   Marshall & Shriver’s 5 levels
  –   Kirkpatrick’s 4 levels
       • Plus a 5th level
  – AEIOU
  – Consumer-oriented
      3. CIPP Model Evaluation
• CIPP is a management-oriented model
  –   C = context
  –   I = input
  –   P = process
  –   P = product
• Examines the OL within its larger
  system/context
      CIPP & OL: Context
• Context: Addresses the environment in
  which OL takes place.
• How does the real environment compare
  to the ideal?
• Uncovers systemic problems that may
  dampen OL success.
  – Technology breakdowns
  – Inadequate computer systems
        CIPP & OL: Input
• Input: Examines what resources are put
  into OL.
• Is the content right?
• Have we used the right combination of
  media?
• Uncovers instructional design issues.
       CIPP & OL: Process
• Process: Examines how well the
  implementation works.
• Did the course run smoothly?
• Were there technology problems?
• Was the facilitation and participation as
  planned?
• Uncovers implementation issues.
       CIPP & OL: Product
• Product: Addresses outcomes of the
  learning.
• Did the learners learn? How do you
  know?
• Does the online training have an effect on
  workflow or productivity?
• Uncovers systemic problems.
       4. Objectives-Oriented
             Evaluation
• Examines OL training objectives as compared
  to training results
• Helps determine if objectives are being met
• Helps determine if objectives, as formally
  stated, are appropriate
• Objectives can be used as a comparative
  benchmark between online and other training
  methods
Evaluating Objectives & OL
• An objectives-oriented approach can
  examine two levels of objectives:
  – Instructional objectives for learners (did the
    learners learn?)
  – Systemic objectives for training (did the
    training solve the problem?)
       Objectives & OL
• Requires:
  – A clear sense of what the objectives are
    (always a good idea anyway)
  – The ability to measure whether or not
    objectives are met
     • Some objectives may be implicit and hard
       to state
     • Some objectives are not easy to measure
     5. Marshall & Shriver's
    Five Levels of Evaluation
• Performance-based evaluation
  framework
• Each level examines a different area’s of
  performance
• Requires demonstration of learning
Marshall & Shriver's 5 Levels

• Level I: Self (instructor)
• Level II: Course Materials
• Level II: Course Curriculum
• Level IV: Course Modules
• Level V: Learning Transfer
   6. Bonk’s Evaluation
         Plan…
    Considerations in Evaluation Plan

       8. University
            or
       Organization     1. Student
7. Program                      2. Instructor


 6. Course                        3. Training
       5. Tech Tool     4. Task
     What to Evaluate?
1.Learner—attitudes, learning, use, performance.
2.Instructor—popularity, course enrollments.
3.Training—internal and external components.
4.Task--relevance, interactivity, collaborative.
5.Tool--usable, learner-centered, friendly, supportive.
6.Course—interactivity, participation, completion.
7.Program—growth, long-range plans.
8.Organization—cost-benefit, policies, vision.
  RIDIC5-ULO3US Model of
      Technology Use
4. Tasks (RIDIC):
 –   Relevance
 –   Individualization
 –   Depth of Discussion
 –   Interactivity
 –   Collaboration-Control-Choice-
     Constructivistic-Community
RIDIC5-ULO3US Model of
    Technology Use

 5. Tech Tools (ULOUS):
   – Utility/Usable
   – Learner-Centeredness
   – Opportunities with Outsiders Online
   – Ultra Friendly
   – Supportive
    7. Kirkpatrick’s 4 Levels
• A common training framework.
• Examines training on 4 levels.
• Not all 4 levels have to be included in
  a given evaluation.
      The 4 Levels
• Reaction
• Learning
• Behavior
• Results
   8. Return on Investment
      (ROI): A 5th Level
• Return on Investment is a 5th level
• It is related to results, but is more clearly
  stated as a financial calculation
• How to calculate ROI is the big issue here
    Is ROI the answer?
• Elise Olding of CLK Strategies suggests
  that we shift from looking at ROI to
  looking at time to competency.
• ROI may be easier to calculate since
  concrete dollars are involved, but time to
  competency may be more meaningful in
  terms of actual impact.
  Example: Call Center Training
• Traditional call center training can take 3
  months to complete
• Call center employees typically quit
  within one year
• When OL was implemented, the time to
  train (time to competency) was reduced
• Benchmarks for success: time per call;
  number of transfers
       Example: Circuit City
• Circuit City provided online product/sales
  training
• What is more useful to know:
  – The overall ROI or break-even point?
  – How much employees liked the training?
  – How many employees completed the training?
  – That employees who completed 80% of
    the training saw an average increase of
    10% in sales?
 Matching Evaluation Levels
  with Objectives Pretest
Instructions: For each statement below, indicate
  the level of evaluation at which the objective is
  aimed.
1.   ___ Show a 15 percent decrease in errors
  made on tax returns by staff accountants
  participating in the e-learning certificate
  program.
2. ___ Increase use of conflict resolution skills,
  when warranted, by 80 percent of employees
  who had completed the first eight modules of
  the online training. (see handout for more)
          9. A 6th Level?
        Clark Aldrich (2002)
• Adding Level 6 which relates to the budget and
  stability of the e-learning team.
  – Just how respected and successful is the e-learning
    team.
  – Have they won approval from senior management
    for their initiatives.

  – Aldrich, C. (2002). Measuring success: In a post-Maslow/Kirkpatrick
    world, which metrics matter? Online Learning, 6(2), 30 & 32.
   10. And Even a    Level?     7th

     Clark Aldrich (2002)
• At Level 7 whether the e-learning sponsor(s) or
  champion(s) are promoted in the organization.

• While both of these additional levels address
  the people involved in the e-learning initiative
  or plan, such recognitions will likely hinge on
  the results of evaluation of the other five levels.
      11. ROI Alternative:
   Cost/Benefit Analysis (CBA)
• ROI may be ill-advised since not all impacts hit
  bottom line, and those that do take time.
• Shifts the attention from more long-term
  results and quantifying impacts with numeric
  values, such as:
    – increased revenue streams,
    – increased employee retention, or
    – reduction in calls to a support center.
• Reddy, A. (2002, January). E-learning ROI calculations: Is a
  cost/benefit analysis a better approach? e-learning. 3(1), 30-32.
   Cost/Benefit Analysis (CBA)
• To both qualitative and quantitative measures:
    –   job satisfaction ratings,
    –   new uses of technology,
    –   reduction in processing errors,
    –   quicker reactions to customer requests,
    –   reduction in customer call rerouting,
    –   increased customer satisfaction,
    –   enhanced employee perceptions of training,
    –   global post-test availability.
• Reddy, A. (2002, January). E-learning ROI calculations: Is a
  cost/benefit analysis a better approach? e-learning. 3(1), 30-32.
   Cost/Benefit Analysis (CBA)
• In effect, CBA asks how does the sum of the
  benefits compare to the sum of the costs.
• Yet, it often leads to or supports ROI and other
  more quantitatively-oriented calculations.

• Reddy, A. (2002, January). E-learning ROI calculations: Is a
  cost/benefit analysis a better approach? e-learning. 3(1), 30-32.
  Other ROI Alternatives
12. Time to competency (need benchmarks)
    – online databases of frequently asked questions can
      help employees in call centers learn skills more
      quickly and without requiring temporary leaves
      from their position for such training
13. Time to market
    – might be measured by how e-learning speeds up the
      training of sales and technical support personnel,
      thereby expediting the delivery of a software
      product to the market
Raths, D. (2001, May). Measure of success. Online Learning, 5(5), 20-
  22, & 24.
 Still Other ROI Alternatives
14. Return on Expectation
  1. Asks employees a series of questions related to how
     training met expectations of their job performance.
  2. When questioning is complete, they place a $ figure
     on that.
  3. Correlate or compare such reaction data with
     business results or supplement Level 1 data to
     include more pertinent info about the applicability
     of learning to employee present job situation.
  –   Raths, D. (2001, May). Measure of success. Online Learning, 5(5),
      20-22, & 24.
           15. AEIOU
• Provides a framework for looking at
  different aspects of an online learning
  program
• Fortune & Keith, 1992; Sweeney, 1995;
  Sorensen, 1996
         A = Accountability
• Did the training do what it set out to do?
• Data can be collected through
  – Administrative records
  – Counts of training programs (# of attendees,
    # of offerings)
  – Interviews or surveys of training staff
          E = Effectiveness
• Is everyone satisfied?
  – Learners
  – Instructors
  – Managers
• Were the learning objectives met?
            I = Impact
• Did the training make a difference?
• Like Kirkpatrick’s level 4 (Results)
   O = Organizational Context
• Did the organization’s structures and policies
  support or hinder the training?
• Does the training meet the organization’s
  needs?
• OC evaluation can help find when there is a
  mismatch between the training design and the
  organization
• Important when using third-party training or
  content
U = Unintended Consequences
• Unintended consequences are often
  overlooked in training evaluation
• May give you an opportunity to brag
  about something wonderful that
  happened
• Typically discovered via qualitative data
  (anecdotes, interviews, open-ended
  survey responses)
       16. Consumer-Oriented
             Evaluation
• Uses a consumer point-of-view
  – Can be a part of vendor selection process
  – Can be a learner-satisfaction issue
• Relies on benchmarks for comparison of
  different products or different learning
  media

          See the vendors!
    Part III:

Applying Kirkpatrick’s
  4 Levels to Online
 Learning Evaluation
 & Evaluation Design
   Why Use the 4 Levels?
• They are familiar and understood
• Highly referenced in the training
  literature
• Can be used with 2 delivery media
  for comparative results
     Conducting 4-Level
        Evaluation
• You need not use every level
  – Choose the level that is most
    appropriate to your need and budget
• Higher levels will be more costly
  and difficult to evaluate
• Higher levels will yield more
      Kirkpatrick Level 1:
           Reaction
• Typically involves “Smile sheets” or
  end-of-training evaluation forms.
• Easy to collect, but not always very
  useful.
• Reaction-level data on online courses
  has been found to correlate with ability
  to apply learning to the job.
• Survey ideally should be Web-based,
  keeping the medium the same as the
  course.
     Kirkpatrick Level I:
          Reaction
• Types of questions:
  – Enjoyable?
  – Easy to use?
  – How was the instructor?
  – How was the technology?
  – Was it fast or slow enough?
  Kirkpatrick Level 2:
       Learning
• Typically involves testing
  learners immediately following
  the training
• Not difficult to do, but online
  testing has its own challenges
  – Did the learner take the test on
    his/her own?
      Kirkpatrick Level 2:
           Learning
• Higher-order thinking skills (problem
  solving, analysis, synthesis)
• Basic skills (articulate ideas in writing)
• Company perspectives and values
  (teamwork, commitment to quality,
  etc.)
• Personal development
       Kirkpatrick Level 2:
            Learning
• Might include:
  – Essay tests.
  – Problem solving exercises.
  – Interviews.
  – Written or verbal tests to assess
    cognitive skills.

  Shepard, C. (1999b, July). Evaluating online learning. TACTIX from
    Fastrak Consulting. Retrieved February 10, 2002, from:
    http://fastrak-
    consulting.co.uk/tactix/Features/evaluate/eval01.htm.
       Kirkpatrick Level 3:
            Behavior
• More difficult to evaluate than Levels 1 & 2
• Looks at whether learners can apply what
  they learned (does the training change
  their behavior?)
• Requires post-training follow-up to
  determine
• Less common than levels 1 & 2 in practice
         Kirkpatrick Level 3:
              Behavior
• Might include:
  – Direct observation by supervisors or coaches
    (Wisher, Curnow, & Drenth, 2001).
  – Questionnaires completed by peers,
    supervisors, and subordinates related to work
    performance.
  – On the job behaviors, automatically logged
    performances, or self-report data.

  Shepard, C. (1999b, July). Evaluating online learning. TACTIX from
    Fastrak Consulting. Retrieved February 10, 2002, from:
    http://fastrak-consulting.co.uk/tactix/Features/evaluate/eval01.htm.
      Kirkpatrick Level 4:
            Results
• Often compared to return on investment
  (ROI)
• In e-learning, it is believed that the
  increased cost of course development
  ultimately is offset by the lesser cost of
  training implementation
• A new way of training may require a
  new way of measuring impact
     Kirkpatrick Level 4: Results
• Might Include:
   – Labor savings (e.g., reduced duplication of
     effort or faster access to needed information).
   – Production increases (faster turnover of
     inventory, forms processed, accounts opened,
     etc.).
   – Direct cost savings (e.g., reduced cost per
     project, lowered overhead costs, reduction of
     bad debts, etc.).
   – Quality improvements (e.g., fewer accidents,
     less defects, etc.).
   Horton, W. (2001). Evaluating e-learning. Alexandria, VA:
     American Society for Training & Development.
Of course, this assumes you have all the documents!
  Kirkpatrick + Evaluation
           Design
• Kirkpatrick’s 4 Levels may be
  achieved via various evaluation
  designs
• Different designs help answer
  different questions
 Pre/Post Control Groups
• One group receives OL training and one
  does not
• As variation try 3 groups
  – No training (control)
  – Traditional training
  – OL training
• Recommended because it may help
  neutralize contextual factors
• Relies on random assignment as much
  as possible
      Multiple Baselines
• Can be used for a program that is
  rolling out
• Each group serves as a control
  group for the previous group
• Look for improvement in
  subsequent groups
• Eliminates need for tight control of
  control group
          Time Series
• Looks at benchmarks before and
  after training
• Practical and cost-effective
• Not considered as rigorous as
  other designs because it doesn’t
  control for contextual factors
   Single Group Pre/Post
• Easy and inexpensive
• Criticized for lack of rigor (absence
  of control)
• Needs to be pushed into
  Kirkpatrick levels 3 and 4 to see if
  there has been impact
           Case Study
• A rigorous design in academic
  practice, but often after-the-fact in
  corporate settings
• Useful when no preliminary or
  baseline data have been collected
Matching Evaluation Levels
 with Objectives Posttest
Instructions: For each statement below, indicate the
   level of evaluation at which the objective is aimed.
1. Union Pacific Railroad reported an increase in
   bottom-line performance--on-time delivery of
   goods--of over 35%, which equated to millions
   of dollars in increased revenues and savings.
2. They also reported that learners showed a 40%
   increase in learning retention and improved
   attitudes about management and jobs.

 (see handout for more)
  Part IV:
ROI and Online
  Learning
  The Importance of ROI
• OL requires a great amount of $$
  and other resources up front
• It gives the promise of financial
  rewards later on
• ROI is of great interest because of
  the investment and the wait period
  before the return
         Calculating ROI
• Look at:
  – Hard cost savings
  – Hard revenue impact
  – Soft competitive benefits
  – Soft benefits to individuals

   See: Calculating the Return on Your eLearning
   Investment (2000) by Docent, Inc.
    Possible ROI Objectives
•   Better Efficiencies
•   Greater Profitability
•   Increased Sales
•   Fewer Injuries on the Job
•   Less Time off Work
•   Faster Time to Competency
  Factors Impacting ROI
• # of employees
• Travel costs
• Opportunity costs (e.g., what does
  it cost to pull off of job)
• Online course development costs
• Infrastructure costs
      Hard Cost Savings
• Travel
• Facilities
• Printed material costs (printing,
  distribution, storage)
• Reduction of costs of business
  through increased efficiency
• Instructor fees (sometimes)
    The Cost of E-learning
• Brandon-hall.com estimates that an
  LMS system for 8,000 learners costs
  $550,000
• This price doesn’t include the cost of
  buying or developing content
• Bottom line: getting started in e-
  learning isn’t cheap
   Hard Revenue Impact
• Consider
  – Opportunity cost of improperly or
    untrained personnel
  – Shorter time to productivity through
    shorter training times with OL
  – Increased time on job (no travel
    time)
  – Ease of delivering same training to
    partners and customers (for fee?)
    Soft Competitive Benefits
• Just-in-time capabilities
• Consistency in delivery
• Certification of knowledge transfer
• Ability to track users and gather
  data easily
• Increase morale from
  simultaneous roll-out at different
  sites
      Individual Values
• Less wasted time
• Support available as needed
• Motivation from being treated as
  an individual
      Talking about ROI
• As a percentage
  – ROI=[(Payback-
    Investment)/Investment]*100
• As a ratio
  – ROI=Return/Investment
• As time to break even
  – Break even
    time=(Investment/Return)*Time
    Period
      Net Present Value
• Need to discount the return to
  present dollars; a $100,000 project
  that yields $30,000/year for 5
  years, would have a new present
  value of $29,364 at 8% interest
  (Horton, 2001, ASTD)
      Benefit-Cost Ratio
• Project cost of $100,000 that yields
  $150,000 of benefits would have a
  benefit-cost ratio of 1.5 (Horton,
  2001, ASTD)
      Time to Payback
• If cost is $100,000 and ROI is
  $10,000/month, then the time to
  payback is 10 months (Horton,
  2001, ASTD)
    Learners to Payback
• Training costs $100,000 to develop
  and $100/person to offer.
  Assuming each person trained
  benefits the organization $300 (or
  $200 net); development costs are
  repaid by training 500 people
  (Horton, 2001, ASTD)
 Classroom Training vs. ROI
      (William Horton)
1. Per-course costs (course
   development costs)
2. Per-class costs
   (instructor/facilitator, travel, and
   facilities)
3. Per-learner costs (travel, salary,
   instructor/facilitator salary)
  What is ROI Good For?
• Prioritizing Investment
• Ensuring Adequate Financial
  Support for Online Learning
  Project
• Comparing Vendors
The Changing Face of ROI
• “Return-on-investment isn’t what
  it used to be … The R is no longer
  the famous bottom line and the I is
  more likely a subscription fee than
  a one-time payment” (Cross, 2001)
      More Calculations
• Total Admin Costs of Former Program
  - Total Admin Costs of OL Program
  =Projected Net Savings
• Total Cost of Training/# of Students
  =Cost Per Student (CPS)
• Total Benefits * 100/Total Program Cost
  =ROI%
     Pause: How are costs
calculated in online programs?
  ROI Calculators
1. Mediapro
   (www.mediapro.com/roi)
2. Mentergy
   (www.mentergy.com/roi)
3. BNH Expert Software
   www.bnhexpertsoft.com
   (free trial version available)
ROI Calculators
Success Story #1   (Sitze, March 2002, Online Learning):

      EDS and GlobalEnglish
 Charge: Reduce money on English training
 Goal: 80% online in 3 months
 Result: 12% use in 12 months
 Prior Costs: $1,500-5,000/student
 New Cost: $150-300/user
 Notes: Email to participants was helpful in
   expanding use; rolling out other additional
   languages.
Success Story #2 (Overby, Feb 2002, CIO):
  Dow Chemical and Offensive Email
Charge: Train 40,000 employees across 70
  countries; 6 hours of training on workplace respect
  and responsibility.
Specific Results: 40,000 passed
Savings: Saved $2.7 million ($162,000 on record
  keeping, $300,000 on classrooms and trainers,
  $1,000,000 on handouts, $1,200,000 in salary
  savings due to less training time).
Success Story #3 (Overby, Feb 2002, CIO):
   Dow Chemical and Safety/Health
Charge: Train 27,000 employees on
 environmental health and safety work
 processes.
Results: Saved $6 million; safety incidents
 have declined while the number of Dow
 employees have grown.
Success Story #4 (Overby, Feb 2002, CIO):
 Dow Chemical and e-learning system
Charge: $1.3 million e-learning system
Savings: $30 million in savings ($850,000 in
  manual record-keeping, $3.1 in training
  delivery costs, $5.2 in reduced classroom
  materials, $20.8 in salaries since Web
  required 40-60% less training time).
 Success Story #5 (Ziegler, e-learning, April 2002):
 British Telecom & sales training
Costs: Train 17,000 sales professionals to sell
  Internet services using Internet simulation.
Result: Customer service rep training
  reduced from 15 days to 1 day; Sales
  training reduced from 40 days to 9 days.
Savings: Millions of dollars saved; sales
  conversion went up 102 percent; customer
  satisfaction up 16 points.
And Blended
 Learning
Results…???
 Blended Learning Advantages
1. Course access at one’s convenience and flexible
   completion
2. Reduction in physical class time
3. Promotes independent learning
4. Multiple ways to accomplish course objectives
5. Increased opportunities for human interaction,
   communication, & contact among students
6. Less time commuting and parking
7. Introverts participate more
Blended Learning Disadvantages
1. Procrastination, procrastination,
   procrastination
2. Students have trouble managing time
3. Problems with technology at the beginning
   (try too much)
4. Can be overwhelming or too novel
5. Poor integration or planning
6. Resistance to change
7. Good ideas but lack of time, money, & support
Success Story #6. Infusing E-Learning
 (Elliott Masie, March 2002, e-learning Magazine)
 A manufacturing company transformed a
 week-long safety program into a three-part
 offering:
 1. One day in classroom
 2. Multiple online simulations and lessons.
 3. One final day of discussions and exams.
Must accomplish online work before phase 3—
 this raised success rate, transfer of skills, and
 lowered hours away from the job.
Success Story #7. Ratheon, Build Own LMS
   (John Hartnett, Online Learning, Summer 2002)
SAP Training Choice: Vendor ($390,000) or
  Build Internally ($136,000) or Cost of
  Instructor-led Training ($388,000).
     Note Saved $252,000
Five Training Components in 18 Weeks (within 6 weeks,
    4,000 courses were taken by 1,400 students)
1. Role-based simulations
2. Audio walk-throughs
3. Online quick reference system
4. Live training support (special learning labs)
5. Online enrollment and tracking
       Success Story #8:IBM
       Special E-Learning Issue, April 2001

• 33,000 IBM managers have taken online
  courseware.
• 5 times as much content at one-third the cost.
• IBM reported $200 million in savings in one year.
• Voided $80 million dollars in travel and housing
  expenses during 1999 be deploying online
  learning.
IBM Training of 6,600 New First-
  Line Managers (Basic Blue)
• Phase I: 26 Weeks of Self-paced Online
  Learning
  –   Cohorts of 24 managers
  –   Lotus LearningSpace Forum
  –   2 hours/week; 5 units/week
  –   18 mandatory and elective management topics
  –   Need minimum score on mandatory topics
  –   14 real-life interactive simulations
  –   LearningSpace tutor guides behavior

       • Karen Mantyla (2001), ASTD.
IBM Training of 6,600 New First-
  Line Managers (Basic Blue)
• Phase II: In-class 5 day learning lab
   – Experiential higher order learning
   – Bring real-life activities from job
   – Focus on self-knowledge and to understand their roles
     as leaders and members of IBM
   – Harvard Business cases, leadership competency
     surveys, managerial style questionnaires, brain
     dominance inventories
   – Coached by a learner-colleague (teaming impt!)
   – Less than 1 hour of the 5 days is lecture
IBM Training of 6,600 New First-
  Line Managers (Basic Blue)
• Phase III: 25 Weeks of Online Learning
  – Similar to Phase I but more complex and
    focuses on application
  – Creates individual development plan and
    organizational action plan
  – Managers reviews and signs off on these plans
        IBM Training Results
         (Kirkpatrick Model)
• Level 1
  – High satisfaction and enthusiasm for blended
  – Coaching and climate rated highest
• Level 2:
  – 96% displayed mastery in all 15 subject areas; 5
    times as much content covered in this program
    compared to 5 days of live training
  – 150 Web page requests/learner
          IBM Training Results
           (Kirkpatrick Model)
• Level 3
  – Significant behavior change (in particular in coaching,
    styles, competencies, and climate)
  – Graduate had high self-efficacy and believed that they
    could make a difference
• Level 4
  –   Linkage bt leadership & customer satisfaction
  –   Leadership led to teamwork and satisfaction
  –   Managers reported improvement on job
  –   Improved morale and productivity reported
        IBM Training Results
         (Kirkpatrick Model)
• Level 5
  – Asked graduates to estimate the impact on their
    departments in dollars
  – $415,000 or ROI of 47 to 1.
  – Perceived real and lasting leadership increases
  Blended Learning Advantages for IBM

1. Greater consistency of language, knowledge,
   and corporate culture across the globe
2. Blended approach to training now replicated in
   other units
3. Market it’s e-learning design
4. Cross functional understanding & teamwork
5. No risk trials and simplicity helps
Success Story #9. Three Phases of
            AC3-DL
I. Asynchronous Phase: 240 hours of
   instruction or 1 year to complete; must
   score 70% or better on each gate exam
II. Synchronous Phase: 60 hours of
   asynchronous and 120 hours of
   synchronous
III. Residential Phase: 120 hours of
   training in 2 weeks at Fort Knox
        AC3-DL Course Tools
• Asynchronous:
   – Learning Management System
   – E-mail
• Synchronous: Virtual Tactical Operations Center
  (VTOC) (7 rooms; 15 people/extension)
   –   Avatar
   –   Audio conference by extension/room (voice over IP)
   –   Text Chat Windows—global and private
   –   Special tools for collaboration
Success #10: Microsoft Excel Training
   (Jeff Barbian, Blended Works, Summer 2002,
                Online Learning)
• Group One: 5 scenario-based exercises that
  offered live use of Excel on real-world tasks,
  online mentors, FAQs, relevant Web sites,
  NETg Excel Fundamentals Learning Objects.
• Group Two: Same as Group One but without
  scenarios, but info in 5 scenarios were
  embedded in the learning objects.
• Group Three: No training control.
Success #10: Microsoft Excel Training
  (Thompson Learning Company Study; Jeff Barbian,
    Blended Works, Summer 2002, Online Learning)

• Group One (the blended group): 30 percent
  increase in accuracy over Group Two (the e-
  learning group) and were 41 percent faster
• Group Two performed 159 more accurately
  than Group Three
• Groups 1 and 2 relied on the online mentors for
  support
  – (Note: with these results, Lockhead Martin became
    a blended learning convert.
  Success #11: NCR: Blended Approaches
  (Thompson Learning Company Study; Jeff Barbian,
    Blended Works, Summer 2002, Online Learning)

1. Design of E-Learning (Various methods: Web
   articles; Synchronous points for team
   exercises)
2. Field Guide Binders (Web site guidance, live
   feedback on case studies, live “kick off” that
   promotes collaboration, hands-on role play)
   Over 71 percent of learners were responding to
    customers more effectively (Kirkpatrick Level 3)
      Success #12: Convergys: Blended
  (Jeff Barbian, Blended Works, Summer 2002, Online Learning)


• Leadership Dev, Succession Planning,
  performance management, etc.
• LMS from Knowledge Planet, 3 e-learning
  libraries, virtual classroom tools to 50 locations
  in North America & Europe
• New managers received: Readings, job aids,
  meeting checklists, 5 off-the-shelf courses from
  SkillSoft, virtual classes via LearnLinc (new
  recruits talk to experienced managers), and a 4
  day instructor-led seminar at HQ.
   Success #13: Sallie Mae/USA Group
 (Blended student loan provider program)
  (Jeff Barbian, Blended Works, Summer 2002, Online Learning)

• LEAD (Leadership and Education Development);
  Groom internal staff to fill supervisory-level positions
• 4 hours/week in class with internal and external
  instructors; learn trust, role of managers, etc.
• First must complete 3 online management courses from
  SkillSoft and 6 online project management courses
  (includes panel presentation by IT Project Team to
  illustrate how projects are handled in the company’s
  culture)
• Findings: increased temawork, camaraderie, shared
  understanding of concepts, respect for individual
  differences, social interaction, and reinforcement for
  class concepts.
      Success #14: Proctor and Gamble
  (Jeff Barbian, Blended Works, Summer 2002, Online Learning)


• 1999 = 100,000 employees; 20,000 trained/year
• LMS from Saba, live training from Centra
• CD-based training using Authorware,
  CourseBuilder, & Dreamweaver
• 2002 = 1,200 learning items; 34% Web, 54%
  CD
• Global English saved $2.5 million per year
• Off-the-shelf courses in time management and
  managing for success
            Proctor and Gamble
  (Jeff Barbian, Blended Works, Summer 2002, Online Learning)


“Given our learning objectives and needs, should
  we select Web-based live training, versus
  classroom, versus video-based, versus CBT, or
  some blended solution?…It depends, on the
  resources you have, how far geographically you
  have to reach, or whether you can get your arm
  around them and pull them into a classroom.”
  Art DiMartile, Senior IT Manager, Proctor and
  Gamble
  The Worldwide Expansion of E-
           Learning!!!
• Success #15: Circuit City is training 50,000
  employees from 600 stores using customized
  courses that are ―short, fun, flexible, interactive
  and instantly applicable on the job.‖
• Success #16: The Army’s virtual university
  offered online college courses to more than 12,000
  students located anywhere in the world in 2001 in
  the first year of a $42 million e-learning program.

Dr. Sylvia Charp, Editor-in-Chief, T.H.E. Journal, March 2002.
  Success #17: Community Health
Network of Indiana; www.ehealthindiana.com
       (July 15, 2002, American Hospital Association)

• Named one of most wired hospitals and most improved
  hospital system nationwide in the use of technology in
  health care
   – Virtual nurse recruitment Web site (live chats with recruiters)
   – Video streams of nursing leaders
   – Virtual tours of individual nursing units
   – Online application and interactive job-posting databases
   – Web portal for physicians
   – First in nation to offer live Web cast of in vitro fertilization
     procedure
   – Real time clinical data repository
Success #18: Cisco and DigitalThink
        Course (employees)

 – Sales training self-assessment
 – Ask via survey to estimate how much time
   training saved them on the job
 – Ask whether it improved performance
 – Select a percentage for each
 – ROI of 900%; for every $1 spent on training,
   Cisco sees a gain of 900% in productivity
Success #18: Cisco and DigitalThink
      Course (Cisco vendors)

 – Most saw significant growth in productivity
 – 74% reported improvement in ability to sell
   or service clients
 – Customer satisfaction jumped 50%
And What about Higher Ed???
   Success #19 Higher Education:
   Student survey results after a hybrid course

• Student feedback N=282
• 69% felt they could control the pace of their
  own learning
• 77% felt they could organize their time better
• 16% felt the time spent online would have been
  better spent in class
• 61% felt there should be more courses like this
  – www.uwsa.edu.ttt/articles/garnham.htm
   At the End of the Day...
• Are all training results quantifiable?
• NO! Putting a price tag on some costs
  and benefits can be very difficult
• NO! Some data may not have much
  meaning at face value
  – What if more courses are offered and annual
    student training hours drop simultaneously?
    Is this bad?
     Evaluation Cases
           (homework…)
1. General Electric Case
2. Financial Services Company
3. Circuit Board Manufacturing Plant
   Safety
4. Computer Company Sales Force
5. National HMO Call Center
    Part V:
   Collecting
Evaluation Data
    & Online
Evaluation Tools
Collecting Evaluation Data
•   Learner Reaction
•   Learner Achievement
•   Learner Job Performance
•   Manager Reaction
•   Productivity Benchmarks
       Forms of Evaluation
•   Interviews and Focus Groups
•   Self-Analysis
•   Supervisor Ratings
•   Surveys and Questionnaires
•   ROI
•   Document Analysis
•   Data Mining (Changes in pre and post-
    training; e.g., sales, productivity)
      How Collect Data?
• Direct Observation in Work Setting
  – By supervisor, co-workers,
    subordinates, clients
• Collect Data By Surveys,
  Interviews, Focus Groups
  – Supervisors, Co-workers,
    Subordinates, Clients
• Self-Report by learners or teams
• Email and Chat
            Learner Data
• Online surveys are the most effective way
  to collect online learner reactions
• Learner performance data can be collected
  via online tests
   – Pre and post-tests can be used to
     measure learning gains
• Learner post-course performance data can
  be used for Level 3 evaluation
   – May look at on-the-job performance
   – May require data collection from
     managers
   Multiple Assessment Example:
 Naval Training Follow-Up Evaluation

• A naval training unit uses an online
  survey/database system to track
  performance of recently trained
  physiologists
• Learner’s self-report performance
• Managers report on learner
  performance
• Unit heads report on overall
  productivity
      Learning System Data
• Many statistics are available, but which
  are useful?
  – Number of course accesses
  – Log-in times/days
  – Time spent accessing course components
  – Frequency of access for particular
    components
  – Quizzes completed and quiz scores
  – Learner contributions to discussion (if
    applicable)
             Computer Log Data
Chen, G. D., Liu, C. C., Liu, B. J. (2000). Discovering decision knowledge from Web log
portfolio for managing classroom processes by applying decision tree and data cute tech.

                 Journal of Educ Computing Research, 23(3), 305-332.


• In a corp training situation, computer log data
  can correlate online course completions with:
   – actual job performance improvements such as
      • fewer violations of safety regulations,
      • reduced product defects,
      • increased sales, and
      • timely call responses.
    Learner System Data
• IF learners are being evaluated based
  on number and length of accesses, it is
  only fair that they be told
• Much time can be wasted analyzing
  statistics that don’t tell much about the
  actual impact of the training
• Bottom line: Easy data to collect, but
  not always useful for evaluation
  purposes
  – Still useful for management purposes
        Benchmark Data
• Companies need to develop benchmarks
  for measuring performance
  improvement
• Managers typically know the job areas
  that need performance improvement
• Both pre-training and post-training data
  need to be collected and compared
Online Survey Tools
  for Assessment
    Web-Based Survey
      Advantages
• Faster collection of data
• Standardized collection format
• Computer controlled branching and
  skip sections
• Easy to answer clicking
• Wider distribution of respondents
    Sample Survey Tools
•   Zoomerang
    (http://www.zoomerang.com)
•   IOTA Solutions
    (http://www.iotasolutions.com)
•   QuestionMark
    (http://www.questionmark.com/home.html)
•   SurveyShare (http://SurveyShare.com; from
    Courseshare.com)
•   Survey Solutions from Perseus
    (http://www.perseusdevelopment.com/fromsurv.htm)
•   Infopoll (http://www.infopoll.com)
Online Testing Tools
   (see: http://www.indiana.edu/~best/)
         Test Selection Criteria
            (Hezel, 1999; Perry & Colon, 2001)

•   Easy to Configure Items and Test
•   Handle Symbols, Timed Tests
•   Scheduling of Feedback (immediate?)
•   Flexible Scoring and Reporting
    – (first, last, average, by individual or group)
• Easy to Pick Items for Randomizing
• Randomize Answers Within a Question
• Weighting of Answer Options
               Web Resource: http://www.indiana.edu/~best/
    Tips on Authentification
•   Check e-mail access against list
•   Use password access
•   Provide keycode, PIN, or ID #
•   (Futuristic Other: Palm Print,
    fingerprint, voice recognition, iris
    scanning, facial scanning, handwriting
    recognition, picture ID)
Ziegler, April 2002, e-Learning

“…the key is not to measure every
 possible angle, but rather to
 focus on metrics that are
 pragmatic and relevant to both
 human and business
 performance at the same time.”
   E-Learning Evaluation
         Measures
So which of the 16 methods
 would you use???

Something ridiculous???
  Some Final Advice…




Or Maybe Some Questions???

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:22
posted:11/14/2010
language:English
pages:177
Description: Bnh Employee Forms document sample