Docstoc

evaluate

Document Sample
evaluate Powered By Docstoc
					      Session P16 Evaluating Online
       Learning: Frameworks and
              Perspectives
   (Workshop: Sunday Feb 17th, Training 2002)

           Dr. Curtis J. Bonk
         President, CourseShare.com
    Associate Professor, Indiana University
 http://php.indiana.edu/~cjbonk, cjbonk@indiana.edu


     Dr. Vanessa Paz Dennen
Assistant Professor, San Diego State University
               vdennen@mail.sdsu.edu
       http://edweb.sdsu.edu/people/vdennen
  Workshop Overview
• Part I: The State of Online Learning
• Part II. Evaluation Purposes,
  Approaches, and Frameworks
• Part III. Applying Kirkpatrick’s 4 Levels
• Part IV. ROI and Online Learning
• Part V. Collecting Evaluation Data &
  Online Evaluation Tools
     (Time: 8:30-11:30; 12:30-3:30)
Part I. The State of Online
         Learning




 Survey of Corporate Settings
 • What’s Going On?
 • And How Are We Evaluating It?
     Free Corporate Reports
1. Corporate E-Learning: Exploring a New
   Frontier, Hambrecht and Co. (2000, March)
   http://www.wrhambrecht.com/research/coverage/elearnin
   g/ir/ir_explore.pdf (95 pages)
2. Training Magazine Special Issue,
   September 2000, 37(9), The State of Online Learning
3. Fortune Special Issue, 142(13), Nov. 27, 2000,
   Special Insert: E-learning strategies for executive
   education and corporate training.
   http://www.fortuneelearning.com/topics/
Survey of 201 Trainers, Instructors,
Managers, Instructional Designers,
        CEOs, CLOs, etc.
        Among the Key Goals
1. To identify the resources, tools, and activities
   desired in e-learning.
2. To document gaps between tools and
   resources deemed useful and actual use.
3. To survey commitment to e-learning.
4. To document practices related to e-learning
   training and support.
5. To document pedagogical practices and
   motivational techniques supported in e-
   learning.
    Survey Limitations
• Sample pool—e-PostDirect
• The Web is changing rapidly
• Lengthy survey, low response rate
• No password or keycode
• Many backgrounds—hard to
  generalize
• Does not address all issues (e.g., ROI
  calculations, how trained & supported,
  specific assessments)
Figure 1. Respondents' Organizational Roles
                                  Neither a
                                   User or
                                  Decison-
 User and
                                   Maker
Decision-
                                    6%
  Maker
   57%
                                    User or
                                   Facilitator
                                      17%



                                  Decision-
                                   Maker
                                    20%
                                    Figure 2. Size of Respondent Organizations

                         30

                         25
Percent of Respondents




                         20

                         15

                         10

                         5

                         0
                              1 to 30   31-100   101 to   501 to   1,001 to   5,001 to   10,001 to More than
                                                  500     1,000     5,000      10,000     100,000 100,001

                                                     Number of Employees
                                         Percent of Respondents
                    Ed
                        uc




                                         0
                                             5
                                                 10
                                                      15
                                                           20
                                                                25
 Fi
    n                       at
      Se
          rv         In ion
             ic        fo
               es          T
                   /In ech
In                     su
  du                      ra
      st                      nc
        ria Co                      e
           l/M ns
                an ulti
                    uf          ng
                       ac
                G         tu
                   ov ring
           H          er
                         nm
              ea
                 lth           en
                      Se             t
                          rv
                             ic
                                 es
                        M
                           ilit
                    N           ar
                                     y
                      on
                As -Pr
                    so ofit
                       ci
                          at
                             io
                   H             ns
                                                                     Figure 4. Focus of Respondent Organizations




                      os
                         pi
                            ta
                                li t
                                    y
                          O
                              th
                                 er
         Primary Job Function
84% = Training (e.g., trainers, training
  managers, training directors, or training
  evaluators)
   –   30% Instructors or Trainers
   –   27% Training Managers
   –   20% Training Evaluators
   –   14% Training Directors
45% = Instructional Designers & Program Devel.
5% = Human Resources; 5% Performance
  Managers; and 4% CLOs
       Categorized Job Titles
26% Trainers, Educators, or Instructors
20% Managers (e.g., Training, IT Programs,
  Instructional Designers, or Quality Assurance)
19% Directors (Director of Corp Education, E-
  Learning, Professional Development, etc.)
13% Instructional Designers or Technologists
13% High Ranking Administrators (CEO,
  President, CLO, CTO)
9% Consultants
Professional Reading Interests
• 80% read magazines or journals related
  to e-learning.
• Nearly 100% read training related
  publications
                Figure 12. Methods Used to Deliver Training in
                                Organization

                       Other

Paper-Based Correspondence

                   Videotape

                  Multimedia

             Internet/Intranet

    Instructor-Led Classroom


                                 0   20   40   60     80     100   120
                                         Figure 14. Interest in Web Learning by Industry Type
Percent of Respondents



                         100
                          90
                          80                                                                                        Agree/Strongly Agree
                          70
                          60
                          50                                                                                        Unsure
                          40
                          30
                          20                                                                                        Disagree/Strongly
                          10                                                                                        Disagree
                           0
                                                            ch




                                                                                          n




                                                                                                                t
                                                                                                      al
                                                                              s
                                    g


                                                 ce




                                                                                                             en
                                                                                       tio
                                                                          ice
                                  t in




                                                                                                  t ri
                                                          Te
                                               an




                                                                                                           nm
                                                                                      a


                                                                                                us
                                ul




                                                                       rv


                                                                                   uc
                                                r


                                                         fo
                             s




                                                                     Se




                                                                                                 d
                                             su




                                                                                                           er
                          on




                                                                                  Ed
                                                      In




                                                                                              In


                                                                                                      ov
                                           In




                                                                  th
                         C


                                         s/




                                                                                                     G
                                                                 l
                                                              ea
                                   ice




                                                              H
                                 rv
                             Se
                            n
                         Fi
                              Figure 15. Commitment to Web Learning by Industry Type

                              100
Percent of Respondents




                               90
                               80                                       Agree/Strongly Agree
                               70
                               60
                                                                        Unsure
                               50
                               40
                               30                                       Disagree/Strongly
                               20                                       Disagree
                               10
                                0
                                                              l
                                                            ng
                                                             r




                                                            ch
                                                             n




                                                           es
                                                             t
                                                          ria
                                        su




                                                         en
                                                          io

                                                      Te


                                                        lti




                                                        ic
                                                       st
                                                      at
                                     /In




                                                    nm
                                                     su




                                                     rv
                                                    du
                                                   uc
                                  s




                                                   fo




                                                 Se
                                                 on




                                                  er
                               ce




                                                 In
                                                Ed


                                                In




                                               ov
                                               C
                                 i




                                             lth
                              rv




                                           G
                          Se




                                          ea
                                        H
                         n
                         Fi
                                         Percent of Respondents
       Le
          ar
             n   in
                    g




                                           0
                                          10
                                          20
                                          30
                                          40
                                          50
                                          60
                                          70
                                          80
                                          90
                                         100
                        Ac
                           c
        G                      es
            ro                    s
              w
                 th
Tr                      in
   a   ck                    Sk
            LM                    ills
              S
                    Pr
                      og
                        re
                                  ss
             In
                c
                    Jo
                       b
                             Pe
        St                         rf
          an
            da
              rd
                         iz
                            at
                              io
                                n
              In
                 te
Le                 ra
   ar                 c      tiv
      n     er                   ity
               S
                                                                                   Learning




           at
O            is
               fa
 nl
   in             ct
     e              io
                       n
       Te
          ch
              Su
 Im               pp
    pr               or
       ov               t
         ed
Em           Le
                 ar
    pl             ni
      oy              ng
        ee
            Re
                                                                  Figure 17. Reasons Interested in Web-Based




                te
                  nt
                     io
                       n
                    Ke
                      ep
                               Up
Why Interested in E-Learning?
 Mainly cost savings
 Reduced travel time
 Greater flexibility in delivery
 Timeliness of training
 Better allocation of resources, speed of delivery,
  convenience, course customization, lifelong
  learning options, personal growth, greater
  distrib of materials
Why Interested in E-Learning?
“Exploit the technology to deliver our
 intellectual capital.”
“Reduce time to learn, reduce time to
 productivity.”
“Cost reduction (write once, publish on
 different platforms).”
“Invest less in expensive trips to train for
 3 days without apparent results.”
                                    Figure 19. Purpose of Web-Based Learning in
                                                    Organization

                          70
Percenrt of Respondents




                          60
                          50
                          40
                          30
                          20
                          10
                          0
                               Sole source of   Supplement     Follow-up to   Alternative to   Other
                                  learning       traditional    traditional    traditional
Blended Approach Is Most Common
   Ganzel, May 2001, Online learning Magazine

Use Blended Approaches (Live plus
             Online)



           No
          33%

                    Yes
                    67%
Corporate Web Integration Continuum
Level 1: Blended course—self-paced
Level 2: Entire course online--self-paced
Level 3: Tutored or mentored course
Level 4: Blended course—instructor led
Level 5: Entire course online-synchronous
Level 6: Entire course online-asynchronous
Level 7: Entire course online-sync and asynchronous
Level 8: Certificate program online
Level 9: Degree online
Level 10: Corporate university online
                                     Perncent of Respondents
C
 om
       pu
          te
            rA
                  pp
                     s/




                                          0
                                         10
                                         20
                                         30
                                         40
                                         50
                                         60
                                         70
                       So
                          ftw
                             ar
                                    e
                     Te
                         ch
                             ni
                                ca
                                    l
                  Jo
     C                b
       om               R
                           el
           m                  at
                                 ed
             un
                  ic
                    at
    Sy                 io
       st                 n
         em                  Sk
             s/                  ills
                Pr
                    og
                       ra
                          m
                             m
                                in
                 M                  g
                    an
                       ag
                                                                                     Offer Online




                           em
                                en
                                     t
                      Pe
                           rs
          C                   on
            us                    al
               to
                   m
                     er
                         Se
                             rv
            Sa                  ic
                le                  e
                   s/
                     M
                        ar
                            ke
                               tin
            Ex
                 ec                 g
                                                               Figure 20. Types of Training Respondent Organizations




                     Ed
                         uc
                             at
                                io
                                    n
                               Figure 22. Aspects of Web-Based Training Developed In-
                                                       House

                         100
Percent of Respondents




                          90
                          80
                          70
                          60
                          50
                          40
                          30
                          20
                          10
                           0
                                    Content    Delivery System      Training      Evaluation
                                                                 Implementation
    Current Courseware System
            Negatives
 “Slow development time.”
 “Not interactive.”
 “Low interactivity, boring.”
 “…lack of bookmarking, tracking, evaluation,
  etc.”
 “Don’t support the instructional design
  process—are course management systems.”
 “XYZ,…, presents obstacles in moving course
  content from one server to another.”
   Current Courseware System:
      Negative and Positive
 “…does provide a number of excellent features, yet
  development time is very clumsy…it is not very
  intuitive.”
 “XYZ is powerful and intuitive. It is not always
  reliable.”
 “Fairly reliable, but not always. At times have had
  to stop training and go back to the beginning to
  start again as it seizes up.”
 “From a cost posture, they are, quite simply,
  unbeatable. Limitations: Can’t save whiteboard
  presentations developed in virtual classroom.”
     Current Courseware System:
              Positives
   “It is comprehensive, scalable, and intuitive.”
   “…seems to be flexible.”
   “XYZ is simple to use & clean in design.”
   “modify to suit individual course needs.”
   “It’s reasonably inexpensive, there is a Web-based
    template to design customized courses…easily
    added to existing courseware.
            Delivery System
 17% developed own systems or tools
 15% did not know what system they were
  using
 30% used Internet application tools (e.g.,
  Designer’s Edge, Dreamweaver, Authorware)
 35% used presentation tools (e.g., Astound,
  WebEx)
 Many used existing courseware systems and
  tools (e.g., WebBoard, Learning Space)
What Vendors Select & Why?

• Standardization vs. Innovation
Standard Tool Advantages:
Training easier, jump started, common framework,
  fixed costs
Disadvantages:
Tools do not fit all needs, need technical training,
  lose control
    Web-Based Content
   Capella                           Knowledge Planet
   Click 2 Learn                     Mentergy--includes LearnLinc
   Colleges/Universities              products
   Digital Think                     Microsoft Training and Service
   Docent, Inc.                      Netg
   Eduprise                          Prime Learning
   Element K
                                      Saba
   eMind.com
                                      Smart Force
   eSocrates
   ExecuTrain                        ThinQ (i.e., Trainingnet)
   Freeskills.com                    TrainSeek
   Headlight.com                     Vcampus
   Jones International University    Viviance New Education
   KnowledgeNet                      Walden Univ./Institute
                              Figure 24. Aspects of Web-Based Training
                                             Outsourced

                         80
Percent of Respondents




                         70
                         60
                         50
                         40
                         30
                         20
                         10
                          0
                              Content   Delivery System      Training      Evaluation
                                                          Implementation
   Figure 25. Percent of Respondent Organizations
Conducting Formal Evaluations of Web-Based Learning




            Yes
            41%

                                        No
                                       59%
          Why Evaluate?

• Cost-savings
  – Becoming less important reason to evaluate
    as more people recognize that the initial
    expense is balanced by long-term financial
    benefits
• Performance improvement
  – A clear place to see impact of online learning
• Competency advancement
     Pause: How are costs
calculated in online programs?
      The Cost of E-learning
• Brandon-hall.com estimates that an LMS
  system for 8,000 learners costs $550,000
• This price doesn’t include the cost of
  buying or developing content
• Bottom line: getting started in e-learning
  isn’t cheap
        Evaluation Process
• Can be likened to ADDIE instructional
  design model
  – ANALYSIS is needed to determine a purpose
    of the evaluation
  – A DESIGN is needed to guide the process
  – Instruments must be DEVELOPED
  – Without IMPLEMENTATION you have no
    data
  – In the end, the data are analyzed, and
    EVALUATED
A Few Assessment
   Comments
 Level 1 Comments. Reactions
“We assess our courses based on
  participation levels and online surveys
  after course completion. All of our courses
  are asynchronous.”
“I conduct a post course survey of course
  material, delivery methods and mode, and
  instructor effectiveness. I look for
  suggestions and modify each course based
  on the results of the survey.”
“We use the Halo Survey process of asking
  them when the course is concluding.”
Level 2 Comments: Learning

 “We use online testing and simulation
  frequently     for    testing student
  knowledge.”
 “Do multiple choice exams after each
  section of the course.”
 “We use online exams and use level 2
  evaluation forms.”
      Level 3 Comment: Job
          Performance
“I feel strongly there is a need to measure
  the success of any training in terms of the
  implementation of the new behaviors on
  the job. Having said that, I find there is
  very limited by our clients in spending
  the dollars required…”
   More Assessment Comments
    Multiple Level Evaluation
“Using Level One Evaluations for each session followed
  by a summary evaluation. Thirty days post-training,
  conversations occur with learners’ managers to assess
  Level 2” (actually Level 3).”
“We do Level 1 measurements to gauge student
  reactions to online training using an online evaluation
  form. We do Level 2 measurements to determine
  whether or not learning has occurred…
“Currently, we are using online teaching and following
  up with manager assessments that the instructional
  material is being put to use on the job.”
    Who is Evaluating Online
           Learning?
• 59% of respondents said they did not
  have a formal evaluation program
• At Reaction level: 79%
• At Learning level: 61%
• At Behavior/Job Performance level: 47%
• At Results or Return on Investment: 30%
                                Figure 26. How Respondent Organizations Measure
                                         Success of Web-Based Learning

                         90
Percent of Respondents




                         80
                         70
                         60
                         50
                         40
                         30
                         20
                         10
                          0
                              Learner satisfaction      Change in        Job performance   ROI
                                                     knowledge, skill,
                                                         atttitude
                                                     Kirkpatrick's Evaluation Level
Assessment Lacking or Too Early
“We are just beginning to use Web-based
 technology for education of both
 associates and customers, and do not
 have the metric to measure our success.
 However, we are putting together a
 focus group to determine what to
 measure (and) how.”
“We have no online evaluation for
 students at this time.”
“We lack useful tools in this area.”
Limitations with Current System
“I feel strongly there is a need to measure the
  success of any training in terms of the
  implementation of the new behaviors on the
  job. Having said that, I find there is very
  limited by our clients in spending the
  dollars required…”
“We are looking for better ways to track
  learner progress, learner satisfaction, and
  retention of material.”
“Have had fairly poor ratings on reliability,
  customer support, and interactivity…”
Pause…How and
 What Do You
  Evaluate…?
What else did the
corporate training
  survey show?
                               Figure 27. Organizational Ownership of Online
                                           Courses and Materials

                         100
Percent of Respondents




                          90
                          80
                                                                    Agree/Totally Agree
                          70
                          60
                                                                    Unsure
                          50
                          40
                          30                                        Disagree/Strongly
                                                                    Disagree
                          20
                          10
                           0
                                Clear Guidelines   Property of
                                                   Organization
Figure 28. Organizational Interest in Knowledge
                   Objects

                      Strongly
                      Disagree
       Strongly          3%
        Agree                      Disagree
         25%                         11%

                                       Unsure
                                        17%




                  Agree
                   44%
                               Figure 29. Percent of Organizations Valuing Online
                                  Certificates and Degrees as Much as Those
                                           FromTraditional Programs


                         100
Percent of Respondents




                         80                                             Agree/Totally Agree
                         60
                                                                        Unsure
                         40
                                                                        Disagree/Strongly
                                                                        Disagree
                         20
                          0
                                 Value Online      Value Online
                                  Certificates       Degrees
           Figure 31. Course Tools with Growth Potential


      Databases

       Cases or
       Problems

File Up/Download


   Quizzes/Tests


     Courseware

       Course
     Evaluations

                   0     5        10       15        20        25

 Percent of Respondents Indicating High Usefulness for a Particular
             Tool or Resource But Not Currently Using It
       Figure 42. Percent of Instructional Time spent
          training via the Web in the next decade

100%

80%                                                     76-100%
60%                                                     51-75%
                                                        26-50%
40%                                                     1-25%
20%                                                     0%

 0%
       1 Year        2            5           10
                   Years        Years        Years
  Figure 44. Freelance or Adjunct Instructor Web-
                   Based Training

100%

80%

60%
                                                    No
40%                                                 Yes

20%

 0%
         Past Experience       Future Interest
        Figure 45. Cultural and Organizational Reasons Limiting
                 the Adoption of Web-Based Learning


          Learner Time

Instructor Delivery Time

    Lack Time to Learn

        Lack of Interest

    Lack Web Training

Difficult to Measure ROI

   Lack of Org Support

    Cultural Resistance

   Instructor Prep Time

   Perceived High Cost

                           0   10        20       30         40   50
                                    Percent of Respondents
 Sample Reasons for Obstacles
• “Skepticism on the benefits within the
  Healthcare environment.”
• “Ignorance about the advantages of using the
  Internet to save money.”
• “Generation gap and bias against anything not
  face to face.”
• “Poor support from IT managers to support
  organizational goals.”
• “Lack of foresight in the industry/no ability to
  see the big pic!”
             Figure 46. Technological Reasons Limiting the
                   Adoption of Web-Based Learning

            Software
    Lack Interactivity
Classroom Resources
   Lack of Standards
           Hardware

            Firewalls
       Tech Support
          Bandwidth

                         0   10        20       30         40   50
                                  Percent of Respondents
Just Why is Bandwidth So
   Darn Important???
  Obstacles: Technology Comments

“Lack of hardware to efficiently use
 Web-based technology.”
“Systems infrastructure.”
“Huge diversity in hardware.”
“Reliable Web access of our training
 audiences.”
“Caught up in the tech not the
 instruction!”
Obstacles: Problems in Delivery Methods

 “Students needs hands on.”
 “High rate of change in IT
  materials—never mature.”
 “Effectiveness of this method.”
 “Some courses are better delivered
  in traditional classrooms.”
          Figure 47. Types of Training Provided to Personnel for
             Designing and Developing Web-Based Courses


                 Web Courses

             Vendor Supported

Outside Consultants or Company


                 Expert Access

            Topical Workshops

           Topical Conferences


                                 0   5   10    15    20    25     30   35
                                         Percent of Respondents
Figure 48. Percent of Organizations Where Design and
     Development Training Leads to Certification


         Don't Know
                                Yes
            15%
                                22%




                     No
                    63%
         Figure 49. Location Where Learners Access Web-Based
                                Training


Other

Road

Home

Office

         0    10     20      30     40     50      60    70    80
                          Percent of Respondents
      Figure 50. Support Resources Provided for E-Learners

                 None

24 hour phone support

       Computer labs

              Laptops

          Online help

       Online tutorials

            Desktops

       E-mail support


                          0   10      20    30     40       50   60
                                   Percent of Respondents
                               Figure 52. Number of Languages Respondent
                             Organizations Currently Offer Web-Based Courses

                        45
                        40
Percent of Respodents




                        35
                        30
                        25
                        20
                        15
                        10
                         5
                         0
                                1       2      3     4 to 6   7 to 10   10+    NA
                                              Number of Languages
                               Figure 53. Learner Completion Rate in Web-Based
                                                   Courses

                         25
Percent of Respondents




                         20

                         15

                         10

                         5

                         0
                              0-25% 26-50% 50-59% 60-69% 70-79% 80-89% 90-99%     99-
                                                                                 100%
                                               Learner Completion Rate
        Figure 54. Reasons Learners Fail to Complete Web-
                         Based Courses


                     Costs

Poorly designed instruction

         Lack of incentives

                      Time

                              0   10    20      30      40   50
                                   Percent of Respondents
      Figure 55. Incentives for Successful Completion of Web-
                          Based Learning

                Promotion
                    Salary
          Inc Job Security
Awarding Credits to Degree
        Public Recognition
     Inc Job Responsibility
                     None

                              0   10      20    30     40       50   60
                                       Percent of Respondents
      Issues Raised in Survey
•   Increases in Web instruction anticipated
•   Better tools needed
•   Perceived high cost
•   Need clearer vision & manage support
•   Lots of money being spent
•   Low course completion rates
•   Limited organizational support
So, any questions about
  the state of things?
What do we need???
         Part II
   Evaluation Purposes,
Approaches and Frameworks
  One Area in Need of
Frameworks is Evaluation
   of Online Learning
    What is Evaluation???
“Simply put, an evaluation is concerned
  with judging the worth of a program and
  is essentially conducted to aid in the
  making of decisions by stakeholders.”
 (e.g., does it work as effectively as the standard
 instructional approach).

 (Champagne & Wisher, in press)
But who are the evaluators?
The level of evaluation will depend on
 articulation of the stakeholders.
 Stakeholders of evaluation in
 corporate settings may range
 from…???
   What is assessment?
• Assessment refers to…efforts to obtain info about
  how and what students are learning in order to
  improve…teaching efforts and/or to demo to
  others the degree to which students have
  accomplished the learning goals for a course.”
  (Millar, 2001, p. 11).
• It is a way of using info obtained through various
  types of measurement to determine a learner’s
  performance or skill on some task or situation
  (Rosenkrans, 2000).
Why Evaluate?
       Evaluation Purposes
• Assessing learner progress
  – What did they learn?
• Assessing learning impact
  – How well do learners use what they learned?
  – How much do learners use what they learn?
       Evaluation Purposes
• Efficiency
  – Was online learning more effective than
    another medium?
  – Was online learning more cost-effective than
    another medium/what was the return on
    investment (ROI)?
• Improvement
  – How do we do this better?
       Evaluation Purposes
• An evaluation plan can evaluate the
  delivery of e-learning, identify ways to
  improve the online delivery of it, and
  justify the investment in the online
  training package, program, or initiative
  (Champagne & Wisher, in press).
       Evaluation Purposes
• Evaluation can help quantify the return
  on investment allowing one to compare
  the costs of acquiring, developing, and
  implementing e-learning to actual
  savings, revenue impact, and other
  competitive advantages that are
  translatable into monetary values.
        Contextual Factors
• Learner progress, impact of training and
  efficiency all may be affected by other
  contextual factors
• Contextual factors unique to online
  learning:
  – Technology breakdowns
  – Inadequate computer systems (learners can’t
    access multimedia components -- and don’t
    know that they’re missing anything)
     Evaluation Plans

Does your company have a training
        evaluation plan?
   Formal Evaluation Programs
• Most training evaluation data are not
  used for evaluation or performance
  improvement purposes.
• Why? There is no plan for using the data
  and no one has the time.
• Why does it matter in online learning?
  Need to be sure that the development
  expense is justified.
    Steps to Developing an OL
       Evaluation Program
• Select a purpose and framework
• Develop benchmarks
• Develop online survey instruments
  – For learner reactions
  – For learner post-training performance
  – For manager post-training reactions
• Develop data analysis and management
  plan
  What Are Your Evaluation
        Questions?
• What does your employer want to know
  about online learning’s impact?
• How interested is your employer in
  evaluation results?
     Formative Evaluation
• Formative evaluations focus on
  improving the online learning experience.
• A formative focus will try to find out
  what worked or did not work.
• Formative evaluation is particularly
  useful for examining instructional design
  and instructor performance.
      Formative Questions
• -How can we improve our OL program?
• -How can we make our OL program
  more efficient?
• -More effective?
• -More accessible?
     Summative Evaluation
• Summative evaluations focus on the overall
  success of the OL experience (should it be
  continued?).
• A summative focus will look at whether or not
  objectives are met, the training is cost-effective,
  etc.
  What Can OL Evaluation
        Measure?
• Categories of Evaluation Info (Woodley
  and Kirkwood, 1986)
     •   .Measures of activity
     •   .Measures of efficiency
     •   .Measures of outcomes
     •   .Measures of program aims
     •   .Measures of policy
     •   .Measures of organizations
          Typical Evaluation
          Frameworks for OL
• Commonly used frameworks include:
  –   CIPP Model
  –   Objectives-oriented
  –   Marshall & Shriver’s 5 levels
  –   Kirkpatrick’s 4 levels
       • Plus a 5th level
  – AEIOU
  – Consumer-oriented
       CIPP Model Evaluation
• CIPP is a management-oriented model
  –   C = context
  –   I = input
  –   P = process
  –   P = product
• Examines the OL within its larger
  system/context
      CIPP & OL: Context
• Context: Addresses the environment in
  which OL takes place.
• How does the real environment compare
  to the ideal?
• Uncovers systemic problems that may
  dampen OL success.
        CIPP & OL: Input
• Input: Examines what resources are put
  into OL.
• Is the content right?
• Have we used the right combination of
  media?
• Uncovers instructional design issues.
       CIPP & OL: Process
• Process: Examines how well the
  implementation works.
• Did the course run smoothly?
• Were there technology problems?
• Was the facilitation and participation as
  planned?
• Uncovers implementation issues.
       CIPP & OL: Product
• Product: Addresses outcomes of the
  learning.
• Did the learners learn? How do you
  know?
• Does the online training have an effect on
  workflow or productivity?
• Uncovers systemic problems.
        Objectives-Oriented
            Evaluation
• Examines OL training objectives as compared
  to training results
• Helps determine if objectives are being met
• Helps determine if objectives, as formally
  stated, are appropriate
• Objectives can be used as a comparative
  benchmark between online and other training
  methods
Evaluating Objectives & OL
• An objectives-oriented approach can
  examine two levels of objectives:
  – Instructional objectives for learners (did the
    learners learn?)
  – Systemic objectives for training (did the
    training solve the problem?)
       Objectives & OL
• Requires:
  – A clear sense of what the objectives are
    (always a good idea anyway)
  – The ability to measure whether or not
    objectives are met
     • Some objectives may be implicit and hard
       to state
     • Some objectives are not easy to measure
      Marshall & Shriver's
      5 Levels of Evaluation
• Performance-based evaluation
  framework
• Each level examines a different area’s of
  performance
• Requires demonstration of learning
Marshall & Shriver's 5 Levels

• Level I: Self (instructor)
• Level II: Course Materials
• Level II: Course Curriculum
• Level IV: Course Modules
• Level V: Learning Transfer
      Kirkpatrick’s 4 Levels
• A common training framework.
• Examines training on 4 levels.
• Not all 4 levels have to be included in
  a given evaluation.
      The 4 Levels
• Reaction
• Learning
• Behavior
• Results
              A 5th Level
• Return on Investment is a 5th level
• It is related to results, but is more clearly
  stated as a financial calculation
• How to calculate ROI is the big issue here
    Is ROI the answer?
• Elise Olding of CLK Strategies suggests
  that we shift from looking at ROI to
  looking at time to competency.
• ROI may be easier to calculate since
  concrete dollars are involved, but time to
  competency may be more meaningful in
  terms of actual impact.
  Example: Call Center Training
• Traditional call center training can take 3
  months to complete
• Call center employees typically quit
  within one year
• When OL was implemented, the time to
  train (time to competency) was reduced
• Benchmarks for success: time per call;
  number of transfers
        Example: Circuit City
• Circuit City provided online product/sales
  training
• What is more useful to know:
  –   The overall ROI or break-even point?
  –   How much employees liked the training?
  –   How many employees completed the training?
  –   That employees who completed 80% of the training
      saw an average increase of 10% in sales?
            A 6th Level?
        Clark Aldrich (2002)
• Adding Level 6 which relates to the budget and
  stability of the e-learning team.
  – Just how respected and successful is the e-learning
    team.
  – Have they won approval from senior management
    for their initiatives.

  – Aldrich, C. (2002). Measuring success: In a post-Maslow/Kirkpatrick
    world, which metrics matter? Online Learning, 6(2), 30 & 32.
      And Even a    Level?   7 th

      Clark Aldrich (2002)
• At Level 7 whether the e-learning sponsor(s) or
  champion(s) are promoted in the organization.

• While both of these additional levels address
  the people involved in the e-learning initiative
  or plan, such recognitions will likely hinge on
  the results of evaluation of the other five levels.
        ROI Alternative:
   Cost/Benefit Analysis (CBA)
• ROI may be ill-advised since not all impacts hit
  bottom line, and those that do take time.
• Shifts the attention from more long-term
  results and quantifying impacts with numeric
  values, such as:
    – increased revenue streams,
    – increased employee retention, or
    – reduction in calls to a support center.
• Reddy, A. (2002, January). E-learning ROI calculations: Is a
  cost/benefit analysis a better approach? e-learning. 3(1), 30-32.
   Cost/Benefit Analysis (CBA)
• To both qualitative and quantitative measures:
    –   job satisfaction ratings,
    –   new uses of technology,
    –   reduction in processing errors,
    –   quicker reactions to customer requests,
    –   reduction in customer call rerouting,
    –   increased customer satisfaction,
    –   enhanced employee perceptions of training,
    –   global post-test availability.
• Reddy, A. (2002, January). E-learning ROI calculations: Is a
  cost/benefit analysis a better approach? e-learning. 3(1), 30-32.
   Cost/Benefit Analysis (CBA)
• In effect, CBA asks how does the sum of the
  benefits compare to the sum of the costs.
• Yet, it often leads to or supports ROI and other
  more quantitatively-oriented calculations.

• Reddy, A. (2002, January). E-learning ROI calculations: Is a
  cost/benefit analysis a better approach? e-learning. 3(1), 30-32.
  Other ROI Alternatives
• Time to competency (need benchmarks)
    – online databases of frequently asked questions can
      help employees in call centers learn skills more
      quickly and without requiring temporary leaves
      from their position for such training
• Time to market
    – might be measured by how e-learning speeds up the
      training of sales and technical support personnel,
      thereby expediting the delivery of a software
      product to the market
Raths, D. (2001, May). Measure of success. Online Learning, 5(5), 20-
  22, & 24.
    Still Other ROI Alternatives
•   Return on Expectation
    1. Asks employees a series of q’s related to how
       training met expectations of their job performance.
    2. When q’ing is complete, they place a $ figure on
       that.
    3. Correlate or compare such reaction data with
       business results or supplement Level 1 data to
       include more pertinent info about the applicability
       of learning to employee present job situation.
    –   Raths, D. (2001, May). Measure of success. Online Learning, 5(5),
        20-22, & 24.
              AEIOU
• Provides a framework for looking at
  different aspects of an online learning
  program
• Fortune & Keith, 1992; Sweeney, 1995;
  Sorensen, 1996
         A = Accountability
• Did the training do what it set out to do?
• Data can be collected through
  – Administrative records
  – Counts of training programs (# of attendees,
    # of offerings)
  – Interviews or surveys of training staff
          E = Effectiveness
• Is everyone satisfied?
  – Learners
  – Instructors
  – Managers
• Were the learning objectives met?
            I = Impact
• Did the training make a difference?
• Like Kirkpatrick’s level 4 (Results)
   O = Organizational Context
• Did the organization’s structures and policies
  support or hinder the training?
• Does the training meet the organization’s
  needs?
• OC evaluation can help find when there is a
  mismatch between the training design and the
  organization
• Important when using third-party training or
  content
U = Unintended Consequences
• Unintended consequences are often
  overlooked in training evaluation
• May give you an opportunity to brag
  about something wonderful that
  happened
• Typically discovered via qualitative data
  (anecdotes, interviews, open-ended
  survey responses)
  Consumer-Oriented Evaluation

• Uses a consumer point-of-view
  – Can be a part of vendor selection process
  – Can be a learner-satisfaction issue
• Relies on benchmarks for comparison of
  different products or different learning
  media
What About Evaluation
  Issues in Higher
   Education???
 My Evaluation Plan…
    Considerations in Evaluation Plan

       8. University
            or
       Organization     1. Student
7. Program                      2. Instructor


 6. Course                        3. Training
       5. Tech Tool     4. Task
     What to Evaluate?
1.Student—attitudes, learning, jobs.
2.Instructor—popularity, survival.
3.Training—effectiveness, integratedness.
4.Task--relevance, interactivity, collab.
5.Tool--usable, learner-centered, friendly, supportive.
6.Course—interactivity, completion.
7.Program—growth, model(s), time to build.
8.University—cost-benefit, policies, vision.
    1. Measures of Student Success
        (Focus groups, interviews, observations,
               surveys, exams, records)

•   Positive Feedback, Recommendations
•   Increased Comprehension, Achievement
•   High Retention in Program
•   Completion Rates or Course Attrition
•   Jobs Obtained, Internships
•   Enrollment Trends for Next Semester
    1. Student Basic Quantitative

• Grades, Achievement
• Number of Posts
• Participated
• Computer Log Activity—peak usage,
  messages/day, time of task or in system
• Attitude Surveys
    1. Student High-End Success
• Message complexity, depth, interactivity, q’ing
• Collaboration skills
• Problem finding/solving and critical thinking
• Challenging and debating others
• Case-based reasoning, critical thinking
  measures
• Portfolios, performances, PBL activities
   Focus of Assessment?
1. Basic Knowledge, Concepts, Ideas
2. Higher-Order Thinking Skills,
   Problem Solving, Communication,
   Teamwork
3. Both of Above!!!
4. Other…
     Assessments Possible
•   Online Portfolios of Work
•   Discussion/Forum Participation
•   Online Mentoring
•   Weekly Reflections
•   Tasks Attempted or Completed, Usage,
    etc.
More Possible Assessments

•   Quizzes and Tests
•   Peer Feedback and Responsiveness
•   Cases and Problems
•   Group Work
•   Web Resource Explorations &
    Evaluations
      Increasing Cheating Online
($7-$30/page, http://www.syllabus.com/ January, 2002, Phillip Long,
            Plagiarism: IT-Enabled Tools for Deceit?)


• http://www.academictermpapers.com/
• http://www.termpapers-on-file.com/
• http://www.nocheaters.com/
• http://www.cheathouse.com/uk/index.html
• http://www.realpapers.com/
• http://www.pinkmonkey.com/
(“you’ll never buy Cliffnotes again”)
    Reducing Cheating Online
• Ask yourself, why are they cheating?
• Do they value the assignment?
• Are tasks relevant and challenging?
• What happens to the task after
  submitted—reused, woven in, posted?
• Due at end of term? Real audience?
• Look at pedagogy b4 calling plagiarism
  police!
     Reducing Cheating Online
•   Proctored exams
•   Vary items in exam
•   Make course too hard to cheat
•   Try Plagiarism.com ($300)
•   Use mastery learning for some tasks
•   Random selection of items for item pool
•   Use test passwords, rely on IP# screening
•   Assign collaborative tasks
       Reducing Cheating Online
($7-$30/page, http://www.syllabus.com/ January, 2002, Phillip Long,
            Plagiarism: IT-Enabled Tools for Deceit?)

   • http://www.plagiarism.org/ (resource)
   • http://www.turnitin.com/ (software, $100, free
     30 day demo/trial)
   • http://www.canexus.com/ (software; essay
     verification engine, $19.95)
   • http://www.plagiserve.com/ (free database of
     70,000 student term papers & cliff notes)
   • http://www.academicintegrity.org/ (assoc.)
   • http://sja.ucdavis.edu/avoid.htm (guide)
        Turnitin Testimonials
"Many of my students believe that if they do not
 submit their essays, I will not discover their
 plagiarism. I will often type a paragraph or two
 of their work in myself if I suspect plagiarism.
 Every time, there was a "hit." Many students
 were successful plagiarists in high school. A
 service like this is needed to teach them that such
 practices are no longer acceptable and certainly
 not ethical!”
    Part III:

Applying Kirkpatrick’s
  4 Levels to Online
 Learning Evaluation
 & Evaluation Design
   Why Use the 4 Levels?
• They are familiar and understood
• Highly referenced in the training
  literature
• Can be used with 2 delivery media
  for comparative results
     Conducting 4-Level
        Evaluation
• You need not use every level
  – Choose the level that is most
    appropriate to your need and budget
• Higher levels will be more costly
  and difficult to evaluate
• Higher levels will yield more
      Kirkpatrick Level 1:
           Reaction
• Typically involves “Smile sheets” or
  end-of-training evaluation forms.
• Easy to collect, but not always very
  useful.
• Reaction-level data on online courses
  has been found to correlate with ability
  to apply learning to the job.
• Survey ideally should be Web-based,
  keeping the medium the same as the
  course.
     Kirkpatrick Level I:
          Reaction
• Types of questions:
  – Enjoyable?
  – Easy to use?
  – How was the instructor?
  – How was the technology?
  – Was it fast or slow enough?
  Kirkpatrick Level 2:
       Learning
• Typically involves testing
  learners immediately following
  the training
• Not difficult to do, but online
  testing has its own challenges
  – Did the learner take the test on
    his/her own?
      Kirkpatrick Level 2:
           Learning
• Higher-order thinking skills (problem
  solving, analysis, synthesis)
• Basic skills (articulate ideas in writing)
• Company perspectives and values
  (teamwork, commitment to quality,
  etc.)
• Personal development
       Kirkpatrick Level 2:
            Learning
• Might include:
  – Essay tests.
  – Problem solving exercises.
  – Interviews.
  – Written or verbal tests to assess
    cognitive skills.

  Shepard, C. (1999b, July). Evaluating online learning. TACTIX from
    Fastrak Consulting. Retrieved February 10, 2002, from:
    http://fastrak-
    consulting.co.uk/tactix/Features/evaluate/eval01.htm.
       Kirkpatrick Level 3:
            Behavior
• More difficult to evaluate than Levels 1 & 2
• Looks at whether learners can apply what
  they learned (does the training change
  their behavior?)
• Requires post-training follow-up to
  determine
• Less common than levels 1 & 2 in practice
         Kirkpatrick Level 3:
              Behavior
• Might include:
  – Direct observation by supervisors or coaches
    (Wisher, Curnow, & Drenth, 2001).
  – Questionnaires completed by peers,
    supervisors, and subordinates related to work
    performance.
  – On the job behaviors, automatically logged
    performances, or self-report data.

  Shepard, C. (1999b, July). Evaluating online learning. TACTIX from
    Fastrak Consulting. Retrieved February 10, 2002, from:
    http://fastrak-consulting.co.uk/tactix/Features/evaluate/eval01.htm.
      Kirkpatrick Level 4:
            Results
• Often compared to return on investment
  (ROI)
• In e-learning, it is believed that the
  increased cost of course development
  ultimately is offset by the lesser cost of
  training implementation
• A new way of training may require a
  new way of measuring impact
   Kirkpatrick Level 4: Results
• Might Include:
  – Labor savings (e.g., reduced duplication of
    effort or faster access to needed information).
  – Production increases (faster turnover of
    inventory, forms processed, accounts opened,
    etc.).
  – Direct cost savings (e.g., reduced cost per
    project, lowered overhead costs, reduction of
    bad debts, etc.).
  – Quality improvements (e.g., fewer accidents,
    less defects, etc.).
  Horton, W. (2001). Evaluating e-learning. Alexandria, VA:
    American Society for Training & Development.
  Kirkpatrick + Evaluation
           Design
• Kirkpatrick’s 4 Levels may be
  achieved via various evaluation
  designs
• Different designs help answer
  different questions
 Pre/Post Control Groups
• One group receives OL training and one
  does not
• As variation try 3 groups
  – No training (control)
  – Traditional training
  – OL training
• Recommended because it may help
  neutralize contextual factors
• Relies on random assignment as much
  as possible
      Multiple Baselines
• Can be used for a program that is
  rolling out
• Each group serves as a control
  group for the previous group
• Look for improvement in
  subsequent groups
• Eliminates need for tight control of
  control group
          Time Series
• Looks at benchmarks before and
  after training
• Practical and cost-effective
• Not considered as rigorous as
  other designs because it doesn’t
  control for contextual factors
   Single Group Pre/Post
• Easy and inexpensive
• Criticized for lack of rigor (absence
  of control)
• Needs to be pushed into
  Kirkpatrick levels 3 and 4 to see if
  there has been impact
           Case Study
• A rigorous design in academic
  practice, but often after-the-fact in
  corporate settings
• Useful when no preliminary or
  baseline data have been collected
  Part IV:
ROI and Online
  Learning
  The Importance of ROI
• OL requires a great amount of $$
  and other resources up front
• It gives the promise of financial
  rewards later on
• ROI is of great interest because of
  the investment and the wait period
  before the return
            Calculating ROI
• Look at:
  –   Hard cost savings
  –   Hard revenue impact
  –   Soft competitive benefits
  –   Soft benefits to individuals

      See: Calculating the Return on Your eLearning
      Investment (2000) by Docent, Inc.
    Possible ROI Objectives
•   Better Efficiencies
•   Greater Profitability
•   Increased Sales
•   Fewer Injuries on the Job
•   Less Time off Work
•   Faster Time to Competency
      Hard Cost Savings
• Travel
• Facilities
• Printed material costs (printing,
  distribution, storage)
• Reduction of costs of business
  through increased efficiency
• Instructor fees (sometimes)
   Hard Revenue Impact
• Consider
  – Opportunity cost of improperly or
    untrained personnel
  – Shorter time to productivity through
    shorter training times with OL
  – Increased time on job (no travel
    time)
  – Ease of delivering same training to
    partners and customers (for fee?)
    Soft Competitive Benefits
• Just-in-time capabilities
• Consistency in delivery
• Certification of knowledge transfer
• Ability to track users and gather
  data easily
• Increase morale from
  simultaneous roll-out at different
  sites
      Individual Values
• Less wasted time
• Support available as needed
• Motivation from being treated as
  an individual
      Talking about ROI
• As a percentage
  – ROI=[(Payback-
    Investment)/Investment]*100
• As a ratio
  – ROI=Return/Investment
• As time to break even
  – Break even
    time=(Investment/Return)*Time
    Period
  What is ROI Good For?
• Prioritizing Investment
• Ensuring Adequate Financial
  Support for OL Project
• Comparing Vendors
The Changing Face of ROI
• “Return-on-investment isn’t what
  it used to be … The R is no longer
  the famous bottom line and the I is
  more likely a subscription fee than
  a one-time payment” (Cross, 2001)
      More Calculations
• Total Admin Costs of Former Program
  - Total Admin Costs of OL Program
  =Projected Net Savings
• Total Cost of Training/# of Students
  =Cost Per Student (CPS)
• Total Benefits * 100/Total Program Cost
  =ROI%
   At the End of the Day...
• Are all training results quantifiable?
• NO! Putting a price tag on some costs
  and benefits can be very difficult
• NO! Some data may not have much
  meaning at face value
  – What if more courses are offered and annual
    student training hours drop simultaneously?
    Is this bad?
    Part V:
   Collecting
Evaluation Data
    & Online
Evaluation Tools
Collecting Evaluation Data
•   Learner Reaction
•   Learner Achievement
•   Learner Job Performance
•   Manager Reaction
•   Productivity Benchmarks
       Forms of Evaluation
•   Interviews
•   Focus Groups
•   Self-Analysis
•   Supervisor Ratings
•   Surveys and Questionnaires
•   ROI
•   Document Analysis
•   Data Mining (Changes in pre and post-
    training; e.g., sales, productivity)
      How Collect Data?
• Direct Observation in Work Setting
  – By supervisor, co-workers,
    subordinates, clients
• Collect Data By Surveys,
  Interviews, Focus Groups
  – Supervisors, Co-workers,
    Subordinates, Clients
• Self-Report by learners or teams
            Learner Data
• Online surveys are the most effective way
  to collect online learner reactions
• Learner performance data can be collected
  via online tests
   – Pre and post-tests can be used to
     measure learning gains
• Learner post-course performance data can
  be used for Level 3 evaluation
   – May look at on-the-job performance
   – May require data collection from
     managers
Example: Naval Phys. Training
   Follow-Up Evaluation
• A naval training unit uses an online
  survey/database system to track
  performance of recently trained
  physiologists
• Learner’s self-report performance
• Managers report on learner
  performance
• Unit heads report on overall
  productivity
      Learning System Data
• Many statistics are available, but which
  are useful?
  – Number of course accesses
  – Log-in times/days
  – Time spent accessing course components
  – Frequency of access for particular
    components
  – Quizzes completed and quiz scores
  – Learner contributions to discussion (if
    applicable)
    Learner System Data
• IF learners are being evaluated based
  on number and length of accesses, it is
  only fair that they be told
• Much time can be wasted analyzing
  statistics that don’t tell much about the
  actual impact of the training
• Bottom line: Easy data to collect, but
  not always useful for evaluation
  purposes
  – Still useful for management purposes
        Benchmark Data
• Companies need to develop benchmarks
  for measuring performance
  improvement
• Managers typically know the job areas
  that need performance improvement
• Both pre-training and post-training data
  need to be collected and compared
• Must also look for other contextual
  factors
Online Testing Tools
   (see: http://www.indiana.edu/~best/)
    Test Selection Criteria
             (Hezel, 1999)

• Easy to Configure Items and Test
• Handle Symbols
• Scheduling of Feedback (immediate?)
• Provides Clear Input of Dates for
  Exam
• Easy to Pick Items for Randomizing
• Randomize Answers Within a Question
• Weighting of Answer Options
  More Test Selection Criteria

• Recording of Multiple
  Submissions
• Timed Tests
• Comprehensive Statistics
• Summarize in Portfolio and/or
  Gradebook
• Confirmation of Test Submission
 More Test Selection Criteria
           (Perry & Colon, 2001)

• Supports multiple items types—multiple
  choice, true-false, essay, keyword
• Can easily modify or delete items
• Incorporate graphic or audio elements?
• Control over number of times students
  can submit an activity or test
• Provides feedback for each response
More Test Selection Criteria
         (Perry & Colon, 2001)

• Flexible scoring—score first, last,
  or average submission
• Flexible reporting—by individual
  or by item and cross tabulations.
• Outputs data for further analysis
• Provides item analysis statistics
  (e.g., Test Item Frequency
  Distributions).
                Computer Log Data
  Chen, G. D., Liu, C. C., Liu, B. J. (2000). Discovering decision knowledge from Web log
  portfolio for managing classroom processes by applying decision tree and data cute tech.

                   Journal of Educ Computing Research, 23(3), 305-332.


• Determine student behavior patterns
   –   student posting opinions,
   –   asking questions,
   –   replying to opinions,
   –   posting articles, etc.
• Web logs can also help instructors make informed
  pedagogical decisions. For instance, does a
  particular teaching strategy or task improve student
  interaction?
             Computer Log Data
 Chen, G. D., Liu, C. C., Liu, B. J. (2000). Discovering decision knowledge from Web log
 portfolio for managing classroom processes by applying decision tree and data cute tech.

                  Journal of Educ Computing Research, 23(3), 305-332.


• In a corp training situation, computer log data
  can correlate online course completions with:
   – actual job performance improvements such as
      • fewer violations of safety regulations,
      • reduced product defects,
      • increased sales, and
      • timely call responses.
           Email and Chat

• Chats and email messages might
  provide data about the effectiveness of
  the training event.
Online Survey Tools
  for Assessment
    Sample Survey Tools
•   Zoomerang
    (http://www.zoomerang.com)
•   IOTA Solutions
    (http://www.iotasolutions.com)
•   QuestionMark
    (http://www.questionmark.com/home.html)
•   SurveyShare (http://SurveyShare.com; from
    Courseshare.com)
•   Survey Solutions from Perseus
    (http://www.perseusdevelopment.com/fromsurv.htm)
•   Infopoll (http://www.infopoll.com)
     Survey Tool Features
•   Maintain email lists and email invitations
•   Conduct polls
•   Adaptive branching and cross tabulations
•   Modifiable templates
•   Maintain library of past surveys
•   Publish reports
•   Technical support, chat advice
•   Different types of accounts—hosted,
    corporate, professional, etc.
    Web-Based Survey
      Advantages
• Faster collection of data
• Standardized collection format
• Computer graphics may reduce
  fatigue
• Computer controlled branching and
  skip sections
• Easy to answer clicking
• Wider distribution of respondents
     Web-Based Survey
    Problems: Why Lower
      Response Rates?
•   Low response rate
•   Lack of time
•   Unclear instructions
•   Too lengthy
•   Too many steps
•   Can’t find URL
•   Perceived as aggressive
      Web-Based Survey
    Solutions: Some Tips…
•   Send second request
•   Make URL link prominent
•   Offer incentives near top of request
•   Shorten survey, make attractive, easy to
    read
•   Credible sponsorship—e.g., university
•   Disclose purpose, use, and privacy
•   E-mail cover letters
•   Prenotify of intent to survey
    Tips on Authentification
•   Check e-mail access against list
•   Use password access
•   Provide keycode, PIN, or ID #
•   (Futuristic Other: Palm Print,
    fingerprint, voice recognition, iris
    scanning, facial scanning, handwriting
    recognition, picture ID)
Some Final Advice…
• As venture capital drys up and
  state funding is cut, evaluation
  and accountability takes center
  stage in e-learning decision-
  making and discussion.
Questions?

Comments?

Concerns?

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:28
posted:5/17/2012
language:English
pages:212
fanzhongqing fanzhongqing http://
About