Instructional Design and SCORM 2004

Document Sample
Instructional Design and SCORM 2004 Powered By Docstoc
					     Instructional Design and
                SCORM 2004
An Overview for GoLearn – January 29, 2009
                   ADL Co-Laboratory Hub
 Presented by: Nina Deibler & Peter Berking
                          Workshop Objectives
 Upon completion of this workshop you will be able to
   • Describe SCORM components and their role in
   • Explain the role of data model elements in SCORM
   • Explain differences between SCORM 1.2 and
     SCORM 2004
   • Identify the four different types of reusability
   • Plan complex sequencing strategies to create
     individualized instruction
   • Design assessments within SCORM content
   • Describe process considerations for SCORM 2004
     content development
   • Identify the role of metadata and content registries in
     making content accessible to a wider audience

 9:00-9:15 Welcome and Introductions
 10:30-10:45 Break
 12:30-1:00 Lunch at ADL*
 2:15-2:30 Break
 4:45 Wrap-up
 5:00 Dismissal

 * Lunchdiscussion with Peter Berking of ADL Team about
 Choosing Authoring Tools
SCORM Overview
                                 SCORM 2004

 Sharable Content Object Reference
  Model (SCORM)
  • Integrates a set of related technical standards,
    specifications, and guidelines designed to
    achieve ADL’s functional requirements for
    content and systems
  • Enables you to redeploy, rearrange, repurpose,
    and rewrite your content
  • Allows e-learning to be delivered to your
    learners via any SCORM-certified LMS using
    the same version of SCORM
               ADL’s Functional Requirements
 Accessibility
   • The ability to locate and access instructional
     components from multiple locations and deliver them to
     other locations
 Reusability
   • The ability to use instructional components in multiple
     applications, courses, and contexts
 Interoperability
   • The ability to take instructional components developed
     in one system and use them in another system
 Durability
   • The ability to withstand technology changes over time
     without costly redesign, reconfiguration, or recoding
              A Few of SCORM’s Benefits
 Provides an object-based approach for
  developing and delivering instructional content
 Allows interoperability of these objects across
  multiple delivery environments
 Enables sophisticated learning strategies based
  on the learner’s mastery and progress
  (individualized learning)
 Permits the packaging of learning content and
  instructional strategies for import and export
                 What is SCORM content?
 Rendered in a browser via a system for managed
  learning (for example: LMS, CMS, LCMS)
  • Typically web-based
 Displayed within html pages
  • Requires JavaScript code that communicates with the
    systems that manage learning
 Accepts objects based on plug-ins and 3rd party
  • For example: Flash Player, QuickTime, and others
    you’ve never heard of
            SCORM 1.2 vs. SCORM 2004 - 1

 Key Differences
  • Data model element requirements
     Multiple levels of SCORM-compliance in SCORM
      1.2, depending on how many data model elements a
      system implemented
     All SCORM 2004-compliant systems must support
      all data model elements
  • Metadata requirement removed
     Metadata was required in SCORM 1.2
     SCORM 2004 only tests metadata as valid XML now
             SCORM 1.2 vs. SCORM 2004 - 2

 Key Differences (con’t)
  • Addition of sequencing capability
      Rules to prescribe which content users see and
Key Terms
 Electronic representations of media such as text, images,
  sound or any other piece of data a web client can deliver

 The most
  basic form of

 Can be
  reused in
  contexts and
            Sharable Content Object (SCO)

 Comprised of one or more
  assets that becomes an
  independent, defined piece of
  instructional material
 The smallest logical unit of
  information an LMS can track
  • In technical terms, a SCO is
    defined as the only piece of
    information that uses the SCORM
    Application Programming Interface
    (API) for communication with an
 Collection of related content
  • A parent and its children in a tree
  • Also known as a cluster
 Used to group related content for
  sequencing so that it can be
  delivered to learners in the manner
  you prescribe
 Contains SCOs and other
  • Aggregations of aggregations are
 The part of a content package
  where SCOs are ordered into a
  tree structure and sequencing
  behaviors are assigned to them
  • Also known as a root aggregation

 Outlines the entire structure
  you created for the content that
  will be delivered as a single
  content package
                                Content Package
 A standardized, interoperable way to
  upload content to a SCORM-compliant
 A SCORM content package contains
  two principal parts:
  • The XML manifest file that lists
      All of the resources or assets you want to include in
        the package
      The content structure diagram you created (called
        the organization)
      The sequencing rules
      All of the metadata for the SCOs, aggregations, and
        the package itself
  • All of the physical SCO and asset files for the content
 Outside the scope of
 SCORM-compliant
  content can be part of a
  curriculum that is
  managed by your LMS
 Typically includes
  courses, lessons, and
  assessments using a
  variety of delivery media
  and instructional
                    Data Model Elements - 1
 Enable tracking and storing of data about
  learner performance in, and interaction with,
  instructional content interoperably
  • Every LMS must support all data model elements
  • Use of data model elements in content is optional
                    Data Model Elements - 2
 Allow you to use or collect information including
  • Learner's name for use in the content
  • Last location in the content the learner viewed
  • Learner's language, presentation, or other
  • Score and/or pass/fail status
  • Completion status for a SCO
  • Total time spent in a SCO and/or time in a single
    session of a SCO
  • Responses to assessment items
  • Interactions within a SCO
                              Data Model Elements
 Technical Initialization     Score Reporting
   •   Launch Data               •   Score
   •   Entry                     •   Progress Measure
   •   Location                  •   Scaled Passing Score
   •   Mode                      •   Success Status
   •   Credit                    •   Objectives
   •   Suspend Data              •   Interactions
 Content Initialization         •   Completion Status
   •   Maximum Time Allowed    Comments
   •   Learner ID                • Comments from Learner
   •   Learner Name              • Comments from LMS
   •   Learner Preference      Exit Data
   •   Completion Threshold      • Exit
   •   Time Limit Action         • Session Time
                                 • Total Time
 All LMSs must support all data elements in SCORM 2004
                    SCORM Product Status - 1
 Conformance
   • Product was designed and developed with the
     intention of following the SCORM documents

Since SCORM 2004 was published, several specifications in
SCORM documentation have been accepted as IEEE standards.
When using standards, the appropriate term is compliance. You
either comply with the standard, or not. The path to certification is
               SCORM Product Status - 2
 Compliance
  • A product is tested to ensure it performs as
    specified in the ADL SCORM Test Suite
  • Applies to a specific version only
     SCORM 2004 1st Edition
     SCORM 2004 2nd Edition
     SCORM 2004 3rd Edition
     SCORM 2004 4th Edition (currently in Beta)
               SCORM Product Status - 3
 Certification
  • A qualified, neutral third party conducts a
    formal evaluation using the ADL SCORM Test
    Suite and a rigorous, accurate, reliable,
    validated methodology
  • Applies to a specific version only
     SCORM 2004 1st Edition
     SCORM 2004 2nd Edition
     SCORM 2004 3rd Edition
     SCORM 2004 4th Edition (coming soon)
 Certified products display the ADL certified logo
SCORM 2004 Project Team
Designing for Reuse
                     Categories of Reuse - 1
 Redeploy
  • Running the same content, without modification, in
    multiple LMSs
 Rearrange
  • Reordering the same content for new uses or new
 Repurpose
  • Using the same piece of content in new contexts or in
    different ways
 Rewrite
  • Taking relevant materials and changing the examples,
    imagery, or writing style, or removing irrelevant
Categories of Reuse - 2
Example of Asset Reuse
                 Designing Reusable Content

 Maximize the potential that your content
  can be redeployed, rearranged,
  repurposed, or rewritten
  • Design smaller, context-neutral SCOs
  • Use sequencing to add context-specific SCOs
 Define data collection and tracking
 Create content structure/flow chart and
  rules for sequencing
 Determine what information is required for
  formal reporting
                  Determining SCO Size – 1

 Compliance Training Example
  • Your learners need general instruction about
    handling hazardous materials
  • You must show that learners “completed” the
  • “Complete” means learners viewed the content

         One larger SCO is acceptable.
                     Determining SCO Size - 2

 Certification Training Example
  • You need to know that a shipping inspector
    learned to “properly ship radioactive materials
    for transit on a vessel” as part of a larger
    hazardous materials transportation course
  • Learners must complete a series of similar
    learning objectives and pass a comprehensive
    assessment to receive certification

           Numerous smaller SCOs
          may be required for tracking.
               Design Considerations - 1
 How will you optimize the potential for
  content to be redeployed, rearranged,
  repurposed, and rewritten?

 Will SCOs cover a single learning objective or
  multiple learning objectives?

 Will SCOs include an assessment, or will the
  assessment be a separate SCO?
               Design Considerations - 2
 How will SCOs be divided, structured,
  chunked, and sequenced?

 What media types will be incorporated?

 What other organizational policies and
  practices must you comply with (e.g., Section

 When, where, and how will you collect data
  (e.g., per SCO, content package, curriculum)?
               Design Considerations - 3
 What navigation options will be provided in
  the SCO versus the standard navigation
  options provided by a typical LMS?

 What colors and layouts will work best in the
  target LMS and in other LMSs?

 Will templates and cascading style sheets
  facilitate rearranging, repurposing, and
  rewriting the content?
                       ACTIVITY – 20 minutes

 Reuse Discussion
  • In your assigned small group discuss
     How content you have developed or are developing
      could be used by others in your group
     What changes you might have to make to some of
      your content so that others can reuse it
Metadata and Content Registries
 The information that describes content,
  including things like
  •   Title, description, keywords, and objectives
  •   Seat time, difficulty, and interactivity level
  •   Creator, owner, and copyright information
  •   Location and file format
                 Importance of Metadata
 Helps people searching for content to find it
 Provides a textual “snapshot” of your
 Captures similar kinds of information about
  all items so content can be compared
                                  ADL Registry
 A central search point for DoD training

 The digital equivalent of a library card
  • Contains all of the cards (registered entries)
    that contain a standard set of information
    (metadata) about all of the books (learning-
    related content) in the library (repository)
            Repositories and ADL Registry
Your organization’s local repository stores the content.
                                                The ADL Registry
                                                stores metadata
                                Registry        records about the
                               Web Portal       content.



               Air Force     JFSC
                    Writing GOOD Metadata
 Use a team approach
  • Instructional designers don’t need to be XML gurus
      They know the content, audience, and application
  • XML gurus don’t need to be training experts
      They know the technical specifications and
 Team must work together throughout the process
  to be successful
   Tools like can minimize need for XML gurus
                             Getting Started
 Apply “constraints” to make it faster and
  easier to create metadata
 Standardize and limit
  • Metadata fields
  • Entries within metadata fields
  • Metadata generation process/workflow
     What do instructional designers write?
     What do the XML gurus create?
 Document team policies and procedures in
  a style guide or production manual
Example of a Metadata Form
Designer’s Version of Metadata
                     Metadata Best Practice
 Use a Content Librarian
  • Standardizes the creation and maintenance of
    metadata records for assets, SCOs,
    aggregations, and content packages
  • Works with ISD and XML guru to ensure
    accurate and effective tagging of all materials
                  Multiple Metadata Entries
 One piece of training content could have
  multiple instances of metadata targeted at
  multiple users
  • The same Hazardous Materials course may
    have three metadata instances targeted at
    three different audiences
      Truck drivers who transport HazMat
      Fire fighters and first responders
      Ammunition workers who handle hazardous
Assessments in SCORM
 SCORM does not address
  • How to create assessments
  • When and how a SCO should be considered an
 Best practices defined in
  • ADL Guidelines for Creating Reusable Content
    with SCORM 2004
                      SCOs as Assessments

                              1 Assessment = 1 SCO

1 Test Item = 1 SCO

                              Multiple Test Banks =
                              1 SCO

1 Test Bank = 1 SCO
                                    Test Banks
 Help to ensure security of tests
 Structure
  • Typically grouped by learning objective
  • Randomly select a given number of test items
    from a larger pool of related test items
 Considerations
  • To randomly pull 3 – 5 test items, you’ll need
    7 – 9 items per objective in the test bank
  • Can use different types of test items
    for the same information
                               1 Test Item = 1 SCO

 Advantages                        Disadvantages
  • Use cmi.interactions to          • Interruption to “test
    collect formative and              flow” as learners
    summative evaluation data in       navigate between LMS
    an interoperable way               and test item
  • Use sequencing to move           • Loss of concentration
    between test items
                                       by learners
  • Update a test by removing a
    SCO and re-sequencing            • Must use roll-up for
                                       score (very complicated
  • Receive a score for each
    item (multiple scores)             to determine rules)
  • Use roll-up for
    comprehensive score
                           1 Assessment = 1 SCO
 Advantages                        Disadvantages
  • Use cmi.interactions to          • Require more time to
    collect formative and              program
    summative evaluation data in
                                     • Updating a single item
    an interoperable way
                                       means updating the entire
  • Preserves integrity of
    learner’s testing experience
  • Randomization of items is
    controlled inside the SCO
  • Don’t have to write as many
    sequencing rules
  • Receive a comprehensive
             Assessment Best Practices
 Collecting data and learner’s experience
  can be balanced
 Use 1 SCO = 1 Assessment
  • Use cmi.interactions to track learners
  • Use objectives from sequencing to control
     Diagnostic pre-tests
     Test-out options
     Completion of learning objectives
     Mastery testing
                    What is Sequencing?
 Ability to prescribe the manner in which
  learners receive content in an interoperable
Sequencing Process
                         Parents and Children

 Sequencing is based on a hierarchical (tree)
 Control mode rules are defined at the
 aggregation (cluster), not SCO, levels
  • All of the children must follow all of the rules of
    the parent
  • No child is special; all children are equal in the
    eyes of the parent
                       Presentation Modes
 Present the content in a linear order
   • FLOW = True or False
 Allow the learner to choose the order
   • CHOICE = True or False
   • The sequencing default value, same as
     SCORM 1.2
 The learner can’t go back once started
   • FORWARD ONLY = True or False
 Exit the aggregation
   • CHOICE EXIT = True or False

Can use combinations of these
                      Pre- and Post-Condition Rules

 If item is/has:                     Then:
  • Satisfied                          • _____ before the SCO is launched.
  • Objective Status Know                    Skip
  • Objective Measure Know                   Disable

  • Objective Measure Greater Than           Hide from Choice
                                             Stop Forward Traversal
  • Completed
                                       • _____ after the SCO terminates.
  • Progress Known
                                             Exit Parent
  • Score Greater Than
                                             Exit All
  • Score Less Than
                                             Retry
  • Attempt Limit Exceeded                   Retry All
  • Time Limit Exceeded                      Continue
  • Outside Available Time Range             Previous
                                                 Roll-up Rules

• If   • All            of the children are        Satisfied
       • Any                                       Completed
                                                   Attempted
       • None
                                                   Objective Status Known
       • At least “x”
                                                   Objective Measure Known
       • At least “%”                              Activity Progress Known
                                                   Attempt Limit Exceeded
                                                   Time Limit Exceeded
                                                   Outside Available Time

Then the parent is              Satisfied
                                Not Satisfied
                                Completed
                                Incomplete
             Global Objectives (variables)
 An objective in sequencing is really a shared
  global variable, not a learning objective
 Global objectives
  • Contain a passed/failed status and a score
  • Are accessed by mapping them to a local objective of a
  • Allow the state of one activity to affect (read/write) the
    state of another activity
 Activity to objective mapping is many to many
  • Objectives are shared among Activities
  • An Activity can be associated with multiple objectives
Portion of Manifest
           Sequencing Design - Simple

        Optional    Analysis of
Intro                                           PIX
        Pre-test    Mission KL

        Optional       COA /
                    Wargaming KL

                       IPB KL                   PIX

                                    Practical          Practical
                                   Exercise 1         Exercise 2

                   Remediate to knowledge lesson(s) of failed objective(s).
Sequencing Design - Complex
                      Sequencing Skeleton
 Content-free SCOs and content package used to
  test sequencing rules
 Speed development time by allowing
  programmers to iterate on rules while content is
  being developed
Template 1
Template 3
Template 4
Template 5
Template 6
Template 7
Template 8
Template 9
Template 10
Model 1
Model 2
Model 4
Model 5
                               ACTIVITY – 1 hour

 Working with your assigned group
  • Review the contents of your packets
     Group scenario
     Descriptions of existing SCOs
  • Identify SCOs you would reuse for your
  • Design a sequencing strategy to meet the
    requirements of your scenario
SCORM 2004 4th Edition
                      SCORM 2004 4th Edition
 Timeline
  • Sample Run Time Environment and Test Suite final
    release expected ~mid 2009
 Enhancements/Corrections
  • Correct minor bugs
  • Improve interoperability across LMSs
  • Test more stringently and add new test cases
 Impact on Community
  • Requires recertification of LMS vendors (no charge
    for currently certified SCORM 2004 3rd Edition LMSs
    for 90 – 120 days after certification is available)
  • Certification available in mid 2009
Resources and References
                                       Upcoming Webinars
 04 February 2009
   • Creating Reusable Content with SCORM 2004
 11 February 2009: Federated Registry Architectures
 18 February 2009
   • Sequencing SCORM 2004 Content, Part 1 of 3
 25 February 2009
   • Sequencing SCORM 2004 Content, Part 2 of 3
 04 March 2009
   • Sequencing SCORM 2004 Content, Part 3 of 3
 11 March 2009: Using the ADL Registry

 All webinars are on Wednesdays from 12:00 PM - 1:00 PM EST.
  Attendance is limited to the first 25 participants to login. Visit for more information. To schedule a webinar at your
  convenience for a group of 10 or more, contact Nina Deibler at
                    Resources for ADL Users

 ADL Initiative

 ADL Registry

 ADL Guidelines for Creating Reusable Content
  with SCORM 2004
 The ADL Registry and CORDRA, Volume 2: ADL
  Registry Overview and User Guide
 DOD Instruction 1322.26, Development,
  Management, and Delivery of Distributed Learning,
  June 2006
                                   Contact Us

 Nina Deibler
  • +1.412.885.4659

 Peter Berking
  • +1.703.575.2017