www.ldeo.columbia.edueduDLESEcollectionsbrc_fo

Document Sample
www.ldeo.columbia.edueduDLESEcollectionsbrc_fo Powered By Docstoc
					  The Portals (Coolfont) Workshop decided
                    that:

• DLESE will be a two-level collection:
  – The unreviewed collection: a broad collection of
    content which is relevant to Earth System
    education and meets minimum quality and
    technical standards
  – The reviewed collection: a subset of high quality
    teaching and learning materials which have been
    rigorously evaluated
• The rationale for having a reviewed collection:
  – For the user: guaranteed high-quality resources,
    even for a teacher without expertise in the field or
    time to “comparison shop”
  – For the creator: inclusion in the reviewed section
    of DLESE can become a recognized stamp of
    professional approval
• The rationale for having an unreviewed
  collection:
  – For the user: access to a wider range of teaching
    and learning resources
  – For the library builders: a pool from which to select
    the reviewed collection
OK, so how do we decide what
goes into the reviewed collection?
            From the Portals (Coolfont) workshop:


•   Selection Criteria:
     – Accuracy, as evaluated by
       scientists
     – Importance/significance
     – Pedagogical Effectiveness
     – Well-documented
     – Ease of use for students
       and faculty
     – Motivational/inspirational for
       students
     – Robustness/Sustainability
     Familiar review procedures:


• “Traditional peer review”




• “Traditional educational
  evaluation”
Traditional “Peer-
    Review”

• Reviewers are
  selected for their
  expertise by an
  editor.
• Reviewers examine
  the material, or a
  description of the
  material, in their
  home or office.
• Typically two
  reviews.
Traditional
“Peer-Review”


What’s wrong
with this picture?
Traditional
“Peer-Review”


There are no
students in this
picture!
        “Traditional Educational Evaluation”
• Evaluator
  (reviewer) is
  selected by the
  developer.
• Evaluator
  observes students
  in another
  teacher’s
  classroom and/or
  administers
  evaluation
  instruments
• Typically one
  evaluator, several
“Traditional Educational Evaluation”




What’s wrong with this
“Traditional Educational Evaluation”




  Evaluation by independent professional
  evaluators is labor-intensive and
                   Community Review Concept
                             Premises

•   The materials in the “inner-circle” of reviewed, DLESE-stamp-of-
    approval-bearing resources must be classroom-tested.
     – However, testimony from the creator of a resource that
       learning has occurred in his or her classroom is insufficient.
     – It is not realistic to pay for professional evaluators to go into
       classrooms to evaluate whether student learning has
       occurred for every potential DLESE resource.
     – Experienced educators can tell whether or not their own
       students are learning effectively from an educational
       resource.
     – It is easier to answer: “Did your students learn?” than “Do
       you think students would learn?”
             Community Review Concept
                 Premises (cont’d)

• In order to be useful, DLESE has to contain lots of
  resources. Therefore it must grow fast.
• In the DLESE ecosystem, teachers, classrooms and
  students will be abundant resources.
• The rate-limiting resources in DLESE’s growth will be
  money, and the time of paid
  librarians/editors/gatekeepers and computer wizards.
• This is a digital library; we can and should take
  advantage of automated information gathering
Community Review Concept
      Procedure
               Community Review Concept
                  Procedures (cont’d)
• What happens to the questionnaire information?
   – Creator receives all feedback from “YES” and “NO”
     respondents.
   – Builders of Discovery System receive feedback from “NO”
     respondents.
   – Suggestions typed in the teaching tips field are added to the
     Teachers’ section of the resource.
   – “Editor” or “gate-keeper” is automatically notified and
     receives full packet of reviews when number of complete
     reviews exceeds N and the average, or weighted average, of
     the numerical scores exceeds Y.
Community Review Concept
      Procedure
             Community Review Concept
                     Strengths


• Inclusive: The community builds the library.
• Scalable: Hundreds or thousands of resources can
  be classroom-tested.
• Thorough: All seven Coolfont/Portals selection
  criteria are applied.
• Economical: Scarce talents are applied at the end of
  the process, to the smallest number of items.
                 Community Review Concept
                             Issues


•   How do we get educators to send in their reviews?
•   How do we ensure that reviews come from bona fide educators?
•   Would creators “spin” the review process by soliciting reviews
    from their friends?
•   Would the merely-good early arrival tend to keep out the truly
    excellent later arrival?
•   Some topics are inherently less inspirational/motivational than
    others; how do we avoid filtering out resources on such topics?
•   What about off-the-wall, or erroneous, or malicious reviews?
How can I become part of DLESE?

 • … as a resource creator/contributor
 • … as a user
 • … as a reviewer/tester
        Continue the conversation at:

            collections@dlese.org

                      or

http://www.ldeo.columbia.edu/dlese/collections

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:0
posted:7/21/2013
language:English
pages:26