Seton Hall University Rejoinder to Board of Examiners Report

Document Sample
Seton Hall University Rejoinder to Board of Examiners Report Powered By Docstoc
					                                 SHU Rejoinder 1




Seton Hall University Rejoinder to

   Board of Examiners Report
   (Visit October 23-27, 2004)


          January, 2005
                                                                          SHU Rejoinder 2


         The purpose of this rejoinder is to provide further information on the areas of
improvement cited in the BOE report for our visit in October, 2004. The BOE report
specifically noted that our areas of improvement reflect the “newness” and extensive
revision of our evaluative tools as well as the construction of a first time assessment
system. Although the rejoinder is designed to comment on the conditions that existed
during the visit, the continuation of work in the cited areas is necessary to show the
stability and growth of the ‘new’ systems. The rejoinder will follow the BOE report
layout and refer to specific pages within it.

Standard One:
     1. page 12 Program Review Process paragraph: Report text “All are in
        compliance with ACTFL, NCTE and NASP currently in the rejoinder process.”

        There is new information since the visit—we have been notified by NCTE
         that we have met national recognition.

       Other Data paragraph: Report text “More specific monitoring of candidate
       content knowledge will be part of the new field experience evaluation forms
       presently being piloted.”

        As noted in the BOE report, more specific monitoring of candidate content
         knowledge has been included in revised field evaluation forms. When the
         team visited the new forms were being used for the first time. The results for
         the fall semester have been analyzed and show that on a scale from 1 to 5
         (1=Needs Intervention; 5=Exceeds Expectations), undergraduate candidates
         scored on average: i) sophomores 4.0; ii) juniors 4.3, and iii) seniors 4.6 for
         content knowledge. The same evaluation form is being this spring, so the
         beginning of aggregated date will emerge at the end of this semester.


     2. page 15 3rd paragraph: Report text “The evaluation form for candidates in the
        EPICS program mirrors that used at the initial undergraduate level. Only data
        from the past academic year was available.”

        Content knowledge and pedagogical content knowledge data using the new
         field evaluation forms is available for Fall, 2004 EPICS candidates. The
         EPICS candidates are using the same new measurement scale as the
         undergraduates 1-5 (1=Needs Intervention; 5=Exceeds Expectations) and the
         results for the fall 2004 semester indicate average scores of : i) Planning and
         Preparation, 4.7; ii) Classroom Environment, 4.8; iii) Instruction, 4.6, and iv)
         Professional Responsibilities, 4.7.


     3. page 19 4th paragraph: Report text “Other graduate programs, such as nursing,
        school psychology, speech pathology, educational media specialist, EPICS, and
                                                                    SHU Rejoinder 3


   the CEAS, have not used exit or alumni surveys. A goal for all of these
   programs is to have surveys ready for their classes graduating in May 2005.”

  All of the graduate programs cited above are in the process of creating surveys
   and will pilot them this May, 2005.


4. page 20, Advanced paragraph: Report text “Within EPICS, versions of a
   professional portfolio have been piloted but they are not as yet fully aligned
   with unit goals that have been established.”

  The EPICS program will have access to the E-portfolio that is in development
   for the undergraduates. It is a standards based portfolio and the roll-out to
   graduate candidates will be Fall, 2005.
   a. A flowchart showing the development, connections, and potential reports
       that can be generated from the E-portfolio is provided in Appendix A.


5. Overall assessment from BOE Report: (page 21) “The assessment
   system for the programs that lead to initial and advanced licensure are
   aligned with the conceptual framework as well as state and national
   standards. The unit has designed measures to monitor candidate
   performance and improve programs. All the programs employ multiple
   measures to assess candidate progress from admission into the College
   of Education and Human Service to licensure. However, the unit is in
   its first year of implementing its new assessment system and only a few
   of its elements were fully operational at the time of the visit. Therefore,
   specific data regarding candidates' knowledge, skills and dispositions at
   key transition points have not yet been regularly compiled,
   summarized, analyzed, or aggregated. The evidence to date does
   indicate that a concerted effort is being made to collect and analyze data
   systematically. To the unit’s credit, there is evidence that the results of
   the Praxis II exam are being used to inform program improvements.”

  The report notes that most data is from the past academic year and it was most
   often not aggregated. Work has continued in this area by (i) collecting course
   data (links to products and standards) for the Fall, 2004 semester which adds
   to the information available to the team during their visit, and (ii) revised
   evaluation forms used during the Fall semester add to the data available. Both
   the course data and field evaluation forms specifically reflect information such
   as pedagogical content knowledge, dispositions, student learning, and
   professional and pedagogical knowledge and skills. Specifically the teacher
   work sample used in the senior seminars provides information showing the
   collection of data and foundation for future aggregated work.
  Teacher Work Sample faculty feedback from Fall, 2005: Secondary
   education has been using a modified form of a teacher work sample over the
                                                                           SHU Rejoinder 4


           past three semesters and the feedback from the fall was similar to prior
           semesters: (i) candidates report difficulty in obtaining pre-assessment data
           while (ii) the overall project was completed without difficulties. Elementary
           and special education candidates used the teacher work sample for the first
           time in fall, 2005. Feedback from that faculty reflects the following: (i)
           candidates had difficulty applying the TWS to varied settings such as a
           multiply handicapped setting to a more general education setting. Faculty in
           these programs have agreed to examine the candidate feedback from the fall,
           as well as the products, to assess a TWS that fits the program goals and
           design.

Standard two:
       The BOE report comments related to standard two focuses on the developing
nature of our assessment system and related issues such as use of data and availability of
aggregated data.

     a. page 30, Overall Assessment of Standard: Report text “The assessment system
        for the programs that lead to initial and advanced licensure are aligned with the
        conceptual framework as well as state and national standards. The unit has
        designed measures to monitor candidate performance and improve programs.
        All the programs employ multiple measures to assess candidate program from
        admission into the College of Education and Human Services to licensure
        However, the unit is in its first year of implementing its new assessment system
        and only a few of its elements were fully operational at the time of the visit.
        Therefore, specific data regarding candidates’ knowledge, skills, and
        dispositions at key transition points have not been regularly compiled,
        summarized, analyzed, or aggregated. The evidence to date does indicate that a
        concerted effort is being made to collect and analyze data systematically. To the
        unit’s credit, there is evidence that the results of the Praxis II exam are being
        used to inform program improvements.”

The assessment system in the College of Education and Human Services is new. It was
designed through discussion with faculty, colleagues in arts and sciences, and community
partners over the past two years (2002-2004). Parts of the system have been piloted over
the past year with full application of all assessment tools beginning in Fall 2004 with the
current freshmen class.

The BOE report is correct when it reports that the unit is developing pilot surveys and
assessment tools, particularly in the area of our conceptual framework related to social
consciousness. Our first opportunity to apply the tools for the initial NCATE class under
our assessment system is fall, 2005 as a follow-up to an entrance survey they completed
regarding the opinions they bring to the university about ‘difference’ upon entrance. A
committee of faculty met in the fall and has regularly planned gatherings in the spring to
complete and pilot the role plays and case studies prior to their full application next year.
In addition, this committee is part of a department research council that has formed to
research and publish regarding efforts in support of our conceptual framework and
                                                                           SHU Rejoinder 5


assessment system. Minutes from the first research council meeting are provided to show
the range of topics being considered by small groups within the department of
Educational Studies. (Appendix B)

In the summary of standard two (page 30 of the BOE Report), the assessment system is
said to align with the conceptual framework and standards. Appendix C provides an
overview to the alignment of assessment tools, conceptual framework, INTASC
standards, when the tools are applied, and who is directly involved with the data.

At the time of the BOE visit in October 2004 and since, the ITPR database (Initial
Teacher Preparation Record) has been in continual development. The following
information for all currently enrolled undergraduates has been entered: (i) Majors-
Education and Liberal Art; (ii) entering SAT scores; (iii) entering High School GPA; (iv)
transition point GPA’s; (v) scores for field tests; (vi) completion of field experiences;
(vii) indication of retention issue with reference back to Associate Dean’s office for the
file due to confidential status; (viii) self-identified race; (ix) gender; (x) correlation of
high school GPA with transition GPA’s; (xi) reasons for leaving program. Appendix D
provides an example of this data for a small cohort of candidates. Information for
graduate students is being added

As the assessment system continues to ‘fill in’ with items like the Initial Teacher
Preparation Record (ITPR), transition I data, a second round of field quizzes, analysis of
field data, and retention committee records, program development has been influenced.
Specific examples are:

    Based on a retention committee issue (early January) it became clear that links to
     colleagues in EOP (Educational Opportunity Program) were not as strong as
     necessary. Contact was made with the office and communication needs were
     clarified. In addition, the names of EOP mentors for education candidates were
     provided to the Associate Dean of the College of Education and Human Services
     (confidentially issue prevents sharing with all faculty) so she can keep track of
     retention concerns and possible links to that office.
    Fall 2004 field evaluation forms were analyzed and data was shared with program
     directors. Candidates displaying difficulties in their professional responsibilities
     as well as overall performance were contacted. Seven students were asked to meet
     with their program director to talk about the evaluation form and establishing
     goals for the semester. This information is shared with the chairperson of the
     Retention Committee (Associate Dean) in case further issues arise. (Two of the
     seven also had low GPA’s and are now on probation from the College.)
     (Appendix E)
    A third example stems from the expanding ITPR. Questions by elementary and
     special education faculty (early January) about the abilities of some transfer
     students lead to a review of all transfer data from the ITPR. The information was
     available within a day of the request and allowed further conversation regarding
     policy development in support of these students based on data rather than
     individual cases. A university conversation regarding similar issues is occurring at
                                                                        SHU Rejoinder 6


     the same time. Minutes from a recent meeting at that level are included in
     Appendix F.
    A final example being added to the system is course data that will allow for the
     development of aggregated information across semesters. The Director of
     Assessment has been working with faculty (January 2005) to develop a template
     for organizing the data to allow for easy analysis across courses, product types,
     and unit goals. (Appendix G-sample course data format)

The E-portfolio was in development at the time of the visit and substantial progress has
been made on its structure, candidate input, faculty role, and report options. This work
has been in conjunction with colleagues in the TLTC (Teaching, Learning, Technology
Center) The E-portfolio will be housed in Blackboard (a system familiar to all candidates
and faculty at Seton Hall) and connected to the newly developed content system.
Appendix A provides visual representations of the information provided below.
        All syllabi contain written links between course products and associated
           standards. (Not all assignments in a course are linked to standards; therefore,
           distinctions between products and other learning activities will be noted in
           color-coded forms.)
        Candidates will receive, along with their regular rubrics for each assignment,
           a sheet from faculty indicating the readiness of the work for E-portfolio entry.
           If the work is well-developed as is, candidates receive an “X” for closed or
           completed, and if additional work is suggested an “O’ for open is given.
           (Comments about what to work on are provided.)
        Faculty, in each course, will help candidates enter their standard linked
           products into their E-portfolios toward the end of the semester. (Blackboard
           will create a summary sheet of what they have entered for all standards they
           are registered for—e.g. CEC, INTASC, and ACEI— while also automatically
           setting up a portfolio format.)
        ***At the end of each semester all products entered by candidates will have
           been reviewed by faculty in their courses and comments regarding their
           readiness offered. Reports from the E-portfolio database will be run at the end
           of each semester to indicate the range of standards addressed, areas where
           candidates have many open products that indicate more work is required, links
           to majors, courses, etc. The reports will be given to the Director of
           Assessment who will then share them with the Department Chairperson and
           Program Directors.
        During the senior seminar candidates pick the best examples from their
           standards portfolio to create a final version. The products will have been
           reviewed by faculty in the past, but the justifications for entry may be updated
           and involve peer as well as faculty review in the seminar.
        In addition to the standards link, the creation of reflective questions that
           address development across the program are in development. Candidates
           would be required to answer two to three reflective questions at various points
           during their program. The results would be used to examine the development
           of reflection as a tool, progression of knowledge across time, and links to
           developing social consciousness.
                                                              SHU Rejoinder 7


 **Freshmen candidates will be trained on the entry of data this semester and
  have it completed by April 30th (end of the semester). Our first E-portfolio
  report will be available in May, 2005.