Template for CED Department, Program, and Office assessments and by gigi12


									                                  Template for CED Program Assessment & Evaluation System (PAES)

I.   Academic programs: Assessment of Candidate Performance (How are candidates doing?)
     [NCATE Standard #2, p. 21]
     NCATE       Stratgc
     standard,   plang                                                                                          Initiate                    Impleme
                                                                                                                          Planning Piloting
                                                                                                                planning                    nting(tab
     page, para. doc                                             ITEM                                           by (date)
                                                                                                                          (tab)     (tab)
     ('target')  item
                         A. One-page description and summary of program knowledge base, skills,
                         dispositions, and Student Learning Outcomes (SLOs) that are assessed at multiple
                         points before program completion including: entry, key points during program,
     2, p. 21    III.4
                         and exit. (e.g. taken from course outlines, syllabi, CED conceptual framework,
                         fieldwork reports, performance assessments, etc.) Recommendation: ½-p.
                         narrative followed by approx. 8 SLOs.
                         B. One page description and summary of how students are assessed at entry, key
     2, p. 21;           points during program progress, and at exit. (e.g. application process, benchmarks,
     p. 22, ¶ 1          papers, portfolios, syllabi, projects, theses, and comprehensive exams).
                         Assessements are explicitly tied to SLOs.
                         C. Copies of assessment instruments at entry, along the way, and at exit that
                         measure progress toward SLOs, including evidence of incorporation of
     2, p. 21    III.4   professional, state, and institutional standards. (e.g. tests, protocols, templates,
                         rubrics, etc.) If multiple assessments integrated into coursework, select key
                         assessments only.
                         D. Data displays showing (and analyses of) benchmarks of student progress
                         linked to SLOs, overall and broken out by significant demographic subgroups.
                         Min. points: Program entry; upon entry to culminating fieldwork; program exit
                         E. Decision rules and data points for determining candidate performance by
     2, p. 22,           candidates and faculty; what happens if progress is not satisfactory at key points
     ¶1                  during program progress (e.g. interventions, review of progress, recommendations
                         for improvement, assistance, etc.)
                         F. Documentation of formal candidate complaints and documentation of
                         resolutions. Complaints must be written and signed and comply with university
     2, p. 22,
                         grievance procedures. (Can include mention that informal complaints handled on
                         case by case basis.) University regulations can be found at

     p. 1                                                                             2/16/07
                                   Template for CED Program Assessment & Evaluation System (PAES)

II.   Academic programs: Assessment of Program Quality (How effective are programs?)
      [NCATE Standard #2, p. 22, ¶ 1]
      standard,                                                                                                 Initiate                    Impleme
                 planni                                                                                                   Planning Piloting
                 g doc                                           ITEM                                           planning
                                                                                                                by (date)
                                                                                                                          (tab)     (tab)
                          A. One page description and summary of how program evaluations are conducted
                 II.1 &
                          (nature of data collected; data collection schedule; types of analyses conducted).
                          If instruments/data overlap with student assessment, specify this.
                          B. Documentation of internal (grad review self-study--comments and responses--
      2, p. 21;           new program proposal/review, executive summaries of self-studies) and external
      p. 22, ¶ 1          (WASC external comments and response, CCTC approval of program changes--
                          e.g., 2042--NCATE external comments and responses) reviews.
      2, p. 22,           C. Evidence of our graduates' impact on students or clients, including during first
      ¶1                  years of practice.
                          D. Evidence of programs’/grads' impact on their professional communities. (e.g.,
                          school university partnerships)
                          E. Copies of both internal and external evaluation instruments for discerning
      2, p. 22,  II. 1 & quality of program (surveys, protocols, templates, rubrics, state licensure exams,
      ¶1         2        state “Program Completers” documentation, surveys of principals, evidence of
                          internal and external review of grants and school university partnerships etc.)
                 II. 1 & F. Data displays and analyses of program evaluation data, including number
                 2, 6     of respondents broken out by sig. demographic subgroups & response rates.
                          G1. Decisions and actions taken as a result of evaluation data analysis (e.g.
                          program improvement plan; course revisions; etc.)
      2, p. 21;
                          G2. Evaluation of decisions and actions (e.g. intended program enhancement
      p. 22, ¶ 2
                          achieved? No adverse consequences?)
                          G3. Continuous examination of validity and utility of evaluation process
      2, p. 22,           H. Evidence of regular and part time faculty review of data on their performance
      ¶2                  and plans for improvement

      p. 2                                                                            2/16/07

To top