Using Online Survey Software to Streamline ... - The RP Group by pengxuebo


									               Inge Bond, Research Analyst
Heidi Diamond, Faculty Accreditation Liaison
10.   Because we really don’t have enough paperwork to keep ourselves busy!

9.    Because when ACCJC says “Jump!” we say, “How high?”

8.    Because we’re tired of people thinking SLO is just an abbreviation for San Luis Obispo

7.    Because why should K-12 teachers have all the accountability fun?!

6.    Because program review is so last accreditation cycle

5.    Because creating assessment rubrics can be more fun than a trip to the dentist

4.    Because creating SLO maps is almost like mapping out your summer vacation in Europe

3.    Because writing SLOs is less confusing then figuring out who to blame for gas prices

2.    Because, by having a sense of humor when faced with challenging accountability
      standards, we can have a laugh, and proceed in the right direction.

1.    Because it’s important to move our assessment process forward so our conversations
      about improving student learning are meaningful - and isn’t that what it’s all about?
   National & State
    ◦ Push for greater accountability in higher education
    ◦ ACCJC deadlines for meeting SLO proficiency and
      SCQI levels approaching

   Locally
    ◦   No clear path for completing the SLO reporting task
    ◦   Clunky system
    ◦   Inconsistent compliance
    ◦   No dissemination/faculty support structure
   User friendly reporting system
    ◦ Low cost
    ◦ Supports both quantitative and qualitative reporting
    ◦ Supports authentic assessment
    ◦ Integrated with program review
    ◦ Accessible both on campus and off

   A tool in two parts
    ◦ SLO Tracker: faculty and staff reporting tool
    ◦ SLO Status Report: College data collection tool
   Demonstration: Business Department
   Demonstration: Student Activities and
    Campus Center
   Demonstration: Business Department

                                     Data for All Courses, Including                                                    Data for Courses NOT Including Inactive,
                                  Directed Studies and Special Topics                                                     Directed Studies and Special Topics

                                                                                                                       Courses with
                                                                                                                                                   % of Courses
                                                                                                                         Assessed     # of Non
                                                                                                                                                  Assessed (not
                                   Courses                   % of                                                        (does not    Directed
                                              Total # of                                                                                            including
                                  with SLOs                Courses                            Notes                       include     Studies,
                                              Courses                                                                                                Directed
                                  Assessed                 Assessed                                                       directed    Special
                                                                                                                                                   Studies and
                                                                                                                        studies and    Topics
                                                                                                                                                  Special Topics)

Applied Arts and Sciences                                      %
Administration of Justice             0           27          0.0%      2 Directed Studies/Special Topics                   0            25             0.0%
Architecture                         17           28         60.7%      2 Directed Studies/Special Topics                  17            26            65.4%
Child Studies                         4           38         10.5%      1 Special Topics                                    4            37            10.8%
Engineering                           1           31          3.2%      1 Special Topics                                    1            30             3.3%
Fashion Design                        8           25         32.0%      2 Directed Studies/Special Topics                   8            23            34.8%
Health Care Technology               8            23         34.8%      1 Special Topics                                    8            22            36.4%
Interior Design                       4           34         11.8%      1 Special Topics                                    4            33            12.1%
Paralegal                            10           15         66.7%      1 Directed Studies                                 10            14            71.4%
Park Management                      0            30          0.0%                                                          0            30             0.0%


Accounting                            5           6          83.3%      *1 course is ACCTG 100, infreq. Offered             5             5           100.0%

Business                             18           18        100.0%                                                         18            18           100.0%

                                                                        deactivated CA 017, 030M, 032C, 046D, CA051,
Computer Applications                12           18         66.7%      063C, 067, 073, 079, 081B, 091-093, 094A-C &       12            16            75.0%
                                                                        100; 2 Special Topics, Directed Studies
   Department/Program level dialogue about
    SLO Assessments with recommended changes
    and associated resource requests is
    summarized in the annual program review
    process (see next slide for an example)
   Effort led by Dean of Instruction, Faculty
    Liaison, Researcher

   Introduced to and approved by participatory
    governance bodies

   House calls to department chairs
    ◦ Ongoing customer support
      Remember, patience is a virtue!
   Each term, department chairs refer to their
    SLO Status Report to determine which courses
    need to be assessed – inform faculty
    ◦ Assessments performed
    ◦ Assessments reported

   Spring Semesters:
    ◦ Departmental discussion of findings, changes
      planned, needs identified
      Reported in program review
   Faculty resistance
    ◦ Union contract
    ◦ Informing and supporting associate faculty
    ◦ Centralization of teaching data

   Administrative Services Outcomes
    ◦ In development
    ◦ Variety among departments makes development of
      a common reporting tool challenging
   Technical issues: course deactivations, cross-
    disciplinary assessments (i.e. English
    assessing IS courses)
    ◦ Recommendation: ask each department which
      courses to include

   Finding the balance with a beta tool (author’s
    desire to upgrade vs user’s need for stability)
   SLO Evidence Binder
    ◦ Created for each department/program
    ◦ Assessment resources, URLs for SLO Tracker and
      SLO Status Report, rubrics, evidence
    ◦ Accrediting teams can refer to SLO evidence binders
   Assessment as an institutional norm

   56% course-level outcomes assessment (from
    20% one year ago)

   College wide discussions of how results
    inform decision making

   Faculty seeking integration of SLO
    assessment and program review
   Improve communication between SLO
    Committee/Division Chairs/Department
    ◦ Division demonstrations
    ◦ Data collection for annual ACCJC report

   Program/Degree Learning Outcomes
       Using Online Survey Software to Streamline Assessment Reporting and Documentation

                                 RP Group Conference: April 5, 2012

                       Inge Bond, Research Analyst (

             Heidi Diamond, Faculty Accreditation Liaison (


Sample SLO Tracker, Instructional:

Sample SLO Tracker, Student Services:

Sample SLO Status Report:

SLO Evidence Binder Contents:

      Department’s/program’s SLO Tracker URL (for distribution to faculty)
      Department’s/program’s SLO Tracker results URL (for department chair)
      Department’s/program’s SLO Status Report URL (for distribution to faculty)
      Department’s/program’s SLO Status Report results URL (for department chair)
      Text of student learning outcome and recommended assessment for each course in department
      Instructions for Faculty Regarding SLO Assessment & Reporting
      Instructions for Department Chairs Regarding SLO Reporting and Use of Results (departmental
       discussion template, program review summary instructions)
      ACCJC Rubric for Evaluating Institutional Effectiveness – Part III: Student Learning Outcomes
      Reference articles: authentic assessment, SLO Terminology Glossary (ASCCC)
      SLO Course and Program Assessment Schedule
      SLO Assessment Sampling Guidelines
      Example of Department Summary Report
      Rubrics and other assessment instruments, assessment data

To top