central new mexico community college

Document Sample
central new mexico community college Powered By Docstoc
					    Central New Mexico Community College
       School of Applied Technologies

            Executive Summary


                               Table of Contents

Part I: Applied Technologies Student Academic Achievement

A. Executive Summary                                                     pg.3-6
B. CNM Student Academic Achievement Philosophy/Mission                    pg. 7
C. Listing of Programs                                                    pg. 8
D. Model of CNM Student Academic Achievement Assessment Plan             pg. 10
E. Background and History of Student Academic Achievement Initiative      pg. 11
F. Roles and Responsibilities for Academic Achievement Activities         pg. 12
G. Reporting System for Communicating Results                             pg.13

Part II: Academic Achievement Assessment Results

A. Assessment Histogram Results for 2006 – 2007 Reports                pg.14 -18
B. Improvement Goals for 2006 - 2007                                     pg. 18
C. Improvement Goals 2006 - 2007 Results                                 pg. 19
D. Improvement Goals for 2007-2008 Assessment Cycle                      pg. 19
E. Report Assessment Rubric                                              pg. 20
F. Programs Comprising Histogram Data                                    pg. 21

Part III: Student Academic Achievement Program Reports

Individual program reports may be viewed on the Applied Technologies Division

 Part I: Applied Technologies Student Academic Achievement Profile

A. Executive Summary

       Academic assessment in Applied Technologies during 2006-2007 was
characterized by the further growth in expertise of established assessment efforts
and solid achievement from some programs that formerly had weaker efforts.
The “new” CNM School of Applied Technologies combined two former TVI
Divisions. After some initial confusion and uncertainty regarding different
administrative and academic cultures, faculty and staff have worked to adapt and
learn the other’s traditions and style. A new Strategic Plan for the School was
created through a collaborative process during the summer and fall of 2007. As
a result efforts to implement the plan and create a strong amalgam to serve
community and students are underway. This report contains summative
outcomes data collected for the academic year (2006-2007) by the CNM School
of Applied Technologies.

                               Meta Assessment Efforts

        The Student Academic Achievement/Curriculum team of the former TVI
Trades Department developed an assessment rubric for academic achievement
program reports that was used to guide and measure the quality of each
reporting program’s assessment efforts starting in the 2001-2002 academic year.
The current CNM School of Applied Technologies adopted essentially the same
rubric. The rubric allows the aggregated results to be quantitatively expressed
and generates a school-wide database. This technique has been emerging
nationally and is known as meta-assessment. Meta-assessment is used to
assess progress as a School in assessing learning outcomes. CNM is currently
using a similar rubric to assess efforts across the college.

       Starting in 2005-2006 an additional line was added to the rubric. CNM
has been working towards the measurement of “core competencies”. Core
competencies are defined as the skills, knowledge, and attitudes exhibited by
every CNM degree graduate. Exit competencies are the specific summative
learning outcomes unique to each program. Individual programs are charged
with measuring both their exit competencies and CNM core competencies for
AAS degree students. Certificate programs measure exit competencies. The
revised rubric provides a means of capturing the School of Applied Technologies
programs’ efforts in the assessment of CNM’s core competencies.

        The Academic Achievement Chair uses the rubric to assess each
participating program’s report. The aggregate results are displayed in this report
as histograms. This data provides the information to target improvements for the
coming year. It also creates the study component in the PDSA cycle. In the

words of the rubric itself, it means that future changes can be, “purposeful
improvements based on multiple data sources.”

       Work Keys continues as one tool that can be used to assess portions of
program exit competencies and core competencies. Several programs included
data from Work Keys in their reports during this assessment cycle. One
continuing challenge is synchronicity between Work Keys scores, program exit
competencies, and college core competencies. The exact relationship between
the three summative data sets remains unresolved.

                               Assessment Effort History

        The assessment of program exit competencies and CNM core
competencies were the biggest challenges for the 2004-2005 year. The Student
Academic Achievement Committee (SAAC) which created the degree
competencies designed rubrics for some of the institute-wide degree
competencies and continues working on others. SAAC asked programs to
collect data from degree graduates using their newly finished communication
rubric during the 2004-2005 academic cycle.

      The request for AAS graduate preliminary communication rubric data
created an impetus for change. Few if any programs that grant AAS degrees
had a mechanism in place to collect data from degree graduates. Most faculty
working within a technical program exhibited little ownership of AAS degrees.
The value added by additional Arts and Science course work was universally
regarded as something that happened out of the School and not within the
purview of technical faculty.

         After much discussion and encouragement a capstone course solution
was adopted by several programs. Students entering in the fall of 2005 were
bound by this requirement. The one-hour capstone course provides students
with a facilitated opportunity to prepare a professional portfolio of their best work
that illustrates both certificate exit competencies and CNM core competencies.
The portfolio will also be used by the degree-granting discipline as the foundation
for core competency assessment.

       Just one program, Automotive Technology used the communications
rubric during 2004-2005 to assess students in their final automotive certificate
course. This information will be used as base line data to capture the value
added by the completion of an AAS degree.

       The 2005-2006 cycle by contrast, found many more programs involved in
preliminary efforts to measure core competencies. Nine of fourteen programs
(64%) actually assessed at least one core competency. For the 2005-2006
academic cycle, SAAC asked for data to be collected using provided rubrics on
two core competencies; communication and team work. The information was

collected in last semester technical programs leading to a program certificate.
Two programs; carpentry and welding piloted initial degree capstone courses.
The capstone course enrollment increased starting with the 2006-2007 year, as
more students approach graduation and needed to meet the requirement.

        Program exit competency measurement for 2005-2006 showed a
bifurcation of results. Programs from the former Trades department were already
doing summative assessment of certificate level graduates. Programs in the
former Technologies department were not doing certificate level assessment.
Exit competencies had been written but assessment efforts had floundered. As a
result these programs were faced with being asked to implement both exit and
core competency assessment in a single year. Universally they chose core
competency assessment. This accounted for the good compliance as a School
in core competency assessment but the seemingly weaker effort in exit
competency assessment. A four-meeting summative assessment class taught
by the Academic Chair was offered again during the 2006-2007 academic year.
Program Chairs struggling with exit competency measurement were urged to

                        2006-2007 Assessment Efforts

        The 2006-2007 academic cycle was an important year for the School.
The college adopted a two-year course catalogue. There is a lag between
requirement adoption and actual compliance since students are bound by the
degree requirements from the catalogue in place at the time of their enrollment.
Because of this lag and the new two-year catalogue, programs that did not add a
capstone course for future degree graduate assessment (or designate some
other course as a last semester requirement before graduation), are challenged
in their future summative assessment efforts.

       During 2006-2007 several programs that had established sound
assessment plans were successful in their implementation. Electronics
Technology welcomed a new Chair that prepared the first detailed assessment
plan for the program. Other programs actively experimented and tried varied
summative assessment methods. The Geographic Information Technology (GIT)
program completed its first year with a full-time faculty member who also served
as Chair. The new GIT Chair completed training in summative assessment,
worked with the community advisory board to create program outcomes and
measurement rubrics, and actually pilot tested one of the rubrics.

      Manufacturing Technology piloted WorkKeys as an assessment tool and
decided against further use. In contrast, Construction Management
Technologies successfully used WorkKeys as one part of the rubric designed to
measure their program exit competency involving communication, teamwork, and
problem solving. Several programs have incorporated industry-based end of

program exams. These exams provide external exam-based subject area
assessment that allows graduate score comparison to national norms. For
example, Air Conditioning Heating and Refrigeration uses ICE exams and
Automotive, NATEF exams. Although exams of this type lack authenticity, they
do provide one method of triangulation to strengthen data bases. Industry and
community-based assessment of student performance combined with internal
classroom and external exams remains the most powerful means of reliable,
valid, and authentic data. Several programs are including industry input via job
shadow, coop, industry visits, industry sponsored competitions, and industry
comment on student projects. Continued experimentation to find program-
appropriate assessment tools should be anticipated in the coming year. The
school’s Academic Achievement and Curriculum Team will be developing a data
base of best practices regarding faculty and student authentic involvement with
industry. The overall goal identified in the new Strategic Plan, is a School with
every student and faculty member authentically connected to industry.

B. Applied Technologies Academic Achievement Philosophy/Mission

       The School of Applied Technologies faculty view students as lifelong
learners. To that end we are driven to understand student goals, develop a plan
that will guide them to the successful achievement of those goals and create
positive, value added experiences that will move them to continually seek to
improve their professional skills and knowledge.

      The School of Applied Technologies Academic Achievement/Curriculum
Team supports the preparation of individuals for challenging positions in the
community workforce by assisting faculty in developing, implementing,
monitoring, and evaluating systems to measure and improve student learning

C. Listing of Programs

       The School of Applied Technologies supports twenty two career and
technical education programs (many with separate skill set certificates), seven
apprenticeship programs, twelve Associate of Applied Science degree programs,
various student organizations and a thriving cooperative education program.

      Certificate Programs (22)

             Airframe Maintenance Technician
             Architectural/Engineering Drafting Technology
             Air Conditioning, Heating, and Refrigeration
             Automotive Technology
             Computer Animation
             Diesel Equipment Technology
             Electrical Trades
             Electronics Technology
             Film Crew Technician
             Geographic Information Technology
             Geomatics Technology
             Machine Tool Technology
             Manufacturing Technology
             Photonics Technology
             Powerplant Maintenance Technician
             Professional Pilot and Flight Instruction
             Residential Wiring
             Truck Driving

Associate of Applied Science Degree Programs (14)

             Aerospace Technology
             Architectural/Engineering Drafting Technology
             Aviation Technology (Pending FAA Approval)
             Computing Technology (Animation)
             Construction Management Technology
             Construction Technology
             Engineering Design Technology
             Electronics Technology
             Geographic Information Technology
             Manufacturing Technology

Listing Continued

             Mechanical Technology
             Metals Technology
             Photonics Technology
             Transportation Technology

Apprenticeship Programs (7)

         Commercial Carpentry     Electrical Trades
         General Trades           Industrial Plant
         Iron workers             Plumbing
         Sheet Metal

D. Applied Technologies Academic Achievement Assessment Plan

     1. CNM core competencies and program exit competencies are identified
     in a collaborative effort between faculty, advisory committee (community
     and/or industry) representatives, recognized program models, students
     and other associated stakeholders.

     2. Measurement criteria for the exit competencies and CNM core
     competencies are established and assessment rubrics are designed or

     3. Program/Course documentation (syllabi, lesson plans, catalog copy,
     etc.) is designed around the competencies.

     4. The program is delivered.

     5. Student exit competencies are measured. (Certificate)

     6. CNM core competencies are measured. (Degree)

     7. Outcomes are evaluated.

     8. Curriculum is revised using the summative data.

     9. The process begins anew with the next year’s academic cycle.

E. History of Student Academic Achievement Assessment

            The Academic Achievement/Curriculum Team meets several times each
     term and practices the principles of CQI (Continuous Quality Improvement) by
     encouraging input from Faculty, Staff, and Students. The team is composed of
     Program Chairs and interested faculty. The team has developed a mission,
     vision, goals and objectives statement for SAA congruent with that of the
     College, drafted measurement guidelines for each of the programs within the
     Division, established assessment criteria, scrutinized Skill Standards and
     National Occupational Testing instruments for applicability, scheduled the
     implementation and oversees practice of the assessment program, and collected
     and processed data for inclusion into the Annual Applied Technologies Academic
     Achievement Report.

             The team has developed a rubric that is used school-wide to assess each
     degree or program’s assessment efforts. Every program continues to refine
     rubrics to measure each of the five program exit competencies during the
     academic year. Several programs have implemented Work Keys as part of their
     assessment plan. CNM core competency measurement is being implemented
     with more programs and degrees participating each year.

            Faculty is trained to identify where each exit competency is taught in their
     program. Programs outcomes are used by each instructor to modify individual
     course curriculum. A CNM course for faculty (SAA 101) is offered to assist
     faculty in the process. The course has acted as a cultural change agent.

F. Roles and Responsibilities for Student Academic Achievement
Assessment Activities

       Faculty is the driving force behind the creation, delivery, monitoring and
evaluation of the Division degrees and programs. Administration provides
guidance as necessary to assure Institute/School goals and objectives are being
met. Staff provides support services. Students and community (Advisory Teams)
provide valued input. The SAA/Curriculum Team Membership is composed of
program Chairs. Program Chairs write the actual Academic Achievement report
with help and input from discipline faculty. Current program Chairs are listed
            School of Applied Technologies Program Chairs

   1. Alain Archuleta-Skills USA
   2. Ernest Arko-Electrical Trades
   3. Amy Ballard-GIT/Geomatics
   4. Paul Baxter-Truck Driving
   5. Gordon Bennett-Photonics
   6. Paul Brownlow-Carpentry
   7. Phyllis Cece-Architectural Engineering Drafting
   8. Ed Fotouhie-Engineering Design Technology
   9. Vardis Gaus-Truck Driving
   10. James Gore-Automotive & Diesel Equipment
   11. Ron Hackney-Welding
   12. Gordon Hall-Architectural Engineering Drafting
   13. Scott Henriksen-Curriculum Team/Academic Chair
   14. Andrew Huertaz-Electronics Technology
   15. Darrell Leland-Animation
   16. Fabian Lopez-Manufacturing Technology
   17. Jason Manzanares-Aviation
   18. Antonio Olguin-Plumbing
   19. Larry Quiggle-Air Conditioning Heating and Refrigeration
   20. David Ruff-Construction Management
   21. Wayne Woody-Machine Tool Technology
   22. Paul Zalesak-Landscaping

   G. Reporting System for Communicating Results

     The Applied Technologies SAA/Curriculum Team members communicate to
appropriate stakeholders via. cluster meetings, staff meetings, division meetings,
other team/committee meetings, community advisory committee meetings,
student activities group meetings, etc.

     The SAA/Curriculum Chair/s communicate/s with faculty and staff regularly
through E-mail, at Division meetings, and individual program cluster meetings.
The Chair presented results at the North American Council of Automotive
Teachers conference held in Edmonton, Alberta in July, 2006 and The New
Mexico Higher Education Assessment and Retention Conference in March, 2007.
The Division was honored to receive an outstanding assessment initiative award
from the National Council of Instructional Administrators (NCIA) in 2004.

     Student Academic Achievement information is published in the annual
Applied Technologies Assessment Report. The executive report and individual
program reports are published on the CNM Applied Technologies web site and
the Division’s internal K drive. The information is also shared on a college-wide
basis by distributing printed copies to all of CNM’s Academic Achievement
Chairs. The reports are presented annually to the College Leadership Team.
Aggregate data is combined with other CNM Schools and published on the CNM
web site.

          Part II: Academic Achievement Assessment Results

A. Assessment Histograms
        Each histogram or chart presented represents one criteria measured by
the division academic achievement report assessment rubric. The members of
the Academic Achievement Team chose the specific criteria. Unclear or
unreported sections in some reports mean the totals for each histogram may be
slightly different. The rubric used to collect the data for the displayed histograms
appears at the end of this section.

                                         CNM School of Applied Technologies
                                              2005-2006 and 2006-2007
                                      Academic Achievement Reports Assessment

                                                      Exit Competencies Measured
        Number of Programs

                                 8                                                        2005-2006 (15)
                                 4                                                        2006-2007 (17)
                                             five      four    three     one or    zero
                                                  Number of Competencies Measured
                                                      (N = 17 for all histograms)

      This chart shows the number of program exit competencies actually
 being assessed by the reporting programs. Five competencies are normative.

                                                    Multiple Measures Over Time
                 Number of Programs

                                              2                                            2006-2007



                                                        t ic











         This chart shows the number of programs assessing with multiple
measures. The use of multiple assessment measures over time increases data

                                                         Summative Data Sources


                        Number of Programs   8

                                             6                                            2005-2006
                                             4                                            2006-2007


                                                        Internal and     Internal Data
                                                       External Data

       This chart shows the source for assessment data reported by programs.
Internal data means data gathered by the program, usually in the classroom.
External data means data from sources outside of CNM. Employer surveys of
student skills during a shadow or coop experience, or an accrediting agency
licensing exam are typical external data sources. External data sources increase
confidence in outcomes measurement.

                                                          Data Summary Quality
          Number of Programs

                                                   5                                     2005-2006
                                                   4                                     2006-2007




                                                         t ic











        This chart compares the quality of report data summaries. Programs to
the left had more complete information in their data summaries. See the
reporting rubric for actual category definitions.

                                  Quality of Curriculum Improvements

          Number of Programs
                                       5                                      2005-2006
                                       3                                      2006-2007




                                                 t ic









       This chart compares the quality of reported curriculum improvements.
Programs to the left instituted targeted data-based improvements. Programs to
the right reported no improvements or anecdotally-based improvements. See the
reporting rubric for actual category definitions.

                                        Applied Technologies 2006-2007
                                      CNM Core Competencies Measurement*

2-3 Core                         2 Core       Limited Core Plans made   No Evidence
Competencies                     Competencies Competency   for          Of
Measured.                        Measured.    Measurement. Measurement. Measurement.
Three Terms.                     1-2 Terms

      2                                    2             6                4               1

                                                                                          *N = 15

                    CNM Applied Technologies Division
                  Core Competency Assessment 2005-2006

CNM Core        Two or three    One or two      Limited     Planning for   No evidence
Competency        CNM Core      CNM Core         Core           Core       or plans for
Measurement*    Competencies   Competencies   Competency    Competency         core
                assessed for   assessed for   assessment.   assessment.    competency
Degree or        three terms    one or two                                 assessment.
Certificate      of academic      terms.
Level               cycle.

* May include
Work Keys
                     4              5             0              5              0
with Core

B. Improvement Goals for 2006-2007

       Increase reporting participation.

       Increase the number of programs reporting exit competency outcomes.

       Improve report format compliance.

       Increase the number of programs with certificate/ degree capstone

C. Improvement Goal Results 2006-2007

     Programs reporting increased from 15 to 17.

     Programs reporting measurement of all five exit competencies increased
     from 8 to 10.

     Programs in the highest category of data summary quality increased from
     3 to 7.

     Programs reporting at least some core competency measurement
     increased from 9 to 10.

D. Improvement Goals 2007-2008

     Start implementation of School Strategic Plan by identifying best practices
     for authentic industry involvement by faculty and students.

     Offer training for new faculty in summative assessment techniques to
     improve data collection and facilitate curriculum improvement.

E. Report Assessment Rubric

              Applied Technologies Academic Improvement Plan Rubric
        For ______________________________ Year ___________________

Use to Assess
Your Academic            4               3               2               1
Improvement        Distinguished    Accomplished     Apprentice        Novice
  Measures all
                                                                    Measures 1 or    No learning
  five program      Measures 5       Measures 4      Measures 3
                                                                         2            outcomes
       exit         outcomes.        outcomes.       outcomes.
                                                                     outcomes.       measured.

 Uses multiple         Many             A few        A couple of                         No
                                                                    One measure
 measures over       measures         measures      measures over                     measures
                                                                     over time.
     time.           over time.       over time.        time.                         over time.

   Uses both         Uses both
                                    Uses one data
  internal and      internal and
 external data.    external data.

                     Tells how,
                                    Tells how,        Tells who,
Has a complete       who, what,                                     Tells what and     No data
                                    who, when,        what, and
data summary.          when,                                             when.        summary.
                                     and what.          when.
                      and why.
                   improvements.      Simple        Improvements.                         No
 Improvements                                                       Improvements.
                     Based on     improvements.       Anecdotal                      improvement
  to curriculum.                                                       No data.
                    multiple data   Some data.          data.                          changes.

F. Programs Comprising Histogram Data. (2006 – 2007)

          1. Air Conditioning Heating and Refrigeration
          2. Architectural Engineering Drafting Technologies
          3. Automotive Technology
          4. Carpentry
          5. Computer Animation
          6. Construction Management Technologies
          7. Diesel Equipment Technology
          8. Electrical Trades/Residential Wiring
          9. Electronics Technology
          10. Geographic Information Technologies
          11. Landscaping
          12. Machine Tool
          13. Manufacturing Technology
          14. Photonics Technology
          15. Plumbing
          16. Truck Driving
          17. Welding

                                 Programs Not Reporting

          1. Aviation (all)
          2. Film Crew Technician (new)
          3. Engineering Design Technology (discontinued)

Shared By: