Liberal Studies Department
California State University Stanislaus
(Revised June 2009)
Liberal Studies Department
What data was collected? How was data evaluated and recommendations made?
Native CSU Stanislaus Liberal Studies majors begin compiling personal portfolios as freshmen
in LIBS 1000: Beginning Field Experience. Additional portfolio work is submitted for
evaluation in LIBS 2000: Intermediate Field Experience and LIBS 3000(WP): Community and
Diversity. Summative assessment is completed in LIBS 4960: Senior Seminar, intended for the
final undergraduate term.
LIBS majors transferring from four feeder community colleges have the opportunity to develop
personal portfolios in articulated courses on those college campuses. Students transferring from
other institutions may not begin portfolio development process until the junior year or later.
Transfer LIBS majors’ portfolios may not be created until LIBS 3000: Community and Diversity
(WP). Final assessment occurs in LIBS 4960: Senior Seminar. Portfolio requirements and
assessment forms are included in this document.
The Liberal Studies Department Assessment Plan is based in large part on required portfolio
elements. For the past several years, LIBS faculty developed and refined student learning goals
and collaborated on data would provide evidence of each learning goal and the steps necessary to
collect and evaluate that data. The following discussion is a brief summary of LIBS faculty
In 2006-2007 Liberal Studies Department faculty examined how effectively our majors made
explicit connections between California K-8 Content Standards and subject matter knowledge in
LIBS major coursework. LIBS faculty unanimously agreed that eighty to eighty-five percent of
LIBS majors assessed would be assessed as competent in the Senior Seminar course. Faculty
agreed to collect data in sections of LIBS 4960: Senior Seminar where majors’ subject matter
competency is assessed. Faculty decided to use the LIBS Subject Matter Assessment Worksheet
and LIBS 4960 Rubric for Assessment, which are included in this document, as the basis for this
In spring 2007 as LIBS 4960 faculty assessed student subject matter competencies and recorded
scores on departmental forms while reserving a copy of each for review at the summer 2007
LIBS Department workshop. At the workshop, faculty members examined student scores on
subject matter assessments and their connections to California K-8 Content Standards. Faculty
sought to determine if Senior Seminar students accurately utilized specific subject matter
vocabulary to establish their comprehension of and connection with the State Content Standards.
It became evident that grades in subject matter courses did not correlate to competency scores.
To establish subject matter competency in State-mandated subject matter areas (Language and
Literature, History and Social Science, Mathematics, Science, Visual and Performing Arts, and
Human Development), each major portfolio included thirteen to twenty-two artifacts from
previous coursework. Each artifact required a written introduction explaining how it related to
the course taken and how it related to specific subject matter content standards. For each subject
matter requirement faculty recorded: course taken, grade received, and if applicable, community
college attended. LIBS faculty sought to answer the following:
Is there a correlation between grade received in the course and the competency score?
Do particular classes more often demonstrate competency than others?
Is there a correlation between the course taken and the competency score?
Originally, LIBS faculty sought to determine if conclusions regarding courses at feeder colleges
could be connected to subject matter competence. Upon completion of the review of subject
matter competence data, LIBS faculty agreed that there was no correlation between assessment
scores and specific classes, grades, community college campus, or any other element in this
process. For example, a LIBS major who earned a “C” grade in community college chemistry
successfully demonstrated “high” competence in the chemistry subject matter assessment.
The LIBS 4960 students, who were assessed as “minimally competent” in subject matter areas,
had the opportunity to improve their competence level by completing research projects and
presentations during the course. By the end of the spring 2007 term, all LIBS majors enrolled in
Senior Seminar successfully demonstrated subject matter competence in all subject matter areas.
Competence levels of LIBS majors significantly exceeded the expectations of LIBS faculty.
After reviewing data, LIBS faculty concluded that our majors enrolled in LIBS Senior Seminar
at the correct time, after subject matter coursework was complete. In addition, LIBS faculty
were in unanimous agreement that evidence of student learning improved as faculty members
communicated more precisely in explaining the goals of the assignment! As faculty explained
the importance of integrating discipline vocabulary in subject matter summaries and then
directed student reviews of subject matter standards in classes, student assessments improved.
Originally, LIBS faculty sought to determine if conclusions regarding courses at feeder colleges
could be connected to subject matter competence, i.e., does a course at one community college
more effectively prepare LIBS majors in a specific subject matter area than an equivalent course
at another college. Upon completion of the review of subject matter competence data, LIBS
faculty agreed that there is no correlation between assessment scores and specific classes, grades,
community college campus, or any other element considered. Faculty planned to re-evaluate
subject matter competence assessment data in 2007-2008 with emphasis on primary and
secondary assessments. Faculty agreed to modify the assessment of LIBS seniors to determine
how many were assessed as “competent” in their first submission of subject matter artifacts as
opposed to their final review. The modified LIBS assessment worksheet, for use in 2007-2008,
is included in this document.
The later review of subject matter competency determined that four of sixty students failed to
achieve competency. These failures might have resulted from failure to follow directions or
from the need for additional time to complete the assignment. The results were consistent with
earlier ones, clear instructions yield high scores!
In 2007-2008 LIBS faculty agreed to collect data on LIBS majors’ reflections on volunteer
service activities in K-8 classrooms. Faculty collected random portfolio artifacts regarding
observations in K-8 classrooms from each section of LIBS 1000, 1010, 2000, 2010, and 3000.
Artifacts were identified by class and code and divided into two groups: lower division (1000,
1010, 2000, and 2010) and upper division (3000). Distinctions of class levels were made since
assignments and faculty expectations varied by course level (lower division vs. upper division).
At the 2008 Summer Workshop, all LIBS full and part-time faculty participated in our
assessment activities to sustain consistency in curriculum and grading across department courses.
As in the previous year, LIBS faculty expectations of assessment outcomes of majors’ reflection
and analysis of fieldwork were at eighty to eighty-five percent.
LIBS faculty discussed possible criteria and the “signs”, or evidence, of each criterion.
Fieldwork reflection assignments varied by course and grade level, freshman versus junior, and
majors’ service learning notes had wide-ranging differences. After reviewing field work
assignments in K-8 classrooms, faculty reached consensus on evaluation criteria with the intent
of rating each lower division artifact on a 5-point scale as: “strong evidence of” (5 points), “some
evidence of” (3 points), or “no evidence of” (0 points) the following:
Field notes show attention to details of the experience.
Student reflects on experience beyond simple reporting.
Student includes connection to coursework in examination of experience.
LIBS faculty read a sample case study from the lower division group to establish congruity in
each score and the corresponding “signs” of each element. Over several hours, the group
reviewed a ten percent sample (sixteen lower division and twenty-two upper division) of field
work reflection papers and field notes. Each paper was evaluated by two faculty members.
Efforts were made to assure that faculty members did not assess their own students’ work. When
faculty scores were the same, the score was entered. If the faculty scores varied, a new score
between the two was assigned.
LIBS faculty concluded: that student field notes at all levels demonstrated attention to detail in
reflections of field work, that upper and lower division students reflected on their field
experiences beyond simple reporting, and that all papers included connections to the LIBS
coursework in the discussion of the experience. The median scores for each element of each
lower division and upper division group were 4.0 on a 5-point scale. Our majors have
thoughtfully and critically examined their volunteer service activities in California K-8
classrooms and majors’ portfolios included personal evidence of such. In addition, the LIBS
faculty’ expectations of successful assessment were exceeded by LIBS majors.
At the same workshop, LIBS faculty discussed potential learning goals for review in 2008-2009.
The group agreed to examine the success of LIBS majors in their demonstration of educational
technologies relevant to teaching and learning.
In fall, winter, and spring 2008-2009, LIBS 4960: Senior Seminar faculty agreed to collect a ten
percent sample of portfolio artifacts submitted by LIBS majors as evidence of Portfolio Tab 6:
Technology. Artifacts from LIBS 1000, LIBS 1010 and LIBS 4960 are traditionally included in
the technology section of the LIBS portfolio.
In the June 2009 workshop, all full time and part-time LIBS faculty agreed to review the LIBS
learning goal on demonstration of educational technologies relevant to teaching and learning. At
the outset of the assessment session, faculty unanimously agreed that fifteen percent of the
majors would not achieve expected results as they projected an eighty to eighty-five percent
success rate with this learning goal.
Two copies of each student technology artifact were coded in sequence (by class and term) and
separated for evaluation. Each artifact was assessed two times by different faculty, making
certain that faculty did not assess their own students. Each artifact was identified by the type(s)
of technology employed, i.e., word document, spreadsheet, computer-generated chart, computer-
generated pictures, power point, web page, etc. In each case, artifacts were rated as: “strong
evidence,” “some evidence,” or “no evidence” of technology relevant to teaching and learning.
When faculty evaluations were complete, artifacts were returned to the original numerical
sequence and the two scores for each were compared. If the two scores were identical, the item
received that score. If scores varied, a new score between the two original scores was identified.
Of the twenty examples reviewed, sixteen (80%) were rated as “strong evidence,” three (15%)
were rated between “strong evidence” and “some evidence,” one (5%) was rated as “some
evidence,” and no artifact received a score of “no evidence.”
Multiple examples of different educational technologies were reviewed:
Power point 11
Word document 6
Computer-generated chart 4
Web resources list 3
Publishing software 2
It is clearly evident that LIBS majors demonstrate computer-based technologies relevant to
teaching and learning and LIBS majors include representative samples in their personal
Considerable discussion followed as faculty considered a 2009-2010 learning goal for review at
the summer 2010 LIBS Workshop. Faculty achieved consensus on the following goal:
demonstrate sensitivity to the diverse cultural, linguistic, and learning abilities of students, and
understanding of a variety of teaching strategies to effectively teach all learners. Because of the
complex nature of this learning goal, faculty agreed that successful assessment of this goal will
require the following: (1) modification of the LIBS (fieldwork) Performance Appraisal Form to
include an additional category on “sensitivity to diverse abilities of students,” and (2) to add
question(s) to LIBS 4960 Exit Survey regarding understanding of various teaching strategies.