General Education Assessment Guideline1
George Mason University
Revised on August 24, 2009
Mason’s culture of learning outcomes assessment is characterized as course-embedded, faculty-led, and
improvement-focused. As with other program-level assessment activities (e.g., writing, quantitative
reasoning, critical thinking, etc.), the assessment of the general education program at Mason is not an
evaluation of individual students or faculty members. It focuses on the overall effectiveness of the
general education program.
Mason’s General Education program comprises twelve areas of study, and each area is referred to as a
“category” throughout this guideline. The assessment follows four major steps (see Figure One):
1. Defining Common Learning Outcomes: for each general education category, there should be a
set of common learning outcomes across all courses regardless of the discipline. The
assessment focuses on two questions: to what extent faculty address these learning outcomes
in their courses and how well students achieve these outcomes.
2. Assessing Learning Outcomes and Collecting Data: all assessment is embedded in the course.
Faculty members or course coordinators provide evidence of course content and pedagogy, and
collect and submit samples of student work, in a process detailed later in this guide.
3. Analyzing and Reviewing Assessment Data: faculty teams develop review criteria and standards
and conduct the review; the Office of Institutional Assessment provides assistance with data
analysis. The results are shared with the General Education Committee and the faculty who
participate in the assessment. The aggregated results are reported to the State Council of
Higher Education for Virginia (if required) and are for the SACS re-affirmation of accreditation.
No individual faculty results are made public.
4. Implementing Curricular Improvement: The ultimate goal for the general education assessment
is to use data to identify the strengths and weakness of the program and plan for curricular
improvement. For example, faculty may discover ways to modify existing course content, tests
or assignments to better align the outcomes of the tests/assignments with the common learning
outcomes for the category.
Faculty suggestions and recommendations regarding this Guideline are welcome. Contact Rick Davis
(firstname.lastname@example.org), Karen Gentemann (email@example.com), or Ying Zhou (firstname.lastname@example.org).
Figure One: Learning Outcomes Assessment Cycle
Step One: Specifying Step Two: Assessing
Common Learning Learning Outcomes
Step Four: Step Three: Reviewing
Implementing Assessment Data
Targeted General Education Categories
The State Council of Higher Education for Virginia (SCHEV) requires all state institutions to assess
student learning in six areas:
Five of these areas have a corresponding category in Mason’s general education program and the sixth
area, critical thinking, is addressed in synthesis courses. Faculty committees have identified explicit
learning outcomes for these areas and the assessment is embedded in relevant general education
courses and carried out cyclically.
This General Education Assessment Guideline applies to the remaining general education categories:
Social and behavioral sciences
Specifying General Education Learning Outcomes
The first step in conducting learning outcomes assessment is to specify expectations for student
learning. Learning outcomes (also called “objectives” or “goals”) are the knowledge, skills, attitudes,
and habits of mind that students take with them from a learning experience (Suskie, 2007).
Interdisciplinary faculty groups identify common learning outcomes for each targeted category across
courses/disciplines. Once these learning outcomes are approved by the General Education Committee,
they become the basis for learning outcomes assessment. Faculty who teach general education courses
are expected to include a majority of these learning outcomes in their syllabi, in addition to their course
specific learning outcomes. The assessment process focuses on demonstrating that these common
learning outcomes are emphasized and that student learning in these areas is assessed in the course.
Assessment Method: Electronic Course Portfolios
Mason’s assessment approach is adapted from a successful Course Portfolio System developed by the
College of William and Mary and recently used as part of a SACS reaccreditation process. Course
portfolios serve three purposes:
1. General Education Assessment: Taken together across courses and categories, the portfolios
provide a clear picture of the overall effectiveness of the General Education program at Mason.
2. Learning Outcomes Assessment: The portfolios provide direct evidence of student achievement
in general education learning outcomes through samples of student work.
3. Course Review: The portfolios provide the most updated information about what is being taught
in the course, what kind of learning experiences are offered to students, and what assessment
strategies faculty use.
Course portfolios will NOT be used to evaluate faculty teaching, review individual faculty, or assess
individual students. In preparation for the SACS reaccreditation review, the targeted general education
categories are assessed in two academic years: 2008-09 (arts, literature and western civilization) and
2009-10 (global understanding, social and behavioral sciences, and synthesis). After 2010, these
categories will be assessed on a six-year cycle reviewing one category each year.
Who will Submit a Course Portfolio?
When a general education category is under review, all faculty (regardless of appointment status, full-
time or adjunct) who teach an approved course or a section of the course during fall and spring
semesters may be required to submit a course portfolio. For courses with numerous sections, the Office
of Institutional Assessment samples about 15% of the sections. For faculty who teach the same course
in both fall and spring semesters, only one course portfolio is required. Faculty who teach multiple
sections of a course in a given semester submit a portfolio for one section. Some faculty members (9-
month term faculty, adjunct, teaching assistant and post-doc) are eligible for a modest honorarium for
completing a portfolio.
Who will Review Course Portfolios?
Faculty Review Teams are appointed by the General Education Committee to conduct portfolio reviews
for each general education category. Reviewers receive a modest stipend in recognition of their work.
The reviewers are faculty who either teach/coordinate general education courses or serve on the
General Education Committee. All review teams have faculty members from different disciplines,
including both an expert and a non-expert in the field(s) under review. They use a common, category-
specific rubric to rate whether a course adequately meets the general education learning goals through
the resources/information provided in the course portfolio. Portfolio review rubrics for each category
are posted on the General Education Assessment website.
Major Components of a Course Portfolio
An electronic course portfolio has the following components:
Samples of Student Work
Student Survey Results
Review Team Ratings
1. Course/Section Syllabus
Faculty are encouraged to include in their syllabi the selected learning outcomes from the general
education category in addition to course specific learning outcomes.
2. Faculty Reflection
A 1-2 page narrative, focusing on the following questions:
a. Which general education learning outcomes were addressed in the course? How were they
b. How did the faculty assess these learning outcomes?
c. How well did the students achieve these general education learning outcomes, i.e.,
approximately what percentage of students were judged to be highly competent, competent, or
less than competent according to the faculty member’s own criteria? (This is NOT a description
of the overall grades, but rather an opportunity to share observations on student achievement
of specific learning outcomes.)
d. Optional: if the faculty member teaches the course again, are any changes being contemplated?
3. Course Assignments/Projects
Faculty submit a maximum of 3 assignments/projects they have created and used in class to address
or assess the general education learning outcomes for the category. The assignments/projects may take
different forms, e.g., individual/group papers, lab reports, websites developed by students,
presentations, performances, videos, essay/short answer exams, multiple-choice exams, etc. Along with
these, faculty are also required to submit:
a. Instructions provided to students, if not included with the assignment
b. Grading guides, checklists or rubrics (if applicable) (NOT answer keys for the exams)
c. A short description mapping these assignments to the corresponding learning outcome(s) for
4. Samples of Student Work
From among the assignments/projects above, faculty select ONE assignment and provide student work
Faculty provide the actual work of 3-6 randomly selected students from the course/section.
The Office of Institutional Assessment does the random sampling and provides faculty with a list
early in the semester. If more than one randomly selected student drops out during the
semester, a replacement sample will be provided.
For each student work provided, faculty specify the level of achievement: highly competent,
competent, or less than competent. If no “highly competent” work was randomly selected,
faculty can provide an additional sample showing high competence.
Faculty are also encouraged to provide additional comments about the assignment or the
student work sample that may help with the review process.
Written work (i.e., papers, lab reports, essay/short answer exams, etc.) is highly encouraged.
Both electronic and hard copies are acceptable.
For student products accessible via the Internet: faculty members only need to provide URLs
and specify the competence level for each.
For student presentations or performance: faculty can provide PowerPoint files, videos or audio
samples. Technical assistance with editing and uploading media is available upon request.
If multiple-choice tests are the only method used in class, faculty should identify which
questions address the general education learning outcomes and provide a grade distribution of
the test based on the entire class (not only the randomly selected students).
5. Student Survey
At the end of the semester, all students enrolled in the general education category under review are
surveyed by the Office of Institutional Assessment. The survey includes questions about course
experiences directly related to the general education learning outcomes. The survey results provide
indirect measurement of student learning in the course and are reviewed by the General Education
Committee. Faculty members are encouraged to remind students of the importance of participation in
6. Portfolio Review Ratings
A faculty team, appointed by the General Education Committee, reviews each course portfolio using a
common review rubric. At the end, the review team assigns a single rating for each course/section. The
rating reflects the extent to which the course provides the learning experiences that result in specified
student learning outcomes. It should be reemphasized that this review is NOT an assessment of the
course as a whole or the instructor, but only the course’s demonstrated alignment with the general
education category in which it resides. Portfolio review results are sent to the General Education
Committee, individual faculty, and the appropriate department chairs.
What Happens after the Portfolio Review?
The reviewers’ ratings are used in the following ways:
by the General Education committee to verify that a majority of the goals and learning
outcomes related to gen ed are being met.
by the SACS compliance team as evidence for reaffirmation of accreditation.
by department chairs to improve the effectiveness of gen ed courses.
by the General Education Committee, the Office of Institutional Assessment and the Associate
Provost for Undergraduate Education to develop recommendations for the gen ed program as a
The ratings will NOT be used, by faculty, chairs, deans, or the provost’s office, as elements in merit pay
or reappointment, promotion, and tenure dossiers. They relate specifically to the alignment of a course
(and, in the aggregate, a set of courses) with the corresponding gen ed outcomes and goals. In some
cases, a faculty member may receive one or two recommendations from his or her chair toward
improving alignment with gen ed; in a very few cases, and only after consultation with the relevant
department, it is possible that the General Education committee may remove a course from the
approved gen ed inventory. Departments may also initiate the withdrawal of a course from the
inventory. It must be emphasized that such actions are not intended as evaluation of or commentary on
the value and effectiveness of a particular course or instructor per se, but only on the issue of alignment
with the goals of general education. Many outstanding courses meet a number of learning outcomes
that relate to a specific discipline but may not be judged suitable for inclusion in the university’s general
Technology Support for Electronic Course Portfolios
All course portfolios are stored in Blackboard, protected by password. Faculty members have access to
their own portfolio only; the Review Team members, General Education Committee members, and staff
from the Office of Institutional Assessment have access to all portfolios. An outstanding portfolio may
be made available to the university community given the faculty member’s approval.
A “course” (with a generic name such as “Gen Ed – Arts,” “Gen Ed – Literature,” etc.) is automatically
assigned to the faculty members who are required to submit a portfolio; they will be, in Blackboard
terminology, “students” in that course. They are instructed to upload syllabi, reflection, course
assignments/projects, and student work samples under designated “assignments” of that “course.”
Logistical instructions for constructing a portfolio and uploading documents is available on the General
Education Assessment website. In each semester, the Office of Institutional Assessment provides
training for creating electronic course portfolios.
Background: why this, why now, why us?
Reasons for Conducting Direct Assessment of Student Learning
George Mason University is accredited by the Southern Association of Colleges and Schools (SACS).
Every ten years the university performs a self-study, the goal of which is to reaffirm that we meet the
standards of accreditation set by SACS. Since our last accreditation in 2001, SACS has put increasing
emphasis on student learning outcomes. For general education, it explicitly requires institutions to
“identify college-level general education competencies and the extent to which graduates have attained
them” (Principles of accreditation: Foundations for quality enhancement, 3.5.1, p. 15).
For a successful re-affirmation of accreditation, Mason must demonstrate, among other things, the
effectiveness of our general education program through direct assessment of student learning in each of
the twelve general education categories. This means, by the time Mason’s self-study report is due (in
fall 2010), we have developed an effective general education assessment system and have conducted
direct assessment of all of the general education categories.
In addition to the SACS requirements, the State Council of Higher Education for Virginia (SCHEV) also
requires state institutions to directly assess six areas of core competency: written communication,
quantitative reasoning, scientific reasoning, critical thinking, oral communication, and information
technology. In Jan. 2008, SCHEV further required institutions to conduct value-added assessment in
these areas. Institutions must revise/revisit their assessment plans to embody the following operating
conception of value-added assessment:
Value-added assessment measures indicate progress, or lack thereof, as a consequence of the
student’s institutional experience (Guidelines for Assessment of Student Learning, p. 7).
Another important audience for this assessment activity is ourselves. One of the hallmarks of
assessment activities at Mason is the extent to which we use the data to implement improvements in
the form and content of academic programs (the nationally-ranked Writing Across the Curriculum
program being perhaps the most visible example). Since General Education constitutes such a major
and important portion of a student’s educational career, and therefore commands a large commitment
of resources (financial, physical, and intellectual), we need good direct data to assess the effectiveness
of this enormous effort. What we learn as an institution from this cycle of assessment may have
profound benefits as we engage in a thorough examination of our practices in this area.
General Education Assessment Required by SCHEV
Since 2000, Mason has been conducting competency assessment in six SCHEV-required areas in selected
general education courses. Direct testing or evaluation of samples of student work (research papers,
presentations, tests, etc.) is embedded in the following courses and the results are analyzed and
reported to SCHEV on a cyclical basis.
1. Written Communication: Writing-Intensive courses (since 2000, serving as a post-assessment)
and English 100/101 (fall 2008, serving as a pre-assessment)
2. Quantitative Reasoning: Math 106 and STAT/IT 250 (fall 2007 and spring 2008, pre- and post-
testing embedded in the course)
3. Scientific Reasoning: ASTR 103/104, BIOL 103/104, CHEM 101/102 and CHEM 103/104, EVPP
110/111, GEOL 101/102, PHYS 103/104 (fall 2007 for pre-assessment and spring 2008 for post-
4. Critical Thinking: in selected synthesis courses (spring 2006 and spring 2007, not in value-added
5. Oral Communication: COMM 100/101 (fall 2005 and spring 2007, not in value-added format)
6. Information Technology: IT 103 (on-going since 2003, pre- and post- testing starting in fall 2008)
In the next two years, we will develop new value-added assessment plans and complete pre-
assessments, at least, for critical thinking, oral communication, and information technology. The
assessment results can be used to partially fulfill our SACS requirement.
Previously Conducted Portfolio Assessment in General Education
A new process using course portfolios to assess the General Education Program was successfully piloted
in Spring 2008. It was fully implemented in the following three general education areas in Spring 2009:
Literature, Western Civilization/World History, and Arts. Course portfolios, collected from 32 randomly
selected faculty members, were reviewed by faculty review teams in summer 2009. In fall 2009, the
assessment results will be distributed to units and faculty will be involved in discussions about how to
improve teaching and learning in general education based on what has been learned through the
Indirect Assessment in General Education
Over the past few years, we have conducted the following indirect assessments of general education.
The results can be used as supporting evidence for the effectiveness of the program.
1. Graduating Senior Surveys: the general education learning outcomes constitute a repeating
theme on the graduating senior surveys between 2003 and 2007. The graduating seniors were
asked to rate either Mason’s contribution to their growth or their competence in each of the
general education categories.
2. Student focus groups: between 2004-2005, focus groups were conducted among students who
were taking a general education course at that time to understand their experiences in general
3. Global Understanding Faculty Survey (spring and fall 2007): a survey was conducted among
faculty who taught an approved global understanding course in the 2006 academic year. The
survey asked about learning outcomes for these courses and how faculty assessed these
4. Faculty Survey on Student Writing: the survey was originally conducted in fall 2000 and a
modified version will be administered in fall 2008.