Management Reflective Essay Sample by nsp83428

VIEWS: 0 PAGES: 18

More Info
									                           Reflective Essay: Capstone Experiences Rubric

Harvey Mudd’s required curriculum is divided into four components: the Common Core, which provides
the foundation for advanced study; the program in Humanities, Social Sciences and the Arts, which
completes the liberal arts nature of a Harvey Mudd College education by providing humanistic and social
scientific perspectives; the Major, which builds depth and technical competence; and the Integrative
Experience, which explicitly addresses the interface between society and science and technology.
Unifying all of these is an emphasis on strong oral and written communications, the development of
computational skills, and a capstone experience.

Students at Harvey Mudd College fulfill the requirements of a capstone experience during their senior
year through one of two venues: participation in the College’s Clinic Program, or through independent
study and research for a senior thesis.

Clinic

The Clinic Program, a hallmark of the College, engages juniors and seniors in the solution of real-world,
technical problems for industrial, government and non-profit clients. Founded as an innovation in
engineering education in 1963, this program has been expanded to other HMC academic departments
and copied by institutions worldwide. Since the Clinic Program’s inception, more than 1,150 projects have
been completed. Companies retain all intellectual property rights that arise from the project, and it is not
uncommon for HMC students to be named on patents. In 2005, for example, Clinic sponsors filed 13
patent disclosures at the end of their projects.

Research

Since research began on campus in the Department of Chemistry in 1958, the college has offered its
undergraduates the hands-on laboratory and field experience usually reserved for graduate students. The
quality and productivity of research here is also unparalleled at most undergraduate institutions, this is
reflected by the fact that among colleges and universities in the US, Harvey Mudd’s alumni rank second
per-capita in earning PhDs.

Anchored by a research-supportive curriculum, students can start research early in their education. They
may collaborate with faculty during the academic year, as well as during the summer through the HMC
Summer Research Program. As described in each department’s capstone requirement (below), several
academic majors require senior research.

For HMC faculty, research is a powerful teaching tool that leads to students’ disciplinary learning, and
professional and personal growth well beyond the traditional classroom setting. Students and faculty
collaborate on many projects that are presented jointly at professional scientific conferences and in peer-
reviewed journals, earning distinguished national awards.

Senior Capstone Experiences and the WASC Rubric

WASC’s Rubric for Assessing the Use of Capstone Experiences for Assessing Learning Program
Outcomes has been useful to the College as we reflect upon the scope, objectives and impact of our
capstone experience. While we feel that we have a potent capstone curriculum, the rubric has challenged
us to consider the extent to which departments are able to integrate and assess their defined student
learning objectives in the context of the capstone experience.

1. Relevant Outcomes and Evidence Identified. Each academic department at Harvey Mudd College
has defined the scope, content and purpose of required cumulative project opportunities that meet the
College’s capstone experience requirement.
Response: Biology, Chemistry, Mathematics, and Physics provide their majors with the opportunity to
complete independent research projects to fulfill capstone requirements. The goal of the senior research
experience is to introduce students to the type of open-ended investigative work that is the primary
activity of practitioners in the field. Senior research projects must represent a significant and original piece
of work that students conduct independently. Through their research students are expected to progress
through the initial stages of project development and experimental design, to collection and analysis of
data, culminating in formal presentations of results in both a written format and in an oral presentation to
an audience of faculty, peers, and others members of the community. These expectations are
documented in the College website, and, to varying degrees, within departmental goals.

During the fall semester, students completing a senior research project prepare a written research
proposal, and give proposal orally to an audience of faculty and peers. At the end of the second
semester, students present the final results of their research in a written report as a thesis. Students are
also required to present their findings in a 15-minute talk during HMC Projects Days.

Engineering, Math, Computer Science and Physics invite their students to fulfill the senior capstone
project requirement through participation in the Clinic Program. As defined in the 2008-2009 Engineering
Clinic Handbook, the educational goals for all students engaged in a Clinic project are as follows:

    Gain experience with the nature, demands and ramifications of real-world problems
    Develop leadership/membership in team efforts
    Increase student understanding of the engineering design process
    Increase students’ ability to apply course material
    Gain real world insights
    Assist HMC towards its institutional goals, including increased private/public financial support,
    encourage interdisciplinary exchange and cooperation, and keeping current with new technology

2. Valid Results. Faculty collect and assess valid evidence for each targeted outcome, and use well-
established, agreed-upon criteria, such as rubrics, for assessing the evidence for each outcome.

Response: Student work in the Clinic is evaluated and assessed using established criteria that have been
approved by all departments whose students participate in Clinic. The criteria reflect essential
competencies considered appropriate for successful professionals in that field as determined by faculty
and industry representatives. The grading and assessment criteria include the following:

Technical Contribution - adequacy and appropriateness of analyses, syntheses, tests, conclusions,
covering library, field, laboratory, computer or shop work

Project Management and Control
    Initiative and imagination in taking responsibility either as a leader, or in volunteering as a team
    member
    Giving and taking criticism is a part of the job
    Keeping team, client, and advisor informed
    Quality of written, graphic and oral work
    Taking obligations seriously in meeting deadlines
    Active participation in team meetings
    Attendance at the Tuesday presentations

Overall Effect - Useful results from the individual’s efforts

Examples of the skills that students are expected to develop through the senior research experience
include:
         Enhanced ability to put classroom knowledge into practice
         Enhanced communication skills, both oral and written
         Enhanced technical skills within the discipline
        Enhanced development of personal initiative
        Increased confidence

While the rubric assessment of the Clinic is more formalized, assessment of the senior research
experience varies across departments and tends to be more informal. As mentioned above, presentation
of a senior research experience has two facets, a public talk and a written thesis. The Mathematics
department uses a four page survey that students fill out as a self-assessment of their thesis experience;
the Chemistry department uses a rubric to assess students’ oral presentations; and the Biology
department uses another rubric to assess the senior thesis. Goal and outcome oriented assessment of
senior research experiences is an area where we need to improve in the next few years.

3. Reliable results. Well-qualified judges should reach the same conclusions about individual students’
achievement of a learning outcome, demonstrating inter-rater reliability. The purpose is to ensure that all
raters apply the criteria in the same way so that each student’s product receives the same score,
regardless of rater.

Response: Clinic is evaluated by several means: peer evaluations, project presentation evaluations, end-
of-year student evaluations, and evaluations by the company liaison. Below is a brief description of each
of these methods of evaluation.

Peer evaluations:
Each semester faculty ask students to write at least one round of peer evaluations on each others'
performance and the team as a whole. The faculty advisor uses this information to help assess how the
team is performing both technically and from a management perspective. This assessment is also fed
back in an anonymous way to each student to help them improve in both individual performance and
team dynamic. These peer evaluations are also give insight to the faculty advisor on ways to help the
team progress and learn. Subsequent peer evaluations (at least two are administered per year) help
gauge project, team, and student adjustment and improvement in response to previous evaluations.

Project presentation evaluations:
Student presentations are evaluated by students, faculty, and staff from across the disciplines. Clinic
leaders, teams, and faculty advisors use this feedback to improve the technical content of the clinic and
the presentation of the results.

End-of-year student evaluations:
All students involved in clinic evaluate the program at the end of the year-long project. Clinic directors
across the departments meet and use this information to improve the program.

Evaluations by company liaisons:
Each company sponsor evaluates their experience with the clinic program at the end of a project. Clinic
directors use this information to both improve the program and to give feedback to faculty advisors.

In addition to these formal evaluations, faculty advisors supplement this information with weekly meetings
with the team leader and the team to receive and give feedback for improvement. Faculty members also
meet informally to discuss methods that work – and ones that don't.

Historically, there has not been a formal process that ensures inter-rater reliability in the evaluation of
student research. The College has relied upon the professional and disciplinary expertise of its faculty to
measure the extent to which students have successfully achieved the learning objectives appropriate to
their capstone experience. This is clearly not ideal. As noted above, goal and outcome oriented
assessment of senior research experiences is an area where we hope to improve in the next few years.

4. Results are used. Faculty collect, discuss and reach conclusions about assessment results. Faculty
develop and implement explicit plans to improve student learning, collaborate with other campus
professionals to improve student learning, and provide opportunities for follow-up studies to confirm that
changes have improved learning.
Response: As discussed in our answer to point 3, above, the Clinic uses end-of-year student evaluations
and evaluations by company liaisons as a vehicle for discussing the success and failures in a given year.
These discussion, at least informally, lead to changes in future Clinic practices.

Short of lots of informal and anecdotal discussion, the senior research experience is a place where we fall
short of collecting and discussing evaluation data, and feeding-back improvements to the system. As we
develop a more formal assessment of the senior research experience we expect that these practices will
improve beyond their current state.

5. The Student Experience. Students should understand the purposes different educational experiences
serve in promoting their learning and development and know how to take advantage of them; ideally they
should also participate in shaping those experiences. Thus it is essential to communicate to students
consistently and include them meaningfully.

Response: The Clinic program is quite explicit about the educational purpose it serves, as described in
our response to #1 in this essay). The goals of the senior research experience are tied to the goals of
each department.

Independent research and Clinic projects are, by design, student-driven; students are expected to take
responsibility and ownership of their projects. While faculty serve as advisors and mentors throughout the
research and inquiry process, HMC students understand that in a capstone experience they lead the
project and are ultimately responsible for how it evolves and what is accomplished.
                      Reflective Essay: General Education Assessment Process

The founders of Harvey Mudd College envisioned the college as a “Liberal Arts College of Engineering
and Science.” HMC’s educational goals are based on the tradition of liberal learning which encourages
the growth of broadly educated citizens, promoting self understanding and self worth in all students.

The WASC Rubric for Assessing General Education is comprised of five major components. This
reflective essay illustrates the extent to which HMC’s continuing efforts at implementing assessment
practices suitable for HMC’s academic program are in line with WASC’s rubric to measure general
education outcomes.

1. GE Outcomes. The set of GE learning outcomes should be a short but comprehensive list of the most
important knowledge, skills, and values students learn in the GE program.

The Common Core – comprised of foundational courses from each of HMC’s academic departments – is
similar in concept to a “general education” program at other institutions in that it is required of all students
and provides essential knowledge for upper division courses and exposes students to the various
disciplines at HMC. All graduates of Harvey Mudd College have, since the College’s founding, completed
the technical Core curriculum to build a solid foundation and broadly educate our students. The Core
curriculum is periodically assessed, including a review in 2003 that examined the success of the Core to
develop higher-order educational objectives (effective oral and written communication, critical thinking,
teamwork and collaboration, project management, and leadership) beyond the acquisition of technical
skills and discipline-based knowledge. Recently, after a comprehensive strategic planning initiative in
2006, the Strategic Vision Curriculum Committee (SVCC) was appointed to address workload and
flexibility issues through both a review and revision of the curricular model and a modification of the
College’s culture to value achieving an appropriate work-life balance.

Through this process, the SVCC identified five goals for HMC’s Core curriculum– a comprehensive list of
the most important knowledge, skills and values that our students learn at the College which establish a
broader, goal-oriented framework for assessing this part of the curriculum:

1. Exercise technical expertise developed through rigorous foundational work and an emphasis on
   problem solving in learning communities
2. Appreciate and employ different kinds of knowledge and expressive sophistication as the basis for
   critical analysis and synthesis and self-examination
3. Serve society by addressing the complex problems of the world, creatively, passionately and
   humanely
4. Flourish in a multi-cultural community and global environment
5. Lead examined and meaningful lives

These broad goals, in turn, provided our Strategic Planning Curriculum Implementation Committee and its
Writing Subcommittee with a foundation upon which to identify more specific and measurable student
learning outcomes. The Writing Subcommittee compiled a list of eight (8) learning outcomes specific to
the new Writing course that will be implemented in fall 2009. These include students’ ability to:

1.   Use informal writing to develop their thinking at different stages of inquiry
2.   Deploy some main elements of persuasive and expository writing (see below) in formal papers
3.   Recognize and use rhetorical purpose, voice, and audience analysis in academic reading and writing
4.   Write clear, coherently structured papers that use appropriate evidence and diction toward forceful
     intellectual discourse
5.   Demonstrate understanding of some of the main cross-disciplinary similarities and differences in
     conventions of expression and article formats
6.   Develop an effective writing process that includes repeated revision of writing
7.   Make use of the feedback process, both as reviewers and as recipients
8.   Identify passages in their writing that call for citation, attribution, or acknowledgment, and apply
     appropriate forms of citation where needed
These specific learning outcomes for HMC’s new Writing course provide a working outline upon which the
assessment for the new Core will be based during the 2009-2010 academic year. We think it is worth
noting that the Writing Subcommittee report, which proposes a new curriculum and includes student
learning goals, learning outcomes and an assessment plan, is an example of Harvey Mudd's evolution
toward a more rigorous assessment culture. While this is our most outstanding example, it is air to say
that we now operate in the mode of including learning goals, learning outcomes, and assessment plans in
our early discussions of curriculum. This was not true ten years ago.

2. Curriculum Alignment. Students cannot be held responsible for mastering learning outcomes unless
the GE program systematically supports their development. The GE curriculum should be explicitly
designed to provide opportunities for students to develop increasing sophistication with respect to each
outcome.

HMC faculty are mindful of the College’s academic priorities, and have taken steps to identify best
practices in measuring students’ increased sophistication with academic skills taught in the Core
curriculum. To do this, the first step was to identify institutional priorities that provide a larger framework
within which to articulate and align College-wide goals. The College’s Mission Statement sets the tone for
all institutional and educational goals, which are most clearly defined by three planning documents: 1) the
College’s Mission Statement 2) the Strategic Planning summary, HMC 2020: Envisioning the Future
(Appendix II-D) and 3) the revised Core curriculum proposal and the educational priorities described
therein.

The Strategic Vision Curriculum Implementation Committee has thought carefully about how the revised
Core curriculum will, as is the case with the current Core sequence, integrate thorough academic
preparation for upper-division courses. It is anticipated that students will, in fact, achieve greater levels of
academic and technical preparation within the revised Core in several ways: 1) the schedule and content
of all Mathematics courses in the Core are purposefully structured in a way that helps build upon and
emphasize sophistication in the subject area; 2) Mathematics courses completed by students in the first
year are intentionally designed to prepare students for Engineering 59 (taken in the third semester) and
Physics 51; 3) The revised Core will offer a selection of Interdisciplinary Choice Labs that will emphasize
interdisciplinary experiential learning, and will be completed by the end of the fifth semester. Such
courses would emphasize the confrontation between abstract models of the world with carefully acquired
data and experimental observations, allowing students to deepen their understanding of the essential
dialog inherent in experimental work. These Choice Labs extend the innovative integration of research
and education that was the basis for our highly-acclaimed Interdisciplinary Laboratory taught at the
College from 1999-2004 following the receipt of the National Science Foundation’s Award for the
Integration of Research and Education in 1998.

Alignment between the academic goals of HMC and the educational and professional expectations of our
                    st
students in the 21 century was further demonstrated through HMC’s strategic planning efforts. During
the week of October 16, more than 400 people including trustees, students, alumni, staff, faculty, parents
and other individuals listened to each others’ ideas, commented, reflected and made recommendations.
The hands-on participation of students, in particular, reflected the College’s commitment to giving them
responsibility for and a role in shaping their personal and academic development. The Strategic Planning
Steering Committee prepared the first-draft outline of the Strategic Vision, which was presented to the
HMC community several times, and ratified by the Board of Trustees on December 9, 2006. The six
themes of the Strategic Vision are:

1.   Innovation, leadership and impact, especially in engineering, science and mathematics
2.   Focus on experiential and interdisciplinary learning
3.   Unsurpassed excellence and diversity at all levels
4.   Nurturing and developing the whole person
5.   Global engagement and informed contributions to society
6.   Improvement of infrastructure and resources to support HMC’s commitment to excellence and
     building community
3. Assessment Planning. Faculty should develop explicit plans for assessing each GE outcome.

The creation and implementation of a comprehensive assessment program that integrates all aspects of
the academic curricula will continue through the 2009-2010 academic year. To date, it remains a work in
progress and a high priority to the College. Determining whether we are achieving stated student learning
outcomes will be the focus of our assessment of the new Core curriculum.

In conjunction with the assessable outcomes of the revised Core curriculum, an assessment plan,
including a description of possible assessment instruments, is provided in Section 9 of the SVCIC report.
In addition, more specific learning goals are articulated in the report describing the Core’s new writing
curriculum. Faculty will focus on assessing the writing component of the new Core curriculum as the
College progresses toward our Educational Effectiveness Review (EER).

Assessments of the revised Core will focus first on the writing course because it captures many of the
goals of our new Core curriculum, and was designed with learning goals and outcomes built in. For
instance, the course is by design interdisciplinary, as it is taught by faculty from different departments who
build the course around a topic of mutual interest. Writing captures many of the critical-thinking skills we
value, and improved writing proficiency by our students is a major emphasis of our new curriculum. An
assessment of other student learning outcomes will be pursued in the coming semesters, including those
related to the first-year mathematics courses designed to expand technical and academic sophistication
with the subject matter and preparation for upper-division courses in the majors.

4. Assessment Implementation. GE assessment data should be valid and reliable. A valid assessment
of a particular outcome leads to accurate conclusions concerning students’ achievement of that outcome.

As noted in the Strategic Vision Curriculum Implementation Committee’s report to the faculty (February
2009), the changes to the Core curriculum were motivated in large part by direct feedback from student
assessments that indicated students wanted more electives and greater responsibility for setting their
own educational objectives. At the same time, the reduction in Core requirements has produced some
concern about the preparation of students for the post-Core curriculum and their experience in majors. As
we move forward, our assessment plan and implementation will focus on measuring how (or whether)
reductions to the Core curriculum are impacting students’ academic abilities as they enter their majors.

While the design and implementation of the Core curriculum assessment plan is piloted during the 2009-
2010 academic year, our goal as a college is to assure its simplicity and sustainability. The following list
of expectations and outcomes provide the basis for HMC’s assessment of the revised Core curriculum:

Core Goal 1: Demographic trends
   Retain and graduate a greater percentage of the students that we enroll
   Attract, enroll, retain, and graduate a greater percentage of students who contribute to the diversity of
   the College, as measured by gender, ethnicity, and economic background

Core Goal 2: Benefits from increased electivity
   Students will be more satisfied with their ability to choose courses that satisfy their interests
   Students will be more satisfied with their ability to shape their own academic programs
   The numbers of students participating in language study during their first year will increase
   Students will be able to create breathing space within their first two years to accommodate academic,
   social, or emotional needs

Core Goal 3: Preparation for the post-Core curriculum
   Students will be as able to achieve success in their majors as they were prior to the Core reform
   Students will be more able to employ interdisciplinary thinking
   Students will be more proficient writers
5. Use of Results. Assessment is a process designed to monitor and improve learning, so assessment
findings should have an impact. Faculty should reflect on results for each outcome and decide if they are
acceptable or disappointing.

Because the 2009-2010 academic year will be the first in which a selection of revised Core courses will
be launched at HMC, including the new Writing seminar, we anticipate that data collected from initial
assessment efforts will be used to measure the quality of the revised instruction and its effectiveness as a
means of communicating key lessons and concepts relevant to the subject matter. The College does not
anticipate that the entire revised Core curriculum will be fully implemented at HMC before the 2011-2012
academic year. At such time, of course, assessment efforts will begin to focus on the extent to which
students are prepared for upper-division courses in their major.

The Assessment Committee is currently developing means of analysis of the data to be collected from
Core assessments (such as mapping NSSE data to specific goals, and rubrics to assess writing);
additional information on which is pending.

Specific data and instruments to be used for the purposes of the Core curriculum assessment will include:

    Institutional database with diversity indexes
    Registration data
    Senior Survey
    Sophomore Survey
    National Survey of Student Engagement (NSSE)
    Faculty Survey of Student Engagement (FSSE)
    CIRP Freshman Survey, HERI
    College Senior Survey, HERI
    National College Health Assessment
    Assessment instruments that measure success of Choice Labs in developing interdisciplinary thinking
    Collections of papers from Hum1, Humanities, Social Sciences and the Arts elective, other core
    courses (for a baseline), and papers from the new writing course and HSA 1 and other core courses
    Faculty end-of-term evaluations

In addition, Individual Departments will choose one outcome they would like to use to measure post-Core
"success" for their majors, then decide on what measures they will use. This could include data from
exams in first-courses-in-major, and possibly first-exams or placement exams in core.
                             Reflective Essay: Learning Outcomes Rubric

The founders of Harvey Mudd College envisioned the college as a “Liberal Arts College of Science and
Engineering.” HMC’s educational goals are based on the tradition of liberal learning which encourages
the growth of broadly educated citizens and promotes self-understanding, self reflection, and a sense of
self worth in its students.

The WASC Rubric for Assessing the Quality of Academic Program Learning Outcomes is comprised of
five major components. This reflective essay illustrates the extent to which HMC’s continuing efforts at
implementing assessment practices suitable for HMC’s academic program correspond to WASC’s rubric
to measure learning outcomes.

1. Comprehensive List. The set of program learning outcomes should be a short but comprehensive list
of the most important knowledge, skills, and values students learn in the program, including relevant
institution-wide outcomes such as those dealing with communication skills, critical thinking, or information
literacy.

The College’s Strategic Vision Curriculum Committee (SVCC) was appointed to address workload and
flexibility issue; the committee considered these issues through both a review and revision of the Core
curriculum and a modification of the College’s culture to value achieving an appropriate work-life balance.
The work pursued by this faculty committee was conducted in the spirit of ensuring that intellectual rigor
and excellence would be maintained or enhanced by any changes made to the curriculum.

The SVCC identified five educational priorities for the College – a comprehensive list of the most
important knowledge, skills, and values that our students learn at the College:

1. Exercise technical expertise developed through rigorous foundational work and an emphasis on
   problem solving in learning communities
2. Appreciate and employ different kinds of knowledge and expressive sophistication as the basis for
   critical analysis, synthesis and self-examination
3. Serve society by addressing the complex problems of the world creatively, passionately, and
   humanely
4. Flourish in a multicultural community and global environment
5. Lead examined and meaningful lives

2. Assessable Outcomes. Outcome statements should specify what students can do to demonstrate
their learning.

Upon the faculty’s October 2008 vote to approve and implement a revised Core curriculum at HMC, the
Strategic Vision Curriculum Implementation Committee (SVCIC) was appointed and worked closely with
the College’s Assessment Committee and WASC Steering Committee to identify institutional goals of the
revised Core:

Core Goal 1: Demographic trends
   Retain and graduate a greater percentage of the students that we enroll
   Attract, enroll, retain, and graduate a greater percentage of students who contribute to the diversity of
   the College, as measured by gender, ethnicity, and economic background

Core Goal 2: Benefits from increased electivity
   Students will be more satisfied with their ability to choose courses that satisfy their interests
   Students will be more satisfied with their ability to shape their own academic programs
   The numbers of students participating in language study during their first year will increase
   Students will be able to create breathing space within their first two years to accommodate academic,
   social, or emotional needs
Core Goal 3: Preparation for the post-Core curriculum
   Students will be as able to achieve success in their majors as they were prior to the Core reform
   Students will be more able to employ interdisciplinary thinking
   Students will be more proficient writers

These goals for the revised Core, in turn, provided HMC’s Strategic Planning Curriculum Implementation
Committee and its Writing Subcommittee with a foundation upon which to identify more specific and
measurable student learning outcomes. In their February 2009 report, the Writing Subcommittee
compiled a list of eight (8) learning outcomes specific to the new Writing course that will be implemented
in fall 2009. These include students’ ability to:

1.   Use informal writing to develop their thinking at different stages of inquiry
2.   Deploy some main elements of persuasive and expository writing (see below) in formal papers
3.   Recognize and use rhetorical purpose, voice, and audience analysis in academic reading and writing
4.   Write clear, coherently structured papers that use appropriate evidence and diction toward forceful
     intellectual discourse
5.   Demonstrate understanding of some of the main cross-disciplinary similarities and differences in
     conventions of expression and article formats
6.   Develop an effective writing process that includes repeated revision of writing
7.   Make use of the feedback process, both as reviewers and as recipients
8.   Identify passages in their writing that call for citation, attribution, or acknowledgment, and apply
     appropriate forms of citation where needed

These specific learning outcomes for HMC’s new Writing course provide a working outline upon which the
assessment for the new Core will be based during the 2009-2010 academic year. We think it is worth
noting that the Writing Subcommittee report, which proposes a new curriculum and includes student
learning goals, learning outcomes and an assessment plan, is an example of Harvey Mudd's evolution
toward a more rigorous assessment culture. While this is our most outstanding example, it is air to say
that we now operate in the mode of including learning goals, learning outcomes, and assessment plans in
our early discussions of curriculum. This was not true ten years ago.

In addition, HMC faculty members have spent the last year deciding how best to assess the educational
effectiveness of student learning outcomes within all academic departments at HMC. Throughout the
2008-2009 academic year, department chairs have worked with the Office of Institutional Research to
identify measurable goals and student learning outcomes, and methods by which to assess them. Each
department has reviewed, and in some cases revised, its academic goals and attendant student
outcomes to reflect their current curricular and instructional priorities. The current goals and learning
outcomes for each academic department at the College are linked below:

     Biology
     Chemistry
     Computer Science
     Engineering
     Humanities, Social Sciences, and the Arts
     Mathematics
     Physics

3. Alignment. Students cannot be held responsible for mastering learning outcomes unless they have
participated in a program that systematically supports their development. The curriculum should be
explicitly designed to provide opportunities for students to develop increasing sophistication with respect
to each outcome.

HMC’s work to identify and implement a comprehensive assessment program that integrates all aspects
of the academic curricula will continue through the 2009-2010 academic year. We have, however,
remained mindful of the academic priorities of the institution and have taken steps to identify how to best
measure students’ increased sophistication with academic skills in a manner that is complementary to our
curriculum.

Our first step in this respect was to identify institutional priorities that provide a larger framework within
which to articulate and align college-wide goals. The College’s Mission Statement sets the tone for all
institutional and educational goals, which are most clearly defined by three planning documents: 1) the
College’s Mission Statement; 2) the Strategic Planning summary, HMC 2020: Envisioning the Future
(Appendix II-D) and 3) the revised Core curriculum proposal, and the educational priorities described
therein. In addition, the HMC curriculum, as structured, has always been one that incrementally prepares
our students for subsequent challenges and advanced studies. Specifically, the Core curriculum is
designed to prepare students academically for the discipline-specific courses in their chosen major. In
turn, upper-division courses in the major prepare students for the capstone experience – Clinic or
research - which is completed during their senior year.

Alignment between the academic goals of HMC and the educational and professional expectations of our
                    st
community in the 21 century was also demonstrated through our strategic-planning efforts. The week of
October 16, 2006 was set aside for institutional reflection, anchored in discussions of the College’s
Mission Statement. Over four days, more than 400 people including trustees, students, alumni, staff,
faculty, parents and other individuals listened to each others’ ideas, commented, reflected and made
recommendations. The hands-on participation of students, in particular, reflected the College’s
commitment to giving students responsibility for and a role in shaping their personal and academic
development. The Strategic Planning Steering Committee prepared the first-draft outline of the Strategic
Vision, which was presented to the HMC community several times between November 16 and December
9, 2006. The six themes of the Strategic Vision are:

1.   Innovation, leadership and impact, especially in engineering, science and mathematics
2.   Focus on experiential and interdisciplinary learning
3.   Unsurpassed excellence and diversity at all levels
4.   Nurturing and developing the whole person
5.   Global engagement and informed contributions to society
6.   Improvement of infrastructure and resources to support HMC’s commitment to excellence and
     building community

4. Assessment Planning. Faculty should develop explicit plans for assessing each outcome.

As noted above, the creation and implementation of a comprehensive assessment program that
integrates all aspects of the academic curricula will continue through the 2009-2010 academic year.
Determining whether we reach student-learning objectives will be the focus of our assessment of both the
new Core curriculum and departments’ student-learning outcomes.

Revised Core – In conjunction with the assessable outcomes of the revised Core curriculum, an
assessment plan, including a description of possible assessment instruments, is provided in section 9 of
the SVCIC report. In addition, more specific learning goals are articulated in the report describing the
Core’s new writing curriculum. Faculty members will focus on assessing the writing component of the new
Core curriculum as the College progresses toward our Educational Effectiveness Review (EER).

Assessments of the revised Core will focus on the writing course because it captures many of the goals of
our new Core curriculum and was designed with learning goals and student learning outcomes built in.
For instance, the course is by design interdisciplinary, as it is taught by faculty from different departments
who build the course around a topic of mutual interest. Writing captures many of the critical-thinking skills
we value, and improved writing proficiency by our students is a major emphasis of our new curriculum.
The Writing Subcommittee compiled a list of eight (8) learning outcomes specific to the new Writing
course that will be implemented in fall 2009. These include students’ ability to:

1. Use informal writing to develop their thinking at different stages of inquiry
2. Deploy some main elements of persuasive and expository writing (see below) in formal papers
3. Recognize and use rhetorical purpose, voice, and audience analysis in academic reading and writing
4. Write clear, coherently structured papers that use appropriate evidence and diction toward forceful
   intellectual discourse
5. Demonstrate understanding of some of the main cross-disciplinary similarities and differences in
   conventions of expression and article formats
6. Develop an effective writing process that includes repeated revision of writing
7. Make use of the feedback process, both as reviewers and as recipients
8. Identify passages in their writing that call for citation, attribution, or acknowledgment, and apply
   appropriate forms of citation where needed

Academic Departments – The Office of Institutional Research is assisting the department chairs in
structuring their programs’ assessment efforts by identifying one department goal to be assessed during
the 2009-10 academic year, listed in the table below. The intention is for each department to create an
assessment schedule that will enable their department’s goals to be assessed on a regular schedule.
With the assistance of the Office of Institutional Research and the Assessment Committee, departments
are developing student learning outcomes that are associated with these goal, and creating instruments
to asses those learning outcomes. Results from these initial assessment efforts should be in place for the
Educational Effectiveness Review

5. The Student Experience. At a minimum, students should be aware of the learning outcomes of the
program(s) in which they are enrolled;

Included in our comprehensive study of the WASC Standards and Criteria for Review are references to
specific courses and their syllabi that show directly how well our faculty are articulating the learning
outcomes for their courses.

We also take seriously information provided to us by HMC students that highlight areas in which the
academic and co-curricular programming at the College are weak or in need of revision. As stated earlier,
data from on-going assessments of students provided us with indicators that suggested the curriculum
was perhaps limiting students’ academic ambitions. For example, a survey conducted by the faculty
Curriculum Committee showed that approximately 82% of 331 students had an interest in taking foreign
language courses, but that the current HMC first-year curriculum made this nearly impossible. Another
survey completed by 64 rising sophomores taking summer math in 2008 confirmed students’ desire for
electivity in the first year (Appendix E-1). Among those surveyed, 72% would have found it valuable to
have an elective in their first semester at HMC. Of those who saw such electivity as valuable, 35%
indicated they might have used that elective to take a foreign language, 26% said they might have taken
E4 or another engineering course, and the remaining comments reflected interests in a wide variety of
subjects in the sciences, social sciences, humanities, and arts. In light of the College’s Strategic Vision
and a desire to develop graduates who can flourish in a global environment, the SVCC saw the Core
curriculum as an important place to direct curricular revision efforts.
                                   Reflective Essay: Portfolio Rubric


Harvey Mudd College’s initial implementation of portfolio assessment practices has targeted a few
specific courses across several academic departments. This document provides examples of the
purposes and goals of two such portfolio projects in place at HMC: the Integrative Experience electronic
portfolio course, and the portfolio assessments used in Humanities 1 classes.

In line with the five criteria measured by the Portfolio Rubric provided by WASC – clarification of students’
tasks, valid results, reliable results, results are used, and adequate support for e-Portfolios (as
appropriate) – there is strong evidence to suggest that these courses are well positioned to meet stated
expectations. The portfolio projects completed in the College’s Humanities 1 course has a proven record
of helping students to pay focused and detailed attention to their writing skills.


Integrative Experience 179P – ePortfolio

The IE 179P Integrative Experience ePortfolio Project is a pilot course in which participants maintain an
ongoing electronic collection of student work, self-reflection, and critical analysis that explores the
interaction of science, technology and society. The course is a three-semester sequence and was first
offered in Fall 2007. One unit of course credit is awarded for each of the three semesters. A final
electronic presentation is submitted at the conclusion of the third semester.

The syllabus for IE 179P, that includes the objectives to be measured and assessed, is appended to this
document.


Student Portfolios for Humanities 1

A Humanities 1 portfolio is a collection of three papers that are written, revised and polished during the
semester-long Humanities 1 course that all first-year students take during their first semester at HMC.
Each paper is peer reviewed and then revised; graded and revised for a second time; and revised a third
time in response to a third round of comments. Ideally, the portfolio represents the student’s progress in
writing and critical thinking over the course of their first semester in college.

A detailed summary of this portfolio format used for Humanities 1 is appended to this document.
                                       Electronic Portfolio Project
                                  A Pilot Integrative Experience Project

For educational purposes, an electronic portfolio (e-portfolio) is a technology-based collection of student
work over a specified period of time that can serve as an authentic performance based assessment tool.
Beginning fall 2007, a pilot group of students use an electronic portfolio to satisfy the College’s Integrative
Experience (IE) requirement. These students maintained an ongoing collection of student work, self-
reflection, and critical analysis that explored the interaction of science, technology, and society. A vast
range of materials were included in the electronic portfolio. Items associated with formal courses
constituted a significant component of the e-portfolio, including written assignments, reflections on
readings and class discussions, electronic presentations, recordings of oral presentations, etc. The
portfolio, however, also enabled students to draw on their co-curricular activities as opportunities to
engage in the IE objectives. Service projects, attendance at lectures and seminars, activities associated
with various clubs and organizations, summer research experiences, etc. were all possible venues for
engaging in considerations of the relationship of science and technology with society. In addition to the
traditional formats listed above for materials to be included in the portfolio, other components also
included scanned or digital photos, video and sound clips, animations, drawings, etc. By enabling
students to use both curricular and co-curricular experiences to demonstrate their understanding of both
the impact of science and technology on society and the “human” dimension of science and technology,
we created a truly exciting way for students to fulfill this aspect of the College’s mission statement that
also addresses the Strategic Planning theme of "nurturing and developing the whole person."

Course Designation

The IE 179P Integrative Experience ePortfolio Project was offered for fall 2007, spring 2008, and fall
2008, with each section designed for 10-12 students. Enrollment required permission of the instructor(s)
and was limited to members of the Class of 2009. Ordinarily, a student enrolled for one graded unit of
credit per semester to satisfy the 3-unit IE requirement. However, a student studying abroad for one or
two semesters in 2007-2008 could participate in the project and enroll for two or three units, respectively,
in fall 2008.

Student Participants

For fall 2007, participants included 10-12 rising juniors (Class of 2009) per faculty member participant.
These students were selected from those who express an interest in maintaining an electronic portfolio
for three semesters (through fall 2008) with a focus on IE objectives. An application process was devised
to solicit interest and to select student participants prior to pre-registration. Continuation of the project or
possible scale-up to involve all students at the College were considerations for a later date.

Review, Assessment, and Grading of ePortfolios

Several different forms of review and assessment were used, including peer assessment each semester
as well as evaluation by alumni volunteers for formative assessment. The faculty instructors devised a
scheme for peer assessment and trained student participants to provide meaningful feedback. Two
alumni were assigned for every 10-12 portfolios.

Course Credit

As noted above, students received one unit of credit for each semester of participation. A drawback of
this approach is that the IE course could not count for another requirement such as a HSS course or a
major elective. The advantage of this approach was the flexibility in scheduling for the student.
                              Summary: Portfolios in Humanities 1
                Harvey Mudd College Department of Humanities and Social Sciences


The use of the term “portfolio,” for the purposes of Humanities 001, refers to the collection and review of
writing samples from the course of a semester, as opposed to the often traditional “e-portfolio” reference,
which is not the model upon which Hum 1 portfolio is based. A Humanities 001 (hereafter referred to as
“Hum 1”) portfolio is a collection of three papers that are written, revised, and polished during the
semester-long course. Each of these papers has been peer reviewed and then revised; graded and
revised for a second time; and revised a third time in response to a third round of comments. These
papers exhibit the collectivity of the students’ work authored in the course of the semester, rather than
selections of work that are accompanied by reflective essays (as is often used in e-portfolio models).

Hum 1 is taught in ten different sections, each with different content – depending on instructor's interest
and discipline -- but all with the same writing assessments and deadlines. Students submit three graded
papers and graded revisions of the first two papers. Each Hum 1 section has in-class writing workshops,
writing assignments, and peer review sessions. The third paper is built on a limited amount of student’s
independent research, and all three papers address the section theme. By the end of the semester, the
three papers have been extensively revised, and all three papers are included in the portfolio. In addition,
the students write an introduction that ties the papers together. This introduction is not a “reflective
essay,” per se, but rather is another writing assignment whose purpose is to explain connections between
the three papers in terms of content, approach, or progress made over the course of the student's first
semester in college. As a standard practice for assessing student work, faculty members consider writing,
evaluating, and revising a paper as an integrative process. These comments serve the next draft of the
paper, which is fodder for new comments, and so on.

It is important to note that the HMC Hum 1 portfolio is a portfolio in the traditional, artistic sense: a
showcase that allows the author's work to be reviewed and graded by an outside reader. In order to make
students aware of audience, and in order to keep course quality high and expectations of student work
consistent, the portfolios are assessed by an external member of the Hum 1 teaching staff who has not
previously engaged the student's work. The outside reader's commentary has advisory status: it is not
binding but it is taken very seriously. In the end, the instructor determines grades on the basis of the
quality of the student's work as exemplified in the portfolio, the outside reader's grade and comments, and
the student's ability and willingness -- as exhibited throughout the semester -- to engage commentary and
improve writing and thinking accordingly.

The ultimate goal of Hum 1 is not only to improve students' writing, thinking and reading skills, but also to
train them to make revision an integral part of their writing routines, to familiarize them with the peer
review process and so with the principles of scholarly practices, and to socialize them into an academic
mode of working and thinking.
                                Reflective Essay: Program Review Rubric

On January 27, 2002, the faculty of Harvey Mudd College adopted a formal policy for assessing the
extent to which academic programs fulfill the mission of the College. This policy, Academic Review
Guidelines (hereafter referred to in this essay the “Guidelines”) provides faculty members and
departments with a comprehensive set of guidelines that must be followed in order to complete this
essential component of the College’s short- and long-term academic planning.

The policies and procedures enumerated in the Guidelines are closely aligned with the five criteria that
are measured within the WASC Rubric on Program Review. This essay documents that alignment.

The WASC Rubric for Assessing the Integration of Student Learning Assessment into Program Reviews
has five major components that are addressed by the Guidelines:

1. Self-Study Requirements. The institution’s departmental review protocol must have “explicit
requirements for the program’s self-study, including an analysis of the program’s learning outcomes and a
review of the annual assessment studies conducted since the last program review. Faculty preparing the
self-study should reflect on the accumulating results and their impact; and they should plan for the next
cycle of assessment studies. As much as possible, programs should benchmark findings against similar
programs on other campuses.”

HMC’s Academic Program Review process meets these requirements. Pages 2-3 of the Guidelines
explain the required elements to be included in a departmental review:

    Each department completing a program review will identify student-learning outcomes and goals, and
    will identify the quantitative and qualitative methods to be used to determine how fully these goals are
    being met.

    Goals will provide the framework of anticipated results; the evidence collected and analyzed may be
    cognitive (e.g., critical thought, analytical abilities suited to a discipline), affective (e.g., valuing the
    ethics of scientific research), or behavioral (e.g., leadership, the operation of technical equipment).

    Anticipated results will then be compared to actual results, as revealed through information collected
    by the department, to help assess the effectiveness of the academic program.

    Faculty indicators will be used to determine whether there is a sufficient number, mix, and quality of
    faculty members to communicate the departmental curriculum. Some examples of these indicators
    include: student-faculty ratio within the department; students’ access to faculty; average class size;
    faculty course loads; data from course evaluations; balance among faculty expertise/fields in the
    program; information abut professional development; faculty committee assignments at HMC; and
    degree of faculty consensus on the department’s goals and objectives for student learning.

    Curriculum indicators will be used to determine whether the curriculum is appropriately designed to
    meet departmental goals. To develop such indicators, assessment processes may focus on the
    following inquiries: Is there an appropriate sequencing from introductory to advanced courses? Are
    introductory courses appropriate to the general profile of HMC students? Are there appropriate
    capstone courses? Are there appropriate levels of career and graduate school advising? Are
    courses accurately described in the catalog? Are affective and behavioral goals taught and/or
    modeled? How successful is the curriculum in enabling students to meet departmental goals?

    Examples of curriculum indicators include: comparison of program goals with course objectives and
    requirements; comparison of course sequencing to the order in which students generally move
    through the courses; comparison of the curriculum to the discipline’s accreditation standards;
    comparison of the curriculum to claims in college admissions publicity; and criteria for adding new
    courses and revising or deleting old ones.
    Resource indicators can be used to determine whether instructional resources are adequate and/or
    appropriate to departmental goals. Some examples of resource indicators include: processes for
    determining budget support for the curriculum, for reviewing instructional resources, and for
    recommending instructional resources; and the comparison of ideal versus actual departmental space
    allocation.

2. Self-Study Review. The program review process should include “internal reviewers (on-campus
individuals, such as deans and program review committee members) and external reviewers (off-campus
individuals, usually disciplinary experts) (who will) evaluate the program’s learning outcomes, assessment
plan, assessment evidence, benchmarking results, and assessment impact; and …provide evaluative
feedback and suggestions for improvement.”

Page 3-4 of the Guidelines explain how the department self-study and external review is incorporated into
the program-review process, and how the assessment of direct and indirect data that inform the
achievement of student learning goals is conducted.

    Self-study - the program review must include a self-study that focuses on the extent to which
    evidence and data (“learning indicators”) are in place to assess the relationship and differences
    between anticipated and actual impact of the departmental curriculum on student learning. Examples
    of direct learning indicators that could be used for such assessment include: answers to questions in
    course examinations; term paper quality; portfolio evaluation; and standardized test scores. Examples
    of indirect learning indicators that could be incorporated into this assessment may include: interviews
    with/surveys of students leaving the college before graduation; observation, documentation, and
    analysis of student behavior; retention studies; employment and graduate program placement;
    student and faculty focus groups about learning experiences in the curriculum; student satisfactions
    surveys; and data derived from surveys of employers, graduate schools, or alumni.

    External review - the program-review committee must establish a tentative timeline for the external
    review and create a list of external scholars who might serve on a three- or four-person review team.
    The committee provides a brief statement explaining the appropriateness of each suggestion, which
    is then forwarded to the Dean of Faculty, who approves and formally invites potential team members.
    Prior to the arrival of the review team, the department sends its gathered evidence and self-study to
    the team members, including contextual information about the College and its goals. External
    reviewers are asked to read these materials prior to their visit. After their visit, external reviewers are
    asked to submit a report in which they consider the evidence and self-study, provide evaluative
    comments about the curriculum, the academic program, and the department, and suggest ways of
    improving organizational effectiveness generally as well as teaching and learning in particular.


3. Planning and Budgeting. program-review processes should be directly “tied to planning and
budgeting processes, with expectations that increased support will lead to increased effectiveness, such
as improving student learning and retention rates.”

Pages 4-5 of the Guidelines explain how the review process meets the criteria with this component of the
Program Review rubric.

    The Vice President and Dean of Faculty is a member of the President’s Cabinet and the College’s
    Budget Committee. Both groups consider the results from the self-study and external review,
    particularly as they relate to metrics reported for resource indicators. Annual operating budgets may
    be adjusted based on the recommendations made both within the department’s self-study and the
    external review and in discussions within the Department Chairs Committee.
4. Annual Feedback on Assessment Efforts. Program reviews may require the integration of
“immediate feedback, usually based on a required, annual assessment report. This feedback might be
provided by an Assessment Director or Committee, relevant Dean or Associate Dean, or others …
Whoever has this responsibility should have the expertise to provide quality feedback.”

Page 5 of the Guidelines explains how the review process meets the criteria with this component of the
Program Review rubric.

    Short- and long-term assessment tools – As noted above, the process incorporates evidence and
    data that are derived from direct and indirect learning indicators. While many of these indicators can
    be most accurately measured over a period of time (e.g., students’ improved grasp of certain
    material, concepts or theories within a discipline over the course of several semesters), data are also
    available that provide more immediate and timely feedback regarding the extent to which academic
    departments are helping students achieve stated learning objectives and goals. These include: end-
    of-semester course evaluations, student performance on midterm and final exams, and course
    assignments.

    Assessment accountability - to ensure that these metrics are interpreted and analyzed accurately,
    program review teams partner directly with the College’s Assistant Vice President for Institutional
    Research and Assessment. The AVP is charged with the responsibility of assisting faculty and staff
    “develop program-review plans, analyze data and resources, assess student learning, and interpret
    findings.” The department’s program review chair is expected to consult with the AVP for Institutional
    Research and Assessment about “interpreting these guidelines and learning from the Assessment
    Officer’s experience in working with other departments that have undergone review.”

5. The Student Experience. Students “have a unique perspective on a given program of study: they
know better than anyone what it means to go through it as a student. Program review should take
advantage of that perspective and build it into the review.”

The review process depends directly on the direct and indirect data that are ultimately generated by and
from our most important constituency: our students. Students’ academic performance in classes, ability to
demonstrate mastery of their subject matter, and feedback provided on course evaluations and other
                                                1
broad-spectrum assessments conducted at HMC , provide the data metrics needed to assess the extent
to which our academic departments are indeed achieving their stated student learning objectives.

What is not clear, however, is the extent to which our students are aware of the on-going program review
process in place, and how their feedback and participation in the academic enterprise directly impacts our
assessment efforts. We acknowledge, then, that the student experience component of the Program
Review Rubric requires more consideration and study. The integration of the students’ academic and
personal experiences within our academic departments must be more clearly integrated into and
articulated with our Academic Program Review Guidelines.




1
  HMC participates in the National Survey of Student Engagement (NSSE) every three years, and will participate in
the 2009 NSSE study, as well as the 2008 Beginning College Survey of Student Engagement (BCSSE) and the 2008
Faculty Survey of Student Engagement (FSSE). HMC also regularly participates in other national surveys and
studies, including the HERI College Senior Survey, the HEDS Alumni Survey, and the Academic Pathways of People
Learning Engineering Survey (APPLES), a component of the NSF-funded Academic Pathways Study.

								
To top