TRACE Rpt 200809 FINAL by Zh5R69fh


									                                  FINAL REPORT

2008-09 Special Ad Hoc Committee for TRACE Implementation
                                      April 2009

Committee Members

     Brendan Bannister, Chair, College of Business Administration
     James Alan Fox, College of Criminal Justice
     David A. Rochefort, College of Arts & Sciences
     Richard Mickool, Executive Director, IS Business Operations
     Tina Penman, Graduate Student, Bouve College of Health Sciences
     Susan Powers-Lee, Vice Provost for Undergraduate and Cooperative Education
     Robert W. Sikes, Bouve College of Health Sciences
     Aaron Zononi, Undergraduate Student, College of Engineering


Effectiveness in teaching is at the core of any reputable institution of higher learning. The
evaluation of teaching is an important process in the pursuit of meeting an effectiveness goal.
TRACE (Teacher Rating and Course Evaluation) was designed as one indicator of teaching
effectiveness. It was piloted in Spring 2007 and launched campus-wide in Fall 2008. The
present committee was charged with a review of progress to date. Formally, the charge read:

Committee Charge

The overall task of this Committee is to monitor and improve the implementation of the online
teaching rating/evaluation system (TRACE). The Special Ad Hoc Committee on TRACE
Implementation shall exist until such time as its work is successfully completed, and it has been
discharged by the Senate Agenda Committee.

The specific tasks for the 2008-09 TRACE Implementation Committee are as follows:

      To review the implementation problems and of the previous academic year, and
       implement and / or recommend improvements to the system.
      To evaluate and recommend procedures for encouraging and ensuring participation by
       students at a level sufficient to engender trust in the validity of the new instrument.
      To work with the faculties of the various academic units to fine-tune the content of the
       TRACE instrument, modifying and adding questions as needed.
      To re-evaluate and, if necessary, to revise and/or recalibrate the TRACE instrument and
       delivery process in light of the results of the previous academic year.
      To provide the Senate Agenda Committee and the Office of the Provost with a plan for
       how to oversee its ongoing implementation.
      To provide the Senate Agenda Committee and the Office of the Provost regular updates
       of progress in implementation and issues that may arise.

The Senate Agenda Committee respectfully requests that the Committee submit its report on
these charges, both in electronic and hard copy, no later than 1 April 2009.

In the course of our deliberations, it became clear that the new TRACE system has enjoyed some
important successes in its first year (e.g., rapid feedback). At the same time, there have been
“bumps in the road” of first-year implementation. Our committee was not charged to make a
recommendation about the viability of online teaching evaluation. Instead, our charge was to
review TRACE and begin the process of making recommendations to improve its
implementation. We point this out because the evaluation of teaching is a topic that generates
substantial, sometimes emotional, response. Across the nation, the topic of online evaluation has
exacerbated these feelings. NU is much like other institutions moving to online evaluation.

Although there are some minor variations in presentation below, the central structure of the
review covers:

Background and Current Situation

Review Issues


In general we have found that there are numerous practical recommendations that can improve
TRACE implementation. These are certainly dealt with. Two larger issues having to do with
response rates and open-ended online comments are dealt with in more depth. We start with
Administrative Issues around TRACE, move forward through issues of Content and Flexibility
of TRACE and conclude with discussion and recommendations around Response Rates and
Open-ended Comments. To assist us in gaining an understanding of how Northeastern faculty
view TRACE, we conducted a survey on this topic through the Office of Institutional Research.
 The survey was conducted via a web-based tool in the spring of 2009. For the initial survey
invitation, the deans of each college were asked to send the survey to their faculty members via
an email with an embedded survey link. In addition, an email survey reminder, also with an
embedded survey link, was sent to the faculty by the Vice Provost for Undergraduate and
Cooperative Education. Overall, 287 faculty responded to this survey. Although the total
number of respondents to our survey was substantial--287 overall—it is impossible to say what
percentage of all those contacted this figure represents due to the nature of the survey procedures
that were employed. The committee also makes no assumption that these 287 respondents
constitute an adequate representative sample of all faculty teaching at the university

Our job was to suggest improvements to TRACE and we respectfully submit our analyses and
recommendations to that end.

                              ADMINISTRATIVE ISSUES
Background and Current Situation

TRACE (Teacher Rating and Course Evaluation) was piloted in Spring 2007 and launched
campus-wide in Fall 2008. Responsibility for TRACE design is shared by the SGA, GPSA and
Faculty Senate. Responsibility for TRACE implementation is shared by these three groups, as
well as the Provost’s Office, CIETL (Center for Innovation and Excellence in Teaching and
Learning) and Information Services.

TRACE is the successor of TCEP (Teacher and Course Evaluation Process), and was designed
with the following distinctions from TCEP:

     The TRACE questionnaire incorporates some TCEP queries but is intended to provide
      more focus on how students learn in the course, and includes more multiple-choice and
      open-ended queries aimed at that goal.
     TRACE is delivered on-line and outside the classroom setting in contrast to the in-class
      delivery of the paper TCEP.
     TRACE is open for elective student participation for a several-week window, including
      the finals period, in contrast to the pre-finals in-class delivery of TCEP.
     TRACE surveys for their own courses, including the open-ended comments, are available
      to faculty at myTRACE on myNEU the day after term grades are released, in contrast to
      the several-month processing time for TCEP surveys.
     Faculty currently have one week into the subsequent term to email to
      request redaction of any comments that harass, threaten, defame, slander or otherwise
      fall outside the university’s appropriate use policy. After the redaction period, IS
      prepares an archive of the quantitative data from all faculty surveys for the term to be
      posted on the myNEU portal at TRACE (2008-present). Faculty retain access to
      complete copies (including open-ended comments) of their own TRACE surveys from
      all terms at myTRACE.
     TRACE is intended to serve as a richer student guide to courses than the most recent
      version of the TCEP archive by sharing student comments with other students, as well as
      continuing to share the quantitative survey results, via the student myNEU portal.
      Although the older publicly-available paper archives of TCEP surveys included student
      comments, it was not readily feasible to include the comments when the paper TCEP
      archive was originally moved on-line.

Like TCEP reports, TRACE reports are used as evidence of teaching effectiveness in
promotion/tenure considerations, as well as in the merit reviews for many units, according to the
guidelines of the 3/29/06 Faculty Senate resolution on student evaluations of teaching:

“… every unit shall carry out adequate, good faith teaching evaluations of all instructors of
record as part of the annual merit review, as part of the tenure evaluation process, and/or as part
of the promotion evaluation process as applicable. The teaching evaluation results will be
compiled by CEUT (3/09 note: now CIETL) and sent in a timely manner to each instructor
evaluated. A second copy of these results will be sent to the instructor’s unit head, who will then
see to it that those results are incorporated into the merit review process according to the Unit’s
procedures. For probationary faculty, adequate good-faith evaluation procedures will include
annual evaluation by two or more means, one of which must include student teaching
evaluations. The other means could include:
• peer classroom visits;
• peer evaluations of class materials
• teaching portfolios
• evaluations by earlier graduates of the program
• other means appropriate to the discipline
For tenured faculty, adequate good-faith teaching evaluations will include annual student
teaching evaluations and, at least once every 3 to 5 years, evaluations by one or more additional
means. Written copies of the unit’s procedures will be approved by the appropriate Dean and the
Provost’s office, and copies will be kept on file in the Provost’s office.”

Review Issues

There are several very important questions that have arisen in the first year of TRACE
implementation, having to do with the administration and “structure” supporting TRACE. These

Who is responsible for TRACE?
Who is the “face” of TRACE?
How can correct faculty assignments for TRACE be assured?
How can access to TRACE data be assured?
When should TRACE be administered?


   1. SGA, GPSA and the Faculty Senate should continue to share responsibility for TRACE.
      These three groups should provide leadership in maximizing effective student
      participation in the survey and effective faculty/administrator response to the surveys.
      However, ownership and accountability for TRACE should extend to all students, faculty
      and administrators in order to create a culture of participation, respect, and learning-

   2. Shared response to TRACE issues, involving Information Services, CIETL (Center for
      Innovation and Excellence in Teaching and Learning) and the Provost’s Office, via is effective for specific issues, and should be continued. However,
      there needs to be a more clear “face” of TRACE, and that role should be more clearly
      assigned to CIETL. CIETL should continue to provide support for educating the faculty
      about TRACE by such mechanisms as: virtual resources on the CIETL web site,
      including the questionnaire, FAQs about the TRACE system, and a TRACE blog;
      TRACE workshops; and, work with individuals on teaching issues identified via TRACE.
      Additionally, CIETL should work with chairs/deans to design effective summaries of
      TRACE data for various administrative activities

   3. The Registrar’s instructional database serves as the input for TRACE faculty and student
      information. Failure of departments/units to provide correct data to the registrar’s system
      has required a large amount of “clean up” for TRACE use of the data, and a number of
      problems have remained in spite of these efforts. Maintaining correct staffing and
      enrollment data on the registrar’s system is critical for many aspects of institutional
      reporting, including faculty workload analysis, faculty/student interaction analysis and
      TRACE, and should be considered a responsibility of each department/unit.
      Implementation of the Banner system should facilitate maintenance of correct
      staffing/enrollment information by the departments/units. Some TRACE-specific
      changes to the registrar’s data, such as including TA names as well as the instructor of

       record, will remain necessary and the TRACE administration should continue to provide
       these specific data services.

   4. The IS group should continue to provide support for maintaining faculty and student
      access to TRACE reports. Faculty access to their own TRACE data, at myTRACE, and
      to the archived TRACE data, would be facilitated by a consistent desktop image, with
      clearly-identified portal access available to all faculty, rather than the present desktop
      distinctions for those teaching or not teaching in a given term.

   5. The original process for TRACE incorporated a several-week survey window, including
      the finals period, with the goal of optimizing student participation. In the recent survey
      of faculty opinion on TRACE, there was an almost equal split of votes for keeping this
      timing as is (104 faculty) and making TRACE available only before finals begin (108
      faculty), with less interest in making TRACE available only after finals are over (21
      faculty). There is a belief that the longest possible survey window offers greater
      opportunities for higher response rates. Given our continued focus on maximizing student
      participation and our goal of creating a campus-wide understanding of the TRACE
      process, we recommend that the timing of TRACE administration remain unchanged
      until the next review of TRACE, when this might be reconsidered.

There are currently a number of improvement possibilities around how TRACE is implemented
when courses or approaches are not “standard”. Because each issue has its own background
situation they are presented separately for both description and recommendations. The Review
Issues include:

Review Issues

      Trace support for idiosyncratic course designs
      Improved reporting
      Customized Questions

Issue 1: TRACE Support for Idiosyncratic Course Designs

The TRACE system relies on the Registrar’s database to indentify which instructor will be
evaluated for each course. For most courses this system works well since the majority of courses
at Northeastern have a single instructor. For courses with multiple instructors, however, the
Registrar’s system does not currently capture the role of each instructor adequately. For example,
a course may have a coordinator who is correctly identified by the Registrar as the person
ultimately responsible for multiple sections of a course. The problem is that only one person can
be listed for each course although some or even all of the individual sections are taught by other
faculty members. Split courses and linked modules, where different faculty are responsible for
each half of a semester, raise other complications. While the number of courses that have such

complex designs is not extensive, they can have large enrollment and thus affect a significant
proportion of Northeastern students.

CIETL (Center for Innovation and Excellence in Teaching and Learning) has developed a system
to collect data on secondary faculty and teaching assistants. An increased presence and
knowledge of the role of CIETL should help resolve some of the difficulties, but it does present
increased workload for both departments and CIETL. Additionally, these options are not fully
understood by all academic units, and the system of checking the collected information before
going online is incomplete. The upcoming Banner enrollment system offers the possibility of
improving many problems related to complex course design, for TRACE as well as better
capturing information on teaching activity across the university. Finally, with increased
innovations in course design, more and more courses can have different ending dates (e.g., 7
week courses, courses extending longer than a single semester). Currently there are no
procedures to easily administer a TRACE evaluation immediately after these course designs.


       1. The academic units should work with TRACE and CIETL to ensure appropriate and
          relevant evaluations for all faculty. Each academic unit should provide accurate
          staffing information to TRACE in a timely manner, and TRACE and CIETL should
          work to develop efficient methods of collecting this information and translating it into
          appropriate evaluations. TRACE should improve the system for notifying faculty
          about the courses and sections for which they will be evaluated.

       2. New administration procedures should be developed by TRACE and CIETL that
          takes into account differences in course design (e.g., varying length) that result in
          differences in ending date and/or changes in faculty.

       3. TRACE and CIETL should work with the Registrar’s office to ensure that the new
          Banner system is designed to better identify and classify all individuals and courses
          which need TRACE assessment.

       4. When the design of a course makes TRACE evaluation impractical, the academic
          units should be able to opt out of TRACE evaluation. The criteria and mechanism for
          opting out of TRACE needs to be clearly defined. CIETL should work with
          Blackboard administrators and other relevant parties to develop alternate evaluation
          tools for these courses.

Issue 2: Improved Reporting of Trace Results

Currently, TRACE produces voluminous reports that are difficult to read online or print and
customization of the reports is limited. Some faculty find access to their results confusing.
Furthermore, there is considerable concern among surveyed faculty about the accessibility of
these reports, especially the open ended responses, and the use in annual and tenure review.
Improved reporting is needed with customization appropriate to specific academic units so that
these results can better meet the goals of informing and improving teaching as well as use in
annual and tenure/promotion review when appropriate.


   1. TRACE and CIETL should work with academic units and faculty to develop improved
      and varied summary reporting that meets the needs of different units. Summary reports
      appropriate for all units may be designed and developed by the survey system vendor.

   2. For ongoing effectiveness and flexibility, mechanisms should be explored to allow the
      academic units access to the data base to develop their own customized designs for

Issue 3: Customized Question Pools

Currently TRACE consists of two sets of questions. One set assesses course content while the
other assesses teaching. Faculty generally considered these questions helpful in modifying their
teaching (70% agreeing or strongly agreeing with only 9% disagreeing or strongly disagreeing)
Additionally, there was strong interest in adding customized questions to the survey (63% agree
vs. only 7% disagree). Indeed, 37% of faculty are currently using Blackboard or paper based
surveys to collect additional information about their courses. The TRACE system has the
capacity to add customized questions, but this process has not been implemented due to an
increase in the administrative processing burden as well as technical development that needs to
be done. Providing a mechanism for adding customized questions to the TRACE survey may
increase usefulness and also improve response rate by providing a single and convenient survey
for students.


   1. Procedures to facilitate adding customized questions to the TRACE evaluation should be
      explored by representatives from TRACE, CIETL and faculty/academic units.

   2. If cost-effective and feasible, consideration should be given to giving an individual in
      each academic unit administrative authority to customize TRACE reports and/or
      customized question.

                                    RESPONSE RATES
Background and Current Situation

One result, to date, in moving from the classroom-based TCEP evaluation process to the on-line
TRACE system has been a general decline in student participation in completing teaching
evaluations. The average response rate among courses evaluated with TCEP was approximately
80% in the period from Fall 2003 to Summer 1 2007. (This calculation does not include the
unknown number of classes for which TCEP surveys were never distributed to students, were
administered improperly, were not submitted for processing, or confronted processing problems.)
This average 80% response rate is at the higher end for that reported in some literature on in-
class evaluations. According to data compiled by the committee, the response rate for courses
evaluated with TRACE in Fall of 2008 averaged 54%, with 16% of all courses having a response
rate below 40%, and 49% of all classes having response rates above 50%. This level of return is
in line with national results based on data provided by the vendor (CourEval) for a diverse
collection of colleges and universities, some large and some small, that also use on-line teaching
evaluation systems

Review Issues

The committee received various kinds of formal and informal input highlighting this decline in
student response as a problem under TRACE. Given the mandatory inclusion of student
teaching evaluation data within merit, tenure, and promotion reviews, some unit heads expressed
uncertainty about using teaching evaluation data based on very low response rates. In the
committee’s March survey of faculty, a strong majority (81%) agreed or strongly agreed that
“Receiving high response rates for TRACE is important to me.” At the same time, the
perceptions around response rate indicated uneasiness with about one-third expressing agreement
with “The response rates for my classes are generally too low to give an accurate picture of
student reaction to my teaching” (35%) or another one-third agreeing with “The response rates in
my classes are inconsistent” (33%).

When asked “Do you think there should be a minimum percentage of students responding to
student evaluations for a course before data can be included for merit and promotion purposes,”
70% responded “yes.” The average cut-off value for useable data, according to these survey
participants, was a response rate of 60-70% (with 75% as the modal response among our
sample). Again, it is important to emphasize here that these data represent no more than an
aggregation of opinions about desirable response rates, and their correspondence to the views of
Northeastern faculty overall is uncertain. Indeed, our impression from this survey, as well as
other sources of information consulted by the committee, is that most faculty members do value
the data being generated by TRACE, they use the results to inform their teaching practices, and
they encourage students to complete the TRACE survey. But faculty also want to be more
confident that TRACE evaluation findings provide an accurate representation of student opinion.

It is noteworthy in this context that only a small minority of faculty (24%) told us they felt they
had “the power to influence the response rate for TRACE in my classes.”

As the committee sought to formulate a position of its own on the adequacy of current TRACE
response rates, two perspectives emerged. Since neither point of view captured a clear majority
within our group, both views are summarized here.

       The First View: Caution Regarding Response Rates

As has been stated, the level of student response at NU under TRACE appears to be consistent
with on-line evaluation systems at other universities. In and of itself, however, such information
does not demonstrate that TRACE response rates are consistently “good enough” for the various
serious uses to which teaching evaluation data are now being put on this campus. We have
already reported that only 49% of all classes surveyed with TRACE this past Fall exceeded a
50% response rate. This might not be significant if we could reassure fellow faculty, based on
some certain principle of statistical analysis, that a 29%, or 35%, or 46% response rate still yields
valid data on the reactions of a cross-section of their students. However, no such categorical
assurance can be made.

No evidence has come to the committee’s attention indicative of a systematic bias against the
instructor in courses with low response rates. Indeed, some counterevidence exist that faculty
members may be rated slightly higher on-line than in-class, as a rule. Yet this information, if
reliable, is only briefly reassuring because it suggests that response bias can enter into on-line
teaching evaluation data due to inadequate response. If one is concerned with the accuracy of
the teaching evaluation process—not just its favorability—then this realization is troubling. In
general, it seems that much more research will be needed on this issue of potential response bias
under on-line evaluations before it can be cited as definitive.

Because no precise answer exists concerning the adequacy of different teaching evaluation
response rates, it is useful—and relevant—to consider what faculty members desire, and would
find credible, when they are being evaluated. As noted, the standard is high, typically in the
neighborhood of 75%, based on those who answered our survey. Are these data representative
of what most faculty members at Northeastern feel on this matter? One can no more say “yes” to
this question than one can put full confidence in a class survey receiving a low response.
Nonetheless, a substantial number of the 188 faculty members who chose to weigh in on this
issue clearly are distressed by the level of student involvement under TRACE so far. It is
sometimes argued that students who don’t participate in a system like TRACE are weaker
academically, or less interested in contributing to the improvement of university teaching, than
students who do respond. This conclusion may or may not be warranted. Whatever the case
may be, it is still important to do everything possible to maximize the number of students in the
teaching evaluation process if one goal is student empowerment.

       The Second View: Optimism Regarding Response Rates

Comparative assessments from other institutions as well as our pilot testing from a year ago
show that moving from paper to on-line administration does not tend to disadvantage faculty. To

the contrary, on-line ratings typically are slightly more favorable. Moreover, on-line assessments
garner more and lengthier comments from students, and comments tend to be more positive.

The committee produced several statistical analyses suggesting that non-response bias is small in
magnitude and slightly in the favor of faculty. For example, the correlation between the response
rate and overall instructor effectiveness score across the nearly 2,000 course sections evaluated
in Fall 2008 was virtually zero. Another analysis compared the course grades received by
students who had and who had not participated in TRACE within a class section with an
enrollment of several hundred; the average grade for TRACE participants was slightly higher
than for non-participants. This result is consistent with the general findings of research in this
area that better students—as measured by grades—have higher rates of participating in on-line
surveys. In general, the concern that large numbers of students will use the on-line system to
criticize an instructor unfairly even while not having attended class regularly is largely
unsupported by the evidence. Higher GPA students tend to take the time and effort to complete
the TRACE survey; marginal students often show less interest.

       Concluding Comment

 In the end, all that can be said—statistically speaking—is that the lower the response rate, the
greater the possibility of unrepresentative or skewed evaluation findings due to outliers and other
data problems. This is true even in the absence of any patterned response bias concerning the
students who choose to participate in TRACE. In this light, the committee recommends that
TRACE evaluations with low response rates be viewed cautiously and used in combination with
other assessments of teaching effectiveness (including TRACE results for other classes taught be
the faculty member). Where the committee is in complete agreement is the advisability of
generating multiple steps to increase awareness of the importance of teaching evaluation and the
criticality of student involvement in that process. To that end the committee advises strong steps
to boost TRACE response rates, among other related recommendations stated below.


   1. The Registrar should assist in implementing an incentive system for TRACE
      participation (e.g., students who complete TRACE could then be given immediate access
      to their grade for that course if it has been submitted by the faculty member).

   2. With the assistance of student government organizations, the undergraduate and graduate
      student populations should be reminded of the importance of TRACE. Additionally, they
      should be reminded of the availability of TRACE data (scores and comments) to guide
      their own decisions at course registration time. Among the many benefits of TRACE, this
      may be of most direct value to students.

   3. The university, under the leadership of CIETL, should highlight the importance of
      TRACE by designating one day near the end of each semester as “TRACE Day” with
      accompanying publicity, special events, prizes, etc., all with the purpose of focusing
      student attention on completing the on-line evaluation process.

   4. Although inclusion of TRACE data within faculty reviews may be required,
      administrators at all levels need also to emphasize an alternative use of TRACE as a tool
      for teaching improvement. The goal should be promotion of TRACE within the culture
      of our university as a vehicle for collecting valuable data and student commentary that
      can contribute to the teaching mission of individual faculty members and their units.

   5. Concerns about the accuracy of TRACE data highlight the need for other methods of
      teaching evaluation to be more actively employed at the unit level than they may be at
      present. As already identified by the Faculty Senate, these other methods could include
      peer reviews, evaluation of course materials, and teaching portfolios.

   6. Departments need to do everything possible at their level to help raise student
      participation rates under TRACE. Those departments that maintain their own systems of
      course evaluation, in addition to TRACE, are cautioned against any procedures, or
      messages, that could discourage student participation in TRACE due to completion of an
      alternative instrument. Particularly as TRACE response rates improve with maturation of
      the system, as well as the implementation of the new incentives described here,
      departments should explore the use of customized TRACE surveys to handle all of their
      evaluation needs with a single system, rather than continuing to operate an alternative to

   7. With respect to proper administrative use of TRACE data from those courses with low
      response rates, we encourage further examination of this issue by the Faculty Senate
      based on the current practices and preferences of administrators who are involved in
      conducting faculty reviews.

                             Open-Ended Online Comments

Background and Current Situation

The TRACE teaching evaluation system incorporates a way for students to make comments and
provide feedback (i.e., qualitative ratings) in several areas of the instrument. Currently, the
design of the TRACE incorporates:

a) students can make anonymous comments at any time during the normal evaluation period.
b) these comments are reviewed against the university’s “appropriate use” policy (see below).
 Currently the Provost’s office designates who will conduct this review.
c) the comments, after review by TRACE administration against appropriate use policies, are
released soon after the quantitative ratings
d) students have access to both the quantitative and qualitative ratings and comments; faculty
members have access to their own quantitative and qualitative ratings but only to the quantitative
ratings of other faculty members

e) department chairs and other select administrators have access to both the quantitative and
qualitative ratings.

Review Issues

TRACE, as a system of teaching evaluation, has now been in operation for close to a year. The
open-ended, online commentary system is one of the system design issues that has created the
most faculty reaction. Part of this is because the “rules” on open-ended comments have changed.
Under the previous TCEP system of teaching evaluation, students could also make written
commentary on both the class and faculty member, but these were delivered only to the faculty
member, department chair and other administrators. What has changed is the public nature and
release of these comments to the general student population.

The major issues are:

   Should the teaching evaluation system have open-ended comments?
Prior to TRACE, the previous system of teaching evaluation (i.e., TCEP) incorporated open-
ended student feedback. This was done in-class, and in hand-written form. This feedback was
passed on to faculty members and to department chairs to inform about the effectiveness of both
the class and the instructor’s teaching behavior. Many instructors can find open-ended comments
and feedback to have important value in reflecting on their own course design and delivery. In
the recent faculty survey 94% of faculty reported receiving open-ended feedback. Faculty
reported that TRACE open-ended feedback was useful to them (70% in agreement and 13%
disagreeing) and to students (64% of faculty in agreement and 13% disagreeing). Faculty may
find the standard questions generating quantitative results to be remiss in reflecting the
individual and distinctive aspects of their course and delivery and well constructed feedback can
provide useful remedy.

   Who should have access to the open-ended comments?
The major change in the TRACE process has to do with the change in dissemination policies
regarding open-ended comments. Under the current TRACE system, students have been given
access to these open-ended comments, something that was not available before. Students would
contend that the open-ended comments provide a rich new source of information and
performance data that can be helpful in course and instructor selection. Some faculty are in
agreement with this while others express the belief that comments should not be released without
further review and control of content. Some faculty believe that the comments create a culture
that looks like a popularity contest and where uncontrolled personal commentary can be
damaging and unprofessional. In the recent survey 25% of the faculty believed it appropriate for
students to view open-ended comments and 54% disagreeing that this practice is

  Is the policy of “appropriate use” sufficient?
Students who choose to write comments about either the course or faculty member are currently
guaranteed anonymity, and can write what they want and as long as it stays within the

appropriate use policies of the university. The Appropriate Use Policy covers protections where
 * Harass, threaten, defame, slander or intimidate any individual or group;
 * Generate and/or spread intolerant or hateful material, which in the sole judgment of the
University is directed against any individual or group, based on race, religion, national origin,
ethnicity, age, gender, marital status, sexual orientation, veteran status, genetic makeup, or
 Arguments have been raised that this policy is important but not covering all situations where
open-ended comments could be inappropriate. Suggestions have been made that argue for an
increase in accountability and professional expectation around written comments. Some have
further argued that there should be procedures to remove comments that are unduly
unprofessional or overly personal and damaging to course delivery (even if within appropriate
use policies). In a recent Sense of Senate of the Graduate and Professional Student Association
(GPSA), this position and need for better protocols in reviewing professional content of open-
ended comments was supported (see Appendix 1)

   How should the review process be handled?
TRACE is a new system and has been implemented through the persistent, substantial efforts of
a team of individuals, assembled in response to the jointly agreed charge of the Faculty Senate,
Student Government Association, Graduate & Professional Student Association, and Office of
the Provost to design and implement an online teaching evaluation system. Despite successes in
initial implementation, to this point, how questions, concerns, or challenges to TRACE can be
handled may not be well understood. The “face” of TRACE, as mentioned previously, has
largely been an email address. In addition, how comments might be reviewed for possible review
and removal has largely been misunderstood by faculty while the locus of ownership and
responsibility for TRACE data has been several levels removed from the individual faculty level.
The argument has been made that open-ended, online comments are not just opinions but are
performance data. As such the argument has also been put forth that for more effective handling
of that data, there should be changes to insure that this open-ended performance data is the joint
responsibility of faculty, department chairs and other key administrators.


The recommendations below represent a conservative approach that maintains open-ended
comments as a source of data for both faculty and students while clearly shifting ownership and
locus of responsibility to the faculty/chair level for initial review and positions authority at that
level for analysis and action on open-ended comments. Ultimately if the open-ended
performance data is useful, it is at this level that meaningful dialogue and action planning takes
place. In the course of our deliberations and discussion it became clear that there are multiple
“situations” that can arise that might make online comments inadvisable. Furthermore, some of
these are not all “negative”. For example, a faculty member who builds intrinsic “surprises” into
his or her course delivery does not want those identified before the next course delivery. The
extent and type of situations that might warrant exceptions need to be better understood and this
starts at the faculty member level. Our recommendations also position the CIETL as an
important “face” of TRACE, representing the Provost’s office. On the “downside”, the
recommendations below will not satisfy those who have firmly held beliefs that open-ended

comments should not end up in the public domain at all. Hopefully, at this point in time, some
may be responding in this manner due to the lack of additional “controls” or review processes to
ensure professionalism. Those review processes are worthwhile and recommended below.
Finally, the extended review processes may slow down how quickly open-ended comments end
up in a public forum. It is our belief that this does not have to be an extensive delay and that the
caution motivating more review is warranted and worthwhile.

       1. Students will continue to have the opportunity to offer commentary and feedback
          about both the course and teaching behavior of the faculty. Feedback can add value
          and improve teaching if professional and specific to teaching behavior.

       2. How to write constructive, professional feedback can be taught and consideration
          should be given to adding this to the curriculum of a unit’s freshman “Intro to the
          College” course (or other appropriate forum). Students will be encouraged to be
          professional, helpful and specific in their comments. This should be added in
          instructions to TRACE.

       3. After the evaluation period ends and final grades are reported, review will commence
          at the level closest to faculty and those responsible for evaluating faculty (e.g., chairs
          or other appropriate evaluators). The review process will begin there and faculty can
          initiate a request for removal of comments considered unprofessional, inaccurate or
          potentially damaging to course delivery.

       4. A set of protocols need to be established between CIETL and academic units to
          implement the removals. For example, Chairs may be granted authority to either
          remove comments or may make recommendations to an individual in the unit who
          has authority. Whoever has the authority would be expected to interact with the
          Director of CIETL to document the type and extent of changes. This process should
          be as streamlined as much as possible but where faculty/department chair ownership
          is increased and where the expectation is created that the performance data is to be
          used for teaching improvement.

       5. CIETL, reporting to the Provost’s Office, would have authority to review the entire
          TRACE system of online comments, including deletion practices, and to enter into a
          dialogue with academic units at the local level if disparities in practice seem

       6. Finally, for a 2 year period of further data collection and review, the open-ended
          online system should include an opt-out option that originates at the
          faculty/department chair level. This opting out policy would occur after evaluations
          so that the performance data is not lost. Consideration should be given to online
          clarification that comments are not posted because there were none versus opting out.
          Review procedures need to be developed so that academic deans and CIETL monitor
          the opting out decisions during this two-year time. After that period of review time
          the Faculty Senate should reconsider the efficacy and form of an ongoing open-ended

           online comments system and work with SGA, GSPA and the Provost’s office to
           modify if necessary.


The 2008-09 Special Ad Hoc Committee for TRACE Implementation was charged with
reviewing the progress, successes and areas of potential improvement of the new online system
of teaching evaluation know as TRACE. It was clear before we began and reinforced during our
deliberations that teaching evaluation generates focused attention and varying, strong opinions
from a wide range of constituent groups. Despite this there is almost unanimous belief that
effective teaching is essential and the student role in that evaluation important. Building on those
areas of consensus we have tried to set forth recommendations, some practical and immediate,
and some requiring ongoing monitoring to improve TRACE. In highlighted form we believe:

   a) The recommendations around very practical concerns such as idiosyncratic course
      designs, customized content etc are worthwhile and can commence almost immediately.
   b) The identification of a very visible “face of TRACE”, CIETL, will provide necessary,
      personalized leadership and a clear location to go to when “issues” emerge around
      TRACE that are systemic.
   c) The shift in ownership and locus of responsibility for initial diagnostics, dealing with
      TRACE non-standard issues, and review of open-ended comments to the faculty
      member/chair level offers much in the way of increased effectiveness and positive action-
   d) The need to continue to monitor response rates, once actions are implemented to impact
      them upwards, and opt-out behavior is warranted and worthy of further Faculty Senate
   We respectfully submit our report.

                                 APPENDIX 1: Sense of the Senate


Expressing the sense of the Northeastern University Graduate & Professional Student Association (GPSA) Senate
that student comments are an integral part of Teacher Ratings and Course Evaluations (TRACE) and should be
continued to be posted online but only if the comments are professional and adhere to The Northeastern University
Appropriate Use Policy updated on January 23, 2009.

Whereas TRACE is the current tool used for evaluating teachers and courses;

Whereas the student comments for TRACE are posted online;

Whereas The Northeastern University Appropriate Use Policy updated on January 23, 2009, states “11. Users are
responsible for the timeliness, accuracy and content/consequences of their personal information, web pages and
other electronic writings”;

Whereas The Northeastern University Appropriate Use Policy updated on January 23, 2009, states “26. The
Appropriate Use Policy specifically prohibits the use of Northeastern University information systems or facilities to:
Harass, threaten, defame, slander or intimidate any individual or group;

Whereas TRACE is administered on myNEU which is covered under The Northeastern University Appropriate Use

Whereas the GPSA has responsibility for TRACE along with the Student Government Association and Faculty

Whereas the GPSA has supported and continue to support the notion that student comments are an integral part of
TRACE and can be very helpful for both faculty and students; hereby be it

Resolved, that it is the Sense of the Northeastern University Graduate & Professional Student Association Senate

    1.   student comments for TRACE continue to be posted online;

    2.   student comments posted online must be professional and adhere to the Northeastern University
         Appropriate Use Policy;

    3.   the 2008-2009 Special Ad Hoc Committee for TRACE Implementation include the following items in their
         final report to the Faculty Senate:
              a. a proposed protocol detailing a process involving faculty to challenge specific comments that they
                  feel are unprofessional or in violation of the Northeastern University Appropriate Use Policy
              b. the sentiment that students must be made aware of the Northeastern University Appropriate Use
                  Policy prior to submitting student comments for TRACE.

Tina Penman 2/17/2009
Passed with a majority vote at the 02/26/2009 GPSA Senate Meeting

To top