Sample Report Writing Skills

Document Sample
Sample Report Writing Skills Powered By Docstoc
					DATE:          June 19, 2007

TO:            Roy Koch, Provost

FROM:          Hildy Miller
               Duncan Carter
               Hugo du Coudray
               Christina Toth
               Jon McClintick
               Elizabeth Harazim

RE:            Writing Research

Enclosed please find an interim report of the faculty and student survey portions of our
“Research and Development Project for the Improvement of Writing Instruction at Portland State

We have been conducting research on writing instruction at Portland State ever since the Faculty
Senate mandated that we do so in 1993, when the Writing-Intensive Course
Program was first approved. By 2000 we had decided to cast a wider net, rather than restricting
the focus of our inquiry to WIC courses. The current version of this project
examines student, faculty, and community partners‟ experiences with and attitudes toward

We call this an “interim” report because (1) our samples are not representative, (2) we have not
yet surveyed community partners, (3) we have just begun content analysis on free response
items, and (4) we will soon have 300-400 more student or faculty surveys to add to our data.

Writing programs around the country tend to be based on tradition or theory; few rest on any
kind of empirical information. We hope that learning more about what our students know about
writing and where they learned it—as well as what kinds of problems they experience—might
suggest future directions for writing instruction at PSU.

We would like to express our appreciation for support from a number of Faculty Enhancement
Grants and Provost's PSU Foundation Faculty Development Awards.

It is our hope that this interim report will begin a dialogue (or more accurately,
add substance to a dialogue already under way) about our students‟ writing and what we can do
to improve it. If you have any questions about this report or suggestions for future research,
                                                               Writing Survey Interim Report

please address them to Hildy Miller (, Duncan Carter ( or
Hugo du Coudray (

cc:    Michael Reardon
       Marvin Kaiser
       Elisabeth Ceppi
       Kathi Ketcheson
       Shawn Smallman
       Sukwant Jhaj
       Ann Marie Fallon
       Dan DeWeese
       Greg Jacob
       Yves Labissiere
       Michael Flower

                              Writing Survey Interim Report

Writing Survey Interim Report
         June 18, 2007

        Christina Toth
   Zapoura Calvert de Ramos
        Duncan Carter
      Hugo du Coudray
      Elizabeth Harazim
     Jonathon McClintick
         Hildy Miller

                                                                        Writing Survey Interim Report


        This report and the research it describes are the result of many people‟s hard work over

the last several years. This research would not have been possible without the support of several

Faculty Enhancement Grants and the Provost‟s PSU Foundation Faculty Development Awards.

Special thanks to Betty Thompson, Zapoura Calvert de Ramos, and the Community Psychology

Class at PSU for the enormous efforts they put into designing and distributing the surveys, and

then collecting the data for analysis. Ongoing thanks to Jon McClintick and Elizabeth Harazim

for the hours spent refining earlier drafts of this report, and for all the labor that will be put into

processing the free response questions. Finally, many thanks to the 55 faculty members who

graciously volunteered their class time (and themselves) to conduct this survey, to the 1453

anonymous students who participated in this research (without whom there would be no data

set), and to the current and future scholars and participants involved in this ongoing effort to

produce strong writers to serve the city.

                                                                    Writing Survey Interim Report

Executive Summary

       What follows is a report on the initial results of a survey of students and faculty regarding

student writing at PSU, part of an on-going curriculum assessment conducted by Writing

Program administrators in the English Department, with assistance from faculty in the

Psychology Department. In Winter 2006, our research team surveyed 1453 students and 55

faculty members. While our samples turned out to be unrepresentative of the student and faculty

bodies as a whole on several demographic points, these first findings present some interesting

trends and correlations that will inform our future research. Complete lists of the questions on

the survey by category, as well as reproductions of the survey forms, are attached as appendices

at the end of this document.

These are some of the highlights of our findings:

       Students reported encountering the most difficulty in their academic writing with

       narrowing the topic, starting the draft, coming up with ideas, organizing their papers, and

       revising. These areas of struggle could indicate a lack of instructional emphasis on the

       initial invention stages of the writing process, which in turn might foster student

       procrastination and leave little time for revision.

      Faculty reported that, from their perspective, students struggle the most with starting their

       papers, organizing, editing/proofreading, and revising their drafts, and supporting points

       with specific examples. Faculty placed more emphasis on trouble with technical and

       mechanical errors than did students.

      Students reported going to reference books, friends or peers, professors, or their parents

       for help with writing much more frequently than they went to teaching assistants,

                                                                  Writing Survey Interim Report

    University Studies mentors, or the Writing Center. This might reflect a lack of

    awareness about or access to institutional resources, or a preference for more personal

    (less institutional) assistance with writing.

   Faculty respondents indicated a high degree of confidence in their own abilities to give

    effective feedback on student writing, but were less enthusiastic about students‟ use of

    that feedback. This could be an indication that some faculty lack awareness about what

    kinds of feedback on writing are useful or meaningful to students.

   A majority of faculty (55%) indicated that they consider the development of writing skills

    to be a responsibility shared equally between students and instructors. Student responses

    to this question were remarkably similar, reflecting a shared understanding between

    students and faculty on this issue.

   Students gave a middling assessment of writing instruction at PSU, but more than half

    (53.2%) expressed a high degree of satisfaction with their own writing ability. Faculty,

    on the other hand, expressed a significantly lower degree of satisfaction with student

    writing abilities, suggesting both a difference in the two groups‟ perceptions of student

    writing, as well as the application of different criteria while making such an assessment.

   Faculty assessment of writing instruction at PSU was remarkably low; however, the

    nature of their dissatisfaction appears to be related more to the lack of instruction that

    students receive than to the quality of the actual writing and writing-intensive courses

    that are offered.

                                                                      Writing Survey Interim Report


        College graduates must have literacy skills in order to succeed at school and in the new

economy they will enter after graduation. Yet at Portland State University, as at many public

institutions that serve diverse groups of students, there has been a growing sense that the literacy

skills students currently develop are inadequate. Faculty report that students at all levels are ill-

equipped to meet the writing demands of their courses and disciplines. Business partners in the

community add that many graduates are unable to advance beyond entry level because they lack

sufficient writing skills (Miller 1).

        Traditionally, in colleges and universities across the country, literacy skills have been

taught in one or two writing classes housed within the English department and required at

mandated points in the general curriculum, typically during students‟ first quarter or semester. In

recent years, this “inoculation” approach to composition instruction has been faulted for not

integrating writing into curricular content within the disciplines. First-year seminars and

writing-intensive courses, in which writing instruction is embedded in content courses, are two

curricular innovations that attempt to remedy the problem. Yet, these innovations present

problems of their own when writing instruction winds up being de-emphasized in favor of course


        PSU has been a leader in this recent pattern of curricular reform in writing instruction. In

1994, it abandoned the traditional two-writing-course requirement and instituted University

Studies, a large, innovative, cross-disciplinary unit housed outside the English department, as

well as a program of other upper-division, discipline-based Writing-Intensive Courses in which

writing is taught and integrated at every level. Our institution went further than most in

decentralizing writing: we dramatically reduced the English department-sponsored writing

                                                                      Writing Survey Interim Report

program to entrust all mandatory composition instruction to this diffuse, cross-disciplinary

model. This pioneering attempt at university-wide curricular reform was supported by grants

from the Pew Charitable Trusts, Kellogg, and became a national model. However, more than a

decade after implementation and despite many successes in improving overall student learning,

the general consensus on and off campus has been that the writing skills of students remain

inadequate to meet the demands of school and employment.

       Since 2000, we have been engaged in a systematic program of research and development

to examine broadly the many sites in which writing instruction takes place throughout the

university curriculum, to describe the current status of student writing abilities, and to gather the

perspectives of all stakeholders—faculty, students, and community business partners. Our

findings should provide a solid research-based foundation on which to improve existing methods

of writing instruction, and to propose new ones. We designed this research program to move

through three stages: Stage 1—the Preliminary Survey Research; Stage 2—Design and Test of

Experimental Writing Courses and/or Programs; and Stage 3—Measuring Outcomes of

Experimental Programs. This research program has the potential to “reform” our recent

curricular reform. Our initial literature review found no clear case of a university writing

program based on empirical research into attitudes and experiences of the main stakeholders.

This on-going research at PSU should, therefore, have national implications.

       In this initial report, we present for institutional circulation our tentative findings as we

approach completion of the Preliminary Survey Research, or Stage 1. The goal of Stage 1 has

been to survey students, faculty, and potential employers before designing experimental writing

courses and programs. The Preliminary Survey Research questionnaire that we distributed

consisted of both multiple-choice and free response items, with the purpose of discovering and

                                                                     Writing Survey Interim Report

documenting the experiences, attitudes, and needs of students, faculty, and potential employers.

By surveying these groups, we hoped to discover what students know about writing, where they

learn it, what problems they face in their academic writing, and what resources they find most

helpful in improving those skills. Our team was especially interested to see how student

responses compare to those of faculty and employers. Our hope was that these results would

shed light on both the deficits and strengths of PSU‟s current writing program.

       Although the Community Partners, or employers, portion of the survey is still underway,

the preliminary results are in for the multiple-choice portions of the student and faculty writing

surveys, which we collected from Fall 2005 to Winter 2006. This report provides an overview of

the initial multiple-choice results-- a team of English graduate students has recently begun

tabulating the open responses. As described in detail below, the survey sample turned out to be

unrepresentative of the PSU student body as a whole on a number of demographic points; even

so, many of the results were interesting and surprising, particularly when we compare student

and faculty responses on related questions. These initial faculty and student responses

demonstrate the kinds of data that a wider distribution of the surveys would yield.

       The Preliminary Research Survey adhered to Federal and University policies regarding

research involving human subjects, and received on-going approval from the Institutional

Review Board‟s Human Subject Research Review Committee. All survey respondents

participated voluntarily and under assurance of anonymity, with no potential risks to the subject.

                                                                     Writing Survey Interim Report


Developing the Survey

        Our research team developed three distinct surveys for three different populations with an

interest in student writing at PSU: the students themselves, the faculty for whom students write,

and the community partners that employ PSU students after they graduate. While we

successfully distributed the student and faculty questionnaires until Winter 2006, the community

partner survey has proved more complicated to design and distribute effectively, and that portion

of the research has not yet been completed.

        The final draft of the survey form, completed in 2004, evolved through a number of pilot

questionnaires over several years, revised to refine the kinds of data the survey would yield and

to improve the documents‟ clarity for respondents. The drafting process included one focus

group, led as a demonstration by PSU professor David L Morgan; we rewrote many of the

survey items, especially the phrasing of the open answer questions, based on his focus group‟s

feedback. We intended to conduct more focus groups to further refine the questionnaire;

however, the funding for that part of the development process was not forthcoming, and we

moved forward with what we had.

        The multiple choice questions on the final version of the student survey1 fall into four

categories. The first category is demographics, which includes basic information on age (item

2), gender (item 1), class standing (item 4), and major or professional school (item 6), as well as

questions relating to language background. This demographic information allows us to cross-

reference students‟ attitudes and experiences related to writing with their educational and

linguistic histories.

  For a complete list of student survey questions by category, see Appendix A. For an exact
reproduction of the student survey questionnaire and possible responses, see Appendix B.

                                                                    Writing Survey Interim Report

        The second category of inquiry on the student survey asks questions regarding aspects of

composition theory and practice. These include a question about the student‟s knowledge of the

“steps or stages” of writing (item 13), to gauge his or her understanding of the writing process.

Another question in this category asks whether the student changes his or her writing to “suit

individual professors” (item 24)—affirmative responses to this question reflect a rhetorical

understanding of audience.

        The third category of questions on the student survey deals with the respondent‟s

attitudes toward writing. This category included inquiries into the students‟ satisfaction with

their own writing skills (item 12) and how much writing they do for their own pleasure (items 18

and 19). Questions of this kind give us a better understanding of how students view themselves

as writers, which can provide both a profile of the attitudes that students who attend PSU tend to

bring to their writing, and also a sense of what kind of attitudes toward writing we as an

institution are fostering in our students.

        The final category of questions on the survey relates to the student‟s personal history with

writing instruction, with several items dealing specifically with the student‟s experiences with

writing at PSU. These items ask, for example, where the student goes for help with writing (item

15), how often different kinds of writing assignments are given in their PSU courses (item 21),

and what aspects of the writing process give them the most trouble (item 23). These questions

are designed to give us insight into how much and what kinds of writing students are being asked

to do in the diffuse writing program at PSU, and how they are using and experiencing the

resources that are in place to help them develop their writing skills. Taken as a whole, the

questions on the student survey provide a multifaceted picture of how students from very diverse

                                                                     Writing Survey Interim Report

backgrounds are experiencing writing at this institution, and what kinds of attitudes towards

writing they develop in response.

       The questions on the faculty survey2 break down into categories similar to those on

student questionnaire: demographic items, composition theory-related items, items inquiring into

the faculty member‟s personal history with writing, and items related to how the respondent uses

writing in the classroom. Many items on the faculty survey repeat or correspond to the questions

posed on the student survey. The demographic questions include age (item 2), gender (item 1),

disciplinary affiliations (item 5), tenure status (item 3), number of years at PSU (item 4), and the

degree to which English was spoken in the faculty person‟s household growing up (item 6). The

questions relating to composition theory, which correspond to similar items on the student

survey, ask how much the faculty person knows about the “steps or stages that experienced

writers go through as they write” (item 7) and whether she requires students to “change their

writing to suit their audience” (item 23).

        Several items deal with the respondent‟s personal history with and attitudes toward

writing. These include questions about where the faculty member learned a lot about writing

(item 8), how much and what kinds of writing his or her professional life requires (items 16 and

17), and how often the respondent uses his or her writing outside of work (item 13). These kinds

of questions give us a sense of the professional writing demands that PSU faculty face, which

inform their approach to student writing, and might also suggest what kinds of writing tasks

students need to be prepared to take on in their own professional lives.

       The remainder of the questions on the faculty survey relate to attitudes toward student

writing and writing instruction at PSU. These include questions regarding the faculty member‟s

  For a complete list of faculty survey questions by category, see Appendix C. For an exact
reproduction of the faculty survey questionnaire and possible responses, see Appendix D.

                                                                       Writing Survey Interim Report

satisfaction with students‟ writing abilities (item 9), the degree to which that faculty member

believes that it is his or her responsibility to provide writing instruction to students (item 19), and

whether he or she feels able to give helpful feedback to students on their writing (item 21).

These kinds of questions give us a broader picture of how PSU‟s diffuse writing program is

playing out in individual classrooms: if writing instruction is now the shared responsibility of all

departments and faculty, then it is important to know whether faculty believe that they should be

providing that instruction, and whether they feel equipped to do so effectively. Furthermore, a

sense of how faculty experience student writing at PSU provides one kind of measure in

determining how well students are writing based on their instruction in our diffuse model.

       The as yet undistributed community partner survey approaches many of the same writing

issues as the student and faculty writing surveys, including rhetoric and composition theory-

based questions about process, as well as questions about the company‟s history with employee

writing and its satisfaction with employees‟ writing skills.

Conducting the Survey

       The research team relied on the voluntary participation of faculty members and their

students to obtain completed surveys. Participating faculty allowed an assistant to come in

during class time to conduct the student survey, but students in those classes were not required to

take part. All participants were assured that their responses would remain anonymous. The data

gathered about student and faculty writing through these surveys, therefore, are derived from an

opportunistic sample, rather than from a random sample. The survey responses are also subject

to the inherent limitations of any form of voluntary self-reporting: there is no way to verify

                                                                      Writing Survey Interim Report

whether participants responded truthfully or accurately, and the selective effect of volunteering is


          Survey administrators presented the questionnaire to students according to a standard

application protocol.3 The survey administrator came in during class time; after introducing

herself, she instructed the respondents to indicate the Course Registration Number for the class

on the survey form, as well as a number that would identify that the separate pages of each

questionnaire and open answer form came from the same respondent. After assuring students of

their anonymity, and reminding them that there were no right or wrong answers on the survey,

the administrator thanked the students and invited them to begin. The administrator also used

standard prepared responses to answer student questions about how to fill out both the multiple-

choice (Scantron) and free response sections of the survey form.

          Late in the Spring 2007 term, our research team had the opportunity to collaborate with

the Office of Institutional Planning and Research to distribute more than five hundred additional

students surveys (and seven faculty surveys) as part of a Pilot Writing Assessment Project.

While data from this new source are just beginning to come in, these additional surveys will

provide an even larger pool of student responses, and perhaps contribute to a more representative

overall sample. Partnership with the Pilot Writing Assessment Project also creates an opportunity

to link student survey responses with actual student writing samples, which will also allow us to

further examine the link between student responses on questions like satisfaction with their

ability to write, and their actual writing abilities as determined by a University Studies portfolio

assessment team. These additional data will be forthcoming during the 2007-2008 academic


    For a transcript of the standard application protocol, see Appendix E.

                                                                    Writing Survey Interim Report


       When our research team began the student survey, we attempted to obtain a random

sample of students attending PSU, but these repeated efforts were unsuccessful. We were,

however, able to collect responses from an opportunistic sample of 1453 students in classes

where we could find cooperative instructors. Because participation in the survey was voluntary

within the surveyed classes, only those students willing to answer the anonymous questionnaire

provided data. This highly selective process of collecting a sample limits the extent to which we

can generalize our findings to the population of interest: the entire student body at PSU.

       To estimate how well our sample represented the whole student body, we compared the

two groups on all demographics where we had comparable data. The list of variables includes

gender, age, class standing, and declared major. We also made a rough comparison between the

percentage of ESL students in the sample and the percentage of international students at PSU,

taking those two variables to be roughly correlated.

       The comparison of data in the sample and the entire student body appears in Table 1,

below. Data for the PSU student population were taken from the Winter Term Factbook, 2006

(4th Week), compiled by the PSU Office of Institutional Research and Planning.

                                                                    Writing Survey Interim Report

      Table 1: Comparison of PSU Student Population in Winter 2006 with Survey Sample
                                         % PSU
         Variable                        W2006           % Sample         Sample/PSU
         Gender (% women)                55.3            58.2             1.05
         Gender (% men)                  44.7            41.8             0.94
         Age (Mean, all students)        29.1            26.4             0.91
         Age (Mean, undergrad)           26.8            25.6             0.96
         Age (Mean, grad)                35.3            35.2             1.00
         Internatl/ESL (% students)      5.5             7.9              1.44
         Class: % Freshman               8.4             20.7             2.46
         Class: % Soph                   11.4            19.4             1.70
         Class: % Junior                 17.7            24.1             1.36
         Class: % Senior                 25.6            28.3             1.11
         Class: % PostBac                7.6             3.9              0.51
         Class: % Masters                16.9            1.8              0.11
         Class: % Doctoral               2.2             0.2              0.09
         Major: % Undeclared             15.7            11.4             0.73
         Major: % Humanities             13.6            17.4             1.28
         Major: % Science                9.3             18.5             1.99
         Major: % Social Science         13.9            13               0.94
         Major: % Bus Admin              15.3            12.2             0.80
         Major: % Education              4.7             8.8              1.87
         Major: % Engineering            8.6             3.9              0.45
         Major: % Fine Perf Arts         7.6             8.8              1.16
         Major: % Social Work            1.9             2                1.05
         Major: % Urban Pub Aff          8.7             4                0.46
         Major: % CLAS                   36.8            48.9             1.33

       In Table 1, the last column shows how well the data from the sample agree with data

compiled directly from the entire population of PSU students in Winter Term 2006. An entry of

1.00 in the last column means that the sample estimate of the population is exactly correct.

Entries less than 1 show that the sample underestimates the population value; entries greater than

1 show that the sample overestimates the population value.

       The value in the last column is directly proportional to the under- or overestimate. For

example, the number in the last column of the first row of the table shows that the sample

estimate of the proportion of women in the student body was 105% of the actual proportion of

women students at PSU in Winter 2006. So the sample overestimates the proportion of women

                                                                    Writing Survey Interim Report

students at PSU in Winter 2006 by 5%. The corresponding entries in the second row of the table

show that the sample underestimates the proportion of men students at PSU in Winter 2006 by

6%: the sample estimate was only 94% of the actual proportion of men students at that time.

       One may scan down the last column to immediately see how well or how poorly the

sample represents the actual student body at PSU on each of the variables listed in column one.

As a rough rule, numbers in the last column that range between .95 and 1.05 may be considered

as correct estimates; numbers between .90 and 1.10 as being fair estimates. Numbers that fall

outside that range mean that the sample is not representative of the actual student body, and

conclusions from the survey relating to those demographic features cannot be generalized to the

entire student body at PSU. Eight of the twenty-four measures show good or fair estimates, but

the two on gender only count as one, so only seven of twenty-three comparisons (30%) show

acceptable agreement between sample and population.

       The values in the last column also allow us to evaluate the sample itself. For example,

we have almost two and one-half times as many freshmen in the sample as we would expect to

get by chance, but only one-half the number of post-baccalaureate students we would expect, and

one-tenth the number of masters and doctoral students. This certainly says something about the

classes that completed the survey. A similar concern applies to majors: the sample contains

about twice as many science and education majors as we would expect by chance, but less than

one-half as many students from Engineering and Urban and Public Affairs. The number of

majors from the College of Liberal Arts and Sciences in the sample is 33% greater than we

would expect from their actual numbers in the students at PSU in Winter 2006.

       Looking at the last column, we can see a precise index of how much the survey sample

differs from the actual student body in Winter 2006. These matters affect how much we can

                                                                   Writing Survey Interim Report

generalize survey results to all the students at PSU. They do not, however, affect much of the

valuable information in the responses themselves. Within the limits of the sample, we can still

make comparisons among majors, class standing, etc., which are as significant as the number of

cases allows. These findings are useful in themselves, and can be very informative. However,

we would need a better sample than the one we obtained in the Preliminary Research Survey to

draw conclusions about students at PSU as a whole.

       The faculty who were sampled also turned out to be unrepresentative of PSU‟s entire

faculty body on a number of the points for which we could find comparative data. We collected

responses from an opportunistic sample of 55 instructors who agreed to complete the survey

form upon request. Some of the volunteer faculty members were instructors of students who

completed the survey in classroom sessions, and others were approached by research assistants

though personal contact or through solicitation of volunteers by email and departmental

announcements. To test how well the sample represented the PSU faculty, we compared the two

groups on the four variables for which we had common data: gender, tenure status years at PSU,

and academic affiliation.

       The comparison of the sample with the whole PSU faculty at the beginning of Winter

Term, 2007 appears in the following table:

                                                                    Writing Survey Interim Report

            Table 2: Comparison of PSU Faculty in Winter 2007 with Survey Sample
                                          % PSU
                     Variable             W2007          % Sample        Sample/PSU
             Women                               48.1            54.7              1.14
             Men                                 43.8            45.3              1.03
            Tenure track
             Yes                                  45             83.7              1.86
             No                                   54             16.3              0.30
             Unknown                               1                0
            Years at PSU
             1 to 3                                28            20.4              0.73
             4 to 6                              19.1            20.4              1.07
             7 to 10                             19.5             8.2              0.42
             11 to 20                            24.2            38.8              1.60
             >20                                  9.1            12.2              1.34
            Academic affiliation
             Humanities                           14             14.5              1.04
             Science                              16             18.2              1.14
             Social Science                       14             14.5              1.04
             University Studies                    4              5.5              1.38
             Urban Public Affrs                   10             10.9              1.09
             School of Business                    8             14.5              1.81
             Engr/Computer Sci                    12                0              0.00
             Fine/Perform Arts                     6              1.8              0.30
             ESL                                   3             36.4             12.13

In the above table, the last column shows how well the data from the sample agree with data

compiled directly from the entire population of PSU faculty in Winter Term 2007. As in Table

1, an entry of 1.00 in the last column means that the sample estimate of the population is exactly

correct, entries <1 show that the sample underestimates the population value, and entries >1

show that the sample overestimates the population value. The number in the last column of the

first row of the table shows that the sample estimate of the proportion of women in the student

body was 114% of the actual proportion of women faculty at PSU in Winter Term 2007, so the

sample overestimates the proportion of women faculty at PSU in Winter 2007 by 14%. The

corresponding entries in the second row of the table show that the sample likewise overestimates

the proportion of men faculty at PSU in Winter 2006 by 3%: this apparently anomalous result is

                                                                       Writing Survey Interim Report

because the gender of 8% of the faculty could not be determined by their first names, the

criterion used for making the judgment in the database used.

        As in Table 1, numbers in the last column that range between .95 and 1.05 may be

considered as correct estimates; numbers between .90 and 1.10 as being fair estimates. Numbers

that fall outside that range mean that the sample is not representative of the actual faculty, and

conclusions from the survey cannot be generalized to the University. Six of the 18 measures

show good or fair estimates, but the two on gender only count as one, so only 5 of 17

comparisons (30%) show acceptable agreement between sample and population.

        It is striking that the sample is so skewed on some of the variables: most of the faculty at

PSU are on fixed appointment, but 84% of the survey sample are on tenure track. The sample

also contains more faculty than expected by chance with long service at PSU. This certainly

says something about those who completed the survey. A similar observation is true about

academic affiliation: the sample contains more faculty than expected from UNST, SBA, and

ESL, but none from Engineering and Computer Science. The proportion of faculty reporting

affiliation with ESL programs appears to be especially over-represented, at twelve times the

expected number. However, this can be explained at least in part by the fact that our survey

permitted faculty to report multiple affiliations; in fact, thirteen of the fifty-five surveyed faculty

(24%) reported multiple affiliations. Eleven reported two affiliations, and two reported three

affiliations. Seven of the “other” affiliations are ESL (including both “triple” cases), which

clarifies the large overrepresentation of ESL found in the sample. In these cases, faculty with

some other primary academic identity may have reported ESL as an academic affiliation if they

ever worked with ESL students in their classrooms.

                                                                    Writing Survey Interim Report

       Our list of variables used for comparison is not long, and may not address important

differences. However, these comparisons do limit the degree to which we can generalize survey

results to all faculty at PSU. As with the student sample, though, the faculty responses we have

are interesting and useful in and of themselves, and when cross-referenced on points of

demography and with the responses of students on related questions. They also provide an

important sense of the kind of data we could obtain with a broader distribution of the survey.


       The preliminary survey results fall into three categories: results from the student survey,

results from the faculty survey, and a comparison of student and faculty responses to similar or

related questions that appeared on both questionnaires. These results shed significant light on

the experiences and attitudes toward writing professed by the surveyed students, as well as the

pedagogical approaches and professional writing experiences of the faculty. They also provide

some fascinating points of comparison between these two major constituencies of PSU‟s Writing


Student Survey

       The results of the student survey highlight several interesting aspects of the respondents‟

histories with the English language, with writing instruction, and their educational track at PSU.

The majority of this sample of students (71.6%) lived in the West Coast region of the United

States between the ages of 6 and 12 (item 8); another 11.8% of respondents said they resided in a

non-English-speaking country during those ages. A strong majority of students (88.8%) said

they were raised speaking Standard English, and 83.7% lived in families where English was

                                                                     Writing Survey Interim Report

spoken “only” or “mainly” (item 9). On the other hand, 16.2% of respondents grew up in

families where English was spoken only “sometimes,” or “never.”

       Of the students surveyed, 35.7% began as freshmen in the University Studies Program,

while 36.9% had transferred in with previous credits and joined UNST after enrollment (item 5).

On the other hand, 19.7% of students had transferred in and were following non-UNST

graduation requirements, and 7.7% were in the Liberal Studies or Honors program. Therefore,

72.6% of the respondents were satisfying UNST requirements and 27.4% were following some

other course of study.

       This range of graduation tracks among the students surveyed reflects the diversity of PSU

students‟ educational backgrounds. When asked where they “learned a lot” about writing (item

14), with multiple responses permitted, 63.8% of respondents said high school, 61% said a four-

year college or university, 40.7% said Grades 1 to 8, 28.9% said at home, and 25.3% said they

learned a lot about writing at community college. Given that so many of the students were

fulfilling UNST graduation requirements, which do not include any mandatory composition

courses, we were surprised to find that 67.8% of respondents had taken some required writing

course in their careers (item 7), and 62.3% had taken at least one non-required writing course, as

well. Additionally, 32.1% of respondents had taken a Writing Intensive Course.

       The student survey also shed light on the respondents‟ attitudes and opinions about

writing in the academic context of PSU. While most students tended to believe strongly that

writing would be important to their careers (82.3% rated it “very much” or “extremely

important,” item 17), this tendency increased with age; that is, the older the student was, the

more important he or she judged writing to be. This is true to a significant degree (p<.001), with

a consistent increase in the judged importance of writing with each increment in age.

                                                                      Writing Survey Interim Report

Interestingly, more than half of the students surveyed (55.2%) indicated that they “always” or

“often” used their writing skills outside of school (item 18); only 15.2% of the respondents said

they “never” or “seldom” wrote beyond their academic work. In fact, 31.2% of those surveyed

said that they “always” or “often” wrote for pleasure (item 19). Students appear to be doing a

surprising amount of extracurricular writing, perhaps reflecting the proliferation of electronic

written communication among peers.

       Students also had the opportunity to diagnose their own writing problem areas. When

asked what gave them the most trouble in their writing, with multiple answers permitted (item

23), students responded as follows in Table 3:

                            Table 3: Self-Identified Student Problem Areas
                Problem Area               “Always,” “Often,”       “Always” or “Often”
                                             or “Sometimes”
     Narrowing the topic                  69.1%                    29.5%
     Starting the draft                   64.9%                    32.7%
     Coming up with ideas                 61.4%                    24.7%
     Organizing the paper                 60.1%                    27.6%
     Revising the paper                   58.9%                    28.6%
     Editing the paper                    55.6%                    25.9%
     Putting thoughts on paper            51.1%                    23.6%
     Supporting points with examples      50.3%                    19.8%
     Spelling and grammar                 45.9%                    26.7%
     Understanding the assignment         32.1%                    7.8%
     Other                                20.5%                    11.1%

ESL students expressed more difficulty than non-ESL students in every category, but professed

significantly more trouble with “grammar and spelling” and “coming up with ideas.” It is

interesting to note that a majority of students said that they experience difficulty with eight out of

the eleven categories. Tellingly, the top choices-- narrowing the topic, starting the draft, and

coming up with ideas-- involve the early, invention stages of writing. These difficulties reflect,

and perhaps partially explain, the classic student composition pattern of procrastination and last-

                                                                    Writing Survey Interim Report

minute panic (nearly a third of the students said they always or often had trouble starting the

draft). Greater instructional support during the invention stages might give students more time to

focus on other problems that the majority expressed trouble with: organizing, revising, editing,

putting thoughts on paper, and finding examples to support their points.

       The survey also revealed which personal and institutional resources students turned to for

writing assistance. When asked where they went for help with their writing (item 15), with

multiple answers permitted, students responded as shown in Table 4:

                           Table 4: Where Students Go for Help with Writing
               Writing Resource        “Always,” “Often,” “Always” or “Often”
                                         or “Sometimes”
          Reference books              62%                    23.9%
          Friends or peers             60.6%                  24.8%
          Professors                   56.9%                  23.2%
          Parents                      36.7%                  16.6%
          Spouse or partner            31.1%                  15.6%
          Teaching assistants          24.8%                  7.1%
          Siblings                     20.2%                  7.5%
          Other                        18%                    6.7%
          University Studies mentors 15.1%                    3.5%
          Writing Center               14.7%                  5.4%

Another 16% of respondents said that they never get help with their writing.

       We found the large percentages of students seeking writing assistance from parents and

spouses or partners surprising. It turned out that younger students-- those between the ages of 16

and 25-- tended to go to parents for help with writing more than older students. Those older

students were significantly more likely to consult reference books or go to a spouse or partner for

help, instead. Unsurprisingly, ESL students were much more likely to seek help from the

Writing Center (43%) than from their parents (17%).

       When asked how adequately their education so far has prepared them for writing (item

16), 53% of students responded “very much” or “extremely” adequate, 44.5% said “somewhat”

                                                                     Writing Survey Interim Report

or “moderately” adequate, and only 2.5% said “not at all” adequate. There was no significant

difference between UNST and non-UNST students. ESL students were significantly less

satisfied (p=.001) with their education in writing than non-ESL students, although 38% of ESL

respondents still rated that education either “very much” or “extremely adequate.” When asked

how much they considered their skills to have been improved by writing classes at PSU, 28.2%

of respondents said “very much” or “extremely improved,” 62.8% said “moderately” or

“somewhat improved” and 9% said “not at all improved.” No distinction was made in this

question between composition classes within the English Department and upper-division

Writing-Intensive Courses taught by faculty in other departments with English Department-

trained assistants.

        When asked how satisfied they were with their ability to write (item 12), 11.4% of

students said they were “extremely satisfied” and another 42.1% said that they were “very much

satisfied.” In other words, more than half (53.5%) of the students expressed a high degree of

satisfaction with their writing abilities. This opinion became significantly more positive (p=.000)

for students who were further along in their studies, from freshmen through post-baccalaureate

and graduate studies. Meanwhile, 12.6% of respondents indicated that they were only

“somewhat” (10.5%) or “not at all” (2.1%) satisfied with their writing ability, leaving over one-

third (33.7%) who rated their satisfaction right in the middle, indicating that they were

“moderately” satisfied with their ability to write. Of course, in the absence of writing samples, it

is impossible to know how accurately the students‟ professed satisfaction reflects their actual

writing abilities, except by comparing them to the degree of faculty satisfaction with student

writing, as we will do below. A regression analysis of the survey data showed that students

tended to be more satisfied with their writing ability if: (a) they knew more of the steps and

                                                                      Writing Survey Interim Report

stages experienced writers go through as they write (item 13), (b) they used their writing skills

outside of school often (item 18), (c) they believed that writing skill comes more from personal

effort than getting instruction (item 22), and (d) they took a required writing course (item 7), in

that order of predictive strength. It may seem odd that taking required writing courses and

knowing the “steps and stages” of writing occur alongside a conviction that instruction is less

important than personal effort. However, the order of the predicting variables is illuminating: it

is beneficial to have basic tools, but few things beat practice, personal dedication, flexibility and

inventiveness to achieve academic success. Perhaps students perceive that a combination of

basic instruction and self-reliance is what they need to be capable writers.

       Finally, the student survey gives us some insight into the varieties and amounts of writing

that students are being asked to do in their classes at PSU. When asked what kinds of writing

their undergraduate courses required, excluding writing courses (item 21), student responded as

follows in Table 5:

                        Table 5: What Kinds of Writing Students’ Courses Require
               Kind of Writing        “Always,” “Often,” “Always” or “Often”
                                        or “Sometimes”
             Short response paper    87.2%                   58.7%
             Research paper          81%                     43.4%
             Term paper              80.2%                   49.2%
             Essay exam              80.2%                   48.2%
             Short, in-class paper   58.1%                   26.9%
             Journal                 52.3%                   20.9%
             Other                   39.3%                   14.1%
             Laboratory report       38.3%                   20.6%

When asked how much writing was required on average in courses that were not writing-

instruction courses (item 20), 10.5% of respondents marked “more than 15 pages,” 29.1%

marked “10-15 pages,” 32.1% marked “6-10 pages,” and 36.9% marked “1-5 pages.” If students

understood this to mean the amount of writing they produced throughout the term for an average

                                                                     Writing Survey Interim Report

class, this leaves 69% writing less than ten pages total for most of their courses each quarter.

Taken together, these results seem to indicate that most students are getting at least some

experience writing research papers and term papers within their disciplinary classes, although

less than half report having this experience often. Nearly as many students are frequently being

asked to write essay exams. It is unclear whether those essay exams are timed writing situations

or take-home exams; the former are generally considered by composition theorists to be highly

artificial writing situations that make poor measures of student writing ability. Comparatively

fewer students seem to be regularly engaging in informal, low-stakes “writing to learn” activities

such as journaling and in-class exercises.

Faculty Results

       As indicated in the section on the faculty sample, above, the demographic results of the

faculty survey showed our sample to be disproportionately on tenure track, with 84% marking

“yes” when asked whether they were tenure-track (item 3). The respondents also tended to have

spent many years at PSU (item 4): 51% indicated that they had spent eleven years or more at this

institution. The median age of respondents (item 2) was 50, and 55% were female (item 1); their

academic affiliations are outlined in Table 2, above. Among the faculty surveyed, nearly 78%

indicated that English was the “only” language spoken in their home growing up; another 15%

marked that English was “mainly” spoken, and 7% marked “sometimes.” No respondent

indicated that English was “never” spoken in his or her childhood home.

       The second category of questions posed to the faculty related to their personal

experiences with writing. These items provide some interesting insights into the writing training

and habits of academic professionals, the same people who are generating writing-based

                                                                    Writing Survey Interim Report

assignments and providing some measure of writing instruction and assessment to students,

regardless of their disciplines; this is especially important knowledge given the diffuse model of

writing instruction at PSU. For instance, in order to design writing assignments that encourage

students to use the most effective process-oriented approaches to composition, faculty need to be

aware of how those processes operate. Item 7 on the faculty survey asked how much the

respondent knew about “the steps and stages that experienced writers go through as they write.”

While 40% of surveyed faculty indicated that they know “much” or “very much” about those

steps and stages, more than 38% marked that they only knew “some,” which suggests a degree of

uncertainty about the writing processes of “experienced writers” among more than a third of the

faculty respondents.

       When asked where they learned a lot about the writing process, with multiple responses

permitted (item 8), faculty indicated the following (Table 6):

                         Table 6: Where Faculty Learned about Writing
                            Site                    Affirmative
                            College or university   76%
                            Profession              67%
                            High school             66%
                            Grades 1 through 8      49%
                            Home                    24%
                            Community college       2%

The high rate of response for “college or university and “profession” is unsurprising, given that

most respondents would have spent many years in graduate school, and that, with a median age

of 50, many respondents would have spent decades in the professional sphere.

       Of those surveyed, 26% indicated that they had taken any non-required writing classes

while they were in school (item 15). Faculty members who said they had taken non-required

writing classes showed a trend toward doing significantly more writing outside of work.

                                                                    Writing Survey Interim Report

However, there was no statistically significant relationship among faculty between taking non-

required writing classes in school and writing for pleasure, or knowing the steps and stages that

experienced writers go through.

       When asked how much writing their professional lives required (item 16), 11% of faculty

indicated that they wrote “some” or a “moderate” amount, and nearly 84% marked “a lot.”

Amusingly, almost 6% marked “too much”—a great commentary on a minority faculty attitude

toward writing. The kinds of professional writing that faculty said they engaged in (item 17)

broke down into the categories illustrated in Table 7, below:

                        Table 7: What Kinds of Writing Faculty Produce
          Kind of Writing           “Always,” “Often,”          “Always” or “Often”
                                       or “Sometimes”
   Lectures and syllabi            100%                     96%
   Reviews or criticism            83%                      49%
   Grant proposals                 81%                      54%
   Research reports                76%                      48%
   Committee reports               74%                      45%
   Book chapters                   67%                      20%
   Articles for a general audience 61%                      24%
   Books                           42%                      12%
   Research logs or field reports  37%                      19%
   Editorials or opinion pieces    32%                      8%


      4% of respondents indicated that they “often” wrote in personal journals, with 19%

       marking “sometimes.” Nearly 82% indicated that they “seldom” or “never” kept a

       personal journal. No respondent marked “always” for this item.

      8% indicated that they “sometimes” wrote creatively, with 26% marking that they

       “seldom” did this, and 66% indicated that they “never” engaged in creative writing. No

       respondent marked “always” or “often” for this item.

                                                                       Writing Survey Interim Report

      55% of respondents marked “always,” “often,” or “sometimes” for the category “other.”

       While we can only hazard guesses as to what faculty might include in this category, some

       possibilities might be correspondence (letters and email), web content (maintaining blogs,

       on-line bulletin boards, or websites), recommendation letters, and responses to student


Clearly, professional academics engage in a wide variety of written tasks; writing plays a major

part in their roles as teachers, researchers, and participants in institutional life. Faculty responses

to Item 12, “How much do you really believe that good writing skills are useful in YOUR OWN

life or career,” reflect the importance of writing in their professional lives: nearly 80% of

respondents marked “extremely,” and another 18% marked “very much,” for a total of 98%

selecting the two highest degrees of usefulness.

       This emphasis on writing carries over into faculty members‟ personal lives. Nearly 91%

indicated that they “sometimes,” “often,” or “always” used writing outside of work (item 13);

66% of respondents marked that they did this “often” or “always.” Sixty-four percent of faculty

“sometimes,” “often,” or “always” wrote for their own pleasure or satisfaction (item 14), with

20% indicating that they did so “often” or “always.” With this amount of professional and

personal writing going on, it is remarkable that more than a third of faculty expressed doubt

about their own knowledge of the “steps and stages” of experienced writers; this dissonance may

reflect a belief that their own writing processes are somehow idiosyncratic or different from

some codified, idealized process they believe other professional writers use.

       The faculty survey also included a number of questions designed to gauge the

pedagogical approaches that faculty from a range of disciplines bring to writing in the courses

they teach. This gets right to the heart of the issues surrounding the kind of diffused writing

                                                                     Writing Survey Interim Report

instruction that PSU has worked to implement—if the responsibility for writing instruction is

shared across departments, then faculty need to be aware that they carry such a responsibility,

and need to be willing and prepared to take on that role. When asked to what degree they

believed it was their job as instructors to teach writing (item 29), 24% marked “extremely” or

“very much,” another 71% marked “moderately” (43%) or “somewhat” (28%), and 6% marked

“not at all.” This preference for the middle option is reflected in a related question, Item 19,

which asked, “In your opinion, is developing skill in writing the personal responsibility of your

students, or does it come mostly from their writing instruction?” Fifty-five percent of faculty

responded “about equal” to this question, with another 24% marking “mostly on own” and 20%

marking “mostly instruction.”

       Item 23, “Do you require your students to change their writing to suit their audience?”

was meant to determine whether faculty have an understanding of the rhetorical concept of

audience, and whether they are encouraging their students to consider audience in their writing in

order to foster rhetorical agility. Of those who responded to this item, 86% indicated that they

“always,” “often,” or “sometimes” require students to change their writing for audience (with

47% marking “always” or “often”). However, nearly 22% of surveyed faculty did not give any

response on this question. This could be an indication that they did not understand what was

meant by the term “audience,” which might reflect a gap in their theoretical knowledge of

writing instruction.

       The survey also inquired into the quality of feedback that faculty gave students with

regards to their writing. When asked whether they felt able to give helpful feedback to students

on their writing (item 21), 91% indicated that they were “always,” “often,” or “sometimes” able

to do so, with 55% marking “always” or “often.” When asked how much students made use of

                                                                      Writing Survey Interim Report

that feedback (item 22), 30% marked “very much,” while another 56% marked “somewhat” or

“moderately.” Fifteen percent indicated that they didn‟t know. This suggests that faculty tend to

have a high opinion of their ability to give helpful feedback, but that they do not feel that

students always make good use of that feedback. Such a gap might indicate that many faculty

members lack an awareness of what kinds of feedback on writing are useful or meaningful to


       Faculty were also asked about where they sent students to get help on their writing, with

multiple responses permitted (item 10). Their responses were as follows in Table 8:

                   Table 8: Where Faculty Send Students for Help with Writing
                    Writing Resource             Faculty Referral
                    Friends or peers             86%
                    Teaching assistants          82%
                    Reference books              80%
                    Editors                      79%
                    Parents                      75%
                    Professors                   71%
                    Writing Courses              70%
                    University Studies mentors 66%
                    Writing Center               46%
                    Other                        85%

It is noteworthy that comparatively few faculty respondents referred students to the Writing

Center for assistance; this might reflect a lack of awareness among faculty of the institutional

resources available beyond their departments to help student with writing, and a need to further

publicize the Writing Center to faculty throughout the university.

       The final category of questions for faculty dealt with their assessment of student writing

at PSU, and how effectively they believe PSU‟s writing programs address students‟ needs.

When asked how much effort they felt students put into writing assignments (item 18), 55%

marked “moderate” or “a lot” (not one respondent marked “too much”). Forty-two percent felt

                                                                     Writing Survey Interim Report

that students put “some” effort into their writing; less than 4% marked “none.” Faculty were

also asked to indicate which areas of writing gave students the most trouble, with multiple

answers permitted (item 20). Their responses are exhibited in Table 9, below:

                  Table 9: Faculty-Assessed Problem Areas in Student Writing
                 Problem Areas            “Always,” “Often,” “Always” or “Often”
                                            or “Sometimes”
       Starting the draft                 100%                  57%
       Organizing the paper               98%                   69%
       Editing the paper                  98%                   81%
       Revising the paper                 96%                   76%
       Proofreading the paper             96%                   74%
       Supporting points with examples 96%                      66%
       Narrowing the topic                92%                   50%
       Mechanics                          92%                   80%
       Coming up with ideas               86%                   26%
       Spelling                           82%                   51%
       Understanding the assignment       62%                   15%
       Other                              20%                   21.4%

Clearly, faculty members perceive that students often procrastinate on their writing assignments.

If their perceptions are accurate, that might help explain the trouble with organizing, revising,

editing, and proofreading that the faculty respondents report: students who wait until the last

minute to start writing have less time to rework and correct their papers. This tendency on the

part of students could be countered by designing assignments that require the submission of

multiple drafts.

       The survey also asked faculty to rate their satisfaction with their students‟ writing

abilities (item 9). More than 46% of respondents opted for the middle choice, “moderately,” and

another 37% went one notch lower, with “somewhat.” More than 9% indicated that they were

“not at all” satisfied; less than 4% marked “very much” satisfied, and no respondent marked

“extremely.” Faculty appear to be underwhelmed with student writing abilities at PSU—we

compare their rates of satisfaction with students‟ self-assessment below.

                                                                   Writing Survey Interim Report

       Faculty were also asked a series of questions about their experiences with students with

writing-related learning disabilities (items 24-26). While 72% of respondents indicated that they

had taught students that they knew had learning disabilities, 55% of those who marked “yes” said

they did not know whether the students had ever sought assistance at PSU; another 33% said that

those students had sought assistance, while 6% said the students did not need any assistance, and

6% did not seek help. When asked whether learning disabled students seeking assistance had

encountered any problems getting the help they needed, 64% of respondents said they did not

know, 17% said their students had encountered problems, and 15% said they had no problems

getting assistance. The degree to which faculty did not know whether their students were

seeking help and whether they were getting the help they needed seems to indicate a lack of

communication on this issue between students and faculty; maybe students consider obtaining

assistance their own responsibility, rather than something they should trouble the professor with.

       Perhaps the most startling results were faculty assessments of the writing education that

students are getting PSU. In response to Item 11, which asked whether they felt their students‟

formal schooling had given them adequate writing skills, 12% said “not at all,” 43% said

“somewhat,” and another 43% chose the middle option, “moderately.” Only 2% marked “very

much,” and no respondent marked “extremely.” When asked more specifically about their

satisfaction with writing instruction at PSU (item 27), a remarkable 36% marked “not at all.”

Another 32% said “somewhat,” and 24% marked “moderately.” Only 8% said they were very

much satisfied with writing instruction at PSU, and no respondent marked “extremely.” Despite

this low level of satisfaction with PSU‟s writing instruction, when asked whether writing courses

or writing intensive courses had improved student writing skills (item 28), 55% said they did not

                                                                      Writing Survey Interim Report

know. Another 28% said their students‟ skills were “extremely” or “very much” improved, 11%

marked “moderately,” and only 6% marked “somewhat” or “not at all.”

       Overall, the surveyed faculty members tended to assess their students‟ writing abilities as

middling, and they seem to place a significant amount of blame on inadequacies in the writing

education students are receiving at PSU. Those who felt knowledgeable enough about the

quality of writing and WIC courses to respond to item 28 seemed to be reasonably satisfied with

how much those courses improved students‟ skills; their broader dissatisfaction with writing

instruction at PSU might reflect their perception that students are not taking enough writing

classes, rather than a sense that the instruction being given in writing courses is poor.

Comparison of Student and Faculty Responses

       While the faculty and student surveys differ in many respects, based on the particular

kinds of knowledge we hoped to gain from the two constituencies of PSU‟s Writing Programs,

some items were designed to yield the different perspectives of students and faculty on similar

issues. These comparable items fall into two categories: in some cases, the two groups are

surveyed on exactly the same issue; in other cases, the groups are surveyed on analogous or

complementary issues.

       An example of the first type of comparison is student item 12 and faculty item 9. Both

assess “satisfaction with student writing.” Student item 12 asks the respondent to rate

satisfaction with his or her own writing ability, and faculty item 9 asks the respondent to rate the

writing skills of PSU students in general, from an instructor‟s point of view. The combined data

from the two items give two different perspectives on student writing. Do students and faculty

agree or disagree on the quality of student writing? How accurately are students (in general)

                                                                     Writing Survey Interim Report

gauging their own skill at academic writing, in comparison to their instructors‟ general

expectations? Comparing data from these two items may afford some insight.

         An example of the second type of question is student item 13 and faculty item 7. Both

items ask the respondent how much s/he knows about the “steps or stages experienced writers go

through as they write.” This type of comparison measures the personal self-assessed states of

each kind of respondent (student or faculty). The purpose of this kind of comparison is generally

to see how well people in training (students) resemble the professionals they later become (for

example, faculty). We expect the training of an apprentice to be aimed at the skill levels of

journeyman and master. If that is also true of writing instruction, how much does schooling in

writing correspond to the skills and knowledge of later professional life? Do students get

training in proportion to the kinds of writing they must do after they graduate? This sort of

comparison, in conjunction with the community partner surveys (to be completed in the future),

could help us design more effective writing courses and curricula.

         In the first category of comparisons, student item 12 measured each respondent‟s

evaluation of his or her own ability to write, and faculty item 9 measured each instructor‟s

assessment of student writing at PSU in general. Ratings ranged from “not at all” satisfied (with

writing ability) to “extremely” satisfied. Results on these two items are compared in the chart


                                                                            Writing Survey Interim Report

                        Chart 1: Satisfaction with Student Writing Abilities

     Percent                                                                               Students
     respons                                                                               Faculty















       Students have a higher opinion of their own writing than the faculty do. We see it first in

the peaks of the two distributions. As noted above, the most common (modal) student

assessment is “very much” satisfied with writing ability, but the modal faculty assessment is

“moderately” satisfied. The percentage of student response on the low end of the scale (“not at

all” and “somewhat” satisfied) is very low, but that is where most of the other faculty responses

pile up. Finally, students picked “very much” satisfied at a rate of 42%, while less than 4% of

faculty made that assessment; likewise, 11% of students were “extremely” satisfied, compared to

none of the faculty.

       The fundamental disagreement of the two groups is moderated by a degree of like-

mindedness. We see it in the peak ratings, which are only one category apart: the faculty

respondents are not persuaded that student ability is “very much” satisfactory, but they do agree

it is “moderately” so at a rate of 46%, exceeding the Student rating at that level by 12%. Both

                                                                     Writing Survey Interim Report

groups tended to avoid the lowest rating of “not at all” satisfactory: only 30 students (2%) said so

of themselves, and they were joined by only five faculty “curmudgeons” (9%).

       How can we understand the disagreement between the two groups on this subject? By

the faculty‟s assessment, the students appear deluded, or at least complacent, about their abilities.

For their part, the students may feel underestimated by the faculty. Considering the different

perspectives of the groups may be helpful in understanding their different ratings. Students may

be evaluating their own writing with a less complex set of criteria (e.g. “Spell check says it looks

good!”) than faculty. There is also a generational factor here: as Mike Rose has pointed out,

college and university educators have been complaining about the upcoming generation‟s writing

abilities with great regularity since 1841 (5-6). Furthermore, there are the different contexts of

the items on the distinct survey forms. On their item 12, the students give an evaluation of a

single person‟s writing ability (their own); on their item 9, faculty respondents make an average

judgment of a group of people (all the students they have taught at PSU). So students judge a

single case and the faculty judge a group average; students make a subjective judgment

(“myself”) and the faculty a more objective one (“all students”). Both of these differences would

tend to favor a higher rating by students.

       Second is the important difference in the perspective of the two groups on the subject of

“student writing ability.” Faculty encounters with student writing are mainly in their classes, and

all faculty members are on their home ground in the courses they teach. All students in a class

must write to suit the teacher‟s disciplinary standard. So, the context for judgment of all student

writing for any particular instructor is a single writing venue: history professors expect their

students to write like historians, and chemistry professors expect students to write like chemists.

Each faculty member has only one venue of student writing in which to make a judgment.

                                                                     Writing Survey Interim Report

       But students go, and write, from course to course. In a single week, a student might write

a critical essay for an English course, a lab report for Chemistry, an ethnographic observation for

an Anthropology seminar, and a business plan for a Finance class. This kind of rapid, diverse

production of written assignments is what “writing performance” means to students, and it is

probably what they have in mind when rating their abilities as writers. We know students do this

because we asked them. Item 24 of the student survey posed the question: “In school, do you

change your writing to suit individual professors?” Ninety-two percent said they did, to some

degree, and 35% said that they did so either “always” or “often.”

       Some other data from the surveys seem consistent with this view of the student

perspective on writing and their self-ratings on item 12. As noted above, we find a significant

relation (p<.001) between the class standing of students and their degree of satisfaction with their

writing. As students advance from freshman to senior, satisfaction with their ability to write

steadily increases. This is consistent with increasing skill at dancing among writing contexts

(rhetorical agility), and perhaps also reflects more time spent in fewer disciplines as they

progress. Post-baccalaureate students suffer a loss of satisfaction, but students in graduate

programs continue the increase in satisfaction with their writing beyond the level of the senior

undergraduates. Graduate students have a double benefit: they have all the skills of experienced

undergraduates plus an even greater degree of specialization, and they are beginning to approach

the single-discipline writing conditions of most faculty members.

       We made another comparison of this type (student self-assessment versus faculty

assessment of general student performance) between student item 23, “What is it about writing

that gives you the most trouble?” and faculty item 20, “What is it about writing that gives your

students the most trouble?” The results of this comparison, displayed in the graph below, show

                                                                     Writing Survey Interim Report

faculty nearly doubling some of the self-reported rates at which students “always,” “often,” or

“sometimes” struggle with several aspects of writing. While 69.1% percent of students reported

having trouble narrowing their topics, 92% of faculty reported their students struggling in this

area. Although 55.6% of students said they had trouble editing, 98% of faculty respondents

indicated that students had problems in this area. Every single faculty respondent said that his or

her students had trouble starting their drafts, compared to 64.9% of students who self-reported

this problem.

                         Chart 2: What Gives Students Trouble with Writing

            Narrow topic

                Start draft

                 Get ideas

          Organize paper


       Think of examples

  Understand assignment

                              0    20       40        60       80        100      120
                                    Percent (always+often+sometimes)

       We could explain this gap between faculty and student responses in several ways. First

of all, faculty may, as readers, be able to see problems in student writing more clearly that the

students themselves. As experienced writers and readers in their fields, faculty members are

better equipped to see where students are missing the mark. However, an additional explanation

could lie in how students and faculty read the question. It may be another instance of the

student‟s single self-reference, versus the faculty‟s view of a population. A student might read

that item and decide, “I seldom have trouble editing my paper.” However, a faculty person may

                                                                      Writing Survey Interim Report

look at the question and think, “Yes! There are always students who have trouble editing in my

class!” The faculty person is more likely to think in terms of the piles of error-riddled papers

that she suffers through at the end of each quarter, rather than the one slaved-over research paper

that the individual student remembers putting so much effort into. Almost every kind of writing

trouble that a student can have is likely to show up at least “sometimes” in the faculty member‟s

grading pile, which would explain why more than 90% of faculty report seeing most of the

problem areas “always,” “often,” or “sometimes.”

       Another comparison we made of this sort is between student item 15, “Where do you go

to get help with writing?” and faculty item 10, “Where do you send students to get help with

their writing?” Again, as illustrated in the graph below, faculty report sending students to a

variety of outside resources “sometimes,” “often,” or “always” at much higher rates that the

students themselves report using those resources. In addition to the surprisingly low rates of

referral to and use of the Writing Center, as noted in previous sections, it appears that students

are much more likely to take advice that refers them to familiar or non-institutional sources of

help (e.g. friends or peers, reference books, or the faculty person they are already talking to);

they seem to be less inclined to follow recommendations that they consult a Teaching Assistant,

University Studies Mentor, or the Writing Center.

                                                                        Writing Survey Interim Report

                                  Chart 3: Where students go for help

   Writing Center


   UNST mentors




  Reference books

                    0   10   20      30     40      50      60     70     80     90    100
                                     Percent (always+often+sometimes)

         Again, however, these differences between faculty advice and student action can be

partially explained by the fact that faculty are able to offer a variety of suggestions to a number

of different students, and be able to say that they “sometimes” recommend each of the possible

resources, while students have a limited number of writing projects with which they need help,

and are likely to return many times to just a few resources that have proven helpful in the past.

         The second kind of comparison we can draw between the two data sets is between the

nearly identical knowledge-based and attitudinal questions that were asked of both students and

faculty on their surveys. One of these comparisons looks at student and faculty knowledge of the

“steps and stages,” or writing process, that experienced writers go through. This comparison

yields some interesting results, shown in the chart below.

                                                                    Writing Survey Interim Report

         Chart 4: Knowledge of the Steps and Stages Experienced Writers Go Through

              Very much



               Very little


                             0    10      20       30      40      50
                                       Percent response

While 5% more faculty respondents than students were willing to mark that they knew “very

much” about the steps and stages, 9% more student respondents than faculty said they knew

“much,” and slightly more students (40.9%) than faculty (38.2%) indicated that they knew

“some” about the steps and stages of writing. Somewhat more faculty (18.3%) than students

(12.3%) admitted to knowing “very little,” and 3.6% of faculty, compared to 2.8% of students,

said they knew “nothing” about the writing process.

       There are a few possible explanations for this difference. One is that students are being

exposed to more process-oriented writing instruction, while faculty, with a median age of 50,

were likely to have been educated under older composition pedagogies that did not teach from

the current process models. If that is the case, then perhaps students do actually know more

about the steps and stages that experienced writers go through. On the other hand, it is possible

that this is a “conscious incompetence” versus “unconscious incompetence” issue. Maybe

                                                                       Writing Survey Interim Report

faculty, as professional writers, are more aware of their own challenges and difficulties as

writers, and feel that they do not have as strong a handle on the writing process as they could,

while students think to themselves, “Sure, I know the steps and stages that writers go through.

First you drink three cups of coffee, then you stay up all night writing, and last you hit „print.‟”

If this is the case, students‟ higher rates of positive response could be a result of a different

understanding of the terminology, rather than a stronger grasp of the writing process.

        Another of these same-question comparisons, one which sheds some light on teacher-

student expectations of their own roles in writing instruction, is between their responses to the

question, “In your opinion, is developing skill in writing the personal responsibility of…students,

or does it come mostly from…writing instruction?”(student item 22, faculty item 19). The

responses that students and faculty gave for this question, shown in the graph below, are

remarkably similar.

              Chart 4: Responsibility for the Development of Student Writing Ability



              About equal

               Mostly own

              Enirely own

                             0     10     20      30      40     50      60
                                         Percent response

                                                                                Writing Survey Interim Report

         Although a handful of students believed that they should be able to rely entirely on instruction, a

         sentiment that no faculty respondents affirmed, by and large students and faculty seem to have a

         reasonably shared understanding that they are equal partners in students‟ writing development,

         with a proportionate percentage faculty and students leaning either slightly more towards

         instruction or towards individual effort.

                 The final comparison between nearly identical questions, one that reflects the

         apprenticeship concept of writing in academia, looks at where faculty and students say they

         learned “a lot” about writing (student item 14, faculty item 8). This comparison, charted below,

         comes out about as one would expect between a group of highly educated professionals and a

         collection of students in the midst of their higher education.

                             Chart 5: Where Respondents Learned a Lot about Writing


       High School


     Grades 1 to 8

           At home

Community College

 Writing Workshop

                      0     10        20       30        40       50       60         70      80       90
                                                     Percent response

                                                                      Writing Survey Interim Report

While somewhat less than half of both groups indicated that they learned a lot about writing in

grades one through eight, and both groups hovered around 65% of respondents marking “high

school,” 15% more of the faculty group selected “college/university,” most likely because those

who hold PhDs have spent a lot more time at such institutions than college freshmen. More than

67% of faculty, all of whom are professionals, marked “profession,” which was not a response

option for students. This was the second-highest response of faculty, and may show the

importance of learning “rhetorical agility” while a student. Interestingly, more than 25% of

students indicated that they had learned a lot about writing in community college, while only one

faculty respondent said the same. This may be because faculty tended, more than students, to

begin their careers in four-year colleges; or perhaps comparatively few PhDs now in their 50s

started their education in community colleges because of generational differences in college

careers due to rising tuition costs.

        It is worth noting that our comparison of students with faculty is a comparison limited to

a single profession: academic life. This comparison indicates that the university experience

becomes an increasingly important part of an individual‟s writing education and experience the

longer an individual stays within that system. It also shows that many students will continue to

learn a great deal about writing in their post-graduation professional lives, but the nature of that

progress, and the particular skills acquired and refined, may differ greatly among various

professional careers, both within and outside of the academy.


        These initial findings demonstrate the rich understanding that our research team is

gaining regarding student writing at PSU. As the OIRP data and writing sample assessments

                                                                    Writing Survey Interim Report

come in, the free response questions are tabulated, the community partner surveys distributed

and compiled, and as several graduate students complete their projected thesis work built on

these data, we will be able to add even more to the emerging picture of students‟ experiences of

our institution‟s writing instruction. Resources permitting, this will allow us to move into Stage

2, Design and Test of Experimental Writing Courses and/or Programs, in order to better prepare

PSU students to assume effective positions in the community, and let their knowledge of writing

serve the city.

                                                                      Writing Survey Interim Report

                                            Appendix A

                           Student Survey Questions by Category

Demographic data:

       Gender (item 1).

       Age (item 2).

       At what point the student entered the PSU curriculum and what graduation track they

        followed once they came in (item 5).

       Academic major or area of interest (item 6).

       Whether the student speaks English as a first language (item 3).

       The geographic regions where the student lived from the ages of six to fifteen (item 8).

       Whether and what kinds of English the student spoke while growing up (item 9).

       Whether the student is now comfortable writing Standard English (this question, item 11,

        is prefaced with an explanation of the term “Standard English,” which emphasizes that

        “many kinds of English are spoken in the U.S., and no one is better than another”).

       Whether the student has any learning disabilities that affect his or her ability to write,

        such as dyslexia, attention deficit disorder, hearing or visual impairments, or brain injury

        (item 25).

       Whether the student has taken any college composition or Writing Intensive courses

        (item 7).

Understanding of Writing Process:

       How much student knows about the steps and stages experienced writers go through as

        they write (item 13)

                                                                     Writing Survey Interim Report

       Whether they change their academic writing to suit individual professors (item 24)

Attitudes toward Writing:

       How satisfied the student is with his ability to write (item 12).

        The degree to which the student feels that his education has given him “adequate writing

        skills” (item 16).

       How often the student writes for his own “pleasure or satisfaction” (item 19) and how

        often the student uses writing skills outside of school “for any reason, personal or work

        related” (item 18).

       Whether the student considers “developing skill in writing” something that he does on

        his own or learns mostly through writing instruction (item 22).

       The degree to which the student “really” believes that good writing skills will be useful

        in his life or career (item 17).

History of Writing Instruction:

       Where the student learned “a lot” about writing (item 14).

       What areas of writing give the student “the most trouble” (item 23).

       Where the student goes for help with writing (item 15).

       How much writing the student is typically required to do in classes that are not writing-

        instruction courses (item 20).

       What kinds of writing most undergraduate courses “other than writing courses” require

        (item 21).

                                                                 Writing Survey Interim Report

   The degree to which writing or WIC courses at PSU have improved his writing skills

    (item 27).

   For those students who indicate that they have learning disabilities, “how easy it is to get

    the help [they] need to complete writing tasks” (item 26).

Writing Survey Interim Report

Writing Survey Interim Report

Writing Survey Interim Report

                                                                    Writing Survey Interim Report

                                           Appendix C

Demographic Data:

      Gender (item 1).

      Age (item 2).

      Tenure status (item 3).

      Number of years at PSU (item 4).

      Faculty academic affiliations (item 5).

      The degree to which English was spoken in the respondent‟s household while growing up

       (item 6).

Personal History and Attitudes toward Writing:

      Where the faculty person learned what she knows about the writing process (item 8).

      Whether she ever enrolled in a non-required writing class as a student (item 15).

      How often she writes for pleasure or satisfaction (item 14) and how often she uses

       writing skills outside of work for any reason (item 13). .

      How much writing her professional life requires (item 16) and how often a variety of

       specific writing tasks “occurs or is in progress” during her work (item 17).

      How much the faculty person “really” believes that good writing skills are useful in her

       own life or career (item 12).

Attitudes toward Student Writing and Writing Instruction at PSU:

      How satisfied the faculty person is with her students‟ writing abilities (item 9).

      Whether her students‟ schooling has given them adequate writing skills so far (item 11).

                                                                 Writing Survey Interim Report

   How much effort she feels her students put into their writing (item 18).

   What aspect of writing gives her students the most trouble (item 20).

   Where she sends students for help with writing (item 10).

   Whether she is able to give helpful feedback to students on their writing (item 21) and

    how much students make use of that feedback (item 22).

   Whether she thinks developing writing skills is the personal responsibility of students or

    comes mostly from writing instruction (item 19).

   Whether the faculty person considers it part of her job to provide writing instruction in

    courses (item 29).

   Whether the faculty person has ever had students with learning disabilities (item 24),

    whether those students received assistance needed to complete writing tasks (item 25),

    and whether they experienced problems when they sought that assistance (item 26).

   How satisfied the faculty person is with students‟ writing instruction at PSU (item 27)

    and whether she feels that writing or Writing Intensive courses at PSU have improved

    student writing skills (item 28).

Writing Survey Interim Report

Writing Survey Interim Report

Writing Survey Interim Report

                                                                     Writing Survey Interim Report

                                            Appendix E

                           Orientation to the Student Writing Survey

Have ready: supply of survey forms; supply of No. 2 pencils; a box to collect completed forms
Write on the chalk board: the CRN for the class

Hello, my name is ____________________________. Thanks for your help.

You must use a No. 2 pencil to mark your answers. Anybody need one? [distribute pencils as

Before you begin, please do two things.

1. Do you see the number marked Quest ID in the upper-left corner of page one? Please write
that same number in the space indicated on page 4 now. [hold up form and point out the
places] This number does not identify you; it‟s to keep the two parts of the survey together.

2. Now look at the space marked CRN on the upper-left corner of page one. [hold up form and
point] Please write the number on the board in that space [CRN for the class]. Then fill in the
bubble in each column to match the number at the top.

This survey is designed to be completely anonymous, and there are no right or wrong answers to
any item. We want to know what you have to say.

When you are done, please turn your form face down until we ask for them. Thanks for helping.
Please begin.

                                    Other remarks, as needed.

Please look at the upper-right corner of the first page to see how to fill in your answers. You
must fill in the bubble completely for the answer you choose. Please don‟t check it, or X it, or
circle it, because our scanner cannot read those marks.

If you don‟t see an answer to an item that is exactly right for you, pick the answer that is best or
closest to the right one. If you feel you want to mark two answers, force yourself to pick one or
the other, even if you feel they are equally good.

Some items allow you to fill in more than one answer. They will always say so, if you can do
that. Otherwise, please mark only one answer for each item.

In items like number 15, on page 2, (please look there now) you should mark the one best answer
on each line. Items 21 and 23 are similar to this one.

Do you have any questions before we begin?

                                                                 Writing Survey Interim Report

OK. You have as much time as you need to finish. [Or consult instructor]

                         =====To collect forms when finished====
Please put your completed forms into the box on the table, face down. Thanks for your help.

                                                                 Writing Survey Interim Report

                                        Works Cited

Miller, Hildy. “Decentralized Writing Programs: How Well Do They Work?” Writing Program

      Administrators grant application, Portland State University, Portland. 11 July 2002.

Rose, Mike. Lives on the Boundary. New York: Penguin, 1989.


Shared By:
Description: Sample Report Writing Skills document sample