By
Alice G. Brand ERIC Clearinghouse on Reading, English, and Communication In
high school, multiple-choice, short answer tests are often used as writing exams.
For several years now, however, writing specialists have agreed that, when it
comes to testing, nothing gets at writing better than writing itself. This testing
calls for evaluating writing samples. First-draft writing to a set topic is closer
to the real writing--the kind students are apt to do in college, in graduate school,
or on the job--than any multiple-choice question could possibly be. Moreover,
a fairly dependable picture of a student's writing skills can be drawn from writing
completed in anywhere from 20 (or 30) minutes (like the SAT) to two hours. On
the idea of "forewarned is forearmed," this digest reviews writing assessment--what
it means and how it works--at a selected number of colleges and universities in
the United States. OUTCOMES
ASSESSMENT Outcomes
Assessment means measuring an individual's writing ability after writing has been
studied formally. At the college level such measures of writing skill serve important
functions: admission, placement, course equivalency, junior status, exit from
courses, or barriers to graduation. What this means is that much of what happens
academically to a student from admission to graduation can actually depend on
writing performance.
An effective test will not only be able to certify to a student's competency as
a freshman writer, it can also certify to writing ability as a transfer student
or a rising junior--a second semester sophomore who has completed between 45 and
54 college credits and is entering the junior year and declaring a major. Once
into a major these writing tests can confirm skill in that major, and/or as a
graduating senior. They may even determine whether or not a student graduates.
PLACEMENT TESTING
Placement testing in writing is the first form of outcomes assessment colleges
undertake when a student arrives on campus. Because it tells the college how well
the student has learned to write in high school, it helps identify the appropriate
college writing course for him or her. Placement testing also allows for more
refined placement once the freshman writing class begins.
For example, in large systems like City University of New York, a freshman writing
assessment may consist simply of a 50-minute placement test requiring students
to respond to one of several topics. The University of Georgia uses the single
essay for placement at three skills levels, as do technical institutes such as
the New Jersey Institute of Technology and California Polytechnic Institute. Students
entering Cal Poly take a two-part examination consisting of an objective test
as well as a placement essay.
UCLA's entry level exam includes a reading passage followed by a choice of two
questions, one based closely on the text and the other on personal experience.
SUNY (State University of New York) Geneseo's testing program, on the other hand,
works exclusively with less-prepared students. Administered by a Language Skills
Center, placement consists of a one-hour essay which screens non-native speakers
of English for an ESL course or for the one semester required writing process
course. Placement
at many community colleges is similar, collecting a writing sample for placement
into a course for less-prepared students or a standard one or two course sequence.
Along with an exam, colleges often factor into a placement decision the student's
high school average and/or the SAT verbal score and the Test of Standard Written
English (TSWE). Such a system is currently in place at several SUNY schools, including
SUNY Brockport
(Brand, 1992). PLACEMENT
AND THE WRITING PROCESS
Writing a single essay seems easy enough. But what students may not be aware of
is that more and more colleges are looking for not only what is written but how
it is written. Over the last two decades interest in the written product has widened
to the writing process. Writing specialists believe that better work is more likely
to be produced when the student is engaged in an effective process. This means
that a student's writing develops over time: time to draft ideas, receive feedback,
revise repeatedly--even scrapping parts of a piece and rewriting others--then
editing, and proofreading.
And this in turn has meant a shift in college testing. If a student has been taught
a writing process approach--now standard in many high school English programs--a
timed, single-session essay test alone is not a valid method for evaluating writing
performance. The problem is simply that what is tested is not what is presumably
taught. The single
writing sample has come under attack because it captures only the first draft,
the start of the writing process. Test essays should reflect the conditions under
which a student has been instructed. This means first drafts or prewriting, multiple
revision, and incubation or rest time--at least in classes that emphasize process
(Sanders and Littlefield, 1975).
How well a student engages in the writing process can be approximated by (1) building
revision into the single essay exam, or (2) evaluating the work by portfolio.
Let's take the first option: the single placement essay.
When a student writes a placement essay as a freshman, he or she may find some
writing process activities incorporated into the exam situation. Some colleges
announce their topic (for example, the environment) to freshmen several weeks
in advance and invite them to choose the environmental problem and its impact
on the environment and then propose a solution. They may draw on several patterns
of development (e.g., description, narration, analysis) to make their point. And
they are given time to prepare, that is, prewrite. Students may be given several
topics ahead of time.
The University of Missouri-St. Louis simulates the revision process under test
conditions by allowing less able writers to discuss a topic and prewrite on the
first day of the exam. The essays are then collected and returned to them the
next day in order to "complete" the written product.
Placement at SUNY Stony Brook is more complicated because it attempts to compress
the writing process into freshman orientation. During summer orientation Stony
Brook conducts a "regular" English composition class of two hours for all incoming
freshmen. Instead of merely completing the usual impromptu essay, students participate
in free writing, discussion, draft writing, and peer response groups. Students
also write about how they feel about this experience, thus reflecting on the writing
process itself. The final draft is completed and used for placement in the appropriate
course. END-OF-COURSE
EVALUATION At
the other end of the college composition course the work must be evaluated. Students
are certainly familiar with a final essay exam. And to some extent that is still
happening at college. How a student wrote at the beginning of a semester can be
compared with how well he or she writes after a course of study in writing.
The principal goals of an outcomes assessment in writing is to answer these two
questions: Did the writing course actually help the student write better? If it
did, can that growth be measured? The attempt to measure the gains a student made
from a particular course may be called value-added assessment
(White, 1990). Improved scores between pre- and posttests are expected to
show the value a course has had for the student. If a writing course has brought
about gains, then those gains should be observable, appearing in behaviors that
can be measured. Although many improvements in an individual's writing process
take place in the mind and are therefore not observable, what changes are inferred
from them become the value added to the individual from the course
(White, 1990).
At Chicago State University students take the English Qualifying Essay Exam, which
follows a two-course sequence. Despite highly individualized classes, the University
of Southern California uses a two-hour uniform final exam consisting of a single
question based on a small group of readings. Students receive the topic in advance
but not the question, and can discuss the topic with their instructors and fellow
students. SUNY Brockport does not test skills outcomes as such but provides a
two-part final exam as an option for instructors. Some colleges provide a final
or exit exam only for students writing below their expected level.
But an increasingly popular option for judging a student's writing performance
that takes into account the writing process is the portfolio. In portfolio assessment
several representative pieces written over a given course of study are evaluated.
Depending on how the requirements are designed, the portfolio generally brings
together several pieces of writing collected at intervals over the semester. The
portfolio may even include early drafts. The great advantage of the portfolio
approach is that it emphasizes writing that occurs over time--the process--not
simply the product.
While placement at SUNY Stony Brook is based on a single essay and conducted during
freshman orientation, it operates one of the most venerable portfolio programs
in the country, having replaced the final exam with the portfolio as an outcome
measure in freshman composition a decade ago. Many schools have adapted Stony
Brook's model for their own needs, including Miami University of Ohio which now
even accepts portfolios with admission. (For the use of portfolios in elementary
and secondary schools, see
Farr, 1991). A
BRIEF REMINDER
For college-bound students, here is a "quick-and-dirty" list of what writing specialists
look for to determine writing skill: -
fluency or the amount written
- quality
and quantity of detail
- complexity
of ideas
- organization
- correctness
Writing is considered a good indication of how well a person thinks. For most
people, there is no short cut to effective thinking on paper. It is the person,
the words, and the labor between both. REFERENCES
Brand, Alice
G. (1992)
A Director of Composition Talks to Students about College Writing Assessment.
ED 340 038. Elbow,
Peter (1973).
Writing Without Teachers. New York: Oxford University Press. ED 078 431
Farr, Roger (1991).
"Portfolios: Assessment in Language Arts." ERIC Digest. Bloomington, IN: ERIC
Clearinghouse on Reading and Communication Skills. ED 334 603
Sanders, Sara E. and John Littlefield (1975).
"Perhaps Test Essays Can Reflect Significant Improvement in Freshman Composition:
Report on a Successful Attempt." Research in the Teaching of English 9(2),
145-53. EJ 135 865
White, Edward M. (1990).
"Language and Reality in Writing Assessment." College Composition and Communication
41(2), 187-200. EJ 414 690 This
publication was prepared (Digest#73, EDO-CS-92-06, July 1992) with funding from
the U.S. Department of Education under contract number RI88062001, and published
by the ERIC Clearinghouse on Reading, English and Communication.
The
opinions expressed in this report do not necessarily reflect the positions or
policies of Learn2study, nor does mention of trade names, commercial products,
or organizations imply endorsement by Learn2study. |