WPA Program Assessment

According to Ed White, “it is both reasonable and responsible for administrators and public officials to inquire into the effectiveness of the writing instruction programs in which students are enrolled, particularly at two crucial transfer points: between high school and college, and between graduation and career development” (12). As writing is a more and more prevalently needed skill in the university, higher-ups will continue to support assessment and, for American University, they recently received a university grant to conduct a more in-depth assessment of their program.

Lacey explains that the process used to center on “collecting a random sample of papers and then evaluating them according to standards.” From there they would “choose a line from our program rubric (such as “research support”) and then evaluate the papers for that area.” A committee called “The Assessment Committee” (aptly named) would manage the rubric-based assessment work and then “prepare a report for the university’s assessment requirements and for the program.” This would would help Lacey and the PD committee organize faculty development sessions, so it seems like it was mostly formative.

I’m not sure they have the Assessment Committee still as Lacey didn’t mention it as part of the faculty development/service; however, they might have ramped it up for this new research.  She does note that before this new plan, she realized that the committee did a lot (if not too much) reading for the means of assessment before she was WPA. This illustrates what Gallagher calls a need for ” embrac[ing] writing assessment leadership” on Lacey’s part (32).

She explains the new method thusly:

“We are assessing one learning outcome per year (this is easier since we revised our learning outcomes), and we’re focusing on self-efficacy.  We did research into self-efficacy and learned that there’s correlation with ability, including in writing.  This year’s outcome centers on metacognition… We did a survey of all students in WRTG 100 in the first couple weeks of the semester (and got a very high response rate), with a survey tool that measured self-efficacy in metacognition (we relied on Bandura [1982] to develop the self-efficacy scale).”

As a quick note, it’s helpful to define what Bandura’s theory on self-efficacy is here. According to Bandura,  self-improvement and self-efficacy evolves from one’s ability to “execute courses of action required to deal with prospective situations.” This idea of being able to learn, act accordingly, and execute actions on one’s own is a valuable “writing to learn” type of skill.

“Then, around the same time, we conducted some student focus groups to dig more deeply into the self-efficacy measures on the survey.  This semester, at the end of the semester, we’ll administer the same survey to all students in WRTG 101, and we’ll invite the same students back to the focus groups.  And we’ll collect some writing samples from those students.  (All of this is IRB approved, too.)”

While they don’t have a lot of data at this point (they just started this method), Lacey seems really excited about this research, and I’m excited for them!

She notes that all departments and programs are required to do assessment, but because the Writing Studies Program takes assessment more seriously than other departments, the Office of Institutional Research (OIR) gives them a bit more freedom. She says, “We don’t have narrow requirements for how to carry it out, and we exceed any requirements that exist.  So, for example, we asked for a year ‘off’ from the usual assessment reports and processes so we could pilot this new method, and that request was approved easily.”

This assessment, analyzing the first year writing courses based on programmatic learning outcomes, illustrates Anson’s point that, “course-level assessment can provide valuable information for broader program-related goals” and that’s precisely what AU’s writing program aims to do — enhance the overall program’s goals and outcomes, especially the consistency and fluency (11).

Lacey looks forward to this assessment and it sounds like this will be an interesting assessment trk to keep track of as the semester goes on. I believe it will take them a bit longer to finish collecting the data, though.

Works Cited

Anson, Chris M. “Assessment in Action: A Mobius Tale.” Assessment in Technical and Professional Communication. Ed. Margaret Hundleby and Jo Allen. Amityville, NY: Baywood, 2009. 3-15.

Bandura, A. (1982). Self-efficacy mechanism in human agency. American Psychologist, 37(2), 122–147. https://doi.org/10.1037/0003-066X.37.2.122

Gallagher, Chris W., “What Do WPAs Need to Know about Writing Assessment? An Immodest Proposal.” WPA: The Journal of the Council of Writing Program Administrators 33.1/2 (2009): 29-45.

White, Edward M., et al. “Trends,” in Very Like a Whale: The Assessment of Writing Programs . Utah State UP, 2015, pp. 12-36.

Wootton, Lacey. “Report 6: Program Assessment.” Received by Bethany Van Scooter, February 19 – 28, 2020.