Praxis: A Writing Center Journal • Vol. 15, No. 1 (2017)
"AT FIRST IT WAS ANNOYING": RESULTS FROM REQUIRING WRITERS IN DEVELOPMENTAL COURSES TO VISIT THE WRITING CENTER
Wendy Pfrenger
Kent State University
wpfrenge@kent.edu
Rachael N. Blasiman
Kent State University
rvolokho@kent.edu
James Winter
Kent State University
jwinter2@kent.edu
Abstract
From fall 2013 through spring 2016, 1,301 students were enrolled in composition courses on our regional campus, with 349 of these enrolled in developmental courses. Our writing center serves approximately 14% of the campus population every year, a number we have seen increase since two professors in 2013-2014 began requiring students in their developmental courses to attend a minimum number of writing sessions each semester. The D-F-withdrawal rates for developmental writing courses on our campus have averaged 32.7% over the past six semesters, an improvement over previous years. Analysis of data from a study of student outcomes during this period demonstrates that requiring frequent visits to the writing center in early semesters results in a statistically significant, positive relationship with increased passing rates and voluntary usage of the writing center.
Our writing center is located within a multi-subject learning center on a regional campus of an open-enrollment, land-grant public institution. The campus mission is to bring higher education to an age-diverse, rural student population in an Appalachian corner of Ohio, yet the fact that 81% of our students require remediation of some kind as entering freshmen suggests the challenges and contradictions inherent in such a mission. Despite the favorite adage of writing centers, “better writers, not better writing,” (North 438) our writing center staff are acutely aware of the high stakes for writers coming through our doors. Their need to succeed is driven not only by a desire for higher learning, but also motivated by immediate economic necessity. We hope that we are helping . . . but until now we have not attempted to examine quantitatively whether what we observe anecdotally is, in fact, suggested by the data. In this article, we will describe a quantitative analysis of the progress of students enrolled in developmental writing courses at our university, comparing the results of those required to attend writing center sessions to those who were not required to attend the writing center. In particular, we are interested not only in student perception of the impact of writing center sessions, as previous studies of required visits have documented (Bishop; Clark; Gordon) but in examining the progress of students as they become better academic writers.
In a meta-analysis of writing center research, Casey Jones described the multidimensional challenges of assessing writing center efficacy. Jones observes that “the most intractable problem involved in trying to evaluate the relative effectiveness of writing centers lies in the personal, subjective nature of the phenomenon under scrutiny and the problems that this raises in terms of evaluation criteria,” adding that measuring “gains” in writing ability, in addition to requiring subjective judgment, also becomes problematic in light of the writing center’s pedagogical emphasis on the writer rather than the writing product (8). Along with struggling with what, exactly, to assess (writer or product), earlier attempts at quantitative assessment of efficacy have also struggled with issues of sample size and the controlling of such variables within the inherently subjective context of writing instruction, tutoring, and evaluation. Such studies have frequently resulted in conclusions that writing center interventions show no statistically significant impact on global improvement in writing (Sadlon; Sutton and Arnold), although in at least one study (Sadlon) students demonstrated improved ability in the area of higher-order concerns (ideas and organization), and in others students demonstrated improvement in mechanical skills (grammar, punctuation) and/or test-taking ability (David and Bubolz; Naugie; Wills).
The use of grades as metrics, given this context, may be problematic for obvious reasons. On our campus, professors participate every semester in a portfolio review process for all students enrolled in the first of our developmental writing sections. In this way, we ensure that, for the most part, our department offers a degree of consistency in rubrics and standards between course sections. Completing a course successfully with one professor means that a student is very likely prepared for the standards of the next course, regardless of who teaches it. However, the problems described above with regard to measuring “improvement” rather than performative competence seemed to us a few steps removed from the central question we are asking: are the writers we work with better able to achieve their goals in the university as a result of visiting the writing center? Given this context, focusing on the writer rather than the writing product seemed to us an approach more likely to yield insight. This focus required examining evidence of the writer’s adaptation to the university, including successful completion of required writing courses (not simply grades), shifts in the writer’s attitude and approach to academic writing, and sustained pursuit of progress as a writer through repeated visits to the writing center.
As we are interested not only in the institution’s goals of educating students but also in the students’ goals of making progress toward degree attainment, we have focused on evidence of progress within the university in addition to shifts in attitude and behavior in a large sample of students, using university data as well as our center’s records. For this study, we have determined grades and course completion to be appropriate indicators of success along with repeated (voluntary) use of writing center appointments. A 2006 Department of Education study determined that students who have successfully completed college composition credits by the end of their second year of college graduate from four-year institutions at a rate of 82.3% as opposed to 53.4% of students who have not completed composition credits in that time (Adelman 63). Such statistics document the power of forward progress while leaving the mechanism of such progress frustratingly obscure. Rather than determining efficacy through the evaluation of writing sessions or ethnographic case studies, this study then examines efficacy in terms of improved academic outcomes in composition courses during the semester when the student used the writing center and increased usage of the writing center in later semesters when not required to do so. Although we cannot account with these measures entirely for those students who, for example, improve as writers but are still unsuccessful in completing a course, at the very least, together these two measures correspond to our beliefs that better writers tend to be those who 1) continue to seek help when they need it, whether or not they are required to do so, and 2) also tend to succeed academically at higher rates than those who do not adopt the practices of engaging in extended, collaborative composition processes.
The limitations of such assessment criteria, however, are obvious when one considers that those most motivated to visit the writing center are likely to be engaged in other efforts that impact their success. To some extent, these limitations may be mitigated by the fact that on our campus some sections of developmental composition courses require students to attend writing center sessions while others do not; thus, we may compare the outcomes and behaviors of students who are required to make use of the writing center with those of students not required to make use of the writing center. We will briefly discuss later the comparative merits of required sessions as opposed to voluntarily initiated sessions, but for now it is sufficient to say that this degree of control was useful in establishing groups of students with roughly comparable academic and motivational characteristics. It is also limiting that we cannot account with these methods for those students who improve as writers but do not succeed in progressing through their required courses; however, as both universities and students place high value on forward progress, we feel it is appropriate to use passing grades as an indicator of success in this study.
From the fall of 2013 through spring of 2016, 1,301 students were enrolled in composition courses on our campus, with 349 of these being writers in developmental courses. Our writing center serves approximately 14% of the campus population every year, a number we have seen increase since two professors in 2013-2014 began requiring students in their developmental courses to attend a minimum number of writing sessions each semester. The D-F-withdraw (DFW) rates for developmental writing courses on our campus have averaged 32.7% over the past six semesters, an improvement over previous years (Kent State University Research Planning and Institutional Effectiveness).
Writers in developmental courses are those who have been identified by the university as in need of remediation through standardized testing or through qualitative evaluation of their previous writing. These writers are placed in a two-semester version of our first college writing course (College Writing I) in the belief that a slower pace will allow for greater attention to building the foundational reading and writing skills needed for their university education. This two-semester course sequence (“College Writing I Stretch”) is followed by our second-tier course that is taken by all students intending to complete a four-year degree. As many have suggested, the category of basic or developmental writing is necessarily difficult to define, though in general writers whose work has been categorized in this way may share in common challenges with reading, interpreting, and writing in an academic context due to linguistic, educational, and cultural differences (Lunsford; Matsuda; Smith; Sternglass). On our campus, typically these writers lack adequate college preparation in their educational background and linguistically may be distinguished by local dialectical variation.
We approached this study with the belief that there is a significant and positive correlation between frequent visits to the writing center in early semesters and greater academic success for students enrolled in developmental writing courses. We hypothesized that
students who were required to go to the writing center in their first semester would be more likely to visit the center in subsequent semesters;
students with lower baseline ACT Reading scores (as compared to baseline ACT English scores) would be less likely to earn a passing grade in their first writing course;
students enrolled in courses with required writing center sessions would be more likely to pass these courses than students who were enrolled in courses without such requirements;
students who were required to visit the writing center and did so would be more likely to pass the course than students who were required but did not visit the writing center;
there would be no difference in course outcome between students who did not visit the writing center, regardless of whether they were required to visit or not required.
methodology
Quantitative data regarding student baseline standardized testing scores, course placement, course outcomes, and writing center usage were collected for fall and spring semesters from the 2013-2016 academic years at the Kent State University Salem campus. During this period, 1,301 individual students were enrolled in college writing courses that included developmental sections of our first-tier writing course, regular sections of first-tier writing, and our second-tier writing course. Of the 1,301 participants in this study, 16% of students self-identified as first generation. Students taking developmental writing classes accounted for 26.7% of the sample in the first semester of the study and 29.9% in the second semester. Only 15.8% of the participants were required to visit the writing center in the first semester of data collection while 24.0% of participants were required in the second semester.
The data was analyzed in order to determine the relationship between writing center usage and successful course completion, with success defined as a passing grade of C- or better. Other outcomes (Ds, Fs, and withdrawals) were defined as not passing in order to be consistent with the university’s institutional research data. The study examined variables including required/not-required writing center usage, baseline test scores, and the frequency of writing center sessions in a given semester.
Qualitative data in the form of student interviews was also collected in the 2016-2017 academic year. Although these interviews were recorded face-to-face, students were guaranteed anonymity if desired. Survey participants were selected on the basis of participation in at least one writing center session. In all, 10 students were interviewed for this study. Survey questions used open-ended questions as well as Likert scale ratings. Due to the small sample size, we have not referenced the Likert scale ratings for this study. The survey may be found in the Appendix.
results
General Descriptive Information
Table 1 (see Appendix) summarizes course outcomes for the first two semesters of writing classes for students registered. Data from successive semesters is not included in this table due to low sample size; that is, few students registered for more than two semesters of writing. Students with a grade of C or better received a P (passing) grade. We also report grades of D and F, withdrawals (W), students who stopped attending (SF), as well as students who registered for the course but never attended (NF). As seen in Table 1, students were more likely to pass their second-semester writing course (69.4% versus 79.6%) and less likely to withdraw (14% versus 8.5%).
Table 2 (see Appendix) provides data on the number of visits to the writing center over the course of four successive semesters of writing classes. Visits per student were lowest for the first semester, peaked in the second semester, then decreased from that point. Means are low due to the large number of students who did not utilize the writing center at all.
Hypothesis Testing: Whole Sample
We hypothesized that students who were required to go to the writing center in their first semester would be more likely to visit the center in subsequent semesters. To test this hypothesis, we used a one-way analysis of variance (ANOVA). Our independent variable was required/not-required to visit the writing center during the first semester, and the number of visits in the second, third, and fourth semesters were used as dependent variables. We found that students who were required to visit the writing center in their first semester were more likely to visit in their second-semester course (F [1, 414] = 154.27, p < .001), more likely to visit the writing center in their third-semester course (F [1, 59] = 10.9, p = .002), but not more likely to visit in their fourth-semester course (F [1, 9] = 0.22, p = .65). However, the fourth semester suffers from a very small sample size of only 10 students. Taking these results together, we find strong support for the hypothesis that students who are required to visit the writing center in their first semester are more likely to visit in subsequent semesters.
We also hypothesized that students with lower baseline ACT Reading scores (as compared to baseline ACT English scores) would be less likely to earn a passing grade in their first writing course. Over half of the students in this study took the ACT, and the remaining students completed an in-house placement test. For those students who did take the ACT (N = 795), the sample mean for Reading was 20.75 (SD = 4.79) and the sample mean for English was 19.08 (SD = 4.66). We used a one-way ANOVA to compare ACT scores between students who passed their first writing course and students who did not pass. Both the ACT English score and the ACT Reading score significantly predicted passing rate (English: F [1, 794] = 22.58, p < .001; Reading: F [1, 794] = 8.23, p = .004). As expected, ACT English was a slightly stronger predictor of pass rate than ACT Reading. Figures 3 and 4 (see Appendix) show the difference in ACT scores between students who passed and did not pass their first writing course.
We also used a one-way ANOVA to compare the number of visits to the writing center between students who passed their first writing class and those who did not. As seen in Figure 5 (see Appendix), students who passed their first writing course visited the writing center at a higher rate than students who did not pass, although this difference was not statistically significant when using the entire sample, which included students not in developmental courses (F [1, 1298] = 2.87, p = .09).
Hypothesis Testing: Developmental Writing Courses
In addition to whole sample analysis, we also examined students who were only enrolled in developmental writing courses during their first two semesters (N = 348). Although the whole sample analysis did not reach significance, we found that students enrolled in developmental writing courses who passed their first- and second-semester courses made significantly more visits to the writing center compared to students who did not pass (F [1, 347] = 10.86, p = .001). Figure 6 shows the results of students’ first-semester outcomes, Figure 7 collapses outcomes into pass/fail, and Figure 8 displays second-semester outcomes (see Appendix).
Not all instructors who taught developmental writing courses required their students to visit the writing center. We hypothesized that students enrolled in courses in which the writing center was required would be more likely to pass the course than students who were enrolled in courses in which the writing center was not required; that is, that requiring students to visit the learning center would positively impact course outcomes. However, this hypothesis was only partially supported. Using a chi-squared analysis, we found that students who were required to visit in their first-semester course were no more likely to pass than students who were enrolled in courses in which going to the writing center was not required (χ2 [350] = 0.13, p = .72). However, we found that students who were required to visit the writing center were more likely to pass their second-semester writing course (χ2 [194] = 13.22, p < .001).
We then further concentrated our sample to only examine students enrolled in developmental writing courses that required visits to the writing center (i.e., we eliminated students enrolled in courses in which the writing center was not required). Of this subset of students (N = 194), 77 students visited the writing center at least once in the semester and 117 did not. We hypothesized that students who were required to visit the writing center and did so would be more likely to pass the course than students who were required but did not visit the writing center. This hypothesis was supported. As predicted, students who were required to visit the writing center and did so were significantly more likely to pass the course than students who were required but did not visit the writing center (χ2 [194] = 10.54, p = .001). Only 48% of students who did not use the writing center when required passed the course while 71% of students who used the writing center when required received passing grades. In addition to simple pass/fail, we also examined specific grade outcomes. Students who were required to use the writing center and did so were not only more likely to pass the course, but less likely to withdraw or stop attending the course (χ2 [194] = 13.78, p = .008).
Finally, we hypothesized that there would be no difference in course outcome between students who did not visit the writing center, regardless of whether they were required to visit or not required. Of students enrolled in the developmental writing courses in the first semester, 255 students made no visits to the writing center. We examined the course outcomes of students who were required to visit the writing center but did not versus students who were not required to visit the writing center and did not. We expected to find no difference in pass/fail rate between these two groups, and this hypothesis was supported for the pass/fail rate (χ2 [225] = 1.32, p = .25).
discussion
To summarize the major findings of this study:
Students required to visit the writing center in their first semester of taking a writing class were more likely to visit the writing center in future semesters.
The ACT English and ACT Reading both predicted passing rates for the first semester of taking a writing class, with English being a slightly stronger predictor than Reading.
Students who passed their first writing class visited the writing center more than students who failed. This was true for the entire sample and for students enrolled in developmental writing courses.
For students in developmental writing courses, requiring students to visit the writing center did not predict first semester pass rate, but did improve their second semester outcomes.
For students in developmental writing courses, students who were required to visit and did so were more likely to pass than students who were required to do so and did not visit. There was no difference between students who were required and did not attend writing center sessions and those who were not required and did not.
A lower percentage of students had to retake their second semester course. Inversely, our rate of retention in second semester writing courses was higher, and the “stopped attending” failures were almost cut in half.
Overall, these results suggest a significant and positive relationship between requiring visits to the writing center in early semesters (the first two semesters of our campus’ writing sequence) and greater academic success for students enrolled in developmental writing courses throughout their time in the sequence. A key point of emphasis here is that merely requiring students to visit does not inevitably result in greater success; some students will of course fail to visit whether or not they are required to use the writing center. However, requiring students to visit does cause students to visit more often, pass at higher rates, and return to use the writing center in later semesters at higher rates, even when they are no longer required.
Our findings show that the ACT English and ACT Reading both predicted passing rates for the first semester writing class, with English being a slightly stronger predictor than Reading. This finding stands in contrast to findings of some other studies which found that ACT English and Reading scores were not reliably predictive of student passing rates (Hassel and Giordano; Belfield and Crosta). However, a 2012 report by the ACT Report Research Series (Radunzel and Noble) produced results consistent with ours. Students, who “hit” their Reading benchmark, a score of 18-21 or lower, were more likely to achieve a B average GPA (9) in their first year of college. Grade outcomes for those who did not “hit” their “marks” was in turn twenty percent lower (18). As with several similar studies using ACT and SAT scores,¹ the 2012 findings indicate the lower the test scores, the lower the rates of satisfactory course completion and graduation rates (48). For economic and personal reasons, our students cannot afford a lengthy, years-long graduation process. In order to avoid this, they need to learn, in this case, writing and reading skills in the allotted time to pass the required courses of our writing sequence. Perhaps our writing center aids such progress because our tutors are trained to focus on higher-order writing concerns emphasized in the latter half of our campus writing sequence, namely in College Writing I Stretch and College Writing II. The predictive power of ACT scores and their use in writing course placement for entering freshman leads us to recommend that instructors require multiple visits for all students placed in developmental writing courses. And, perhaps more drastically, not complete their degree.
We suspect that repeated visits not only helped students become better writers through instrumental support, but also allowed students to feel more comfortable in the writing center itself. We interpret these results to show in part that continual utilization of the writing center by students in developmental writing courses shifts educational attitudes and behaviors in advantageous ways. Although required visits may present challenges both practical and pedagogical, as Jaclyn Wells has pointed out (2016), the research strongly suggests the student-perceived benefits of required tutoring outweigh these challenges, but we must implement required sessions with nuance in order to achieve the benefits suggested in the data above. During interviews, one student, who was required to attend the writing center only once, spoke of the process of her shift in attitude: “I was required to go,” she said, “but . . . I found it useful. So, I kept coming not only because I was required, but it gave me . . . direction on what to fix and if I was on the right track.” This observation voices a student perspective that we believe is demonstrated in the data above as well as in previous studies (Bishop; Gordon): requiring students to attend can often result in subsequent voluntary attendance, and in fact 81% of students in Barbara Lynn Gordon’s study recommended that instructors require future classes of students to attend writing center sessions (157). Although Wendy Bishop has noted the tendency of students to regard required sessions as deserving some form of “recompense” (37) and Gordon acknowledges the logistical as well as pedagogical challenges inherent in required visits (159-161), the consensus among RAD (replicable, aggregable, data) studies of writing centers appears to confirm a positive student reaction to required visits.
As students become more comfortable with the writing center, they also adopt a more sophisticated understanding of writing and of themselves as writers, due to sustained interaction with writing tutors. Several other students surveyed spoke of the writing center as a place for “guidance,” especially noting how it either supplemented course lessons or provided benefits the students found lacking in the classroom. Student 2 told us: “Last semester, I wasn’t really getting much feedback. So [visiting the writing center helped] me to know what to do . . . It was [sic] guidance.” Establishing the writing center as a comfortable place, one of facilitation instead of punishment, where students can have agency over their work and their time in a way unlike the classroom, lays the groundwork for building better student attitudes and the development of writing skills needed for future classes in our campus’ writing sequence. Student 1 commented: “I was kind of annoyed at first, but then I realized it was helping.” Other students similarly noted that being required led them to do something they had initially found either unappealing or impractical, and that they were likely to continue using the writing center as a resource in future courses. Brian Huot has suggested that good writers tend to be able to evaluate writing well, and that instruction that produces good writers “requires that students and teachers connect the ability to assess with the necessity to revise, creating a motivation for revision that is often difficult for students” (174). We interpret responses like those above as well as repeated, voluntary visits to the writing center to be evidence of students responding to the combination of informal evaluative conversations with tutors and more formal evaluative responses from their instructors. They are becoming better at assessing writing and thus better—and more engaged—in revision.
Beyond a general sense of goodwill towards tutoring sessions, two other students mentioned specific writing skills in their responses. Such responses offer insight into how the writers themselves link their progress as writers with their evolving (and portable) skills:
Student 4: I think even through College Writing 2 that they should require you to [go to] the Learning Center because when they’re not forced to I feel like they still wait [until] the last minute and their writing isn’t as good. I have . . . friends . . . that have a lot of trouble and if they come in once and don’t get the help they want they don’t tend to go back and they might fail or they might not do as well, so I think it’s very helpful for them to be required to do it.
Note that this supports the notion that one visit at the writing center is not enough. Again, those students who showed greatest success had multiple required writing center visits and were therefore more inclined to use the center in future semesters and even recommend it to their peers. Obviously, shifts in attitude and behavior toward education take time. To this end, the student response above begs the question of whether instructors should require writing center visits beyond the first-tier courses. As per our results, those students who visited the writing center during their second-tier course were more likely to pass than those who did not. During our second-tier writing course, students not only have to read more complex texts, but they must complete extensive research assignments. Although not necessarily new to them, these research-driven assignments, due to their length and instructor expectations, can be seen as overwhelming and even brand new. As indicated above, students feel that although tutors give them helpful advice about specific aspects of their work, most significant of all is the feeling of a writing support system, a place for guidance and even refuge, a safe space to discover, as David Bartholomae describes, “that the errors in [their] writing fall into patterns, that those patterns have meaning in the context of [their] own individual struggle[s] with composing, and that they are not, therefore, evidence of confusion or a general lack of competence” (86).
conclusion
A 2010 survey of remediation in California’s extensive community college network marks as two of its key recommendations the suggestions that 1) there should be greater uniformity of practice among remedial instructors within institutions and 2) there should also be greater focus on providing individualized support for these students (Grubb). The study concludes by observing, “One of the enduring problems in remedial classes, therefore, is how to impose adequate demands on students while simultaneously providing the moral and academic support so that they will continue their education” (13). On our campus, support is paramount to achieving successful educational outcomes. As previous studies of required appointments have noted, the primary reason given by most students for not using the writing center is a lack of time (Bishop; Clark; Gordon). Many of our students work part- or full-time jobs while juggling the challenges of school, finding childcare, paying bills, et cetera. It is the usual regional-campus story. Many of our students feel as if they have been left out of the national conversation for, in some cases, several generations, making exposure to the unfamiliar university setting emotionally and socially fraught. In addition, older, nontraditional students sometimes have to make a technological leap to be able to have a chance of keeping up with their coursework. Writing center appointments do provide them with space to learn and practice writing skills, but they also allow our students to see that college itself is not a set of rigged hoops through which they must jump. Requiring appointments puts them in extended contact with other students with similar backgrounds who can model skills and translate goals, make those goals seem less imposing, and assist writing clients in finding their sense of agency. With an understanding of the goals and pedagogical approaches taken in these courses and with a sense of agency cultivated by peer tutoring, these students are better positioned to succeed in the university.
Further study is needed, particularly with regard to whether and how writing center usage impacts persistence and degree completion for these students; a more nuanced examination of how motivation and agency shift over time for students who are required to attend writing center visits would also be useful. The finding that requiring students to participate in writing sessions can have a positive and significant impact on their development as writers also has implications for tutor training, which must be taken into consideration. How can we prepare tutors to shift the dynamic that results when a student fulfills a requirement rather than actively seeks assistance? Which narratives do we construct in collaboration with these students about their progress and agency as writers? We would suggest that requiring students enrolled in developmental writing courses to take advantage of the support available in writing centers improves the likelihood that they will meet the rigorous demands of academic writing courses while also, perhaps contrary to expectation, strengthening their sense of agency. Particularly on small campuses like ours, a minor shift in practice may have magnified, visible effects on campus climate and improved retention rates. Most significantly, it allows us to better fulfill the promise of our open-enrollment mission, extending higher education to a community in need of greater access and opportunity.
notes
1. See Allen et al.; DeAngelo et al.; Sackett et al.
works cited
Adelman, Clifford. The Toolbox Revisited: Paths to Degree Completion From High School Through College. U.S. Department of Education, 2006. ERIC. http://files.eric.ed.gov/fulltext/ED490195.pdf
Allen, Jeff, et al. “Third-Year College Retention and Transfer: Effects of Academic Performance, Motivation, and Social Connectedness.” Research in Higher Education, vol. 49, no. 7, 2007, pp. 647-664. ERIC. http://web.a.ebscohost.com.proxy.library.kent.edu/ehost/pdfviewer/pdfviewer?vid=66&sid=44bed7c1-8f65-4691-9c0e-60729a745c6d%40sessionmgr4009
Bartholomae, David. “Teaching Basic Writing: An Alternative to Basic Skills.” Journal of Basic Writing, vol.2, no. 2, 1979, pp. 85-109. ERIC. https://wac.colostate.edu/jbw/v2n2/bartholomae.pdf
Belfield, Clive, and Peter M. Crosta. “Predicting Success in College: The Importance of Placement Tests and High School Transcripts.” Working Paper. Community College Research Center. Columbia U, Feb. 2012. ERIC. http://www.eric.ed.gov/contentdelivery/servlet/ERICServlet?accno=ED529827
Bishop, Wendy. “Bringing Writers to the Center: Some Survey Results, Surmises, and Suggestions.” The Writing Center Journal, vol. 10, no. 2, 1990, pp. 31-44. The Writing Centers Research Project. http://casebuilder.rhet.ualr.edu/wcrp/publications/wcj/wcj10.2/10.2_Bishop.pdf
Clark, Irene Lurkis. “Leading the Horse: The Writing Center and Required Visits.” The Writing Center Journal, vol. 5/6, no. 2/1, pp. 31-34. The Writing Centers Research Project. http://casebuilder.rhet.ualr.edu/wcrp/publications/wcj/wcj5.2_6.1/wcj5.2_6.1_clark.pdf
David, Carol, and Thomas Bubolz. “Evaluating Students’ Achievements in a Writing Center.” The Writing Lab Newsletter, vol. 9, no. 8, Apr. 1985, pp. 10-14.
DeAngelo, Linda, et al. Completing College: Assessing Graduation Rates at Four-Year Institutions. Higher Education Research Institute at UCLA, 2011 https://heri.ucla.edu/DARCU/CompletingCollege2011.pdf.
Gordon, Barbara Lynn. “Requiring First-Year Writing Classes to Visit the Writing Center: Bad Attitudes or Positive Results?” Teaching English in the Two-year College, vol. 36, no. 2, Dec. 2008, pp. 154-163.
Grubb, W. Norton. “The Quandaries of Basic Skills in Community Colleges: Views from the Classroom.” NCPR Developmental Education Conference, 23-24 Sep. 2010, New York, NY, National Center for Postsecondary Research, Sep. 2010. ERIC. https://eric.ed.gov/?q=The+Quandaries+of+Basic+Skills+in+Community+Colleges%3a+Views+from+the+Classroom&id=ED533875 http://files.eric.ed.gov/fulltext/ED533875.pdf
Hassel, Holly, and Joanne Giordano. "The Blurry Borders of College Writing: Remediation and the Assessment of Student Readiness." College English, vol. 78, no. 1, 2015, pp. 56-80. National Council of Teachers of English. http://www.ncte.org.proxy.library.kent.edu/library/NCTEFiles/Resources/Journals/CE/0781-sep2015/CE0781Blurry.pdf
Huot Brian. “Toward a New Discourse of Assessment for the College Writing Classroom.” College English, vol. 65, no. 2, Nov. 2002, pp. 163-180. doi: 10.2307/3250761
Jones, Casey. “The Relationship Between Writing Centers and Improvement in Writing Ability: An Assessment of the Literature.” American Journal of Education , vol. 122, no. 1, Sep. 2015, pp. 3-20, Education Full Text.
Kent State University Research Planning and Institutional Effectiveness [RPIE]. Grade Distribution Report Fall 2013 ENG 01001, ENG 01002, ENG 11011, Salem Campus. Data File. Kent State U, 2016.
---. Grade Distribution Report Spring 2014 ENG 01001, ENG 01002, ENG 11011, Salem Campus. Data File. Kent State U, 2016.
---. Grade Distribution Report Fall 2014 ENG 01001, ENG 01002, ENG 11011, Salem Campus. Data File. Kent State U, 2016.
---. Grade Distribution Report Spring 2015 ENG 01001, ENG 01002, ENG 11011, Salem Campus. Data File. Kent State U, 2016.
---. Grade Distribution Report Fall 2015 ENG 01001, ENG 01002, ENG 11011, Salem Campus. Data File. Kent State U, 2016.
---. Grade Distribution Report Spring 2016 ENG 01001, ENG 01002, ENG 11011, Salem Campus. Data File. Kent State U, 2016.
Lunsford, Andrea. “Cognitive Development and the Basic Writer.” College English, vol. 41, no. 1, 1979, pp. 38-46. JSTOR. doi: 10.2307/376358
Matsuda, David Kei. “Basic Writing and Second Language Writers: Toward an Inclusive Definition.” Journal of Basic Writing, vol. 22, no. 2, 2003, pp. 67-89. ERIC.
Naugie, Helen. “How Georgia Tech’s Lab Prepares Students for the Georgia Mandated Proficiency Exam.” The Writing Lab Newsletter, vol. 5, no. 4, 1980, pp. 5-6. http://web.a.ebscohost.com.proxy.library.kent.edu/ehost/pdfviewer/pdfviewer?vid=29&sid=44bed7c1-8f65-4691-9c0e-60729a745c6d%40sessionmgr4009
North, Stephen M. “The Idea of a Writing Center.” College English, vol. 46, no. 5, Sep. 1984, pp. 433-446. JSTOR. doi: 10.2307/377047
Radunzel, Justine, and Julie Noble. Predicting Long-Term College Success through Degree Completion Using ACT[R] Composite Score, ACT Benchmarks, and High School Grade Point Average. ACT Research Report Series, 2012. ERIC. http://files.eric.ed.gov/fulltext/ED542027.pdf
Robinson, Heather M. “Writing Center Philosophy and the End of Basic Writing: Motivation at the Site of Remediation and Discovery.” Journal of Basic Writing, vol. 28, no. 2, Fall 2009, pp. 70-92. ERIC. http://files.eric.ed.gov/fulltext/EJ877256.pdf
Sackett, Paul R., et al. “Does Socioeconomic Status Explain the Relationship Between Admissions Tests and Post-Secondary Academic Performance?” Psychological Bulletin, vol. 135, no. 1, Jan. 2009, pp. 1-22. doi: 10.1037/A0013978
Sadlon, John. “The Effect of a Skills Center upon the Writing Improvement of Freshman Composition Students.” The Writing Lab Newsletter, vol. 5, no. 3, 1980, pp. 1-3.
Smith, Cheryl Hogue. “ ‘Diving in Deeper’: Bringing Basic Writers’ Thinking to the Surface.” Journal of Adolescent and Adult Literacy, vol. 53, no. 8, May 2010, pp. 668-676. JSTOR. http://www.jstor.org.proxy.library.kent.edu/stable/25653927
Sternglass, Marilyn. “The Need for Conceptualizing at All Levels of Writing Instruction.” Journal of Basic Writing, vol. 8, no. 2, 1989, pp. 87-98.
Sutton, Doris G., and Daniel S. Arnold. “The Effects of Two Methods of Compensatory Freshman English.” Research in the Teaching of English, vol. 8, no. 2, Summer 1974, pp. 241-249. JSTOR. http://www.jstor.org.proxy.library.kent.edu/stable/40170528
Wells, Jaclyn. “Why We Resist ‘Leading the Horse’: Required Tutoring, RAD Research, and Our Writing Center Ideals.” The Writing Center Journal, vol. 25, no. 2, Spring/Summer 2016, pp. 87-114.
Wills, Linda Bannister. “Competency Exam Performance and the Writing Center.” The Writing Lab Newsletter, vol. 8, no. 10, 1984, pp. 1-4.
appendix
Note: Students who passed their first-semester writing course had significantly higher ACT Reading scores than students who failed their first-semester writing course.
Note: Students who passed their first-semester writing course had significantly higher ACT English scores than students who failed. The difference in scores between students who passed and students who failed is greater for ACT English than ACT Reading.
Note: Students who passed their first-semester writing course visited the Writing Center at a higher rate than students who failed their first-semester writing course.
Note: Students who passed their first-semester developmental writing course visited the Writing Center significantly more often than students who stopped attending (SF), received a D or F, withdrew from the course (W), and never attended (NF).
Note: This figure collapses the results of Figure 6 to pass/fail.
Note: Students who passed their second-semester developmental writing course visited the Writing Center significantly more often than students who received a grade of D or F, withdrew from the course (W), stopped attending (SF), or never attended (NF).
Interview Script
PI/Co-Investigator: Would you be willing to participate in a brief interview regarding your visit to the Learning Center? It will only take about 15 minutes. If you’re willing to participate, please fill out this brief consent form, taking time to read it through and ask any questions you might have. You may remain anonymous in the study or choose to be identified by indicating your preference on the consent form.
I will be the person who conducts the interview. Your responses will not be shared with the tutors or your professors.
Interview questions:
What was your first writing course at the university?
Which course are you in now?
Why did you first start coming to the Learning Center?
If you came voluntarily, how did you hear about it and what were you told beforehand?
If you were required to visit the Learning Center, who told you to come and what were you told beforehand? How did you feel about being required?
If you were required to visit, how did your attitude about being required change over the course of the semester (if it did)?
How would you describe your tutors and the tutoring sessions?
In your tutoring sessions, did your tutors talk with you about your “outside class” reading and writing habits? If so, did they make connections between them and the things you need to do in college courses?
On a scale of 1-5 with 1 being not at all helpful and 5 being very helpful, how helpful would you say the tutoring was?
If it was helpful, what was helpful about it?
If it was not, why wasn’t it? What do you think would have been more helpful?
Have you used the Learning Center even when you weren’t required? Why or why not?
On a scale of 1-5, with 1 being not at all likely and 5 being very likely, how likely are you to use the Learning Center for help with future writing assignments?
On a scale of 1-5, with 1 being “don’t agree” and 5 being “very much agree,” please rate the following statements:
The Learning Center helped me become a better writer.
The Learning Center changed how I thought about the writing process.
The Learning Center changed how I thought about myself as a writer.
The Learning Center helped me understand university writing.
The Learning Center helped me adapt my usual reading and writing skills to the new demands of university classes.
(If applicable) I am glad my professor required me to use the Learning Center.