Supporting Data-Driven Conversations about Institutional Cultures of Writing

Colleges and universities have long sought to identify ways to improve students’ writing abilities, as communication skills are central to a liberal education. As institutions assess how students evolve as communicators, questions arise from different stakeholders on campus about the role of writing in the curriculum and the most appropriate approaches for supporting student writers. Faculty committees or individual departments ask these questions:

  • ƒ Are students graduating with the ability to communicate effectively based on a set of outcomes?
  • ƒ Who decides on the outcomes, and what are the best methods for assessing them?
  • ƒ Where and how are institutions helping students to meet these outcomes?
  • ƒ Is it through general education requirements, writing throughout the entire curriculum, or the support of a writing center?

The answers to these and other related questions are the foundation for understanding an institution’s culture of writing.

Defining an institution’s culture of writing can be a daunting task, and often this culture can be informed by what transpires at peer institutions or in national conversations. The National Census of Writing (NCW), a database of responses from nine hundred US colleges and universities, serves as a resource for institutions and individual researchers to find data from over two hundred questions related to the administration, teaching, and support of academic writers. The NCW (writingcensus.swarthmore.edu) provides data to help institutions illuminate what is or what could be the culture of writing on a given campus by naming the different sites of writing and highlighting where resources have been allocated to support and develop an institution’s culture of writing. This naming and identification process may raise additional questions on a given campus and in the field as people grapple not only with identifying the different sites of writing, but also with the rationale for the institutional structure surrounding these sites. The NCW can be one space for institutions to look for answers to begin contextualizing these conversations. In this piece, we will introduce the genesis and methodology of the NCW project and discuss how the naming of practices aids in understanding an institution’s culture of writing, which can lead to improved student outcomes.

Genesis of a National Database on Writing Programs

The NCW project began with the simple task of notating the existing sites of writing at different US colleges and universities. The project met at the crossroads of three discussions: (1) In the book Writing Program Administration at Small Liberal Arts Colleges, Gladstein and Regaignon (2012) hypothesized that the approach to writing at Small Liberal Arts Colleges (SLACs) was different from other institutional types, but there were limited comparable data available. (2) We had been involved in discussions about the diversity of membership of the Council of Writing Program Administrators, and we questioned what data were available to document the diversity of writing programs and their administrators. (3) Each week we observed people on listservs requesting data they needed for proposals or reports. These three discussions prompted us to adapt a survey used in the SLAC book (Gladstein and Regaignon 2012) for a larger audience. We wanted to create an open-access database that people could utilize in their own work as administrators and researchers. The survey covers eight areas:

  • ƒ sites of writing,
  • ƒ first-year writing/English composition,
  • ƒ identifying and supporting diversely prepared students,
  • ƒ writing across the curriculum (WAC) and writing beyond the first year,
  • ƒ the undergraduate and graduate major and minor,
  • ƒ writing centers,
  • ƒ administrative structures, and
  • ƒ the demographics of the respondents.

Most empirical writing scholarship has been based on surveys focused on a limited number of institutions (Peterson 1987; Charlton and Rose 2009) or focused on a particular site of writing instruction on campus (e.g., first-year writing, writing centers, or writing across the curriculum programs) (Thasis and Porter 2010; Writing Centers Research Project 2014). By creating one survey that focused on the different administrative, teaching, and support structures around writing, our goal was to put the data for each area in conversation with the others.

After formulating a list of schools through the Integrated Postsecondary Education Data System, we attempted to identify potential respondents to directly send the census invitation in order to avoid the self-selection biases inherent in requesting respondents from professional listservs. To locate these individuals, we searched institutional websites to identify email addresses of campus writing professionals, sometimes finding multiple people administering different aspects of writing on a single campus or sometimes finding no one because that information was unavailable on the website.

Through the generous support of the Andrew W. Mellon Foundation and Swarthmore College, we hired an IT firm to design and build a searchable database using the processed data. The database includes an ability to filter the results by Carnegie classification, size, geographic region, and mission (e.g., minority-serving institutions, SLACs, and the Catholic Consortium). These filters allow users to constrain the results to better align with users’ institutional contexts. Some schools provided authorization for a public program profile that allows users to see how a specific institution answered the survey.

We sent the survey to 1,621 four-year institutions and 924 two-year institutions. As responses were returned, we cleaned and processed the data, often sending follow-up emails seeking clarification. Ultimately, we ended up with responses from 680 four-year institutions (a 42 percent response rate) and 220 two-year institutions (a 24 percent response rate). Through our conversations with respondents during data collection and processing, we learned of the difficulties some respondents had when naming the writing practices on their campuses. We found that some of these difficulties relate to how the different sites of writing and the people administering these sites were positioned within the institution and with each other. The naming or categorizing of the sites of writing spoke to the respondents’ relationships with their campus’s culture of writing.

Naming Practices

Campuses create taskforce after taskforce and committee after committee to improve student writing; these committees often emerge after at least one stakeholder voices the problem that students at the institution cannot write. The attempts to find a solution, however, are often based on the different conceptions of writing that various stakeholders bring to the process. This leads to repeated and frustrating initiatives that often do not justify the time and resources put into them.

A common step with such initiatives is to identify best practices at other institutions and replicate them. This impulse to examine best practices as potential models belongs in the review process; however, through our work with the NCW, we have found that stakeholders at an institution or within a department also need to be introspective about their practices to identify their culture of writing. To do this, they need to name all the sites of writing—explicit, embedded, and diffused—and the relationships among them. The NCW does not illuminate best practices; rather, it helps institutions and individuals name and contextualize their writing practices, which leads to a better understanding of the institution’s culture of writing and ultimately to a more efficient and sustainable process for changing that culture.

Multiple times during the data collection and processing stages, we encountered confusion from respondents about the sites of writing on a campus. Unless individuals had direct responsibility for a particular site, they sometimes were unaware of the existence of other sites of writing, or they lacked a basic understanding of the functioning of that site. For example, people may not have known if there was a writing requirement, if there were basic writing courses, or if students had to take writing courses beyond the first year. Obviously, at most institutions it is impossible for an individual to know the ins and outs of each institutional unit; however, writing extends beyond and across multiple units. Depending on the institution, it is a shared responsibility among members of a department, among departments and writing centers, or among all faculty. A working knowledge of how different sites of writing interlock and support each other is the basis for improving an institution’s culture of writing, supporting student writers, and resolving curricular challenges.

Our work on the NCW left us with the question of why different stakeholders lacked knowledge about the various sites of writing at an institution. We found that sometimes there was a complete lack of awareness that different sites exist, but other times this lack of acknowledgement could be traced to how the different stakeholders named or defined a site and how that naming aligned with the naming used in the census. This confusion was most apparent when we asked respondents, “Does your institution have an official writing program or department?” The NCW was originally named the Writing Program Administration (WPA) Census, but when collecting and processing survey results, we learned that respondents positioned themselves in relation to those terms and sometimes opted out of the project because, as they put it, “We don’t have a writing program, so we can’t answer your survey.” From our research on the cultures of writing at SLACs, we had deliberately designed questions to address the fact that some institutions have a less-defined institutional structure for their sites of writing, but we found that some people still responded negatively to the terms “program” and “administration.”

A similar instance of naming conventions revealing a tension within campus cultures of writing occurred when we asked respondents, “Does your institution have a writing across the curriculum (WAC) program or requirements beyond the first year?” Here two people at the same institution may have provided opposite responses, leaving us as researchers perplexed as to how to process the responses. In further conversations with respondents, we learned that with this question, respondents again contested the formality embedded in the term “program.” Some respondents wanted to name and own the fact that faculty do incorporate writing into their courses beyond the first year, while other respondents wanted to be clear that even if faculty embedded writing in their courses, it did not mean the institution had a formal or explicit (and thus funded) program or requirement.

We believe that some of the confusion surrounding respondents’ answers stems from debates within the larger field of writing studies. Questions about naming, best practices, and local contexts are not settled within the discipline, and respondents aware of different trends in the field may have submitted contradictory answers based on their relationships to and knowledge of shifting disciplinary discussions.

Emerging Conversations

The NCW bridges local and national conversations by sharing responses to two hundred commonly asked questions about writing programs and centers. Despite the extent of the NCW, the survey itself is not all-encompassing; instead, answering the NCW questions and exploring the database of responses may suggest additional avenues for exploring writing instruction and support on a campus. For example, does responsibility for stewardship of the culture of writing fall to the writing program? If there is also a writing center that is institutionally housed outside the program, how does it work within the current culture? What happens if there is a writing center but no designated program? These questions cannot be answered in a multiple-choice survey, but the NCW raises these questions, rather than answers them, as it attempts to help institutions and the field name their practices.

Sharon Mitchler, a professor of English and humanities at Centralia College, explained her institution’s difficulty with definitions in a recent NCW blog post:

We struggled to answer the question as presented in the National Census questionnaire. Of course we are a writing program. We have curriculum review, we have outcomes, we have course outlines, we have professional training, and we have students in writing classes. We talk about how to best implement pedagogical changes. We document changes in students’ progress through our courses. We work with other composition professionals. We gambol and stretch and learn and struggle to empower our students so that they may move on to their next goals as prepared as possible to use writing proficiently and strategically.

However, we do all these things through multiple lenses and structures. We are not using an [institution-wide] writing program to coordinate our actions. So is our fractured structure a way to move most expediently to our goals? Would our structure be enhanced by having an explicit writing program that crossed all disciplines and campus hierarchies? What might we lose by changing to a monolithic program? How would it be administered? And would having a single writing program silence those who need writing instruction to occur in particular ways with students who need a finer grained skill set? Who would we be marginalizing? What ways would a single program both provide institutional power and limit the flexibility of the current organization? And lastly, how might this be financed and staffed (re: how the heck do we pay for a director or a partial director when we are all hands on deck to cover the courses we need to teach)?

The NCW does not attempt to answer these questions posed by Mitchler, nor does it want to suggest that all institutions should have a formal writing program or that there is only one way to structure a program. Instead, as the researchers behind the NCW, we argue that one way to define the culture of writing and its stewardship is to look at what currently gets defined as the writing program and if this program includes all sites of writing. If not, why are some sites under the umbrella of “program” while others are separate entities or not entities at all?

As Gladstein and Regaignon state, “the first step toward change is often that of identifying and claiming all the sites of writing on campus. Not only does this give both [writing program administrators] and institutions a full picture of the current relations shaping the local culture of writing, but it also helps bring to light the history of that school’s approach and how the current program fits into national conversations” (2012). By all sites, the authors were referring to the explicit, embedded, and diffused sites of writing. Explicit sites are those places the institution has labeled as administering, teaching, or supporting the goals of writing on a given campus (e.g., writing requirements, writing center, director of writing). On the other hand, embedded sites are places where the goals of writing are addressed as an embedded part of a larger unit (e.g., first-year seminar, learning center, chair of a department). Finally, there are diffused sites where writing takes place across the curriculum, but there is no formal entity in place to organize, direct, and assess what occurs.

When documenting spaces where students develop as writers, it may be tempting to focus on the explicit sites of writing and to sidestep the embedded and diffused sites of writing. Bringing these sites of writing to light can be difficult and time consuming and could reveal issues that need to be painfully or expensively resolved. It may seem easier and more efficient to instead focus on the explicit sites of writing and align them with best practices; however, ignoring embedded and diffused sites of writing creates situations where students’ explicit instruction in writing (if any) may be contradicted or countermanded. To get beyond this, it is important to identify and document all sites of writing on campus. Naming all campus writing practices can lead to conversations that reveal the true culture of writing on a campus and will reveal how and where resources may be better allocated to improve student learning.

While collecting and processing data, we heard from respondents that the NCW encouraged different stakeholders on campus to come together to discuss the posed questions. These questions provide language and direction to understand how the various entities develop and support student writers. To ensure that its students can write (a goal of liberal education), a school should recognize its culture of writing, and to do this they need to name and understand their sites of writing. The National Census of Writing helps in this naming process because it identifies the explicit, uncovers the embedded, and shares the results in a national database. 

 

References

Charlton, Jonikka, and Shirley K. Rose. 2009. “Twenty More Years in the WPA’s Progress.” WPA: Writing Program Administration 33 (1/2): 114−45.

Gladstein, Jill M., and Dara Rossman Regaignon. 2012. Writing Program Administration at Small Liberal Arts Colleges. Anderson, SC: Parlor Press.

Peterson, Linda H. 1987. “The WPA’s Progress: A Survey, A Story, and Commentary on the Career Patterns of Writing Program Administrators.” WPA: Writing Program Administration 10 (3): 11−18.

Thasis, Chris, and Tara Porter. 2010. “The State of WAC/WID in 2010: Methods and Results of the U.S. Survey of the International WAC/WID Mapping Project.” College Composition and Communication 61 (3): 534−70.

Writing Centers Research Project. 2014–15. International Writing Center Association and Purdue University. Retrieved from https://owl.english.purdue.edu/research/survey.


Jill Gladstein, Associate Professor and Director of the Writing Associates Program, Swarthmore College; and Brandon Fralix, Associate Professor and Coordinator of the Writing and Analysis Program, Bloomfield College

 

Select any filter and click on Apply to see results