Peer Review

How Writing Contributes to Learning: New Findings from a National Study and Their Local Application

Writing ability is among the most valued outcomes of a college education. Always included in conceptions of a liberal education, writing is also one of the most highly desired skills across business and industry (see Hart 2015; Burning Glass Technologies 2015). Since the 1970s, writing specialists have intensified their theorizing, research, and advocacy of institution-level initiatives aimed at improving students’ writing abilities. These efforts have produced, among other things, writing-intensive (WI) courses and writing-across the-curriculum (WAC) and writing-in-the-disciplines (WID) programs. In 2008, the Association of American Colleges and Universities (AAC&U) underscored the importance of effective writing pedagogy by including writing-intensive courses in its list of high-impact educational practices (Kuh).

Despite the impressive gains these efforts have realized at many institutions, the national results have fallen short of expectations. In an AAC&U-sponsored survey, four hundred employers ranked writing among the skills they most desired in new college graduates, but only one in four said recent graduates were well prepared in writing (Hart 2015). While there is widespread agreement that a single writing course taken in the first year cannot adequately prepare students for the writing they will need to do after graduation, almost half (47 percent) of four-year colleges and universities in the United States do not have a WAC/WID program and/or a writing requirement beyond the first year (Gladstein and Fralix 2017). This article and the ones that follow highlight new developments in efforts to improve the writing ability of all students.

The Contribution of Writing to Learning

Writing specialists have long argued that writing enhances learning, believing that faculty across the disciplines who are persuaded on this point will be more likely to provide the additional instruction and practice that students need. Also, understanding how writing contributes to learning would enable faculty already incorporating writing to increase the positive results of their efforts.

Unfortunately, research on writing’s relationship to learning has been mixed, reducing the persuasiveness of writing specialists’ claims. While two large-scale studies suggest that writing has a positive impact on students’ learning (Astin 1995; Light 2001), small-scale studies, often conducted in a single course at a single institution, have produced mixed results. Further, when the small-scale research is examined collectively, the variety of institutional missions, student populations, and definitions of “learning,” as well as the different kinds and quality of studies conducted, paint a less consistent and coherent picture of the relationship between writing and learning.

Over the last eight years, through a collaboration between the Council of Writing Program Administrators (CWPA) and the National Survey of Student Engagement (NSSE), we have gathered and analyzed data from tens of thousands of bachelor’s degree students to identify characteristics of writing assignments that increase their learning. Results show that well-designed assignments can, indeed, increase learning, including learning about writing. In addition, the quality of assignments is more powerful in advancing learning than the amount of writing assigned, a finding that can reduce the reluctance of faculty who believe that including writing in their courses and curricula requires unreasonable amounts of class and grading time (Anderson, Anson, Gonyea, and Paine 2015). Below, we summarize our study, and then three institutions describe how they are implementing our findings.

Our Research Strategy

Our research strategy was to ask undergraduates about the nature of writing assignments they received and then to see whether their experiences correlated with aspects of their learning—as well as their perceptions of the progress they made in college toward achieving particular educational goals—that the existing NSSE survey already captured.

To generate the additional questions about writing, we invited members of the CWPA to nominate best practices in designing writing assignments. Eighty CWPA members named 150 practices and then helped pare the list to twenty-seven practices, which we developed into questionnaire items with guidance from NSSE analysts and further validated for clarity with focus groups of students. The questions asked students to indicate whether they encountered each practice for “all,” “most,” “some,” “few,” or “no” writing assignments in their courses during the current academic year.

To gather data from a variety of institutions and students, we recruited a total of eighty institutions (all bachelor’s degree-granting) from among NSSE institutions planning to participate in 2010 and 2011. For these schools, we appended the twenty-seven writing questions to the core NSSE questionnaire. More than 70,000 students responded to the modified survey. While the participating institutions were not randomly selected, they closely mirrored the profile of four-year US institutions, lending credibility to implications for US higher education in general.

Finding Broad Strategies Easily Applied in Any Discipline

Examining the responses to the twenty-seven questions, we noticed that fifteen questions could be grouped into three clusters, each pointing to a general strategy faculty in any field could apply in the contexts of their own disciplines, courses, and students. Analysis confirmed three underlying constructs representing effective pedagogical practices—each measured by an interrelated subset of the writing questions (see figure 1):

  • ƒ Interactive Writing Processes, in which students communicate orally or in writing with others about an assignment at some point between receiving it and submitting the final draft.
  • ƒ Meaning-Making Writing Tasks, which require students to engage in some form of integrative, critical, or original thinking.
  • ƒ Clear Writing Expectations, which involve instructors communicating accurately what they want their students to do in an assignment and the criteria they will use to evaluate the students’ submissions.

An essential feature of these constructs is that each concerns instructor behavior. The instructors determined whether their assignments included meaning-making tasks, and they provided explanations that students found to be clear or unclear. While some students may have decided on their own to engage in interactive writing processes, the instructors decided whether to require peer review, personal conferences, visits to the writing center, or other interactive processes.

Figure 1. Student Experiences Related to the Three Writing Constructs

Interactive Writing Processes
For how many writing assignments have you:

  • Talked with your instructor to develop your ideas before you started drafting your assignment
  • Talked with a classmate, friend, or family member to develop your ideas before you started drafting your assignment
  • Received feedback from your instructor about a draft before turning in your final assignment
  • Received feedback from a classmate, friend, or family member about a draft before turning in your final assignment
  • Visited a campus-based writing or tutoring center to get help with your writing assignment before turning it in

In how many of your writing assignments has your instructor:

  • Asked you to give feedback to a classmate about a draft or outline the classmate had written

Meaning-Making Writing Tasks
In how many of your writing assignments did you:

  • Summarize something you read, such as an article, book, or online publication
  • Analyze or evaluate something you read, researched, or observed
  • Describe your methods or findings related to data you collected in lab or fieldwork, a survey project, etc.
  • Argue a position using evidence and reasoning
  • Explain in writing the meaning of numerical or statistical data
  • Write in the style and format of a specific field (engineering, history, psychology, etc.)

Clear Writing Expectations
In how many of your writing assignments has your instructor:

  • Provided clear instructions describing what he or she wanted you to do
  • Explained in advance what he or she wanted you to learn
  • Explained in advance the criteria he or she would use to grade your assignment

Using the New Constructs to Create Writing Assignments that Promote Learning

Having established the three new constructs, we sought to determine whether and how they were associated with student learning. The 2010 and 2011 versions of the NSSE instrument included three well-established constructs—clusters of interrelated questions from the core NSSE—that assessed students’ participation in deep-learning activities:

  • ƒ Higher-Order Learning concerns how much students say their courses emphasize analyzing experiences and theories, synthesizing concepts and experiences into more complex relationships, and judging the value of information.
  • ƒ Integrative Learning combines ideas from various sources, such as including diverse perspectives in course work, using ideas from different courses in assignments or class discussions, and discussing course concepts with their instructors or others outside of class.
  • ƒ Reflective Learning focuses on the students’ self-examination of their views on a topic, understanding the perspectives of others, and learning that changes the way the students understand an issue.

Statistical analysis demonstrated that students who experienced more assignments featuring the three writing constructs (Interactive Writing Process, Meaning-Making Writing Tasks, and Clear Writing Expectations) reported they were also more often engaged in higher-order, integrative, and reflective learning. Further, these relationships persisted after controlling for institutional type (Carnegie classification); ten student characteristics (age, sex, ethnicity, major, enrollment status, transfer status, living on campus, international students, parent education, and self-reported grades); and eight other reasons why students could be engaged in deep-learning activities (amount of assigned reading, diversity experiences, group work, academic challenge, service-learning, internships, participation in a learning community, and doing research with faculty).

While these results are indirect measures of learning—as opposed to direct measures of outcomes—they demonstrate that writing assigned and carried out across the curriculum using the three constructs is associated with engagement in deep learning. Engagement has been shown to correlate with a variety of academic success outcomes (Kuh 2008), so these findings answer our original question: yes, well-designed writing assignments contribute to student learning.

Moreover, the three constructs can be used as heuristics by faculty in any discipline. For instance, the seven questions related to Meaning-Making Writing Tasks serve as a measure of the broader notion of meaning making. Faculty in any field can probably think of other cognitively challenging tasks to incorporate into their writing assignments.

Assignment Quality versus Assignment Quantity

Our research also found that, with all other variables taken into account, the three constructs developed in this study had much higher correlations with engagement in deep learning than did the amount of writing. Faculty who already include substantial writing in their courses can increase student learning by applying the three constructs. Institutions with little or no writing beyond the first year can reap additional learning by adding modest numbers of writing assignments in advanced courses—as long as they pay attention to the constructs. By incorporating well-designed writing assignments in courses for their majors, departments can teach students the conventions and expectations of the fields they will enter after graduation.

Writing Increases Students’ Perceptions

Statistical analysis also discovered a positive relationship between the three writing constructs—Interactive Writing Processes, Meaning-Making Writing Tasks, and Clear Writing Expectations—and three well-established constructs from the core NSSE that measure students’ perceptions of how much they had learned and developed while in college:

  • ƒ Practical Competence includes acquiring job- or work-related knowledge and skills as well as the ability to work effectively with others; use computing and information technology; analyze quantitative problems; and solve complex real-world problems.
  • ƒ Personal and Social Development includes learning independently, understanding oneself, understanding other people, developing a personal code of values and ethics, and contributing to the community.
  • ƒ General Education Learning includes communicating clearly and effectively and thinking critically and analytically.

The more students experienced writing assignments that featured the three writing constructs, the more they credited their educational experience at the institution for helping them become brighter, more socially adept, more tolerant, and more astute individuals than when they started college.

Creation of the NSSE and FSSE “Experiences with Writing” Modules

In 2013, NSSE created an Experiences with Writing module, an optional set of questions based on the fifteen writing questions used in this study. Institutions can use this optional module to

  • ƒ assess the extent that their students encounter writing assignments featuring our research’s constructs;
  • ƒ benchmark their results with all schools (in aggregate) that administered the module within two years;
  • ƒ measure progress by comparing results across years;
  • ƒ target their writing initiatives to areas of greatest need by disaggregating results by academic unit, student characteristics, or other variables.

NSSE also created an Experiences with Writing module for its Faculty Survey of Student Engagement (FSSE). It asks faculty about their use of the same behaviors asked of students on the NSSE module. By comparing student and faculty responses, institutional leaders can create a robust picture of their school’s writing climate and culture.

Examples of How This Research Has Been Used

By disseminating and promoting the use of our research results throughout their curricula, colleges and universities can maximize the additional learning that students in any field gain through well-designed writing assignments. The following examples illustrate ways that schools with different missions, students, and resources for faculty and curricular development have begun doing so.

Harvey Mudd College
Harvey Mudd College (HMC)—a small, private liberal arts institution in California with a focus on science, mathematics, and engineering—revised a half-semester writing course for first-year students called Writ 1. Designed by faculty from many departments, the resulting curriculum supports the research findings.

Because Writ 1 instructors come from all departments, the course provides an excellent opportunity for disseminating the use of the three constructs throughout the school’s curriculum. Faculty new to the course participate in a five-day preparatory workshop; returning faculty take a two-day refresher. Course instructors meet weekly to clarify and rehearse their expectations and strategize ways to make those expectations clear to students and consistent across sections. Nearly half the faculty who taught Writ 1 indicated that the experience substantially (“very much” or “quite a bit”) influenced the way they taught their disciplinary courses. Fewer than one in ten felt it had “very little effect” on their teaching in other courses.

HMC has also used the research to inform instruction and assessment in other ways. For example, there was a disparity between HMC students’ and faculty members’ perceptions of the clarity of assignments: while nine in ten instructors believed their expectations were clear, only seven in ten first-year students thought so (with a slightly larger gap for seniors). These results sparked discussion among HMC faculty about the dimensions of transparency and how they could “make the invisible visible” to students, acknowledging that although they don’t want to guide students with step-by-step instructions, they could help students learn more if they explained the pedagogical methods behind their assignments.

Auburn University
In 2010, Auburn University, a midsize public university in Alabama, began an institution-wide writing initiative supported by a new Office of University Writing. This office was charged with helping each department develop plans for embedding writing assignments and instruction throughout their regular courses, a strategy that was believed to be much more effective than designating one or two courses as writing intensive.

Drawn from the twenty-seven best practices developed early in the research project, a University Writing Committee established five requirements for department plans. The committee reviewed department plans, suggested improvements, and asked departments that had not satisfied all five criteria to revise and resubmit. It also elevated some plans as models on the Office of University Writing’s website (http://wp.auburn.edu/writing/index/writing-plans/).

By 2011, all undergraduate majors had approved writing plans. Every three years the committee reviews implementation reports and suggests refinements to departments. Based on these reviews, it also recommends ways the Office of University Writing can better support the writing initiative. The office uses the research project’s three constructs in its programs, curricular assistance offered to departments, and online resources.

University of the Cumberlands
University of the Cumberlands (UC), a Christian liberal arts institution in Kentucky, provides an illustration of the ways some schools have woven the Experiences with Writing results into projects centered on other institutional objectives. As an accreditation requirement, the Southern Association of Colleges and Schools Commission on Colleges asks institutions to develop a targeted Quality Enhancement Plan (QEP) to improve student learning. UC originally used locally developed rubrics to assess progress in its QEP of Critical Thinking Across the Curriculum (CTAC) with emphasis on reading and writing in general education courses. At the suggestion of consultants from the Teagle Foundation-funded Center of Inquiry, UC adopted NSSE with the Experiences with Writing module in 2013 to assess the writing focus of CTAC.

The NSSE Experiences with Writing module has become a part of the University’s ongoing discussion of pedagogy. The data were central to the development of the current QEP, piloted in 2015, which focuses on enhancing student metacognition and performance through reading and writing. As part of a recent general education revision, faculty in disciplines serving this curriculum developed writing-intensive, cross-disciplinary, upper-level “integrated studies” courses. Under the current QEP, the University seeks to focus on general education as the liberal arts major shared by all undergraduates in preparation for lifelong learning, with faculty in integrated studies strengthening these courses as capstone general education experiences.

A longitudinal comparison of UC’s 2013 and 2016 NSSE data for seniors shows a slight increase in Interactive Writing Processes from a mean of 2.8 to 2.9. Meaning Making was stable at a mean of 3.3, slightly above the NSSE-wide mean of 3.2. The most notable increase was the mean for Clear Writing Expectations which rose from 4.0 to 4.2, statistically better than the NSSE comparison mean of 4.0. Also noteworthy, first-year students who completed NSSE in 2016 found their courses to be more challenging than did their counterparts in 2013. These results support the benefit of curricular development with underpinnings of collaborative faculty development. UC’s ongoing pedagogical enhancements and writing initiatives have had a measureable impact upon student learning.

Other Applications

Some institutions report their results for internal use and in accreditation and other documents directed to extramural readers. For example, faculty at Miami University (Ohio) and North Carolina State University highlighted the NSSE/FSSE research results in a successful joint proposal to the National Science Foundation for a three-year, multi-institutional project to improve student learning and writing in computer science and software engineering programs nationwide.

Conclusion

The research project described in this article and the examples of its local application underscore the value of teaching faculty to focus on the quality, not just the quantity, of writing assignments they give students. It also provides faculty across the disciplines with practical, adaptable guidance for enhancing what students learn from any writing assignment, including those described in—or derived from—the other articles in this issue of Peer Review

 

Acknowledgments

The research team consisted of Paul Anderson, Chris M. Anson, Robert M. Gonyea, and Charles Paine. Wendy Menefee-Libey and Laura Palucki Blake wrote the section on Harvey Mudd College and provided indispensable help in preparing the entire article. Margaret Marshall wrote the section on Auburn University. Tom Fish and Susan Weaver wrote the section on University of the Cumberlands.

 

References

Anderson, Paul, Chris M. Anson, Robert M. Gonyea, and Charles Paine. 2015. “The Contributions of Writing to Learning and Development: Results from a Large-Scale Multi-Institutional Study.” Research in the Teaching of English 50: 199–235.

Burning Glass Technologies. 2015. The Human Factor: The Hard Time Employers Have Finding Soft Skills. Boston: Burning Glass Technologies. http://burning-glass.com/wp-content/uploads/Human_Factor_Baseline_Skills_FINAL.pdf.

Gladstein, Jill, and Brandon Fralix. 2017. “Supporting Data-Driven Conversations about Institutional Cultures of Writing.” Peer Review 19 (1): 21–24.

Hart Research Associates. 2015. Falling Short? College Learning and Career Success. Washington, DC: Association of American Colleges and Universities.

Kuh, George D. 2008. High-Impact Educational Practices: What They Are, Who Has Access to Them, and Why They Matter. Washington, DC: Association of American Colleges & Universities.


Paul Anderson, Professor Emeritus, Miami University (Ohio); Chris M. Anson, Distinguished University Professor, North Carolina State University; Tom Fish, Dean of Retention, Director of Quality Enhancement Plan, Professor of English, University of the Cumberlands; Robert M. Gonyea, Associate Director of Indiana University Center for Postsecondary Research; Margaret Marshall, Professor of English, Director of University Writing, Auburn University; Wendy Menefee-Libey, Director of Learning Programs, Harvey Mudd College; Charles Paine, Professor of Rhetoric and Writing, University of New Mexico; Laura Palucki Blake, Director of Institutional Research and Effectiveness, Harvey Mudd College; and Susan Weaver, Professor of Education, Assistant Director of Assessment, Former Director of the Quality Enhancement Plan, University of the Cumberlands

 

Previous Issues