Tool Kit Resources: Campus Models & Case Studies

Campus Experiences with the VALUE Approach to Assessment

In the early 2000s, faculty in Massachusetts had the same feelings about assessment as colleagues across the country.

They hated it. Assessment meant tests largely divorced from real work students did in the classroom.

“Standardized tests actually got in the way of showing whether students could demonstrate their learning,” said Robert Awkward, director of learning outcomes assessment for the Massachusetts Department of Higher Education (DHE).

Enter the VALUE rubrics. Developed from 2007 to 2009 by teams of faculty and higher education experts at more than one hundred institutions, AAC&U’s sixteen Valid Assessment of Learning in Undergraduate Education (VALUE) rubrics help educators assess learning outcomes like critical thinking, quantitative reasoning, written communication, or civic engagement. Each rubric contains five or six rows that include dimensions (i.e., criteria) within a learning outcome, and four columns representing levels of proficiency from benchmark (1) to capstone (4). See figure 1 for a breakdown of the Critical Thinking VALUE Rubric.

Massachusetts was an early adopter of the rubrics, and faculty found them useful for assessment and pedagogical needs in general education, cocurricular programs, capstones in the major, and institutional accreditation.

“Faculty here tend to react to those rubrics the same way I’ve heard that they react around the country,” said William Heineman, vice president of academic and student affairs at Northern Essex Community College. “This is the kind of stuff we would actually agree that we want our students to be able to do.”

Figure 1. Parts of a VALUE Rubric (click to expand)

Figure 2. Parts of a VALUE Rubric

PFigure 2. Parts of a VALUE Rubric

But once institutions scored their students’ work with the rubrics, what did that information really mean?

“We had data that say how students did relative to those rubrics,” Awkward said. “The problem is, compared with what? Are we in the middle of the pack, are we leading, are we behind?”

In 2012, several Massachusetts public institutions initiated a pilot program to see if the VALUE rubrics are a valid way to benchmark student learning across institutions. This pilot went nationwide the next year as the Multi-State Collaborative (MSC), with thirteen state systems and eighty-eight two- and four-year institutions eventually participating. Two smaller pilot initiatives—the Great Lakes Colleges Association and the Minnesota Collaborative—brought private four-year institutions into the project. Information about VALUE’s validity study, which included the three pilot projects, can be found in AAC&U’s 2019 report, We Have a Rubric for That.

“It’s very challenging to do authentic assessment and engage faculty across many institutions, and to do something that aspires to some level of standardization even though the prompts aren’t standardized,” said Martha L. A. Stassen, associate provost for assessment and educational effectiveness at the University of Massachusetts Amherst. The campuses and organizations involved were “really working to . . . build this bridge to something that’s not been done before at a national level.”

Since fall 2017, AAC&U’s VALUE Institute has made the benefits of nationwide, direct assessment available to any national and international higher education institution, department, program, state system, or consortium (see figure 2). Though VALUE rubrics can be downloaded for free, the VALUE Institute requires registration and provides independent validation of institutions’ student learning and campus-based assessment. Institutions participating in the VALUE Institute can compare their results with benchmarks from similar institutions across the country and over time. Registration for the 2019–20 institute opens October 15, 2019.

“You have independent scorers across the country who are looking at your data; who don’t know you, don’t know the state, and don’t know the students; who have all been consistently trained using the same rubrics,” Awkward said.

Assignment Design for Assessment and Equity

For the 2018–19 and 2019–20 academic years, Massachusetts DHE funded ten public institutions to participate in the VALUE Institute assessment of students’ critical thinking. This article features interviews with administrators at the DHE and three institutions representing different sectors of public higher education: Northern Essex Community College, a two-year institution; the University of Massachusetts Amherst, a research-intensive flagship institution; and Worcester State University, a regional comprehensive university. All found VALUE useful for their unique assessment needs.

Infographic_2019_1.pngTo participate in the VALUE Institute, each institution collects one hundred “artifacts,” examples of student work from a real course assignment and ideally from students who are near graduation (with forty-five credits at two-year institutions or ninety at four-year institutions).

Assessment staff scrub any information that could be used to identify the student or institution before uploading artifacts to AAC&U’s online system.

This process can be challenging for some institutions. Many students do not have enough earned credits, or faculty sometimes don’t follow through after pledging to provide student work. But institutions recognize the importance of maintaining a consistent data set.

“The point of the assessment is trying to see, when a student’s almost about to graduate, if they demonstrate they have learned the institutional learning outcomes that we’ve chosen to focus on at their institution,” Awkward said.

But for some institutions, getting faculty to participate was less challenging than anticipated. Stassen described how many UMass faculty members initially viewed assessment as meaning standardized testing and were skeptical. But when they see that VALUE is about “actual student work and that it’s scored by faculty, they truly see the value in that,” Stassen said. “It has built pockets of truly intrinsic interest in the faculty-based, rubric-based authentic assessment that the VALUE Institute promotes.”

Another challenge faced by institutions is that assignments don’t always match the learning outcomes faculty hope students will demonstrate.

“We have very strong examples of how, when we didn’t have assignments that were aligned with the criteria, our results were more spotty,” Stassen said. Stassen and her colleagues used their VALUE experiences to design a campus-wide workshop on the use of rubrics in course design.

Influenced by the work of the Transparency in Learning and Teaching in Higher Education initiative, Sarah Strout, assistant vice president of assessment and planning at Worcester State, sees assignment design as an equity issue as well as an assessment best practice.

“Students come in with varying levels of hidden knowledge about college, and the more explicit you are in your assignment directions, the more you’re actually assessing students on their ability to do the assignment and not their knowledge of how to read between the lines,” Strout said.

Strout and her colleague Bonnie Orcutt, professor of economics, led a series of workshops at Worcester State to help faculty tailor assignments to align more closely with course objectives and make these objectives clear to students. They later led four regional assignment design workshops across Massachusetts to help institutions struggling with the same problems. Each institution sent several faculty, who returned to campus and shared best practices with colleagues.

“The professional development for faculty in assessment is a wonderful outcome of our participation in the institute,” Heineman said.

Benefits of Statewide Participation in VALUE

In Massachusetts, public institutions and the state higher education system have a “legal mandate” to have a “systematic way of assessing learning,” Awkward said. “You can’t keep asking people to pay more taxes [to fund your institutions] unless you’re demonstrating that there’s a return on those investments they’re making.”

Although Massachusetts participates in the VALUE Institute as a state, “this is all campus-based, allowing campuses to maintain independence” and use the data in ways that make sense for their students, Awkward said.

Disaggregated VALUE data about specific institutions are only sent to those campuses, and states only get an aggregated report of statewide data if they have enough participating institutions to ensure confidentiality in the results.

“Institutions would not feel very comfortable sharing that kind of data directly with us,” Awkward said.

But even aggregated data still have immense value, helping Awkward to show stakeholders the strength of teaching and learning in Massachusetts state compared with institutions nationally. He shares VALUE data with presidents of Massachusetts institutions, members of the board of trustees for the state, and publicly on the DHE website. And because Massachusetts has participated in nationwide VALUE scoring initiatives since 2014, the data can be used across multiple years to see historical trends.

In “lean times” like these, Awkward said that VALUE also provides alternatives to states who don’t have capacity for their own assessment at this scale. Awkward predicts he would need another full-time staff member and several interns to do something similar in-house.

But the most important benefit to institutions across the state is that the data help faculty to improve the quality of their teaching and student learning. As a result of VALUE data, Massachusetts is planning five regional workshops to help faculty “improve the delivery of quantitative literacy in their teaching,” Awkward said. “If you’re going to accept that we’ve moved forward in some areas like critical thinking, which we spent a lot of time on over the last several years, then you also have to accept the fact that, if we’re falling behind year to year in quantitative literacy within our own state and behind the national numbers, there’s something going on there that we need to address.”

Faculty Conversation across Silos at Northern Essex Community College

Data from the VALUE Institute include in-depth information about each of the five dimensions of the VALUE rubric, allowing institutions to see “what particular parts of the skill our students seem to be excelling and not excelling at, and also how we’re measuring up with other schools either in Massachusetts or in the other states,” Heineman said.

Faculty at NECC have seen in the past that “influence of context and assumptions“ in critical thinking may be a weakness in students’ performance. “So, we had a fair amount of conversation here around the campus about what we are doing to help students understand that when they build a logical argument,” Heineman said.

In addition to participating in national VALUE initiatives for the last five years, NECC also uses VALUE rubrics in their yearly internal assessment. Each year, NECC focuses on one of six core academic skills. Their most recent assessment cycle, begun in 2017, used a rubric adapted from AAC&U’s Quantitative Literacy VALUE rubric.

“There is wide agreement, not just here but among everybody I’ve talked to in the Massachusetts community college system, about how helpful those rubrics are as a benchmark to work from for different core academic skills that we assess here,” Heineman said.

But the real benefit is the conversation assessment creates among faculty on “specific concepts that students struggle with, over time and across institutions, and figuring out how to address those concepts in ways that seem to click with students better,” Heineman said. “Individual faculty members definitely end up taking action in their own classes based on that.”

Such discussions can cut across disciplines, as faculty see they have common understandings or approaches related to outcomes and assessment. And these discussions often focus on what the data really mean about their assignments and their students.

“It’s not necessarily a secret that faculty don’t come to learning outcomes assessment with a lot of excitement. They’re good faculty members, so they approach assessment data with a critical eye,” Heineman said. “But that conversation is really golden because that’s what assessment is all about. You’re really trying to answer the questions, ‘How do you know what students are learning, and how do you know you know?’”

University of Massachusetts Amherst

The University of Massachusetts Amherst (UMass) strategic plan emphasizes evidence-informed decision making.

“It’s a nice environment to be in because there’s already support for the importance of looking at various forms of evidence, both quantitative and qualitative, to inform department activity and other program activity,” Stassen said. In 2018, she and her colleague Anne J. Herrington wrote an article in Peer Review on UMass’s approach to using VALUE for faculty development.

While UMass does use national VALUE institute data to compare their results with other four-year institutions, Stassen finds the data less useful for informing individual instructors about student learning in their own courses.

“Because of the small sample size from each course for the national assessment, it’s difficult to use those results as an indication of learning in any one class,” Stassen said. Instead, faculty use data to start “dialogue around the fit of the rubric to their course or to their assignment, the fit of their pedagogy to their broader goals for critical thinking, and what this evidence can help them think about with respect to those pedagogical questions.”

Like at NECC, UMass faculty adapt VALUE rubrics for the university’s internal assessment. Though the adapted rubrics make it difficult to compare results between internal assessment and the VALUE Institute, it “can energize faculty by showing them their perspectives and needs are important, Stassen said. “It’s been a very nice, research-based way to communicate and to bolster a pedagogical argument with assessment evidence that’s both national and local.”

This pedagogical argument has spread across campus. Several UMass faculty have been trained as national VALUE scorers, and they come back to campus to hold calibration sessions for other faculty.

Stassen says these in-house calibration sessions have been one of the more valuable aspects of participating in the VALUE institution. “You could just see their wheels turning as they’re hearing from faculty who have a different perspective than them because of disciplinary differences,” Stassen said. “It’s wonderfully intellectually charged and it’s fun.”

Some departments use VALUE rubrics for the university’s Educational Effectiveness Plan, which allows departments to define their own lines of assessment inquiry. And lessons learned from VALUE are transferring to other assessment work across the university, including emphasizing assignment design in the ongoing assessment of the integrative experience capstone to the general education curriculum.

“We can bolster the rationale for our focus on assignments by drawing from the evidence-based experiences of our work on VALUE over the last five years,” Stassen said.

Worcester State University

Like NECC and UMass, many faculty and departments at Worcester State University use VALUE rubrics in their own departmental assessment. But unlike the other institutions, Worcester State uses national VALUE Institute data as an integral part of their internal institutional assessment.

“I think it allows for a lot of communication that might not happen if we didn’t have this validated standardized rubric and benchmarks to compare against,” Strout said.

Bonnie Orcutt, professor of economics at Worcester State, trains faculty across the country for the VALUE Institute, and she also holds training sessions on campus. Rather than selecting artifacts from a few artifacts in each course that meet the VALUE Institute’s requirements, this allows Worcester State to assess work from entire courses.

“It was really interesting to be able to go back to show faculty how the students scored early in their career, where they scored later in their career, where the gaps might be,” Strout said. “They can really see the direct impact of how their students did on that particular assignment and then talk about how maybe the assignment wasn’t really asking them to do critical thinking or if the assignment directions were clear enough to students.”

At Worcester State, the most difficult part of VALUE assessment is trying to get artifacts from faculty across a range of disciplines, not just in certain subjects.

“I don’t want to have all of it from humanities or all of it from STEM fields because then that’s not a very good representation of the university as a whole, and the whole point of this is that we want to have some university-wide assessment,” Strout said.

The consistency with internal assessment allows faculty to compare internal ratings of their students’ work with ratings from the VALUE institute.

“The assessments that were done on campus were pretty consistent with the ones done outside of campus,” Strout said. “We have a responsibility as a state with our public education to show what our students are doing, and we can’t do that if we do that on our individual campuses alone. And I think it’s helpful to have university-wide assessment that you can benchmark to your peers.”