Peer Review

Assessment Culture: From Ideal to Real—A Process of Tinkering

As a state university, California State University Monterey Bay (CSUMB) is experiencing a time of great stress, uncertainty, and fear. Budget cuts, furloughs, and the challenges inherent in educating increasing numbers of students during a time of shrinking resources are all at the forefront of the minds of our staff and faculty. Nonetheless, accreditation pressures, program reviews, and annual assessment plans are still realities and expectations. In its fifteen-year history, CSUMB has taken an outcomes-based approach to building its curriculum. Every requirement, course, and degree has carefully articulated student learning outcomes, and mechanisms have been built in to assure sustained conversations about those outcomes. A recent self-study, conducted as part of the campus’ first reaccreditation, revealed, however, that the faculty have not moved as intentionally from outcomes to assessment of those outcomes. Perhaps “closing the loop” has not happened because the demands of institution building have diverted the necessary time and attention. The critical step of achieving closure will require building a culture of assessment on campus. Ideally this culture will be steeped in an understanding and appreciation of how systematic assessment can inform what and how we teach, so that deep and meaningful student learning is more likely to occur. Building such a culture with its attendant practices and perspectives has become the centerpiece of program reviews and preparation for reaccreditation.

In terms of context, it is also important to realize that as a campus, we have been engaged in an ambitious general education revision, that is framed by AAC&U’s Liberal Education and America’s Promise (LEAP) essential learning outcomes, and is intended to bring coherence to the entire undergraduate curriculum, one of our academic priorities. Thus, the invitation to apply for the Engaging Departments Institute was perfectly timed. As we discovered once we arrived, three other academic priorities are extremely well served by the conversations, workshops, and presentations at the institute, namely, operating in a culture of evidence, enhancing active and engaged learning, and enhancing technology.

Impacts at the Administrative Level

Upon our return from the institute, we participated in a meeting of the Deans and Provost Council dedicated to reviewing departmental plans for assessment of learning in the majors. That meeting was a perfect opportunity to begin connecting those plans to the larger goal of curricular coherence that had become central to all of us during the institute. At that gathering, we shared the following insights:

  • First, we must consider the potential of the LEAP outcomes to contribute to curricular coherence. Our campus has used LEAP to develop models for redesign of our general education curriculum over the past academic year and will be adopting a new model and beginning implementation in the coming year. Using that same LEAP structure to reexamine learning outcomes in our majors is a goal we all have come to share.
  • Second, while all of our academic programs include capstones and we appreciate these courses as important sites for assessing student learning, we also must be more attentive to identifying milestones and helping students recognize and assess their learning at those milestones, as well as understand how they contribute to overall learning.
  • Third, we see e-portfolios as powerful learning tools to provide coherence and to support our students and ourselves in the teaching and learning process.
  • Finally, the evidence that is available to us through electronic portfolios, capstone courses, interaction with our students and faculty, and myriad other sites, must be thoughtfully analyzed and used to inform ongoing improvement of the learning opportunities we provide to our students.

The guidelines (fig. 1) have now been shared with all academic departments and have been used by them to revise their assessment plans for 2009–10. Departments have gained some insight into how to use their scholarly and creative training to design meaningful, sustained, and systematic assessment of student learning that is analyzed and mobilized to enhance curriculum and pedagogy. Although implementing these plans will require continued nurturing and support, faculty have expressed interest in revisiting the structure of our upper-division curriculum that will provide us with opportunities to frame campus conversations around LEAP, e-portfolios, milestones, and evidence-based decisions about teaching and learning.

Figure 1. Assessment Plan 2009-2010

  1. What is the critical concern/question held by the department regarding student learning that will be assessed during 2009-2010?

  2. How is the critical concern/question related to the department’s latest program review and program improvement plan?

  3. Describe how/whether/when this critical concern has been previously assessed by your department. How will this new assessment build on the previous one(s)? How will this new critical concern/question generate new information for you?

  4. In what specific activities will the department engage in 2009–2010 to determine which evidence will best align with the critical concern listed above?

  5. How/when will the department gather evidence of student work? Who will be involved in this process? How will you assure that the evidence gathered is a random sample of student work?

Selected guidelines for departmental plans for assessment of learning in the majors from the 2009–2010 Assessment Plan (as adapted by Renee Curry, Dean, CSUMB College of Arts, Humanities and Social Sciences).

Impacts at the College Level: The School of Business

Attending the institute came at an opportune time for the CSUMB School of Business (BUS) representatives. Over the preceding year, we had completed three major steps in our program review process: submitting our self-assessment study, receiving the external reviewers’ report, and receiving the university’s academic program review committee’s report. What we learned at the institute helped us define a clear path to program assessment and improvement, and helped us align our thinking about program improvement with that of assessment professionals from across the nation.

One major “aha” moment we experienced at the institute came when we realized that CSUMB students do not experience the curriculum as two separate parts: lower-division general education requirements and upper-division major learning requirements. Only administrators and faculty members view it this way. As a result of this realization, we established the goal of integrating the lower and upper divisions into a continuum of study for CSUMB BUS students—a step we also viewed as essential to better equipping our students for jobs in the twenty-first century. We returned to CSUMB with this new insight as a foundation piece for our program-improvement plan, and as a validation of the strategic decision we had made to align with AACSB accreditation standards.

Our second “aha” moment came when we were inspired to map our major learning outcomes (MLOs) and general knowledge and skills outcomes with the learning outcomes espoused by AAC&U through LEAP (AAC&U 2007). The result was a two-dimensional, both functional and knowledge-based, outcome structure. Figure 2, below, shows how we “nested” our outcomes within the larger LEAP framework to meet our goal of seamless integration of lower- and upper-division curricula. 

Figure 2. Alignment of CSUMB BUS outcomes with LEAP Essential Learning Outcomes.

LEAP Essential Learning Outcomes
School of Business General Knowledge Outcomes
School of Business Major/Management Specific Knowledge Outcomes
Knowledge of Human Cultures and the Physical and Natural World
Through study in the sciences and mathematics, social sciences, humanities, histories, languages, and the arts [and the professions]
General education
  • Leadership and management
  • Marketing
  • Finance
  • Information technology
  • Operations management
  • Entrepreneurship
Intellectual and Practical Skills
Inquiry and analysis
Apply critical thinking and analysis (quantitative and qualitative decision making)
 
Critical and creative thinking
Apply critical thinking and analysis (quantitative and qualitative decision making)
 
Written and oral communication
Demonstrate professional written and oral
communication
 
Quantitative literacy
Apply critical thinking and analysis (quantitative and qualitative decision making)
 
Information literacy
Demonstrate technical competence
 
Teamwork and problem solving
Function effectively in cross-functional teams
 
Personal and Social Responsibility
Civic knowledge and engagement—local and global
Demonstrate understanding of the implications of globalization and cultural diversity
 
Intercultural knowledge and competence
Demonstrate understanding of the implications of globalization and cultural diversity
 
Ethical reasoning and action
Demonstrate ethical and socially responsible reasoning and action
 
Foundations and skills for lifelong learning
All of the above
 
Integrative and Applied Learning
Synthesis and advanced accomplishment across general and specialized studies
All of the above
All of the above MLOs

Thanks to our learning at the institute, the program assessment and improvement activities we have undertaken since returning to campus have been more data-driven than we had practiced in prior semesters. Our first area of assessment is oral and written communications, an outcome that cuts across both lower- and upper-division segments of the school of business curriculum. During fall planning week, the full-time faculty team crafted a research question designed to help us understand why students are unprepared to meet the professional writing standards of the senior capstone and, beyond the capstone level, of future employers (a finding affirmed by our own and published research). It quickly became apparent that we, as full-time faculty, did not have common standards. This helped galvanize our commitment to seek more evidence regarding writing instruction in the school of business. We are now assessing our writing outcomes by employing two pertinent concepts we learned at the institute: (1) designating milestone assignments for assessment within milestone courses; and (2) collecting both direct and indirect evidence to help us navigate the path to improved student learning.

Additionally, we have used a questionnaire to gather indirect evidence from students taking the business graduate writing assessment requirement course and we have asked the instructors to introduce a short writing assignment to gauge the writing proficiency of students entering this course. We are using department faculty meetings to assess randomly selected pieces of student work against the rubric used by instructors in the course. Some early insights of this “norming” process are that: (1) the rubrics need to be modified because they do not capture all elements of writing students should be mastering; (2) not all faculty members are facile in applying rubrics—it takes time; (3) we don’t yet have a single set of standards; and (4) writing mechanics in the evidence collected are consistently below our expectations.

Since our inception, CSUMB and the School of Business have committed to outcomes-based learning. Our challenge has been to embed a process that empowers us to assess student learning outcomes across our program—information not provided by course grades. We view our achievements as glimmers of reality that curriculum assessment and improvement can take place even within the context of these macro challenges we cannot control.

Impacts at the Departmental Level: Liberal Studies Department

Attending the AAC&U Engaging Departments Institute provided opportunities to actively participate in workshops led by experts on teaching, learning, and assessment; debrief with teammates; and use the allotted time to wrestle with new insights, tools, and strategies in the context of our department. Such opportunities were especially important to a team crafting a program-improvement plan based on the recently completed program review. A prominent question before us was how and what kind of assessment data should be collected, so as to profitably inform instruction and document the extent to which learning outcomes are achieved. The self-study done for the program review revealed that for the three MLOs under consideration, faculty tended to develop grading rubrics for their courses that primarily addressed course, rather than program, learning outcomes. There was also no coordination across courses within the major, and in the instance of the capstone class, there were even differences in the nature of assignments among instructors. Hence, with respect to the MLOs, it was difficult to effectively map where particular outcomes were introduced, practiced, and assessed—let alone ensure that what constitutes practice of a learning outcome in one course is paralleled in another. Additionally, as emphasized by the external reviewers, it was important for the department to develop and/or identify milestone assignments that served as consistent evidence across required liberal studies courses of student learning.

The exposure to three tools—the VALUE rubrics, e-portfolios, and Bloom’s Taxonomy—during the Engaging Departments Institute prompted our “aha”moments because the tools afforded us concrete strategies that could be used to achieve goals at both the department and university levels.

Upon returning to campus, the challenge for the liberal studies team has been determining how to achieve buy-in amongst our colleagues. Obviously, adding these tools to existing departmental practices requires learning new skills and modifying courses to accommodate the use of those tools. Convincing colleagues that this is a worthwhile endeavor during a time of furloughs when we are already doing more with less is no easy feat. Our response is to begin leading by example. In the fall, two of the authors will teach sections of the major proseminar. This will provide a perfect opportunity to begin infusing these tools into existing departmental practices.

Finally, we, as a faculty, and our external reviewers have asked, “Why are our students entering the senior capstone unable to develop big ideas, address complex questions and provide complex responses to the thematic focus (multiculturalism and social justice) in an analytical manner?” Thus, our immediate focus is on the development of critical thinking skills. We plan to use the “practical strategies for gathering ‘actionable’ evidence of student learning at the department level” introduced by Jo Beld at the AAC&U Institute.

In sum, the three tools mentioned are examples of how the participating liberal studies faculty were inspired to expand our assessment practices in ways that will nurture the type of teaching and learning that we would like to characterize our department as we enter the continuous cycle of improvement. We look forward to inspiring our departmental colleagues to join us on this journey.

Assessment Culture: From Ideal to Real…A Process of Tinkering

What conditions seem to be in place that will help this culture to take root? First, clearly the administration at CSUMB is establishing structures that will promote the growth of an assessment culture. Examples include the need to conduct program reviews on a seven-year cycle and the expectation that departments will develop annual assessment plans that “close the loop” using the results from meaningful, sustained, and systematic assessment to inform curriculum and pedagogy. These results should in turn lead to more effective practices. Obviously, these efforts require faculty to devote time and energy to assessment activities. With appropriate support and guidance, these activities should facilitate the development of a more utopian assessment culture. Thus, these mandates are supported by on-campus workshops that provide a means for developing effective assessment practices.

Second, the institution is willing to invest resources to establish a context that will support and nurture desired practices and perspectives. Considerable resources are being devoted to the important task of building curricular coherence to develop an environment that promotes deep and meaningful learning. Moreover, the financial resources used to send this team to the institute have afforded the development of not only faculty expertise, but perhaps just as important, positive faculty dispositions toward the importance of a robust culture of assessment. It is expected that through our practices and leadership, participating faculty can begin spreading what we have learned from the institute to our departmental colleagues, who in turn will also begin to serve as sources of inspiration and expertise for their peers across campus.

A third factor that will help build a culture of assessment on campus is, as stated earlier, a senior leadership group committed to guiding principles that allow us to plan and implement assessment and improvements with realistic targets. These principles include the following: (1) assessment is not episodic, but continuous; (2) initiatives for change can be done in small steps; and (3) time horizons and plan achievements are governed by available resources.

By and large, we are finding faculty and administrators quite responsive to what we are introducing. We anticipate that campuswide discussions will come when, as we have said, the general education model is in place and we can pursue the next obvious step—tying together the upper and lower divisions through coherent outcomes. By that time, business and liberal studies will be able to guide the discussion, sharing what they’ve been doing and learning.

References

Association of American Colleges and Universities. 2007. College learning for the new global century: A report from the national leadership council for Liberal Education and America’s Promise. Washington, DC: Association of American Colleges and Universities.

Beld, J. 2009. Practical strategies for gathering “actionable” evidence of student learning at the department level. Presented at the AAC&U Engaging Department Summer Institute, Philadelphia, PA.

California State University Monterey Bay. Academic priorities and goals 2008-2013. csumb.edu/site/x24337.xml.

Lave, J. and E. Wenger. 1991. Situated learning: Legitimate peripheral participation. New York: Cambridge University Press.


Pat Tinsley is an associate professor of strategic management in the School of Business; Marylou Shockley is an associate professor and chair of management in the School of Business; Patricia Whang is a professor and chair of psychology foundations and liberal studies; Paoze Thao is a professor of linguistics and education and liberal studies; Becky Rosenberg is the director of the Center for Teaching, Learning, and Assessment; Brian Simmons is the dean of the College of Professional Studies—all of California State University Monterey Bay.

Previous Issues