Peer Review

Increasing Student Success in STEM

For the past twenty years, countless reports have been issued calling for change and reform of undergraduate education to improve student learning, persistence, and graduation rates for students in STEM; however, few recommendations in these reports have been widely implemented (Seymour 2002; Handelsman et al. 2004; Fairweather 2008; Borrego, Froyd, and Hall 2010). Aspirational student success goals in STEM have been set most recently by the President’s Office of Science and Technology 2012 report, Engage to Excel: Producing One Million Additional College Graduates in Science, Engineering, Technology and Mathematics (2011). This report states that STEM graduation rates will have to increase annually by 34 percent to meet this goal. On most campuses, the persistence and graduation rates of underrepresented minority (URM) and first-generation students still lag behind those of their majority counterparts. Thus, in order to reach the aspirational graduation rates called for in national reports, a focus on URM and first-generation student success is imperative.

Participating Institutions in the Keck/PKAL Project

  • California State University–East Bay
  • California State University–Fullerton
  • California State University–Long Beach
  • California State University–Los Angeles
  • San Diego State University
  • San Francisco State University
  • W.M. Keck Science Department of Claremont McKenna, Pitzer, and Scripps Colleges
  • University of San Diego
  • University of La Verne
  • The California State University Chancellor’s Office
  • University of California–Davis

While many change efforts have been initiated, almost always at the departmental level, few have reached the institutional level of entire programs, departments, or colleges in the STEM disciplines, described as necessary in these recent reports. There is growing recognition that reform in STEM is an institutional imperative rather than only a departmental one. Student advising, faculty professional development, student research mentoring, academic support programs, clear STEM-focused institutional articulation agreements, external partnerships with business and industry related to internships and other research experiences, and many other critical programs and areas that have been identified as central to student success are often overlooked within reform efforts. New research demonstrates the importance of a broader vision of STEM reform for student success—moving from programs and departments to an institution-wide effort. For example, institution-wide implementation of high-impact practices has been shown to dramatically improve the graduation rates of URM students (Kuh and O’Donnell 2013). The Meyerhoff Scholars Program at the University of Maryland Baltimore County epitomizes this type of institution-wide effort and combines specific academic, social, and research support interventions that have resulted in dramatic improvements in graduation of minority STEM students (Lee and Harmon 2013).

The Keck/Project Kaleidoscope (PKAL) STEM Education Effectiveness Framework project, funded by the W. M. Keck Foundation, aimed to develop a comprehensive, institutional model to help campus leaders plan and implement evidence-based reforms geared toward improving student learning and success in STEM into scalable and sustainable actions. The project engaged eleven California-based colleges and universities (see box, below) to test evidence-based strategies that will lead to program, departmental, and eventually, institutional transformation.

The systemic institutional change model that came out of this project, outlined in this article and brought to life by the case studies in this issue, is a valuable tool to help campuses work on this broader vision. This model provides both a process and a content scaffold for campus leaders to plan, implement, assess, and evaluate change efforts in undergraduate STEM education in a way that goes beyond redesign of a single course or isolated program. Further details regarding the model have been written in a guidebook for campus leaders who have convened (or will convene) teams comprising faculty members, department-level leaders, student affairs staff, appropriate central administration officers, institutional researchers, and undergraduate studies officers. We have learned from our own work as both researchers and practitioners that institutional change is best executed by a cross-functional team working together. Institutional change depends on the support of leaders across campus—including grassroots faculty leadership, mid-level leadership among department chairs and deans, and support from senior leaders in the administration.

This issue of Peer Review, produced with support from the W. M. Keck Foundation as part of the PKAL project, provides an overview and case studies from six of the project campus teams describing their STEM reform journeys. Each case study highlights the elements of the Keck/PKAL model for effective institutional change to increase student success in STEM that was developed during this three-year project. We appreciate the efforts of our pioneering campuses to explore new territory, often going where few colleges have gone before. We are convinced that campuses that are open to a broader vision for student success and that allow themselves to engage in what can be a messy process of change can create sustained and scaled efforts at STEM reform.

The Keck/PKAL Model

The Keck/PKAL model for effective institutional change outlines both a process and content that will lead to increased student success in STEM. Although focused on STEM, it is applicable to any change process that is focused on improving student learning and success. The elements of the model are illustrated in figure 1 and described in table 1.

Table 1. Model Elements

Model Element

Description
Establish Vision The vision represents the direction in which the campus is aimed in terms of altering its STEM experience to support student success. We encourage a vision that is clear, shared, and aligned with institutional priorities.
Examine Landscape and
Conduct Capacity Analysis
A direction forward is typically best created through an analysis of the existing landscape (internal campus data as well as external reports on STEM reform) as well as a review of current capacity to engage in change generally—such as history of reform, leadership, and buy-in and ownership among faculty. This stage focuses on collecting data and information to conduct analysis.
Identify and Analyze Challenges and Opportunities The landscape and capacity information needs to be analyzed in order to identify both challenges and opportunities for the campus. This phase often brings in politics and culture that might be sources of both opportunities and challenges.
Choose Strategies/ Interventions, Leverage Opportunities Campuses need to familiarize themselves with a host of possible strategies or interventions to address the challenges identified and leverage the opportunities. They can examine these strategies in light of the capacity of the campus as well as opportunities identified earlier.
Determine Readiness for Action In addition to reviewing capacity and opportunities, there are key issues that emerge when implementing specific strategies such as resources, workload, institutional commitment, facilities, timeline, and other areas that campuses should review in order to effectively implement the strategy and to ensure that the campus is ready to move forward with that particular strategy. Campuses will be able to take advantage of opportunities, such as a newly established special campus projects fund, or a new faculty hire with appropriate expertise, that can be leveraged in support of effective implementation. This phase also involves further exploring campus politics and culture.
Begin Implementation Implementation involves drafting a plan for putting the intervention or strategies in place. The plan builds off of the ideas from the readiness for action, capacity of the campus, and opportunities identified. All of these will be built into the plan, as well as a process for understanding challenges as they emerge. In addition to creating a well-laid-out plan, campuses may decide to pilot an initiative first and then consider how to modify and scale it after an initial trial.
Measure Results Campuses will also create an assessment plan to determine whether the intervention is working and ways they can be changed over time to work better.
Disseminate Results and Plan Next Steps In order to prevent the continued “siloization” of work, it is important for campuses to think about dissemination opportunities on campus as well as off campus, either regionally, statewide, or nationally. Also, keeping the momentum going will require deliberate planning for next steps.

 

 

The model is shown in the context of a river because the flowing nature of a river represents the flowing nature of change as well as the dynamic and powerful process of change. The flow (change process) encounters obstacles (challenges presented by certain aspects of the change process) represented by rocks in the figure that may result in an eddy where the flow circles around the obstacle until it can break free. The resulting eddy motion is an apt analogy for the circular swirl, or iterative process, that campus teams experience when they encounter resistance and challenges along their path toward reform. They must work through the issue, determine the nature of the challenge, and figure out how to get the flow going again in the desired direction. Travelers on the river may enter at various points or “put out” at certain locations to rest and regroup. New travelers may enter and join a party already on a journey down the river. Indeed, teams working on systemic change initiative may start at different points, alter membership, or even stop for periods of time because other campus priorities emerge, team members take on other duties, campus leadership changes, etc. Teams may also enter the river at different points, depending on where they are in terms of understanding of the problem, existing expertise, campus leadership capacity, etc. Teams can also paddle up or downstream, although ultimately the general flow will be to go downstream toward action and success. The case studies in this issue highlight campus experiences using the model as they worked toward their own STEM education reform goals.

Figure 1. The KECK/PKAL Systemic Institutional Change Model

River-Graphic_5-28-15_Web.jpg

Information Gathering and Data Analysis

Our approach to change and to this project is based on practices of organizational learning. Within this approach to change, information gathering and data analysis play a central role in helping individuals to identify directions and appropriate interventions for making strategic forward progress. Participants in any organizational learning/planning process foreground the data, reflection, dialogue, and nonhierarchical teams learning and developing innovative approaches. This means having campus teams look at data related to student success in order to determine the specific challenges and problems and to orient themselves toward a vision for change. But an organizational learning model also focuses on learning throughout the change process. The model is focused on facilitating organizational learning, but it also incorporates key ideas from other research on change such as the need to address politics, developing buy-in and a shared vision, understanding the power of organizational culture, and helping campus leaders unearth underlying assumptions and values that might create resistance to change.

During our work with campuses, we discovered common challenges and barriers they encountered. The most common obstacle was that campus leaders wanted to start by immediately implementing a strategy that they read about in a report or publication. While news of a successful program may motivate change, it is important to check in with campus vision and landscape analysis before jumping into implementation of the latest published student success strategy. It may or may not fit your campus situation, student population, faculty expertise, or resources. Campuses that jumped right into a strategy found that, while they made some progress, they struggled with defining purpose, specifying outcomes, implementation, and measuring impact. They ended up going back to their vision, refining it and doing more landscape analyses, which ultimately slowed progress but improved success in the long run. Another common barrier we identified was that campus team members held implicit theories of how change happens that were contradictory and often contrary to the project’s vision and goals (Kezar, Gehrke, and Elrod, forthcoming).

For example, a common assumption among STEM faculty is that meaningful change can only happen in departments. If faculty hold this belief, they will resist examining potential levers outside the department that are important to address, such as mathematics preparation, success in a prerequisite course in another department, level of study skills, advising, or institutional support, which is critical for sustaining long-term change. Implicit biases can only be revealed through conversations about beliefs, values, and practices. Therefore, we encourage teams to make their first meeting a discussion about how change occurs in order to make their implicit theories explicit. What makes this process hard is that implicit theories are often unconsciously held. Many people may not be able to articulate a theory of change or understand why the model is hard for them to work with. It can help just to have the candid discussion among your team members: “What do you think it will take to start an undergraduate research program here?”

Other common barriers encountered were

  • faculty beliefs about their roles as “gatekeepers” or as the “sage on the stage” as opposed to “gateways” or as “guides on the side”;
  • the lack of faculty expertise in evidence-based STEM education teaching and assessment methods;
  • a misguided belief that all faculty and staff share the same vision;
  • failure to examine all the implicit assumptions about the problem, possible solutions, and approaches; team members’ implicit theories of change that may prevent them from engaging in aspects of the work;
  • a lack of capacity for data collection and analysis in terms of support from centralized offices of institutional research;
  • inadequate incentives and rewards for faculty participation in STEM reform projects;
  • inadequate planning to secure appropriate buy-in, approval, or support from relevant units, committees, or administrators;
  • inadequate resource identification or realization;
  • unforeseen political challenges, such as tension regarding department “turf” or resource and faculty workload allocation;
  • shifts in upper-level leadership leading to stalled support or redirection of efforts to new campus initiatives (e.g., quarter to semester conversion);
  • changes in team membership because of sabbatical leaves or other assignments;
  • failure to connect STEM reform vision at the departmental level to institutional priorities to secure support and resources; and
  • lack of consideration about how students will be affected by and/or made aware of the changes, including the rationale for them. In order for students to fully participate, they need to understand how they will benefit from the changes or new opportunities.

More on the Systemic Institutional Change Model

The full details of the model will be published in a forthcoming PKAL report, Increasing Student Success in STEM: A Guide to Systemic Institutional Change. The guidebook includes detailed information about each element of the model accompanied by an explanation, key questions to consider, highlights from campus case studies, challenge alerts (mistakes to avoid or pitfalls to be aware of), and timeline considerations.The guidebook also contains specific tools to help campus leaders and teams plan and manage change initiatives, such as

  • tools to help campus leaders and teams determine how to get started in the process;
  • a readiness survey to help teams determine whether they are prepared to move forward with implementation of their chosen strategies and interventions;
  • a rubric to help campus teams gauge their progress in the model phases;
  • examples of data analyses to conduct as well as examples of implementation strategies to address common challenges facing STEM programs; and
  • suggestions for how to build effective teams, develop leadership capacity, and sustain change.

These tools are also included in a practical workbook intended for use by teams to actively work through the elements of the model. This workbook and the full-length case studies are available on the project website: http://www.aacu.org/pkal/educationframework.

 

References

Borrego, Maura, Jeffrey E. Froyd, and T. Simin Hall. 2010. “Diffusion of Engineering Education Innovations: A Survey of Awareness and Adoption Rates in US Engineering Departments.” Journal of Engineering Education 99 (3):185–207.

Fairweather, James. 2008. “Linking Evidence and Promising Practices in Science, Technology, Engineering, and Mathematics (STEM) Undergraduate Education.” http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_072637.pdf. Accessed June 12, 2011.

Handelsman, Jo, Diane Ebert-May, Robert Beichner, Peter Bruns, Amy Chang, Robert DeHaan, Jim Gentile, Sarah Lauffer, James Stewart, Shirley M. Tilghman, and William B. Wood. 2004. “Scientific Teaching.” Science 304 (5670): 521–522.

Kezar, Adrianna, Sean Gehrke, and Susan L. Elrod. (Forthcoming.) “Implicit Theories of Change as a Barrier to Change on College Campuses: An Examination of STEM Reform.” Review of Higher Education.

Lee, Diane M., and Keith Harmon. 2013. “The Meyerhoff Scholars Program: Changing Minds, Transforming a Campus.” Metropolitan Universities 24 (2): 55–70.

President’s Council of Advisors on Science and Technology. 2012. Engage to Excel: Producing One Million Additional College Graduates with Degrees in Science, Technology, Engineering, and Mathematics. https://www.whitehouse.gov/sites/default/files/microsites/ostp/pcast-engage-to-excel-final_2-25-12.pdf.

Seymour, Elaine. 2002. “Tracking the Processes of Change in US Undergraduate Education in Science, Mathematics, Engineering, and Technology.” Science Education 86 (1): 79–105.


Susan Elrod is interim provost and vice president for academic affairs at California State University–Chico; and Adrianna Kezar, is a professor at Rossier School of Education, and co-director of Pullias Center for Higher Education, at the University of Southern California

Previous Issues