iAMSTEM: Increasing STEM Success at UC–Davis: Increasing Student Success at UC–Davis

The University of California–Davis (UCD) is a large research institution with over 26,000 undergraduate students. Nearly 60 percent of students are pursuing a major in STEM (science, technology, engineering, or mathematics). UCD has a long history of initiatives for improving retention of specific student populations and within specific STEM fields.

Examine Landscape and Conduct Capacity Analysis

The iAMSTEM (integrated agriculture, medicine, science, technology, engineering, and mathematics) Hub, established in 2012 by the provost and housed in the Office of Undergraduate Education, is the first unit dedicated to comprehensive undergraduate STEM student success across the campus.

The founding of iAMSTEM coincided with the launch of the UCD 2020 Plan, an effort to increase enrollment by 5,000 undergraduate students by 2020. One of iAMSTEM’s first directives was to investigate STEM student factors that might impact or result from planned university growth (e.g., impaction, time to degree, retention, etc.).

Identify and Analyze Challenges and Opportunities

Several concerns emerged from our analyses. For example, average time to degree was consistently greater for STEM students than for non-STEM students. Average retention in STEM majors was only 55 percent. Among underrepresented minority (URM) and first-generation (FG) STEM student populations, performance gaps emerged from the very first quarter and persisted until graduation or transfer out of STEM. Retention of URM and FG in STEM majors averaged just 35 percent.

The pattern and magnitude of losses was distressing and suggested that at least some of our students were leaving STEM majors for reasons other than personal preference. Growth of the student population would likely further strain instructional capacity and resources, which in turn could threaten the goals of the 2020 Plan. iAMSTEM set out to better understand STEM student success and if or how we could impact it.

Local conventional wisdom attributed STEM student losses primarily to students’ lack of ability, interest and/or willingness to work hard (i.e., innate traits over which we have little influence). Our examination of the data suggested a more complex story. Students from every academic level and from every demographic were leaving STEM in high numbers. Based on this data and the findings of decades of retention research, we hypothesized that quality of instruction was a critical yet underexamined part of our student success equation (Seymour 1995). If true, then by improving instruction, we could improve student outcomes. Our mission was clear.

Determine Readiness for Action

Most of our students who left STEM did so by their fifth quarter (middle of second year) while still enrolled in large introductory courses (100–500 students per section). Large lecture classes dominate the early part of our STEM students’ careers and pose unique challenges for student learning. Given the importance of introductory courses for later success, we focused our attention on understanding the outcomes and dynamics of our largest introductory STEM classrooms. UCD Institutional Research surveys and departmental program review studies offered insights into campus morale and the self-assessed approaches to teaching used by the average instructor. But there was little data available that informed what was or was not actually happening day-to-day in classrooms. It quickly became apparent that if we wanted to better understand the role of instruction in student success, we first needed to improve our ability to see and measure it.

In response, we created UCON (Undergraduate Classroom Observation Network), a team of highly trained undergraduate observers that used an iAMSTEM designed app based on the COPUS observation protocol (Smith et al. 2013) to quantify instructor and student behaviors in the classroom. With the permission of instructors, the UCON team enabled iAMSTEM to characterize instruction across all introductory STEM courses. To paraphrase Sir William Thomson, you cannot improve what you cannot measure. We could now measure instruction and relate it to student outcomes.

Choose Strategies and Begin Implementation

While iAMSTEM was ramping up its ability to gather, analyze, and make sense of instructional data, we were also embedding ourselves in small-scale collaborative efforts on the ground. We looked for places where we could be useful. When we first came on the scene, only a few communities of STEM faculty were interested or engaged in transforming their own instruction. However, we were repeatedly approached by graduate teaching assistants (TAs) eager to advance their teaching skills, as few comprehensive TA training programs or resources were available. One of iAMSTEM’s first efforts on the ground was helping the instructors and coordinator running BIS2A—the first gateway course in the introductory biology series—to systematize and improve the quality of their TA training program. Early successes with that team established trust, and opened the door to discuss more ambitious and impactful instructional approaches for helping students learn. Full-scale implementation of the team’s ideas would require resources, so we looked to outside organizations for support.

In late 2012, iAMSTEM became part of the Bay View Alliance, an international coalition of research universities focused on understanding cultural change related to instruction. In 2013, we submitted a successful proposal to the Association of American Universities (AAU) to support development of active learning practices across introductory STEM courses. That same year, we received additional support from the Bill & Melinda Gates Foundation for transformation of introductory biology and began our participation in the Keck/PKAL project. These awards and collaborations supplied much needed resources and expertise, allowing us to secure the commitment of our new colleagues and collaborators to take things to the next level.

By the 2013–14 academic year, we were finally ready for a full-scale test of our hypothesis: intentional design of instruction, based on our best understanding of learning and teaching, can increase success of all students. We piloted TA-led highly structured discussion sections in BIS2A (supported by grants from AAU and the Bill & Melinda Gates Foundation). Students attended three one-hour large lectures and one two-hour discussion section per week. The same faculty instructor led all lecture sections. Control discussion sections were taught using the conventional curricula and activities. Treatment discussion sections integrated active learning instruction with adaptive online material developed at Carnegie Mellon (Thille 2013). Active learning TAs were trained to use formative student data from the online activities to inform and adapt instruction and curricula each week to meet their students’ specific needs. Weekly TA training meetings permitted active learning TAs to review student data together with peers and to brainstorm ways to address student weaknesses and misconceptions.

The team put enormous thought and effort into design and implementation, fueled by the passion and progress of the TAs. We knew, however, that expecting TAs to significantly improve student performance in a course, whatever their training, was asking a lot. Not surprisingly, detractors were numerous. It was a long shot, but a worthy one that, if successful, would demonstrate the merit of our hypothesis in a way that no one could ignore.

Measure Results

Classroom observation data collected from lecture and discussion treatments revealed that discussion sections incorporated significantly greater active-learning instruction over lecture and that active learning TAs required significantly greater engagement and accountability from their students than either the lecturer or the control TAs. After controlling for covariates, Spring 2014 students in the active learning discussions were 66 percent more likely to pass the course (based on total exam scores) than students in control sections.

The positive outcomes realized by the TAs encouraged three of the four BIS2A faculty to begin integrating active learning instruction and practices into their own lecture sections during the 2014–15 academic year. The impact of those changes were measured through comparison of each faculty member’s student performance on standardized pre- and post-exams and exams from previous quarters using statistical analysis techniques such as propensity score matching and performance prediction modeling. Additional faculty from the BIS series learned of the BIS2A results and are now working with iAMSTEM to integrate active learning practices into their courses as well.

Using similar strategies, iAMSTEM’s chemistry team has developed and tested a variety of teaching methods and strategies, including online alternatives to textbooks, online homework software with instructor dashboards, active learning and group tasks in lecture and discussion sections, and implementation of pre–post learning assessments across most sections of the introductory chemistry course series. Outcomes from these efforts have secured the backing of the department chair and informed new discussion by chemistry faculty and lecturers about greater integration across the introductory series. iAMSTEM also helped to launch two course transformation committees, Chemistry Innovation and Chemistry for Life Science, that are actively engaged in instructional improvement and alignment of course and series student learning goals. More instructional experiments are planned for the 2015–16 academic year, along with development of formalized active learning TA training and professional development of faculty and lecturers.

Disseminate Results and Plan Next Steps

We are still in the beginning stages of change. It takes time, footwork, and plenty of coffee and coaching sessions to figure out what people need and help fully realize solutions. As with great teaching, we’ve learned that great innovating requires more listening than telling. From students to instructors to the provost and chancellor, each person we work with has unique concerns, capabilities, and questions. Each sees value in what we do in direct proportion to our alignment with their goals. Understanding collaborators’ needs (and ensuring that they know we understand their needs) requires real listening and empathy. We have found no shortcuts.

To scale the successes of our collaborators, we are working to bring together all campus communities focused on student learning to share ideas, results, and challenges and find ways to extend their efforts into the larger community.

National Dissemination

We are very fortunate to be part of multiple active communities working to promote evidence-based instructional strategies for STEM instruction. As previously mentioned, these include (1) the AAU STEM pilot initiative, (2) the Bay View Alliance, (3) the Bill & Melinda Gates Foundation Adaptive Learning Acceleration initiative, and (4) the Keck/PKAL project. Between these four projects, we are involved in ten or more multi-institutional gatherings each year. These community interactions provide tremendous opportunities for sharing findings and learning valuable lessons from others. Through the iAMSTEM Tools for Evidence-Based Actions project, we share iAMSTEM developed apps (e.g., student pathway ribbon tool and the GORP classroom observation platform) as well as tools and research from fellow community members. We’re working on several articles for submission to journals and hope to share our experiences and findings at national education association and disciplinary group meetings and universities across the country.

In late 2014, the chancellor and provost were informed of our work through discussions with others outside of our university, including the leadership of AAU and the Association of Public and Land-grant Universities, and through articles in the New York Times and Inside Higher Ed blog posts. Sometimes the right voices from the outside can be compelling allies on the inside.

Reflections

Three years into the work, we still love the job and the amazing people we regularly interact with. As a group, we strongly believe in maximizing the effectiveness and quality of instruction to help students achieve their best. Some of our key lessons learned include:

  • Student success is the product of a system. To improve a system, it is important to understand all the parts and the relationships that connect them. Not everyone working in a system can see all the parts. For example, program- and department-level workers cannot always see or anticipate higher-level barriers that can disrupt or block innovation (financial issues, student flow, resource limitations, etc.). A big part of facilitating systemic change involves helping others understand how the system works and how they can be most impactful.
  • Student success is primarily a people issue. In our experience, (almost) no one is completely satisfied with their teaching or the quality of their student’s learning.
  • Change is hard and often risky. Resistance to change is logical. Logic, data, and good ideas by themselves rarely convince people to take a risk and do something hard. People convince people. Trust convinces people.
  • Graduate Teaching Assistants matter. Despite their significant role in undergraduate education, TAs are vastly underutilized and underdeveloped as educators and mentors. Our work has demonstrated that TAs can have a significant impact on student learning and success if given proper training and support. Yet so few are. TAs can also influence campus instructional culture through modeling, mentoring, and some friendly peer competition. It’s time to value and cultivate their potential as educators and innovators.

 

References

Brewer, Carol, and Diane Smith, eds. 2011. Vision and Change in Undergraduate Biology: A Call to Action. Washington, DC: American Association for the Advancement of Science.

President’s Council of Advisors on Science and Technology. 2012. Engage to Excel: Producing One Million Additional College Graduates with Degrees in Science, Technology, Engineering, and Mathematics. Available at: https://www.whitehouse.gov/sites/default/files/microsites/ostp/pcast-engage-to-excel-final_2-25-12.pdf.

Seymour, Elaine 1995. “Why Undergraduates Leave the Sciences.” American Journal of Physics 63: 199–202.

Singer, Susan R., Natalie R. Nielsen, and Heidi A. Schweingruber, eds. 2012. Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering. Washington, DC: The National Academies.

Smith, Michelle K., Francis H. M. Jones, Sarah L. Gilbert, and Carl E. Wieman, 2013. “The Classroom Observation Protocol for Undergraduate STEM (COPUS): A New Instrument to Characterize University STEM Classroom Practices.” CBE-Life Sciences Education 12 (4): 618–627.

Thille, Candace M. 2013. Surfing the Tsunami: Faculty Engagement with the Open Learning Initiative. PhDdiss. University of Pennsylvania.

Tinto, Vincent. 2012. Completing College: Rethinking Institutional Action. Chicago: University of Chicago Press.

 


Chris Pagliarulo, associate director of assessment and instruction, iAMSTEM, Office of Undergraduate Education; and Marco Molinaro, assistant vice provost for undergraduate education, iAMSTEM director—both of University of California–Davis

 

Select any filter and click on Apply to see results