Peer Review

Evaluating Quality of Engagement in Hampshire College's First-Year Plan

In 2002, Hampshire College inaugurated a new first-year plan that incorporates small, adviser-taught tutorials, a required eight-course load, a five-course distribution requirement, a first-year portfolio, and seven first-year learning goals. The plan replaced an older curriculum that combined coursework with independent projects distributed across the curriculum. The old curriculum had no clearly articulated learning outcomes, no year-end profiling of student work, and little overlap between classroom experiences and advising.

Although the new plan represented a radical shift for Hampshire, we regard it more as a change in our methods than as departure from our mission. An innovative college founded in the 1970s, Hampshire is a testing ground for progressive ideas in American liberal education. We are committed to interdisciplinary, inquiry-based education and to forward-looking approaches to pedagogy and curricula that are in tune with emerging areas of knowledge. Our academic structure maximizes student engagement—after completing the first year, one’s entire course of study is self-designed in consultation with a faculty committee. Students have enormous freedom to match coursework to their interests. We encourage them to delve into subjects they care about, and assume students will be intensely self-motivated.

It became clear in the 1990s that we were not consistently achieving these goals for the first year of the Hampshire education. Persistence rates were unacceptable, first-year students were insufficiently engaged with our academic and social expectations, and worries mounted about whether advising was well integrated into academic life. Members of a first-year task force boldly asked whether our first-year curriculum was working well. Their inquiries and subsequent proposals, developed over a two-year period, resulted in Hampshire’s new first-year plan.

Systematic Assessment

A change of this magnitude demands a systematic assessment. We developed an “assessment grid,” in which first-year outcomes, program goals, implementations, measures, and targets were identified and linked. For example, one intended outcome was an improved graduation rate. An associated program goal was an increase in academic engagement. The eight-course requirement constituted one implementation intended to achieve this goal; the relevant measures included the average number of courses completed in the first year, and our target was an average of seven courses per student by the end of the inaugural year of the new plan.

These efforts led us to investigate how well we foster academic engagement among our students. We employed the College Student Expectations Questionnaire and the College Student Experiences Questionnaire combined with a homegrown first-year survey to measure student expectations and self-reported patterns of engagement. We supplemented these instruments with direct measures of engagement derived from transcript analysis, course evaluations, assessments of academic progress, and patterns of distribution. Our guiding insight, that deeper engagement in an integrated academic and social environment leads to higher achievement and ultimately to lower attrition, inclined us to base our evaluation paradigm on quantitative measures of this sort.

By these measures, our first-year plan was a substantial success. In its initial year, the percentage of students successfully completing eight courses more than doubled to 70 percent, and the percentage of students in academic difficulty at the end of the first year fell by over 20 percent. All but six continuing students began their concentrations “on time,” and the distribution of first-year students across the curriculum flattened out, with many more students successfully completing courses in the cognitive and natural sciences. Over the next few years, our six-year graduation rate improved by almost 20 percent.

Still, other considerations suggested a less rosy picture. Third-semester persistence was essentially unchanged, and although course completion was up, students’ self-reported progress on Hampshire’s learning goals was unchanged from previous years. More disturbingly, anecdotal evidence derived from hallway conversations and opinion pieces in the student newspaper indicated that while the quantity of student engagement was improving, the quality of that engagement was not.

Quality of engagement refers not to the extent that students are engaged with their studies, but to the extent to which they feel so engaged. This type of evaluation requires supplementing standard quantitative methods of analysis with qualitative methods. Triangulating measures of course completion, good standing, and academic progress with data from student and faculty focus groups, ethnographic interviews, and analysis of other qualitative information (e.g., comments on course evaluations or open-ended questionnaires), fills in many important gaps in our analysis. In addition to looking at the numbers, we need to listen to the students. We illustrate this approach by briefly discussing the evaluation of two aspects of Hampshire’s first-year program: distribution requirements and the first-year tutorial.

Distribution Requirements

Since 2002, first-year students have been required to complete one course in each of Hampshire’s five “schools” (cognitive sciences, humanities/arts/cultural studies, interdisciplinary arts, natural sciences, and social sciences). Prior to this requirement, course-taking patterns were heavily skewed across schools. Under the new program, these patterns have evened out, so that measures of student behavior indicate equal engagement across the curriculum. However, qualitative inquiries indicate that many students remain harshly critical of having to take courses in areas in which they lack strong interest.

Our work reveals a significant tension between Hampshire’s mantra, “learn what you love,” and the equally important goal of achieving a broad liberal arts education. Student interviews show that while some students find new interests by taking distribution courses, others remain disengaged, preferring to “learn what they already love.” Students disconcertingly describe such courses as “a waste of time,” or “totally irrelevant to me,” even though they are choosing from a great many alternative courses within each interdisciplinary school. Here is an illustrative interview excerpt in which a first-year student comments on his degree of effort in distribution courses outside his area of interest:

I don’t work as hard. I feel poorly about that. I don’t think it is fair to the professor, to the class, and to me because I do enjoy learning. I do enjoy the work if I can get into it. I think the entire student body feels that way. If it is a good class, if they are interested in it, and if it is what they want to study, then they will do it. If not, then—forget it. And it shows.

There may be an interesting developmental aspect of this phenomenon. Interviews with Hampshire alumni indicate that many recognize that what seemed at the time to be an unnecessary requirement turned out in hindsight to represent a powerful learning experience. Here is one such comment drawn from a recent interview:

I was afraid of math; I was never good at it in high school. I took a science course to fill Hampshire’s distribution requirement, and I did some math in that course. It was hard, but I came out of that experience a much stronger learner and more confident in my ability to do academic work. I’ve sometimes thought since, maybe I should have done more science. Now I teach math, and I enjoy it.

It is important for the college to address these aspects of student and alumni culture and to explicate our apparently contradictory messages (“be broadly educated” and “create your own education”). Hampshire is currently involved in a systematic study of the “open curriculum” funded by the Teagle Foundation to try to better understand the advantages and disadvantages of curricula without distribution requirements. We hope to report on this work at a future date.

First-Year Tutorials

Under the new first-year plan, every new student is assigned to a tutorial—a first-semester seminar taught by the student’s adviser. Tutorials combine the general goals of a first-year seminar with the additional aim of integrating advising and teaching. In many ways, the adoption of the tutorial has been advantageous. Advisees meet advisers in a rich academic context, group advising promotes many efficiencies, and the tutorial cohort creates a peer network for first-year students. Interestingly, students rate tutorials ahead of other 100-level courses in course evaluations on measures of course excellence, the professor’s excellence, and the extent of learning that occurs. Mean scores for first-year survey questions pertaining to advising have also shown small but consistent improvement.

Once again, qualitative data indicate a current of dissatisfaction that runs underneath these positive trends. The following excerpt from interviews with first-year students identifies one problem clearly:

Your adviser is really important. . . . While I absolutely love my adviser and think he is amazing and think that he is very supportive, I think that it would be much more beneficial to me if I had an adviser in the School of SS [Social Science] who knew a lot about law schools and knew a lot about exactly what courses I need to be taking.

Although the college assigns first-year students to general advisers who guide them through the general education requirements of the first-year program, our work reveals a trend at Hampshire in which students increasingly want to be assigned advisers in their presumed area of concentration at the earliest opportunity. Once revealed, it was easy to improve the situation. By developing a tutorial registration algorithm that maximizes student preference for as many students as possible, under-enrolling each tutorial by two students, and allowing tutorials to participate in the online add/drop system, we have greatly increased the odds that first-year students will enroll in the tutorial of their choice.

Lessons Learned

Our evaluation of Hampshire College’s first-year plan suggests three conclusions. First, quantitative methods that investigate student engagement should be supplemented by qualitative methods that reveal how students experience involvement in college life. Positive evidence encoded in many standard measures of success can mask important issues that contextualize these outcomes. Second, when we did this kind of mixed-method research, we sometimes found that the college is sending students inconsistent messages. Improving teaching and learning will require resolving these inconsistencies to develop a more coherent institutional position. Finally, we recognize that when students select an institution, they may not understand or agree with all of the principles that shape its curriculum and other aspects of student life. Communicating these principles clearly to students is an important institutional responsibility and is essential to providing a high-quality education.

Steven Weisler is the dean of academic development, and Carol Trosset is the director of institutional research—both of Hampshire College.

Previous Issues