Peer Review

Institutional Uses of Rubrics and E-portfolios: Spelman College and Rose-Hulman Institute

Institutions are now turning to e-portfolios to demonstrate and communicate student achievement at the college or university level. Below, two very different institutions discuss how they have engaged faculty in developing rubrics and e-portfolios that articulate expectations for student learning and focus on the work students do in response to assignments and activities reflective of the curriculum and cocurriculum that embodies their education at their respective institutions. Rose-Hulman Institute developed its own e-portfolio system and outcome rubrics for purposes of program assessment to meet accreditation requirements. Spelman College uses a commercial e-portfolio platform and began assessing a single outcome for a specific set of students. Each was driven by a desire to know more about student learning and curricular improvement. The two institutions demonstrate two different ways to begin to use of rubrics and e-portfolios in intentional ways that involve the entire institution within that campus’ mission and culture.

Rose-Hulman Institute of Technology

Rose-Hulman Institute of Technology is a private, undergraduate college of 1,900 students located in Terre Haute, Indiana (www.rose-hulman.edu). It has a national reputation for educating undergraduates to pursue careers in the fields of mathematics, engineering, and science, and has a strong track record of creatively developing and rigorously assessing pedagogies for teaching in these fields.

Defining Student Learning Outcomes
The school’s commitment to undergraduate education is reflected in its institute-wide assessment process that includes a defined set of institutional learning outcomes and the Rose-Hulman electronic portfolio project, the RosE Portfolio System (REPS), winner of the 2007 Council on Higher Education Accreditation Award for Progress in Student Learning Outcomes. In 1997, school administrators began the process of developing a set of institute-wide student learning outcomes, outcomes that would constitute the set of skills all Rose-Hulman students develop by the time of graduation. These outcomes were designed based on input from a wide variety of constituents: faculty, alumni, industry (those who hire Rose-Hulman graduates), graduate schools, and other sources. By the end of the 1997–98 academic year, a set of institute student learning outcomes were in place, defined with specific performance criteria. These ten learning outcomes were adopted by the faculty of the institute and subsequently published in Rose-Hulman official documents, such as course catalogue and Web pages. In 2006, following the institution’s program and institutional accreditation visits, the school reviewed the institute outcomes and revised them into the current set of six outcomes (available at www.rose-hulman.edu/REPS).

In addition to defining student learning outcomes, Rose-Hulman faculty also needed to develop an effective and efficient data-collection method. Thus, their work on defining student learning outcomes occurred simultaneously with designing an electronic portfolio system for the purpose of data collection for evidence of student learning. In 1997, there were no electronic portfolios systems available commercially that reflected Rose-Hulman’s assessment model, so the institution developed its own portfolio. Since Rose-Hulman instituted a laptop computer requirement for all students in 1995 (one of the first colleges to do so), the college decided to use electronic portfolios. Thus, students were required to use an institute-specified laptop computer with a preinstalled software suite, which made the portfolio assessment process both effective and efficient since all dimensions of the process—from student submission to portfolio evaluation—occurred within an electronic system.

In summer 1998, REPS (the RosE Portfolio System) was piloted to evaluate a set of student submissions. Every year since then, REPS has been used to collect, evaluate, and report achievement in student learning outcomes to students, faculty, employers, graduate schools, and various accrediting agencies. In 2005, we began to develop the functionality of the RosE Portfolio System inside our institution’s course management system. Now we have adopted the RosEvaluation Tool as our means to evaluate student work products that are submitted.

Portfolio Assessment
The core of the RosE Portfolio System is the assessment process. This process begins with faculty identifying the outcomes that are addressed in their courses. All faculty members submit quarterly curriculum maps that show which of the institutional learning outcomes are addressed in their courses. A review of these curriculum maps demonstrates which courses will provide students with opportunities to develop their skills in the learning outcomes. After the curriculum maps are analyzed, faculty members determine which assignments in their courses will provide the best evidence of student achievement in the outcome. Faculty members teaching courses in technical communication, for instance, identify specific assignments in their courses that can show evidence of improvement in their students’ communication skills. Once the assignments have been identified, faculty members direct students to submit those assignments to drop boxes in the course-management system. These drop boxes are mapped to the Institute Student Learning Outcomes through the RosEvaluation Tool.

We collect evidence of student learning for all six institute learning outcomes every year. At the end of the academic year, a team of faculty portfolio raters are trained; they then rate all submissions to the RosE portfolio system over a two-day rating session, using predefined evaluation rubrics. Once the ratings are completed, the portfolio rating results are compiled and analyzed by the Office of Institutional Research, Planning and Assessment. Each department then receives a report that contains detailed portfolio results for all student majors (from freshmen through seniors). Departments use these data to make improvements in their curricula to address any deficiencies in student achievement.

Rating submissions to the RosE portfolio have followed the same basic methodology since the system was initiated in 1998. Rose-Hulman faculty members (usually up to fourteen each year) are hired as portfolio raters. Attempts are made to involve faculty from many different departments on campus to ensure objectivity in rating and broad-based familiarity and participation in the process. Raters work for two days together in a computer laboratory. The rating session coordinator facilitates the process and assigns pairs of raters to rate student submissions for a particular outcome. For example, a mechanical engineering faculty member and a chemistry faculty member may work as a rating pair assessing the student files submitted for the Communication Outcome.

The rating process consists of four steps.

  1. Faculty portfolio raters review the rating rubric associated with the learning outcome. The rating rubrics were developed by faculty members who serve on the Commission for the Assessment of Student Outcomes (CASO), the institute-wide committee charged with maintaining the outcomes assessment process. Each year faculty portfolio raters review the rating rubric, as well as the comments made by the faculty portfolio raters who evaluated the same outcome in previous years. As part of their training to be raters, the rating team discusses the rubric while comparing it to student documents that were rated during previous rating sessions. The purpose of this work is to ensure calibration: between the two faculty raters, and between the current faculty raters and each previous faculty rater team. Calibration like this helps ensure consistency in rating from year to year.
  2. REPS requires that each rater team rate a set of three shared documents. The rating is made on the basis of a preestablished rating rubric; raters answer “Yes” or “No” for a single rating question: “Does this document meet the standard expected of a student who will graduate from Rose-Hulman.” Student achievement is measured as either “Yes/Pass” or “No/Fail.” Raters also have the opportunity to mark the document as “Yes/Pass/Exemplary” to designate student submissions that represent superior achievement for a particular outcome. In order to ensure consistency in rating between the raters, REPS uses an interrater reliability (IRR) process. When they read and evaluate the set of three shared documents, the raters must agree in their rating. If their ratings are not identical, REPS prohibits them from continuing on with the rating process. Raters then discuss their ratings, checking their evaluation against the rating rubric for the outcome; they then come to agreement on how they will evaluate the shared document set. IRR is a key component of REPS; it ensures that raters look for the same qualities and features in order to rate documents. This helps the faculty raters to calibrate their ratings against each other and ensures consistency in rating.
  3. If the raters agree in their IRR, the system then allows them to proceed with a set of ten documents, each rater reading and rating a different set of ten documents. REPS records their rating for each document. The system also introduces a shared file every ten documents in order to check that the raters have maintained their interrater reliability. Failure to rate the shared document identically will cause the system to stop the raters so that they can recalibrate their evaluation before moving on to another document set. Thus, IRR continues to validate rating throughout the rating process.
  4. The raters can provide comments about the rating session or about the student submission in the comment boxes. In addition to the work of rating, faculty raters also record insights they made during rating and collect sample documents in order to provide next year’s raters with material for calibration. They may also suggest changes to rating rubrics or to learning outcomes, although revisions must be reviewed and approved by CASO before they are implemented into REPS.

Spelman College

Spelman College is a 127-year-old, historically black, liberal arts college for women in Atlanta, Georgia. About three years ago, the provost began a curricular transformation project for general education that included greater emphasis on interdisciplinarity and connected learning across courses. Faculty workgroups began to rethink general education goals, and adopted new ones based on a theme of Free-Thinking Women of African Descent. Inspired by our signature first-year course, African Diaspora and the World, the new general education curriculum expands Diaspora studies into interdisciplinary courses beyond the first year.

Instituting E-Portfolios
About the same time that the curricular transformation began, the college was contemplating a campuswide electronic portfolio system for student, faculty, and administrative use. After some piloting of electronic portfolios in courses during fall 2007, the college initiated required electronic portfolios for all first-year students. Initially, electronic portfolios were housed in a cost-effective computer-based platform, while faculty and administrators evaluated other commercially available products that might serve as a campus standard. An Internet-based software product was eventually selected and implemented for use throughout the college in fall 2008. Beginning last semester, all first-year students (approximately 565) developed electronic portfolios as part of a revised first-year experience course. The Spelman Electronic Portfolio project has come to be known, in brief, as SpEl.Folio.

The SpEl.Folio emerged from a longstanding practice of evaluating entering students’ writing ability through a portfolio submitted near the end of the first year of college. For over fifteen years, the Comprehensive Writing Center has coordinated assignments, collection, and scoring of first-year writing portfolios, using trained evaluators (faculty and graduate students) to evaluate the compositions. Students who do not receive a passing score are tutored and allowed to resubmit the portfolio the following semester. Any student who does not receive a passing score on her revised portfolio is automatically enrolled in a two-credit course on grammar and style, to help develop her writing skills.

Other college departments—psychology, education, and art, for example—also required students to submit portfolios. However, it was largely the first-year writing portfolio that inspired a comprehensive portfolio to document, in addition to writing, many other aspects of the student learning experience at Spelman. With the introduction of SpEl.Folio, all first-year students now submit in digital format reflections on the required community service experience, report on information literacy exercises, and compose reflections on the first year of college in addition to their writing portfolio. As indicated by the SpEl.Folio mission statement, one of the aims of creating an electronic portfolio is to enable students to “think critically about the connections among their intellectual, professional, and personal lives.” We plan for SpEl.Folio to expand further, and provide multimedia documentation of student achievement throughout her time at Spelman. As a bonus, the SpEl.Folio is a portable document that may also be downloaded and shared with prospective employers and graduate schools.

The faculty has identified four institutional goals that embody the college mission in academic affairs, student affairs, and cocurricular activities. Those institutional goals are associated with seven student learning outcomes for general education, and together, they form the standards by which student performance is evaluated in SpEl.Folio. With this structure, the college is able to evaluate student learning from several different perspectives: on the institutional goals and general education learning outcomes by course or by student classification, on specific elements of the general education learning outcomes (like writing and critical thinking) that are common across courses, and quite importantly, on longitudinal student development over time.

Because SpEl.Folio development and implementation has occurred contemporaneously with general education curricular reforms throughout the college, it has stimulated discussions among the faculty on how we might best use such a tool to engage students meaningfully, while evaluating their performance and growth. Like many other liberal arts colleges, we regard the electronic portfolio as the preferred method of demonstrating to internal and external constituencies the impact of the college experience on student development. While the college plans continued use of some standardized tests for specific purposes, our goal is to increase systematically the number of courses using SpEl.Folio, thus enabling the college to evaluate students’ learning using genuine artifacts from general education courses, work in the major, and cocurricular experiences.

More Authentic Assessment
One particular benefit of using electronic portfolios is that it is regarded by many faculty as a more authentic approach to assessment of student learning and development than standardized testing alone. Criticisms of high-stakes, multiple-choice testing are most loudly voiced, but some faculty members are skeptical of even the more recent problem-solving tests, like the Collegiate Learning Assessment (CLA). Although acknowledged as a considerable improvement over multiple-choice exams, the CLA is described by our more vocal faculty as an engaging high-stakes test, but one with the same faults as many other high-stakes tests—namely, time-limited performance pressure on subjects divorced from the learning experience.

Spelman College has joined AAC&U’s VALUE project over concerns about the validity of student assessment and a desire to participate in developing rubrics for use in undergraduate education. Now that SpEl.Folio is underway, the college recognizes a need to expand faculty development in electronic portfolio use beyond the group of Vanguard Faculty who implemented SpEl.Folio in the first-year experience. The two groups of faculty most immediate training needs are the departmental faculty, who will guide the sophomore-year experience linked to the student’s major, and the general education faculty, who are revising student learning objectives that will be evaluated in SpEl.Folio. Both groups of faculty will determine how to structure courses and SpEl.Folio assignments to achieve learning objectives of the revised curriculum and document them appropriately in the electronic portfolio.

The SpEl.Folio project was created with several goals in mind, including an intention to use digital technology to enhance and document the student learning experience. With the full-scale launch of required electronic portfolio use for all first-year students—now in a uniform platform for all users—Spelman College has achieved its goal of implementing an electronic means of capturing significant elements of the learning experience. Having done so, we are now well on our way to enriching the student learning experience via multiple curricular and cocurricular interconnections to the SpEl.Folio. Students and faculty alike are able to access the Web-based portfolio to compose and review authentic learning artifacts derived from classes, service learning activities, and personal reflections.

For students, who are often well-versed in digital networking via the Internet, the electronic approach to submitting and revising assignments feels familiar and provides creative flexibility that they can appreciate. For faculty participating in development and implementation of the electronic portfolio, it is a means of collecting authentic artifacts of student learning and evaluating those artifacts using standardized methods (scoring rubrics). Faculty also can supplement face-to-face communications with students through the electronic portfolio.

Through use of the electronic portfolio, the college is attempting to increase student engagement in the learning process—a critical factor in promoting achievement and persistence to graduation.


Myra N. Burnett is a vice provost and associate professor of psychology at Spelman College; Julia M. Williams is a professor of English and the executive director of institutional research, planning, and assessment at Rose-Hulman Institute of Technology.

Previous Issues