Summer/Fall 
2005, 
Vol. 7, 
No. 3/4
Peer Review

Integrative Learning and Assessment

Integrative learning is an ambitious student learning goal, long espoused in higher education and in the world at large. It is also a goal that for too long has depended upon serendipity rather than planning in its achievement and is often not included as an element in assessments. But if a college or university is committed to integrative learning as an expected outcome, it must create intentional approaches to providing integrative experiences and assessing the quality of student integrative achievement.

For learning in virtually all disciplinary and skill areas, as high levels of achievement are reached, discrimination of levels of quality becomes increasingly difficult. What is good writing or a good musical performance according to one expert is, according to another, average or poor. Such differences in assessment may derive from tacit differences in standards or the elements considered during the assessment--differences that must be resolved for more consistent judgments to be made.

Evaluation experts pursue reliability in measurement through clear definitions, training of evaluators, and well-designed problems that elicit evidence of learning. Approaching the intentional achievement and assessment of integrative learning (or any other complex learning outcome) requires similar care. Those fostering the learning should agree upon clear definitions and desired outcomes and share their expectations with learners; create engaging, authentic assignments ripe with integrative possibilities to gather evidence of student accomplishment; and hone their skills of discrimination and explanation to provide meaningful formative and summative feedback to students. As with any complex learning, repeated experiences over time, with expert formative feedback, are likely needed to foster integrative learning. (Teachers will also benefit from repeated experiences in assessment, which over time will improve the validity and reliability of integrative learning assessments.)

The development and use of rubrics for scoring complex student work is gaining acceptance. Grant P. Wiggins suggests that rubrics used for any purpose acquire meaning for students when they see the rubric in use on actual examples of work (1993, 53). If work is assigned to students with integrative outcomes as an expectation, instructors must have thought through what those outcomes will "look like" in enough detail to be able to separate the high-quality work from the lesser, and to explain their judgments in ways that will help students to improve. Leading students through a sample scoring process of an actual piece of work will contribute to student understanding and success.

Clear Definitions, Shared Expectations

The term "integrative learning" represents many different behaviors that can range from the simple and commonplace to the complex and original. "Making connections" among learning experiences begins in early childhood and continues throughout life. During college-level study, integrative learning can involve

  • usefully blending knowledge and skills from different disciplinary areas, as in a learning community;
  • putting theory into practice, as in a student teaching semester or nursing clinical practice;
  • considering multiple perspectives to advance collaborative problem solving, as in a senior capstone project completed by a team of students from different majors;
  • adapting the skills learned in one situation to problems encountered in another, as when a business student conducts market research to help a community agency estimate the potential client load for a new branch office;
  • reflecting upon connections made over time among academic, cocurricular, and preprofessional experiences, as when a student writes reflective essays in a multiyear portfolio;
  • "Across-the-curriculum" integration of skills with learning in disciplinary or interdisciplinary settings, as when writing and quantitative skills are used in history or women's studies.

Given the variety of behaviors represented by the concept of integrative learning, a first step toward assessment of student outcomes must be to define what a particular campus or program actually expects students to do as integrative learners. A professional program might commit to "putting theory into practice," while a science program might focus on connections among science disciplines. Institutions might commit to one kind of integrative learning for all students, while programs might have additional, different integrative goals specified for their own graduates. Defining goals for integrative learning is a vital first step toward planning and implementing intentional learning and assessment.

Assessment Tools for Different Kinds of Integrative Learning

A few examples of assessments and conceptual frameworks used by different campuses will illustrate how some are defining and fostering integrative learning. Because each campus or program will likely define for itself what integrative learning means, these assessments are offered as potential models for adapting, not simply adopting. Aligning local assessments with the educational experiences that students have is required to assure reasonable validity of assessments.

Modest Beginnings

Checking for the presence of integrative thinking or action in student work and rating its quality is a simple tactic for assessment. In this case, assessment of integration becomes one element within a longer assessment rubric. The assessment checklist for the introductory essay of a portfolio created in a learning community at New Century College at George Mason University includes a check box for "connections across" course experiences as one element among six assessed. The portfolio assessor, in reviewing the essay, would check one of the following statements to match his or her assessment of the quality of student work:

  • Excellent: consistently makes insightful connections across course
  • Satisfactory: makes insightful connections across course experiences
  • Adequate: makes connection between/among ideas/experiences
  • Unsatisfactory: connection among readings, experiences, etc., rather general (Oates and Leavitt 2003, 24–25)

Multi-Definition Rubric

Bowling Green State University provides faculty and students with rubrics to be used (or adapted) for assessment of university learning outcomes (see figure 1). "Connection" itself is not specified as a learning outcome--it is viewed as an important means of achieving specified outcomes. The "connection" rubric begins with a definition:

"Connecting" is the essence of creative problem solving, shown in synthesizing knowledge within and across courses, integrating theory and practice, linking academic and life experiences, and relating one's self and culture to diverse cultures within the U.S. and globally. (See www.bgsu.edu/offices/provost/Assessment/Connect.htm)

The rubric presents four levels of achievement with descriptive statements for each level that cite elements of the definition (although not verbatim). The rubric also allows multiple kinds of integrative behavior to contribute toward a particular level. Levels 1 and 4 are shown in figure 1. The full rubric also includes levels 2 (novice) and 3 (proficient). For a more analytic approach, one could alter the rubric and scoring instructions to have the assessor indicate both the kind(s) and the quality of integration observed. Such an assessment could then guide formative conversations and work about improving specific kinds of integrative behavior.

Integration During Performance

Observing students during field placements often results in seeing them integrate theory with practice. Student teaching assessment forms may list a variety of desired teaching behaviors, many of which are integrative. Following are some examples of how different institutions describe these behaviors:

  • Connects lessons to learning standards (State University of New York at Stony Brook)
  • Articulates connection among concepts, procedures, and applications (Pennsylvania State University)
  • Demonstrates the ability to integrate content across the curriculum (University of Delaware)
  • Lessons incorporate insights from other disciplines (State University of New York at Stony Brook)

Observation forms often contain Likert-style rating scales along with spaces for written comments that guide a coaching conversation following the observation.

Authenticity, Analysis, and Synthesis

In an insightful analysis of students' interdisciplinary work, Veronica Boix Mansilla suggests using three factors to assess the quality of integration (2005, 18–21). Working from a definition of "interdisciplinary understanding" as "the capacity to integrate knowledge and modes of thinking drawn from two or more disciplines to produce a cognitive advancement . . . in ways that would have been unlikely through single disciplinary means," she selects three dimensions as the foundation for assessment:

  1. Disciplinary grounding (Have appropriate disciplines been selected for the work and are the concepts used in accurate ways?)
  2. Integrative leverage (Has a new understanding been generated that would not have been possible using a single discipline?)
  3. Critical stance (Is the goal of the work significant and does the integration withstand critique?)

Mansilla argues that a student's thinking must be "made visible" in order for assessment of integration to be possible, suggesting that writing about the knowledge produced and reflecting on the work are two possibilities. Given the generic nature of the areas suggested for assessment, this model could be developed for many different kinds of integrative work. While Mansilla suggests that "the goal of quality interdisciplinary student work is to produce a cognitive advancement," the affective and aesthetic outcomes of student integrative learning can reinforce and motivate students to persist or even increase their learning efforts and should not be ignored.

More on Writing

Christopher R. Wolfe and Carolyn Haynes (2004, 126–169) developed the "Interdisciplinary Writing Assessment Profiles" to delve deeply into the quality of interdisciplinary student work. They view this tool as having potential to guide students in planning interdisciplinary writing as well as providing data for program assessment. The detailed procedure includes four dimensions, two of which could be adapted to assessment of integrative learning: multidisciplinary perspectives and interdisciplinary integration. Scoring statements for the three categories assessed in interdisciplinary integration appear in figure 2.

Clear scoring instructions guide the details of the assessment process developed by Wolfe and Haynes. The profiles, along with scoring instructions and validity and reliability information, can be found at www.units.muohio.edu/aisorg/pubs/reports/InterdisWritingProfile.pdf.

Toward Intentional Learning and Assessment

A well-written assessment tool represents a substantial amount of analytic and strategic thinking, all of which, when shared in thoughtful ways among students and faculty, can contribute to improved learning and teaching. The examples and conceptual frameworks presented here provide interesting possibilities for creating assessment tools for integrative learning of many kinds that will serve individual campus needs. While developing assessments is difficult analytical work, that work can be greatly leveraged to improve teaching and learning by using the assessments to alert students at the start of an assignment to precise expectations for their work and elements critical to assessment. Assessments can also provide formative advice as students develop their projects. Finally, campuses can use assessments to inform students and faculty of the achievements to be celebrated and the deficiencies to be improved.

Figure 1. Levels of achievement with descriptive statements

Level 1 Connection (Beginner)

  • Describe similarities and differences in a collection or set of items
  • Categorize items or observations into groups
  • Recognize simple links among topics or concepts in a course
  • Offer accurate definitions of terms and concepts
  • Describe the setting (e.g., context, environment, culture, domain) in which connections are being made

Level 4 Connection (Advanced)

  • Identify ways to reconcile diverse or conflicting priorities, viewpoints, or options
  • Call attention to something that has not been adequately noticed by others (e.g., a subtle or deep relationship, novel findings or interpretations, the context or frame of reference)
  • Apply frameworks from multiple domains of knowledge and practice to create something (e.g., business plan, musical composition, thesis, capstone paper, research project)
  • Integrate diverse elements into a product, performance or artifact that fits its context coherently

See www.bgsu.edu/offices/assessment/Connect.htm

 

Figure 2. Excerpted Scoring Instructions from “Interdisciplinary Writing Assessment Profiles” (Wolfe and Haynes 2003)

INTERDISCIPLINARY INTEGRATION
Creating Common Ground (Category 1)

  • Presents a clear rationale for taking an interdisciplinary approach.
  • Assumptions from more than one discipline are made explicit and compared
  • Compares and/or contrasts disciplinary perspectives.
  • The problem is explicitly defined in neutral terms that encourage contributions from more than one discipline.
  • Creates a common vocabulary that can be applied to the object of study

New Holistic Understanding (Category 2)

  • One or more novel metaphors are presented.
  • A preexisting metaphor is used or applied in a novel way.
  • One or more novel models are presented.
  • A preexisting model is used or applied in a novel way.
  • A new theoretical interpretation or understanding is presented which explicitly draws on more than one discipline.

Application of the New Holistic Understanding (Category 3)

  • The new metaphor, interpretation, or model is applied to a new situation or phenomenon.
  • The new metaphor, interpretation, or model is applied in a novel way to an established “text,” situation, or phenomenon
  • The new metaphor, interpretation, or model is explicitly tested through observation, data collection, or lived experience and reflection.
  • The new metaphor, interpretation, or model is used in a significant way to guide inquiry.
  • The new metaphor, interpretation, or model is tested by using it to solve a problem.
  • Interdisciplinary theory is used to assess the approach taken.

Note: If credit was not given for any category 2 [above] items, then credit is possible only for the last point (Interdisciplinary Theory).


References

Mansilla, V. B. 2005. Assessing student work at disciplinary crossroads. Change 37 (January/February): 14–21.

Oates, K. K., and L. H. Leavitt. 2003. Service-learning and learning communities: Tools for integration and assessment. Washington, DC: Association of American Colleges and Universities.

Wiggins, G. P. 1993. Assessing student performance. San Francisco: Jossey-Bass.

Wolfe, C. R., and Haynes, C. 2003. Interdisciplinary writing assessment profiles. Issues in Integrative Studies 21, 126–169.


Ross Miller is the director of programs in the Office of Education and Quality Initiatives for the Association of American Colleges and Universities.

Previous Issues