A Test Can Do That?

By Kadriye Ercikan

For some time now I’ve wondered: Can assessments do more for us? Can they help us understand not just how a student answers a question, but the thought process the student uses to get to the answer? Knowing about the strategies that students use when testing has the potential to help us better understand what students know and don’t know, how test takers interact with the test questions, and how we might build even better tests. A lot goes on when a student takes a test that traditional testing just hasn’t been able to capture — until now.

The move from paper-based to digitally-based assessments is expanding opportunities to more fully understand the approach students engage in when responding to the questions on a test by capturing “process data.” Process data, also called “observable data,” reflect student behaviors during test taking.  This information can be derived from key strokes used to answer a question, the time spent on a test question or task, eye movement when reading a test passage, how test takers interact with test stimulus materials, and how test takers utilize resources and tools. Such data hold the promise of developing a better understanding of students’ thought processes and strategies during testing and, therefore, developing better tests.

Emerging research at Educational Testing Service (ETS) on process data is providing more insight into what a test is measuring and how students interact with the test questions. Traditionally, the answers given by students to test questions have been the only source of information about what students know or don’t know. While students’ final answers show if they can answer correctly or incorrectly, the answers alone do not help us understand fully how students come up with the answer. Interpretations of students’ performance based only on how many correct responses are given may be misleading for students who use a flawed strategy or for students who rush through the test but are “lucky guessers.”  By capturing process data as students work through the test questions, use the tools and aides provided to them, and record their observations and conclusions, profiles of performance that demonstrate different solution strategies might be created, not just test scores. Other information, such as the percent of students who used a calculator on a particular mathematics question, the percent of students who used text-to-speech capabilities on the test, or the fluency rate of students (i.e., the ability to write an essay accurately, quickly and appropriately) could also be captured. In the future, this type of information could make the reporting of test results even more useful to educators.

In one of my recently published books, Validation of Score Meaning for the Next Generation of Assessments: The Use of Response Processes, many of the issues surrounding capturing, interpreting and using process data are explored. I co-edited the book with one of my colleagues, James Pellegrino, Distinguished Professor at the University of Illinois, Chicago. As we point out in the book, an emphasis on using process data to help us understand how students approach testing is not new — the desire to assess what a student is “doing” was highlighted in the first edition of Educational Measurement in 1951. But over the last decade I have seen an increase in designing and developing tests that are both informed by and provide information about students’ processes in learning and in testing.

In addition to providing a better understanding of the thought processes and strategies students use, process data show promise in informing the future design of tests. Such data can provide information on new types of questions that require students to do more than simply select from a list of possible answers — questions that require students to come up with answers after performing a science experiment on their computer, to create a mathematical equation or graph, or to highlight sentences in a passage to provide support for their answer. Do these types of questions pose potential barriers for students because they are unfamiliar with the technology that needs to be used? Are the questions clear and concise or is there something in the way in which the questions are presented that confuses students? Process data may well provide answers to these as well as other important questions.

I admit that research on process data is not without its challenges. How can we appropriately interpret such complex behavior? What do different behaviors imply for individual students, or for groups and subgroups of students? Will gathering process data change the very behavior we are attempting to capture? But such data also open up a world of possibilities to better understand how students go about answering the questions on a test, to provide educators with more relevant information about their students and to guide the creation of the next generation of assessments.

Creating tests that are useful, appropriate and fair for all students is the goal that we at ETS are continually striving towards. The promise that process data hold takes us into the future of what an assessment is capable of and how it can benefit all.

Related