Conversation-Based Assessment

By G. Tanner Jackson and Diego Zapata-Rivera

Imagine a student working with a tutor for the first time. To better understand what the student knows, the tutor may give problems to solve and then review the student’s response. If the response was incomplete or indicative of a misunderstanding, the tutor may ask additional questions and follow up with multiple turns of questions and answers. In some instances, the additional questions may reveal that the student understood the concept deeply but, for whatever reason, had failed to provide a complete answer initially. Such an interactive conversation helps reveal what the student knows and is able to do and areas where more learning is needed. This adaptive method allows the student to fully express his or her knowledge and provides the tutor with more diagnostic information than a non-interactive approach.

These types of open-ended, human-to-human conversations can provide great insight and evidence for assessment purposes and may be fairly easy to develop and administer on a small scale. However, scoring human-to-human conversations requires human raters, since current artificial intelligence (AI) technologies cannot yet handle such open-ended conversations. Using human raters is costly and requires significant training and monitoring to maintain acceptable levels of rater agreement. Thus, although these human-to-human conversations provide valuable assessment evidence, they are not easy or financially viable to deploy on a large scale.

Could the same type of interaction take place between a student and a computer? That is the idea behind conversation-based assessment (CBA) systems that involve innovative and interactive tasks framed in engaging and meaningful contexts. Such realistic and meaningful interactions are an example of the new and innovative assessment types that are being developed in response to emerging educational standards and the requirements of a modern and global economy.

To read the complete report, visit:

Tagged: R&D Lab