Semi-supervised Learning to Perceive Children’s Affective States in a Tablet Tutor

Published in Tenth Symposium on Educational Advances in Artificial Intelligence (AAAI - EAAI'20), 2020

Recommended citation: Mansi Agarwal, and Jack Mostow. Tenth Symposium on Educational Advances in Artificial Intelligence. AAAI 2020.

[PDF] [DOI]

Abstract

Like good human tutors, intelligent tutoring systems should detect and respond to students’ affective states. However, accuracy in detecting affective states automatically has been limited by the time and expense of manually labeling training data for supervised learning. To combat this limitation, we use semi-supervised learning to train an affective state detector on a sparsely labeled, culturally novel, authentic data set in the form of screen capture videos from a Swahili literacy and numeracy tablet tutor in Tanzania that shows the face of the child using it. We achieved 88% leave-1-child-out cross-validated accuracy in distinguishing pleasant, unpleasant, and neutral affective states, compared to only 61% for the best supervised learning method we tested. This work contributes toward using automated affect detection both off-line to improve the design of intelligent tutors, and at runtime to respond to student affect based on input from a user-facing tablet camera or webcam.