Affordable Access

Access to the full text

Toward Continuous Social Phenotyping: Analyzing Gaze Patterns in an Emotion Recognition Task for Children With Autism Through Wearable Smart Glasses

Authors
  • Nag, Anish1
  • Haber, Nick2
  • Voss, Catalin2
  • Tamura, Serena3
  • Daniels, Jena4
  • Ma, Jeffrey2
  • Chiang, Bryan2
  • Ramachandran, Shasta2
  • Schwartz, Jessey2
  • Winograd, Terry2
  • Feinstein, Carl2
  • Wall, Dennis P2
  • 1 University of California, Berkeley, CA , (United States)
  • 2 Stanford University, Stanford, CA , (United States)
  • 3 UC San Francisco, San Francisco, CA , (United States)
  • 4 Medable, Inc, Palo Alto, CA , (United States)
Type
Published Article
Journal
Journal of Medical Internet Research
Publisher
JMIR Publications Inc.
Publication Date
Apr 22, 2020
Volume
22
Issue
4
Identifiers
DOI: 10.2196/13810
PMID: 32319961
PMCID: PMC7203617
Source
PubMed Central
Keywords
License
Green
External links

Abstract

Background Several studies have shown that facial attention differs in children with autism. Measuring eye gaze and emotion recognition in children with autism is challenging, as standard clinical assessments must be delivered in clinical settings by a trained clinician. Wearable technologies may be able to bring eye gaze and emotion recognition into natural social interactions and settings. Objective This study aimed to test: (1) the feasibility of tracking gaze using wearable smart glasses during a facial expression recognition task and (2) the ability of these gaze-tracking data, together with facial expression recognition responses, to distinguish children with autism from neurotypical controls (NCs). Methods We compared the eye gaze and emotion recognition patterns of 16 children with autism spectrum disorder (ASD) and 17 children without ASD via wearable smart glasses fitted with a custom eye tracker. Children identified static facial expressions of images presented on a computer screen along with nonsocial distractors while wearing Google Glass and the eye tracker. Faces were presented in three trials, during one of which children received feedback in the form of the correct classification. We employed hybrid human-labeling and computer vision–enabled methods for pupil tracking and world–gaze translation calibration. We analyzed the impact of gaze and emotion recognition features in a prediction task aiming to distinguish children with ASD from NC participants. Results Gaze and emotion recognition patterns enabled the training of a classifier that distinguished ASD and NC groups. However, it was unable to significantly outperform other classifiers that used only age and gender features, suggesting that further work is necessary to disentangle these effects. Conclusions Although wearable smart glasses show promise in identifying subtle differences in gaze tracking and emotion recognition patterns in children with and without ASD, the present form factor and data do not allow for these differences to be reliably exploited by machine learning systems. Resolving these challenges will be an important step toward continuous tracking of the ASD phenotype.

Report this publication

Statistics

Seen <100 times