Affordable Access

deepdyve-link
Publisher Website

Classification of Alzheimer’s Disease Leveraging Multi-task Machine Learning Analysis of Speech and Eye-Movement Data

Authors
  • Jang, Hyeju1
  • Soroski, Thomas2
  • Rizzo, Matteo1
  • Barral, Oswald1
  • Harisinghani, Anuj1
  • Newton-Mason, Sally2
  • Granby, Saffrin1
  • Stutz da Cunha Vasco, Thiago Monnerat3
  • Lewis, Caitlin2
  • Tutt, Pavan2
  • Carenini, Giuseppe1
  • Conati, Cristina1
  • Field, Thalia S.2
  • 1 Department of Computer Science, University of British Columbia, Vancouver, BC , (Canada)
  • 2 Vancouver Stroke Program and Division of Neurology, Faculty of Medicine, University of British Columbia, Vancouver, BC , (Canada)
  • 3 Department of Statistics, University of British Columbia, Vancouver, BC , (Canada)
Type
Published Article
Journal
Frontiers in Human Neuroscience
Publisher
Frontiers Media SA
Publication Date
Sep 20, 2021
Volume
15
Identifiers
DOI: 10.3389/fnhum.2021.716670
PMID: 34616282
PMCID: PMC8488259
Source
PubMed Central
Keywords
Disciplines
  • Human Neuroscience
  • Original Research
License
Unknown

Abstract

Alzheimer’s disease (AD) is a progressive neurodegenerative condition that results in impaired performance in multiple cognitive domains. Preclinical changes in eye movements and language can occur with the disease, and progress alongside worsening cognition. In this article, we present the results from a machine learning analysis of a novel multimodal dataset for AD classification. The cohort includes data from two novel tasks not previously assessed in classification models for AD (pupil fixation and description of a pleasant past experience), as well as two established tasks (picture description and paragraph reading). Our dataset includes language and eye movement data from 79 memory clinic patients with diagnoses of mild-moderate AD, mild cognitive impairment (MCI), or subjective memory complaints (SMC), and 83 older adult controls. The analysis of the individual novel tasks showed similar classification accuracy when compared to established tasks, demonstrating their discriminative ability for memory clinic patients. Fusing the multimodal data across tasks yielded the highest overall AUC of 0.83 ± 0.01, indicating that the data from novel tasks are complementary to established tasks.

Report this publication

Statistics

Seen <100 times