Affordable Access

Learning from time-dependent streaming data with online stochastic algorithms

Authors
  • Godichon-Baggioni, Antoine
  • Werge, Nicklas
  • Wintenberger, Olivier
Publication Date
May 24, 2022
Source
HAL
Keywords
Language
English
License
Unknown
External links

Abstract

We study stochastic algorithms in a streaming framework, trained on samples coming from a dependent data source. In this streaming framework, we analyze the convergence of Stochastic Gradient (SG) methods in a non-asymptotic manner; this includes various SG methods such as the well-known stochastic gradient descent (i.e., Robbins-Monro algorithm), mini-batch SG methods, together with their averaged estimates (i.e., Polyak-Ruppert averaged). Our results form a heuristic by linking the level of dependency and convexity to the rest of the model parameters. This heuristic provides new insights into choosing the optimal learning rate, which can help increase the stability of SGbased methods; these investigations suggest large streaming batches with slow decaying learning rates for highly dependent data sources.

Report this publication

Statistics

Seen <100 times