Machine learning over spaces of measures : invariant deep networks and quantile regression
- Authors
- Publication Date
- Nov 30, 2020
- Source
- HAL
- Keywords
- Language
- English
- License
- Unknown
- External links
Abstract
This thesis proposes theoretical and numerical contributions to perform machine learning and statistics over the space of probability distributions. In a first part, we introduce a new class of neural network architectures to process probability measures in their Lagrangian form (obtained by sampling) as both inputs and outputs, which is characterized by robustness and universal approximation properties. We show that this framework can be adapted to perform regression on probability measure inputs, with customized invariance requirements, in a way that preserves its robustness and approximation capabilities. This method is proven to be of interest to design expressive, adaptable summaries of datasets referred to as “meta-features”, in the context of automated machine learning. In a second part, we demonstrate that the resort to entropy eases the computation of conditional multivariate quantiles. We introduce the regularized vector quantile regression problem, provide a scalable algorithm to compute multivariate quantiles and show that it benefits from desirable asymptotic properties.