Affordable Access

deepdyve-link
Publisher Website

Stochastic Optimal Control as Non-equilibrium Statistical Mechanics: Calculus of Variations over Density and Current

Authors
Type
Preprint
Publication Date
Submission Date
Identifiers
DOI: 10.1088/1751-8113/47/2/022001
Source
arXiv
License
Yellow
External links

Abstract

In Stochastic Optimal Control (SOC) one minimizes the average cost-to-go, that consists of the cost-of-control (amount of efforts), cost-of-space (where one wants the system to be) and the target cost (where one wants the system to arrive), for a system participating in forced and controlled Langevin dynamics. We extend the SOC problem by introducing an additional cost-of-dynamics, characterized by a vector potential. We propose derivation of the generalized gauge-invariant Hamilton-Jacobi-Bellman equation as a variation over density and current, suggest hydrodynamic interpretation and discuss examples, e.g., ergodic control of a particle-within-a-circle, illustrating non-equilibrium space-time complexity.

Statistics

Seen <100 times