The output distance function is a key concept in economics. However, its empirical estimation often violates properties dictated by neoclassical production theory. In this paper, we introduce the neural distance function (NDF) which constitutes a global approximation to any arbitrary production technology with multiple outputs given by a neural network (NN) specification. The NDF imposes all theoretical properties such as monotonicity, curvature and homogeneity, for all economically admissible values of outputs and inputs. Fitted to a large data set for all US commercial banks (1989-2000), the NDF explains a very high proportion of the variance of output while keeping the number of parameters to a minimum and satisfying the relevant theoretical properties. All measures such as total factor productivity (TFP) and technical efficiency (TE) are computed routinely. Next, the NDF is compared with the Translog popular specification and is found to provide very satisfactory results as it possesses the properties thought as desirable in neoclassical production theory in a way not matched by its competing specification.