Abstract This survey is concerned with variants of nonlinear optimization methods designed for implementation on parallel computers. First, we consider a variety of methods for unconstrained minimization. We consider a particular type of parallelism (simultaneous function and gradient evaluations), and we concentrate on the main sources of inspiration: conjugate directions, homogeneous functions, variable-metric updates, and multi-dimensional searches. The computational process for solving small and medium-size constrained optimization problems is usually based on unconstrained optimization. This provides a straightforward opportunity for the introduction of parallelism. In the present survey, however, we focus on promising approaches for solving large, well-structured constrained problems: dualization of problems with separable objective and constraint functions, and decomposition of hierarchical problems with linking variables (typical for Bender's decomposition in the linear case). Finally, we outline the key issues in future computational studies of parallel nonlinear optimization algorithms.