Subspace-based algorithms that exploit the orthogonality between a sample subspace and a parameterdependent subspace have proved very useful in many applications in signal processing. The purpose of this paper is to complement theoretical results already available on the asymptotic (in the number of measurements) performance of subspace-based estimators derived in the Gaussian context to real elliptical symmetric (RES), circular complex elliptical symmetric (C-CES) and non-circular CES (NC-CES) distributed observations in the same framework. First, the asymptotic distribution of M-estimates of the orthogonal projection matrix is derived from those of the M-estimates of the covariance matrix. This allows us to characterize the asymptotically minimum variance (AMV) estimator based on estimates of orthogonal projectors associated with different Mestimates of the covariance matrix. A closed-form expression is then given for the AMV bound on the parameter of interest characterized by the column subspace of the mixing matrix of general linear mixture models. We also specify the conditions under which the AMV bound based on Tyler's M-estimate attains the stochastic Cramér-Rao bound (CRB) for the complex Student t and complex generalized Gaussian distributions. Finally, we prove that the AMV bound attains the stochastic CRB in the case of maximum likelihood (ML) M-estimate of the covariance matrix for RES, C-CES and NC-CES distributed observations, which is equal to the semiparametric CRB (SCRB) recently introduced.