A random variable X is said to have an univariate elliptical distribution (or an elliptical density) with parameters [mu] and [gamma] > 0 if it has a density of the form fh(x[mu], [gamma]) = [gamma]h((x - [mu])2/[gamma]) for some function h(Â·). If [mu] = 0 and [gamma] = 1, then X is said to have a spherical density corresponding to radial function h(Â·). Here we derive Chernoff-type inequality and a natural identity for univariate elliptical distribution. The problems of estimation of mean vector [mu] = ([mu]1,...,[mu]p) of a random vector X = (X1,...,Xp) with independent components is discussed as an application of the identity. The risk of the improved estimator for p [greater-or-equal, slanted] 3 dominating the unbiased estimator X of [mu] under squared error loss is derived. Locally asymptotic minimax estimation for a function g([theta]) of [theta] = ([mu], [sigma]), [sigma]2 = [gamma] is discussed. Special case of g([theta]) = [mu] + c[sigma] is discussed in detail and as a further special case, a locally asymptotic minimax estimator of [mu] + c[sigma] is derived for normal distribution with parameters ([mu], [sigma]2). Finally Chernoff-type inequality and an identity are derived for a multivariate random vector X = (X1,...,Xp) when the components Xi, 1 [less-than-or-equals, slant] i [less-than-or-equals, slant] p, are independent univariate elliptical random variables.