# On Bahadur Efficiency of Power Divergence Statistics

Authors
Type
Preprint
Publication Date
Feb 07, 2010
Submission Date
Feb 07, 2010
Source
arXiv
It is proved that the information divergence statistic is infinitely more Bahadur efficient than the power divergence statistics of the orders $\alpha >1$ as long as the sequence of alternatives is contiguous with respect to the sequence of null-hypotheses and the the number of observations per bin increases to infinity is not very slow. This improves the former result in Harremo\"es and Vajda (2008) where the the sequence of null-hypotheses was assumed to be uniform and the restrictions on on the numbers of observations per bin were sharper. Moreover, this paper evaluates also the Bahadur efficiency of the power divergence statistics of the remaining positive orders $0< \alpha \leq 1.$ The statistics of these orders are mutually Bahadur-comparable and all of them are more Bahadur efficient than the statistics of the orders $\alpha > 1.$ A detailed discussion of the technical definitions and conditions is given, some unclear points are resolved, and the results are illustrated by examples.