Characteristic Function-Based Lower Bounds for Fisher Information Under Arbitrary Parametrization


Creative Commons License

DÜLEK B.

IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS, cilt.53, sa.1, ss.501-512, 2017 (SCI-Expanded) identifier identifier

Özet

Fisher information is widely used in statistical signal processing to measure the amount of information that a random variable (RV) carries about an unknown parameter of the probability distribution that models the RV. For some RVs, the exact sampling distribution is either intractable or admits no closed analytical form, which renders the computation of the Fisher information a difficult task. In those cases, it would be useful to have lower bounds that can be computed easily. In this correspondence, lower bounds for the Fisher information of a general parametric probability distribution are derived based on its characteristic function using the Fisher information inequality. The necessary and sufficient condition required to achieve these bounds is discussed. It is shown that the relations derived previously for the Fisher information of a location parameter can be obtained as special cases of the general parametric scenario investigated in this correspondence. Examples from common continuous and discrete distributions as well as mixture densities, symmetric a-stable distribution, and K-distribution are provided to corroborate the applicability and efficiency of the proposed bounds.

Fisher information is widely used in statistical signal processing to measure the amount of information that a random variable (RV) carries about an unknown parameter of the probability distribution that models the RV. For some RVs, the exact sampling distribution is either intractable or admit no closed analytical form, which renders the computation of the Fisher information a difficult task. In those cases, it would be useful to have lower bounds that can be computed easily. In this correspondence, lower bounds for the Fisher information of a general parametric probability distribution are derived based on its characteristic function using the Fisher information inequality. The necessary and sufficient condition required to achieve these bounds is discussed. It is shown that the relations derived previously for the Fisher information of a location parameter can be obtained as special cases of the general parametric scenario investigated in this correspondence. Examples from common continuous and discrete distributions as well as mixture densities, symmetric alpha-stable distribution, and K-distribution are provided to corroborate the applicability and efficiency of the proposed bounds.