| Titre |
IMPROVED ESTIMATORS OF BREGMAN DIVERGENCE FOR MODEL SELECTION IN SMALL SAMPLES |
| Auteurs |
Papa Dario [1],
Nkurunziza Jean de Dieu [2],
Ogouyandjou Carlos [3],
|
| Journal: |
Far East Journal of Theoretical Statistics |
| Catégorie Journal: |
Internationale |
| Impact factor: |
|
| Volume Journal: |
54 |
| DOI: |
10.17654/TS054020189 |
| Resume |
Recently in [1, 2], Bromideh introduced the Kullback-Leibler
Divergence (KLD) test statistic in discriminating between two models.
It was found that the ratio minimized Kullback-Leibler divergence
(RMKLD) works better than the ratio of maximized likelihood (RML)
for small sample size. The aim of this paper is to generalize the works
of Ali-Akbar Bromideh by proposing a hypothesis testing based on
Bregman Divergence (BD) in order to improve the process of choice
of the model. We investigate the problem of model choice and propose
a unified method for model selection and estimation procedure with
desired theoretical properties and computational convenience. After
observing n data points of unknown density f; we firstly measure the
closeness between the bias reduced kernel density estimator and the
first estimated candidate model. Secondly between the bias reduced kernel density estimator and the second estimated candidate model. In
these two cases, BD and the bias reduced kernel estimator [3] focuses
on improving the convergence rates of kernel density estimators are
used. We establish the asymptotic properties of BD estimator and
approximations of the power functions are deduced. The multi-step
MLE process will be used to estimate the parameters of the models.
We explain the applicability of the BD by a real data set and by the
data generating process (DGP). The Monte Carlo simulation and then
the numerical analysis will be used to interpret the result. |
| Mots clés |
A bias reduced kernel estimator, Bregman divergence, hypothesis test |
| Pages |
189 - 224 |
| Fichier |
(PDF) |