Minimum-entropy estimation in semi-parametric models

Research output: Contribution to journalArticlepeer-review

Abstract

In regression problems where the density f of the errors is not known, maximum likelihood is unapplicable, and the use of alternative techniques like least squares or robust M-estimation generally implies inefficient estimation of the parameters. The search for adaptive estimators, that is, estimators that remain asymptotically efficient independently of the knowledge of f, has received a lot of attention, see in particular (Proceedings of the Third Berkeley Symposium on Mathematical Statistics and Probability, vol. 1, 1956, pp. 187; Ann. Stat. 3(2) (1975) 267; Ann. Stat. 10 (1982) 647) and the review paper (Econometric Rev. 3(2) (1984) 145). The paper considers a minimum-entropy parametric estimator that minimizes an estimate of the entropy of the distribution of the residuals. A first construction connects the method with the Stone-Bickel approach, where the estimation is decomposed into two steps. Then we consider a direct approach that does not involve any preliminary n-consistent estimator. Some results are given that illustrate the good performance of minimum-entropy estimation for reasonable sample sizes when compared to standard methods, in particular concerning robustness in the presence of outliers.

Original languageEnglish
Pages (from-to)937-949
JournalSignal Processing
Volume85
Issue number5
DOIs
Publication statusPublished - 2005
Externally publishedYes
EventProceedings - IEEE International Conference on Acoustics, Speech, and Signal Processing - Montreal, Que, Canada
Duration: 17 May 200421 May 2004

Keywords

  • Adaptive estimation
  • Efficiency
  • Entropy
  • Outliers
  • Parameter estimation
  • Robustness
  • Semi-parametric models

Fingerprint

Dive into the research topics of 'Minimum-entropy estimation in semi-parametric models'. Together they form a unique fingerprint.

Cite this