BACK to VOLUME 30 NO.2

Kybernetika 30(2):187-198, 1994.

Second-Order Approximation of the Entropy in Nonlinear Least-Squares Estimation

Luc Pronzato and Andrej Pázman


Abstract:

Measures of variability of the least-squares estimator $\hat{\theta}$ are essential to assess the quality of the estimation. In nonlinear regression, an accurate approximation of the covariance matrix of $\hat{\theta}$ is difficult to obtain (Clarke 80). In this paper, a second-order approximation of the entropy of the distribution of $\hat{\theta}$ is proposed, which is only slightly more complicated than the widely used bias approximation of Box (Box 71). It is based on the "flat" or "saddle-point approximation" of the density of $\hat{\theta}$. The neglected terms are of order ${\cal O}(\sigma^4)$, while the classical first order approximation neglects terms of order ${\cal O}(\sigma^2)$. Various illustrative examples are presented, including the use of the approximate entropy as a criterion for experimental design.


download abstract.pdf


BIB TeX

@article{kyb:1994:2:187-198,

author = {Pronzato, Luc and P\'{a}zman, Andrej },

title = {Second-Order Approximation of the Entropy in Nonlinear Least-Squares Estimation},

journal = {Kybernetika},

volume = {30},

year = {1994},

number = {2},

pages = {187-198}

publisher = {{\'U}TIA, AV {\v C}R, Prague },

}


BACK to VOLUME 30 NO.2