BACK to VOLUME 34 NO.4

Kybernetika 34(4):485-494, 1998.

About the Maximum Information and Maximum Likelihood Principles.

Igor Vajda and Jiří Grim


Abstract:

Neural networks with radial basis functions are considered, and the Shannon information in their output concerning input. The role of information-preserving input transformations is discussed when the network is specified by the maximum information principle and by the maximum likelihood principle. A transformation is found which simplifies the input structure in the sense that it minimizes the entropy in the class of all information-preserving transformations. Such transformation need not be unique - under some assumptions it may be any minimal sufficient statistics.


Keywords: EM algorithm; Radial basis functions; Information preservation; Entropy minimization; infomax principle; neural networks;


AMS:


download abstract.pdf


BIB TeX

@article{kyb:1998:4:485-494,

author = {Vajda, Igor and Grim, Ji\v{r}\'{\i}},

title = {About the Maximum Information and Maximum Likelihood Principles.},

journal = {Kybernetika},

volume = {34},

year = {1998},

number = {4},

pages = {485-494}

publisher = {{\'U}TIA, AV {\v C}R, Prague },

}


BACK to VOLUME 34 NO.4