Neural networks with radial basis functions are considered, and the Shannon information in their output concerning input. The role of information-preserving input transformations is discussed when the network is specified by the maximum information principle and by the maximum likelihood principle. A transformation is found which simplifies the input structure in the sense that it minimizes the entropy in the class of all information-preserving transformations. Such transformation need not be unique - under some assumptions it may be any minimal sufficient statistics.
Keywords: EM algorithm; Radial basis functions; Information preservation; Entropy minimization; infomax principle; neural networks;
AMS:
BACK to VOLUME 34 NO.4