The paper solves the problem of minimization of the Kullback divergence between a partially known and a completely known probability distribution. It considers two probability distributions of a random vector (u1, x1,..., uT, xT ) on a sample space of 2T dimensions. One of the distributions is known, the other is known only partially. Namely, only the conditional probability distributions of xτ given u1, x1,...,uτ-1, xτ-1, uτ are known for τ= 1, ..., T. Our objective is to determine the remaining conditional probability distributions of uτ given u1, x1,...,uτ-1, xτ-1 such that the Kullback divergence of the partially known distribution with respect to the completely known distribution is minimal. Explicit solution of this problem has been found previously for Markovian systems in Karný. The general solution is given in this paper.
Keywords: Kullback divergence; minimization; stochastic controller;
AMS: 60G35; 93E20; 90D60; 49N35;
BACK to VOLUME 44 NO.1