The paper solves the problem of minimization of the Kullback divergence between a partially known and a completely known probability distribution. It considers two probability distributions of a random vector $(u_1, x_1,..., u_T, x_T )$ on a sample space of $2T$ dimensions. One of the distributions is known, the other is known only partially. Namely, only the conditional probability distributions of $x_\tau$ given $u_1, x_1,..., u_{\tau-1}, x_{\tau-1}, u_{\tau}$ are known for $\tau = 1, ..., T$. Our objective is to determine the remaining conditional probability distributions of $u_\tau$ given $u_1, x_1,..., u_{\tau-1}, x_{\tau-1}$ such that the Kullback divergence of the partially known distribution with respect to the completely known distribution is minimal. Explicit solution of this problem has been found previously for Markovian systems in Karn\'{y} \cite{Karny:96a}. The general solution is given in this paper.
Keywords: Kullback divergence; minimization; stochastic controller;
AMS: 60G35; 93E20; 90D60; 49N35;
BACK to VOLUME 44 NO.1