BACK to VOLUME 43 NO.5

Kybernetika 43(5):731-746, 2007.

Optimality Conditions for Maximizers of the Information Divergence from an Exponential Family

František Matúš


Abstract:

The information divergence of a probability measure $P$ from an exponential family $\mathcal{E}$ over a finite set is defined as infimum of the divergences of $P$ from $Q$ subject to $Q\in\mathcal{E}$. All directional derivatives of the divergence from $\mathcal{E}$ are explicitly found. To this end, behaviour of the conjugate of a log-Laplace transform on the boundary of its domain is analysed. The first order conditions for $P$ to be a maximizer of the divergence from $\mathcal{E}$ are presented, including new ones when $P$~is not projectable to $\mathcal{E}$.


Keywords: Kullback--Leibler divergence; relative entropy; exponential family; information projection; log-Laplace transform; cumulant generating function; directional derivatives; first order optimality conditions; convex functions; polytopes;


AMS: 94A17; 62B10; 60A10; 52A20;


download abstract.pdf


BIB TeX

@article{kyb:2007:5:731-746,

author = {Mat\'{u}\v{s}, Franti\v{s}ek },

title = {Optimality Conditions for Maximizers of the Information Divergence from an Exponential Family},

journal = {Kybernetika},

volume = {43},

year = {2007},

number = {5},

pages = {731-746}

publisher = {{\'U}TIA, AV {\v C}R, Prague },

}


BACK to VOLUME 43 NO.5