BACK to VOLUME 43 NO.5

Kybernetika 43(5):747-764, 2007.

Exploiting Tensor Rank-one Decomposition in Probabilistic Inference

Petr Savicky and Jiří Vomlel


Abstract:

We propose a new additive decomposition of probability tables -- {\em tensor rank-one decomposition}. The basic idea is to decompose a {\em probability table} into a series of tables, such that the table that is the sum of the series is equal to the original table. Each table in the series has the same domain as the original table but can be expressed as a product of one-dimensional tables. Entries in tables are allowed to be any real number, i.\,e. they can be also negative numbers. The possibility of having negative numbers, in contrast to a multiplicative decomposition, opens new possibilities for a compact representation of probability tables. We show that {\em tensor rank-one decomposition} can be used to reduce the space and time requirements in probabilistic inference. We provide a closed form solution for minimal tensor rank-one decomposition for some special tables and propose a numerical algorithm that can be used in cases when the closed form solution is not known.


Keywords: graphical probabilistic models; probabilistic inference; tensor rank;


AMS: 68T37; 62E15; 15A69;


download abstract.pdf


BIB TeX

@article{kyb:2007:5:747-764,

author = {Savicky, Petr and Vomlel, Ji\v{r}\'{\i} },

title = {Exploiting Tensor Rank-one Decomposition in Probabilistic Inference},

journal = {Kybernetika},

volume = {43},

year = {2007},

number = {5},

pages = {747-764}

publisher = {{\'U}TIA, AV {\v C}R, Prague },

}


BACK to VOLUME 43 NO.5