BACK to VOLUME 34 NO.5
BACK to VOLUME 34 NO.5
Abstract:
For general Bayes decision rules there are considered perceptron approximations based on sufficient statistics inputs. A particular attention is paid to Bayes discrimination and classification. In the case of exponentially distributed data with known model it is shown that a perceptron with one hidden layer is sufficient and the learning is restricted to synaptic weights of the output neuron. If only the dimension of the exponential model is known, then the number of hidden layers will increase by one and also the synaptic weights of neurons from both hidden layers have to be learned.
Keywords:
AMS:
BACK to VOLUME 34 NO.5