Upper Bound of Bayesian Generalization Error in Nonnegative Matrix Factorization
This topic contains 0 replies, has 1 voice, and was last updated by arXiv 2 years, 4 months ago.

Upper Bound of Bayesian Generalization Error in Nonnegative Matrix Factorization
Nonnegative matrix factorization (NMF) is a new knowledge discovery method that is used for text mining, signal processing, bioinformatics, and consumer analysis. However, its basic property as a learning machine is not yet clarified, as it is not a regular statistical model, resulting that theoretical optimization method of NMF has not yet established. In this paper, we study the real log canonical threshold of NMF and give an upper bound of the generalization error in Bayesian learning. The results show that the generalization error of the matrix factorization can be made smaller than regular statistical models if Bayesian learning is applied.
Upper Bound of Bayesian Generalization Error in Nonnegative Matrix Factorization
by Naoki Hayashi, Sumio Watanabe
https://arxiv.org/pdf/1612.04112v5.pdf
You must be logged in to reply to this topic.