# 非负矩阵分解模型的费用函数: 大繁至简

非负矩阵分解(Nonnegative Matrix Factorization, NMF, [1,2])已经成为数据挖掘领域中最流行的模型之一,该模型在无监督学习领域中具有良好的数值表现. 该模型可以被归结为一个非线性规划问题, 通过最优化一个费用函数来达到发现数据背后隐含结构的目的, 而选取不同的费用函数会导致不同的数值结果. 最近, 在文献中提出了多种不同的费用函数或者费用函数族, 包括Least squares error ([1,2]), K-L divergence([1,2]), phi-divergence ([3]), alpha-divergence ([4,5]), Bregman divergence ([6]), beta-divergence ([4]) IS-divergence ([7]). 但是文献中仍缺乏一个系统的比较来说明哪些费用函数适合哪些具体的应用问题. 我们针对这个问题进行了实证研究. 我们的实验结果显示, 形式最简单的最小二乘方 (Least squares error) K-L divergence两种费用函数的数值表现是最好的, 真是应了那句话: 大繁至简.

1 Lee D, Seung H, Learning the parts of objects by nonnegative matrix factorization, Nature, 1999, 401:788-791
2Lee D, Seung H, Algorithms for nonnegative matrix factorization, NIPS, 2001: 556-562

3 Cichocki, A., Zdunek, R., Amari-ichi, S, Csiszárs divergences for non-negative matrix factorization: family of new algorithms, Berlin: Springer: 32-39.

4 Cichocki, A., Zdunek, R., Choi, S., Plemmons, R., Amari-ichi, S., Nonnegative tensor factorization using alpha and beta divergences, Proc. IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP07): 13931396

5 Cichocki, A., Lee, H., Kim, Y. D., Choi, S., Non-negative matrix factorization with alpha-divergence, Pattern Recognition Letters, 29:14331440

6 Dhillon, I. S., Sra, S., Generalized non negative matrix approximations with Bregman divergences, NIPS, 2005: 283290

7 Fevotte, C., Bertin, N., Durrieu, J. L., Nonnegative matrix factorization with the Itakura-Saito divergence with application to music analysis,Neural Comput., 21: 793-830.

https://blog.sciencenet.cn/blog-297051-504965.html

## 全部精选博文导读

GMT+8, 2023-2-3 02:04