sciweb08的个人博客分享 http://blog.sciencenet.cn/u/sciweb08

博文

[转载]Matrix Factorization, Algorithms, Applications, and Avaliab

已有 3072 次阅读 2014-11-16 11:25 |系统分类:科研笔记|文章来源:转载


美帝的有心人士收集了市面上的矩阵分解的几乎所有算法和应用,由于源地址在某神秘物质之外,特转载过来,源地址

Matrix Decompositions has a long history and generally centers around a set of known factorizations such as LU, QR, SVD and eigendecompositions. More recent factorizations have seen the light of the day with work started with the advent of NMF, k-means and related algorithm [1]. However, with the advent of new methods based on random projections and convex optimization that started in part in the compressive sensing literature, we are seeing another surge of very diverse algorithms dedicated to many different kinds of matrix factorizations with new constraints based on rank and/or positivity and/or sparsity,… As a result of this large increase in interest, I have decided to keep a list of them here following the success of the big picture in compressive sensing.

The sources for this list include the following most excellent sites: Stephen Becker’s page, Raghunandan H. Keshavan‘ s page, Nuclear Norm and Matrix Recovery through SDP by Christoph Helmberg, Arvind Ganesh’s Low-Rank Matrix Recovery and Completion via Convex Optimization who provide more in-depth additional information.  Additional codes were featured also on Nuit Blanche. The following people provided additional inputs: Olivier Grisel, Matthieu Puigt.

Most of the algorithms listed below generally rely on using the nuclear norm as a proxy to the rank functional. It may not be optimal. Currently, CVX ( Michael Grant and Stephen  Boyd) consistently allows one to explore other proxies for the rank functional such as the log-det as found by Maryam  Fazell, Haitham Hindi, Stephen Boyd. ** is used to show that the algorithm uses another heuristic than the nuclear norm.

In terms of notations, A refers to a matrix, L refers to a low rank matrix, S a sparse one and N to a noisy one. This page lists the different codes that implement the following matrix factorizations: Matrix Completion, Robust PCA , Noisy Robust PCA, Sparse PCA, NMF, Dictionary Learning, MMV, Randomized Algorithms and other factorizations. Some of these toolboxes can sometimes implement several of these decompositions and are listed accordingly. Before I list algorithm here, I generally feature them on Nuit Blanche under the MF tag: http://nuit-blanche.blogspot.com/search/label/MF or. you can also subscribe to the Nuit Blanche feed,

Matrix Completion, A = H.*L with H a known mask, L unknown solve for L lowest rank possible

The idea of this approach is to complete the unknown coefficients of a matrix based on the fact that the matrix is low rank:

Noisy Robust PCA,  A = L + S + N with L, S, N unknown, solve for L low rank, S sparse, N noise

Robust PCA : A = L + S with L, S, N unknown, solve for L low rank, S sparse

Sparse PCA: A = DX  with unknown D and X, solve for sparse D

Sparse PCA on wikipedia

  • R. Jenatton, G. Obozinski, F. Bach. Structured Sparse Principal Component Analysis. International Conference on Artificial Intelligence and Statistics (AISTATS). [pdf] [code]

  • SPAMs

  • DSPCA: Sparse PCA using SDP . Code is here.

  • PathPCA: A fast greedy algorithm for Sparse PCA. The code is here.

Dictionary Learning: A = DX  with unknown D and X, solve for sparse X

Some implementation of dictionary learning implement the NMF

NMF: A = DX with unknown D and X, solve for elements of D,X > 0

Non-negative Matrix Factorization (NMF) on wikipedia

Multiple Measurement Vector (MMV) Y = A X with unknown X and rows of X are sparse.

Blind Source Separation (BSS) Y = A X with unknown A and X and statistical independence between columns of X or subspaces of columns of X

Include Independent Component Analysis (ICA), Independent Subspace Analysis (ISA), and Sparse Component Analysis (SCA). There are many available codes for ICA and some for SCA. Here is a non-exhaustive list of some famous ones (which are not limited to linear instantaneous mixtures). TBC

ICA:

SCA:

Randomized Algorithms

These algorithms uses generally random projections to shrink very large problems into smaller ones that can be amenable to traditional matrix factorization methods.

Resource
Randomized algorithms for matrices and data by Michael W. Mahoney
Randomized Algorithms for Low-Rank Matrix Decomposition

Other factorization

D(T(.)) = L + E with unknown L, E and unknown transformation T and solve for transformation T, Low Rank L and Noise E

Frameworks featuring advanced Matrix factorizations

For the time being, few have integrated the most recent factorizations.

GraphLab / Hadoop

Books

Example of use

Sources

Arvind Ganesh’s Low-Rank Matrix Recovery and Completion via Convex Optimization

Relevant links

Reference:

A Unified View of Matrix Factorization Models by Ajit P. Singh and Geoffrey J. Gordon




http://blog.sciencenet.cn/blog-56590-843799.html

上一篇:看纪录片《互联网时代》所思之三
下一篇:[转载]大数据的机会和挑战:从数据分析角度的讨论 读书笔记

0

该博文允许注册用户评论 请点击登录 评论 (0 个评论)

数据加载中...

Archiver|手机版|科学网 ( 京ICP备14006957 )

GMT+8, 2019-10-21 00:36

Powered by ScienceNet.cn

Copyright © 2007- 中国科学报社

返回顶部