Personal Information

  • Master Tutor
  • (Professor)
  • Name (Pinyin):

    shijiarong
  • School/Department:

    理学院
  • Education Level:

    With Certificate of Graduation for Doctorate Study
  • Gender:

    Male
  • Degree:

    Doctoral Degree in Engineering
  • Professional Title:

    Professor
  • Status:

    Employed
  • Alma Mater:

    西安电子科技大学
  • Discipline:

    Mathematics

Other Contact Information

  • Email:

Robust Generalized Low Rank Approximations of Matrices

  • Release time:2024-08-09
  • Hits:
  • Affiliation of Author(s):

    理学院
  • Journal:

    PloS one
  • Key Words:

    中文关键字:广义低秩逼近,英文关键字:Generalized Low Rank Approximations of Matrices
  • Abstract:

    In recent years, the intrinsic low rank structure of some datasets has been extensively exploited to reduce dimensionality, remove noise and complete the missing entries. As a well-known technique for dimensionality reduction and data compression, Generalized Low Rank Approximations of Matrices (GLRAM) claims its superiority on computation time and compression ratio over the SVD. However, GLRAM is very sensitive to sparse large noise or outliers and its robust version does not have been explored or solved yet. To address this problem, this paper proposes a robust method for GLRAM, named Robust GLRAM (RGLRAM). We first formulate RGLRAM as an l 1 -norm optimization problem which mini- mizes the l 1 -norm of the approximation errors. Secondly, we apply the technique of Aug- mented Lagrange Multipliers (ALM) to solve this l 1 -norm minimization problem and derive a corresponding iterative scheme. Then the weak convergence of the proposed algorithm is discussed under mild conditions. Next, we investigate a special case of RGLRAM and extend RGLRAM to a general tensor case. Finally, the extensive experiments on synthetic data show that it is possible for RGLRAM to exactly recover both the low rank and the sparse components while it may be difficult for previous state-of-the-art algorithms. We also discuss three issues on RGLRAM: the sensitivity to initialization, the generalization ability and the relationship between the running time and the size/number of matrices. Moreover, the experimental results on images of faces with large corruptions illustrate that RGLRAM obtains the best denoising and compression performance than other methods.
  • Note:

    SCI
  • First Author:

    zhengxiuyun,yangwei,shijiarong
  • Indexed by:

    Journal paper
  • Volume:

    卷:10
  • Issue:

    期:9
  • Page Number:

    页:
  • Translation or Not:

    no
  • Date of Publication:

    2015-09-01
Back
Top