史加荣

教授    硕士生导师

个人信息 更多+
  • 教师拼音名称: shijiarong
  • 所在单位: 理学院
  • 学历: 博士研究生毕业
  • 性别: 男
  • 学位: 工学博士学位
  • 在职信息: 在职

其他联系方式

邮箱:

论文成果

当前位置: 中文主页 - 科学研究 - 论文成果

Alternating direction method of multipliers for generalized low-rank tensor recovery

发布时间:2024-08-09
点击次数:
所属单位:
理学院
发表刊物:
Algorithms
关键字:
中文关键字:低秩张量恢复;低秩矩阵恢复;核范数最小化,英文关键字:low-rank tensor recovery; low-rank matrix recovery
摘要:
Low-Rank Tensor Recovery (LRTR), the higher order generalization of Low-Rank Matrix Recovery (LRMR), is especially suitable for analyzing multi-linear data with gross corruptions,outliers and missing values, and it attracts broad attention in the fields of computer vision, machine learning and data mining. This paper considers a generalized model of LRTR and attempts to recover simultaneously the low-rank, the sparse, and the small disturbance components from partial entries of a given data tensor. Specifically, we first describe generalized LRTR as a tensor nuclear norm optimization problem that minimizes a weighted combination of the tensor nuclear norm, the l1 -norm and the Frobenius norm under linear constraints. Then, the technique of Alternating DirectionMethod of Multipliers (ADMM) is employed to solve the proposed minimization problem. Next, we discuss the weak convergence of the proposed iterative algorithm. Finally, experimental results on synthetic and real-world datasets validate the efficiency and effectiveness of the proposed method.
备注:
史加荣
第一作者:
杨威,郑秀云,殷清燕,史加荣
论文类型:
期刊论文
卷号:
卷:9
期号:
期:2
页面范围:
页:28
是否译文:
发表时间:
2016-04-01