Artigo Revisado por pares

On removing potential redundant constraints for SVOR learning

2021; Elsevier BV; Volume: 102; Linguagem: Inglês

10.1016/j.asoc.2020.106941

ISSN

1872-9681

Autores

Fa Zhu, Ning Ye, Xingchi Chen, Yongbin Zhao, Yining Gang,

Tópico(s)

Rough Sets and Fuzzy Logic

Resumo

As an extension of support vector machine in ordinal regression problem, support vector ordinal regression (SVOR) finds (r−1) parallel hyperplanes by solving a quadratic programming which has n(r−1) constraints. Here, n and r represent the number of training sample and ranks, respectively. Therefore, it would costs much more time to train SVOR than SVC or SVR. Fortunately, the solution of SVOR is only decided by minor constraints which are associated with non-zero Lagrange multipliers. Other constraints with zero Lagrange multipliers have no influence on the solution. Because a training sample is associated with (r−1) constraints, retaining potential support vector may still induce that many redundant constraints are reserved. In this paper, we try to remove these potential redundant constraints for SVOR learning. For the jth parallel hyperplane, the potential constraints with non-zero Lagrange multipliers are associated with the samples near the jth parallel hyperplane. These samples can be identified by a chain near the jth parallel hyperplane. Then, other constraints for the jth parallel hyperplane can be discarded before learning. The number of the constraints can be reduced to less than 17 percent of the original in our experiments. Additionally, it only executes once to find potential critical constraints. Obviously, it is easy to tune parameters after removing potential redundant constraints. The experimental results on several datasets demonstrate that SVOR becomes much faster after removing potential redundant constraints and the performance does not degrade seriously.

Referência(s)
Altmetric
PlumX