Artigo Revisado por pares

Constraint-weighted support vector ordinal regression to resist constraint noises

2023; Elsevier BV; Volume: 649; Linguagem: Inglês

10.1016/j.ins.2023.119644

ISSN

1872-6291

Autores

Fa Zhu, Xingchi Chen, Xizhan Gao, Weidu Ye, Hai Zhao, Athanasios V. Vasilakos,

Tópico(s)

Domain Adaptation and Few-Shot Learning

Resumo

Ordinal regression (OR) is a crucial in machine learning. Usual assumption is that all training instances are perfectly denoted. However, when this assumption does not hold, the performance degrades significantly. As a widely used ordinal regression model, support vector ordinal regression (SVOR) identifies r-1 parallel hyperplanes to separate r ranks, where each instance is associated with r-1 constraints for r-1 parallel hyperplanes. Different from the traditional classification problem, an instance with incorrect label may have no influence on certain parallel hyperplanes during SVOR learning. If a constraint induces the deviation of parallel hyperplane(s), it is termed as constraint noise. To address constraint noises, this paper proposes constraint-weighted support vector ordinal regression (CWSVOR) by introducing a constraint weight vector whose length is r-1 to control the influence of r-1 constraints on parallel hyperplanes for each instance. When an instance is denoted as an incorrect rank, the elements of the weight vector for constraint noises are close to 0, while others remain close to 1. The proposed constraint-weighted strategy aims to mitigate the detrimental effects of constraint noises and simultaneously retain the useful constraints during SVOR learning. The experiments on several datasets demonstrate that CWSVOR outperforms KDLOR, ELMOP, NNOP, SVOR and NPSVOR when the training set is corrupted by noises and it shows comparable performance to pin-SVOR.

Referência(s)
Altmetric
PlumX