Sparse solution of least-squares twin multi-class support vector machine using ℓ0 and ℓp-norm for classification and feature selection

. 2023 Sep ; 166 () : 471-486. [epub] 20230801

Jazyk angličtina Země Spojené státy americké Médium print-electronic

Typ dokumentu časopisecké články

Perzistentní odkaz   https://www.medvik.cz/link/pmid37574621
Odkazy

PubMed 37574621
DOI 10.1016/j.neunet.2023.07.039
PII: S0893-6080(23)00398-2
Knihovny.cz E-zdroje

In the realm of multi-class classification, the twin K-class support vector classification (Twin-KSVC) generates ternary outputs {-1,0,+1} by evaluating all training data in a "1-versus-1-versus-rest" structure. Recently, inspired by the least-squares version of Twin-KSVC and Twin-KSVC, a new multi-class classifier called improvements on least-squares twin multi-class classification support vector machine (ILSTKSVC) has been proposed. In this method, the concept of structural risk minimization is achieved by incorporating a regularization term in addition to the minimization of empirical risk. Twin-KSVC and its improvements have an influence on classification accuracy. Another aspect influencing classification accuracy is feature selection, which is a critical stage in machine learning, especially when working with high-dimensional datasets. However, most prior studies have not addressed this crucial aspect. In this study, motivated by ILSTKSVC and the cardinality-constrained optimization problem, we propose ℓp-norm least-squares twin multi-class support vector machine (PLSTKSVC) with 0

Citace poskytuje Crossref.org

Najít záznam

Citační ukazatele

Nahrávání dat ...

Možnosti archivace

Nahrávání dat ...