Sparse solution of least-squares twin multi-class support vector machine using ℓ0 and ℓp-norm for classification and feature selection
Language English Country United States Media print-electronic
Document type Journal Article
PubMed
37574621
DOI
10.1016/j.neunet.2023.07.039
PII: S0893-6080(23)00398-2
Knihovny.cz E-resources
- Keywords
- -norm, Cardinality-constrained optimization problem, Feature selection, Least-squares, Multi-class classification, Twin k-class support vector classification,
- MeSH
- Least-Squares Analysis MeSH
- Machine Learning * MeSH
- Support Vector Machine * MeSH
- Publication type
- Journal Article MeSH
In the realm of multi-class classification, the twin K-class support vector classification (Twin-KSVC) generates ternary outputs {-1,0,+1} by evaluating all training data in a "1-versus-1-versus-rest" structure. Recently, inspired by the least-squares version of Twin-KSVC and Twin-KSVC, a new multi-class classifier called improvements on least-squares twin multi-class classification support vector machine (ILSTKSVC) has been proposed. In this method, the concept of structural risk minimization is achieved by incorporating a regularization term in addition to the minimization of empirical risk. Twin-KSVC and its improvements have an influence on classification accuracy. Another aspect influencing classification accuracy is feature selection, which is a critical stage in machine learning, especially when working with high-dimensional datasets. However, most prior studies have not addressed this crucial aspect. In this study, motivated by ILSTKSVC and the cardinality-constrained optimization problem, we propose ℓp-norm least-squares twin multi-class support vector machine (PLSTKSVC) with 0
References provided by Crossref.org