Integrating joint feature selection into subspace learning: A formulation of 2DPCA for outliers robust feature selection.

Clicks: 286
ID: 60926
2019
Article Quality & Performance Metrics
Overall Quality Improving Quality
0.0 /100
Combines engagement data with AI-assessed academic quality
AI Quality Assessment
Not analyzed
Abstract
Since the principal component analysis and its variants are sensitive to outliers that affect their performance and applicability in real world, several variants have been proposed to improve the robustness. However, most of the existing methods are still sensitive to outliers and are unable to select useful features. To overcome the issue of sensitivity of PCA against outliers, in this paper, we introduce two-dimensional outliers-robust principal component analysis (ORPCA) by imposing the joint constraints on the objective function. ORPCA relaxes the orthogonal constraints and penalizes the regression coefficient, thus, it selects important features and ignores the same features that exist in other principal components. It is commonly known that square Frobenius norm is sensitive to outliers. To overcome this issue, we have devised an alternative way to derive objective function. Experimental results on four publicly available benchmark datasets show the effectiveness of joint feature selection and provide better performance as compared to state-of-the-art dimensionality-reduction methods.
Reference Key
razzak2019integratingneural Use this key to autocite in the manuscript while using SciMatic Manuscript Manager or Thesis Manager
Authors Razzak, Imran;Saris, Raghib Abu;Blumenstein, Michael;Xu, Guandong;
Journal neural networks : the official journal of the international neural network society
Year 2019
DOI
S0893-6080(19)30257-6
URL
Keywords

Citations

No citations found. To add a citation, contact the admin at info@scimatic.org

No comments yet. Be the first to comment on this article.