Skip to main content

Table 3 Well known DR methods from the literature that were compared with the proposed Vec2im-Siam

From: A novel feature extraction methodology using Siamese convolutional neural networks for intrusion detection

DR approach Acronym Description
Infinite Latent Feature Selection (Roffo et al. 2017) ILFS A probabilistic latent feature selection approach that performs the ranking step by considering all the possible subsets of features bypassing the combinatorial problem
Unsupervised graph-based filter (Roffo et al. 2015) Inf-FS In Inf-FS, each feature is a node in the graph, a path is a selection of features, and the higher the centrality score, the most important the feature. It assigns a score of importance to each feature by taking into account all the possible feature subsets as paths on a graph.
Relief-F (Liu and Motoda 2007) Relief-F An iterative, randomized, and supervised approach that estimates the quality of features according to how well their values differentiate data samples that are near to each other; it does not discriminate among redundant features, and performance decreases with few data.
Laplacian Score (He et al. 2005) LS The importance of a feature is evaluated by its power of locality preserving. In order to model the local geometric structure, this method constructs a nearest neighbor graph. LS algorithm seeks those features that respect this graph structure.
Fisher filter feature selection (Gu et al. 2011, Xue-qin et al. 2006) Fisher It computes a score for a feature as the ratio of interclass separation and intraclass variance, where features are evaluated independently, and the final feature selection occurs by aggregating the m top ranked ones.
Correlation-based Feature Selection (Shahbaz et al. 2016) CFS CFS sorts features according to pairwise correlations
Unsupervised Feature Selection with Ordinal Locality (Guo et al. 2017) UFSOL A clustering-based approach that preserves the relative neighborhood proximities and contributes to distance-based clustering
Least Absolute Shrinkage and Selection Operator (Hagos et al. 2017) Lasso This method applies a regularization process that penalizes the coefficients of the regression variables while setting the less relevant to zero to respect the constraint on the sum. FS is a consequence of this process when all the variables that still have non-zero coefficients are selected to be part of the model
Chi-square feature selection (Thaseen and Kumar 2017; Thaseen et al. 2018) Chi2 It ranks features based on the statistical significance test and consider only those features that are dependent on the class label
Minimum redundancy maximum relevance (Nguyen et al. 2010) mRMR A FS algorithm that systematically performs variable selection, achieving a reasonable trade-off between relevance and redundancy.
Fuzzy Complementarity Criterion (Moustakidis et al. 2012, Moustakidis and Theocharis 2010) FuzCoC FS is driven by a fuzzy complementary criterion which assures that features are iteratively introduced, providing the maximum additional contribution with regard to the information content given by the previously selected features