تعداد نشریات | 44 |
تعداد شمارهها | 1,303 |
تعداد مقالات | 16,020 |
تعداد مشاهده مقاله | 52,486,893 |
تعداد دریافت فایل اصل مقاله | 15,213,919 |
A swarm intelligence based multi-label feature selection method hybridized with a local search strategy | ||
مجله مهندسی برق دانشگاه تبریز | ||
دوره 51، شماره 4 - شماره پیاپی 98، دی 1400، صفحه 443-454 اصل مقاله (1.23 M) | ||
نوع مقاله: علمی-پژوهشی | ||
نویسندگان | ||
آذر رفیعی1؛ پرهام مرادی* 2؛ عبدالباقی قادرزاده3 | ||
1Department of Computer Engineering, Islamic Azad University, Sanandaj Branch, Sanandaj, Iran | ||
2Department of Computer Engineering, University of Kurdistan, Sanandaj, Iran | ||
3Department of Computer Engineering, Islamic Azad University, Sanandaj Branch, Sanandaj, Iran. | ||
چکیده | ||
Multi-label classification aims at assigning more than one label to each instance. Many real-world multi-label classification tasks are high dimensional, leading to reduced performance of traditional classifiers. Feature selection is a common approach to tackle this issue by choosing prominent features. Multi-label feature selection is an NP-hard approach, and so far, some swarm intelligence-based strategies and have been proposed to find a near optimal solution within a reasonable time. In this paper, a hybrid intelligence algorithm based on the binary algorithm of particle swarm optimization and a novel local search strategy has been proposed to select a set of prominent features. To this aim, features are divided into two categories based on the extension rate and the relationship between the output and the local search strategy to increase the convergence speed. The first group features have more similarity to class and less similarity to other features, and the second is redundant and less relevant features. Accordingly, a local operator is added to the particle swarm optimization algorithm to reduce redundant features and keep relevant ones among each solution. The aim of this operator leads to enhance the convergence speed of the proposed algorithm compared to other algorithms presented in this field. Evaluation of the proposed solution and the proposed statistical test shows that the proposed approach improves different classification criteria of multi-label classification and outperforms other methods in most cases. Also in cases where achieving higher accuracy is more important than time, it is more appropriate to use this method. | ||
کلیدواژهها | ||
Feature selection؛ Multi-label classification؛ Local search strategy؛ Swarm intelligence؛ Particle swarm optimization | ||
مراجع | ||
[1] Y. Lin, Q. Hu, J. Liu, J. Chen, and J. Duan, "Multi-label feature selection based on neighborhood mutual information," Applied Soft Computing, vol. 38, pp. 244-256, 2016.
[2] O. Reyes, C. Morell, and S. Ventura, "Scalable extensions of the ReliefF algorithm for weighting and selecting features on the multi-label learning context," Neurocomputing, vol. 161, pp. 168-182, 2015.
[3] L. Li, H. Liu, Z. Ma, Y. Mo, Z. Duan, J. Zhou, et al., "Multi-label feature selection via information gain," in International Conference on Advanced Data Mining and Applications, 2014, pp. 345-355.
[4] Y. Lin, Q. Hu, J. Liu, and J. Duan, "Multi-label feature selection based on max-dependency and min-redundancy," Neurocomputing, vol. 168, pp. 92-103, 2015.
[5] S. Tabakhi and P. Moradi, "Relevance–redundancy feature selection based on ant colony optimization," Pattern recognition, vol. 48, pp. 2798-2811, 2015.
[6] P. Moradi and M. Rostami, "Integration of graph clustering with ant colony optimization for feature selection," Knowledge-Based Systems, vol. 84, pp. 144-161, 2015.
[7] مریم رحمانی نیا, پرهام مرادی و م. جلیلی، "یک راهکار انتخاب ویژگی چندهدفه بر اساس اطلاعات متقابل شرطی و نظریه مجموعه پارتو،" مجله مهندسی برق دانشگاه تبریز، دوره 50، شماره 3، صفحات 1225 تا 1237، پاییز 1399. [8] J. Lee and D.-W. Kim, "Memetic feature selection algorithm for multi-label classification," Information Sciences, vol. 293, pp. 80-96, 2015.
[9]Y. Yu and Y. Wang, "Feature selection for multi-label learning using mutual information and GA," in International Conference on Rough Sets and Knowledge Technology, 2014, pp. 454-463.
[10] Y. Zhang, D.-w. Gong, X.-y. Sun, and Y.-n. Guo, "A PSO-based multi-objective multi-label feature selection method in classification," Scientific reports, vol. 7, p. 376, 2017.
[11] M.-L. Zhang, J. M. Peña, and V. Robles, "Feature selection for multi-label naive Bayes classification," Information Sciences, vol. 179, pp. 3218-3229, 2009.
[12] M. A. Khan, A. Ekbal, E. L. Mencía, and J. Fürnkranz, "Multi-objective optimisation-based feature selection for multi-label classification," in International Conference on Applications of Natural Language to Information Systems, 2017, pp. 38-41.
[13] M. You, J. Liu, G.-Z. Li, and Y. Chen, "Embedded feature selection for multi-label classification of music emotions," International Journal of Computational Intelligence Systems, vol. 5, pp. 668-678, 2012.
[14] P. Zhu, Q. Xu, Q. Hu, C. Zhang, and H. Zhao, "Multi-label feature selection with missing labels," Pattern Recognition, vol. 74, pp. 488-502, 2018.
[15] شیما کاشف و حسین نظام آبادی پور، "یک روش ترکیبی برای یافتن زیرمجموعه ویژگی موثر در داده های چند برچسبی,"مجله مهندسی برق دانشگاه تبریز، دوره 48، شماره 3، صفحات 1327 تا 1338، پاییز 1397. [16] I. A. Gheyas and L. S. Smith, "Feature subset selection in large dimensionality domains," Pattern recognition, vol. 43, pp. 5-13, 2010.
[17] Y. Saeys, I. Inza, and P. Larrañaga, "A review of feature selection techniques in bioinformatics," bioinformatics, vol. 23, pp. 2507-2517, 2007.
[18] H. Liu and L. Yu, "Toward integrating feature selection algorithms for classification and clustering," IEEE Transactions on Knowledge & Data Engineering, pp. 491-502, 2005.
[19] S. Tabakhi, A. Najafi, R. Ranjbar, and P. Moradi, "Gene selection for microarray data classification using a novel ant colony optimization," Neurocomputing, vol. 168, pp. 1024-1036, 2015.
[20] R. K. Sivagaminathan and S. Ramakrishnan, "A hybrid approach for feature subset selection using neural networks and ant colony optimization," Expert systems with applications, vol. 33, pp. 49-60, 2007.
[21] M. H. Aghdam, N. Ghasem-Aghaee, and M. E. Basiri, "Text feature selection using ant colony optimization," Expert systems with applications, vol. 36, pp. 6843-6853, 2009.
[22] J. Yang and V. Honavar, "Feature subset selection using a genetic algorithm," in Feature extraction, construction and selection, ed: Springer, 1998, pp. 117-136.
[23] M. Rostami and P. Moradi, "A clustering based genetic algorithm for feature selection," in 2014 6th Conference on Information and Knowledge Technology (IKT), 2014, pp. 112-116.
[24] T. M. Hamdani, J.-M. Won, A. M. Alimi, and F. Karray, "Hierarchical genetic algorithm with new evaluation function and bi-coded representation for the selection of features considering their confidence rate," Applied Soft Computing, vol. 11, pp. 2501-2509, 2011.
[25] S.-W. Lin, T.-Y. Tseng, S.-Y. Chou, and S.-C. Chen, "A simulated-annealing-based approach for simultaneous parameter optimization and feature selection of back-propagation networks," Expert Systems with Applications, vol. 34, pp. 1491-1499, 2008.
[26] S.-W. Lin, Z.-J. Lee, S.-C. Chen, and T.-Y. Tseng, "Parameter determination of support vector machine and feature selection using simulated annealing approach," Applied soft computing, vol. 8, pp. 1505-1512, 2008.
[27] L.-Y. Chuang, S.-W. Tsai, and C.-H. Yang, "Improved binary particle swarm optimization using catfish effect for feature selection," Expert Systems with Applications, vol. 38, pp. 12699-12707, 2011.
[28] Y. Liu, G. Wang, H. Chen, H. Dong, X. Zhu, and S. Wang, "An improved particle swarm optimization for feature selection," Journal of Bionic Engineering, vol. 8, pp. 191-200, 2011.
[29] B. Xue, M. Zhang, and W. N. Browne, "Particle swarm optimisation for feature selection in classification: Novel initialisation and updating mechanisms," Applied soft computing, vol. 18, pp. 261-276, 2014.
[30] H. M. Abdelsalam and A. M. Mohamed, "Optimal sequencing of design projects' activities using discrete particle swarm optimisation," International Journal of Bio-Inspired Computation, vol. 4, pp. 100-110, 2012.
[31] سمیرا حیدری مقدم بجستانی, سعید شعرباف تبریزی و ع. قاضی خانی، "ارائه ی یک روش انتخاب ویژگی جدید مبتنی بر بهینه سازی ازدحام ذرات با استفاده از به روزرسانی فازی," مجله مهندسی برق دانشگاه تبریز، دوره50، شماره 4، صفحات1557 تا 1567، زمستان 1399. [32] J. Lee and D.-W. Kim, "Mutual information-based multi-label feature selection using interaction information," Expert Systems with Applications, vol. 42, pp. 2013-2025, 2015.
[33] W. Chen, J. Yan, B. Zhang, Z. Chen, and Q. Yang, "Document transformation for multi-label feature selection in text categorization," in Seventh IEEE International Conference on Data Mining (ICDM 2007), 2007, pp. 451-456.
[34] N. SpolaôR, E. A. Cherman, M. C. Monard, and H. D. Lee, "A comparison of multi-label feature selection methods using the problem transformation approach," Electronic Notes in Theoretical Computer Science, vol. 292, pp. 135-151, 2013.
[35] G. Doquire and M. Verleysen, "Feature selection for multi-label classification problems," in International work-conference on artificial neural networks, 2011, pp. 9-16.
[36] J. Read, B. Pfahringer, and G. Holmes, "Multi-label classification using ensembles of pruned sets," in 8th IEEE international conference on data mining, 2008, pp. 995-1000.
[37] . Doquire and M. Verleysen, "Mutual information-based feature selection for multilabel classification," Neurocomputing, vol. 122, pp. 148-155, 2013.
[38] J. Lee and D.-W. Kim, "Fast multi-label feature selection based on information-theoretic feature ranking," Pattern Recognition, vol. 48, pp. 2761-2771, 2015.
[39] J. Yin, T. Tao, and J. Xu, "A multi-label feature selection algorithm based on multi-objective optimization," in 2015 International Joint Conference on Neural Networks (IJCNN), 2015, pp. 1-7.
[40] J. Kennedy and R. Eberhart, "Particle swarm optimization," in Proceedings of ICNN'95-international conference on neural networks, 1995, pp. 1942-1948.
[41] R. B. Pereira, A. P. d. Carvalho, B. Zadrozny, and L. H. d. C. Merschmann, "Information gain feature selection for multi-label classification," 2015.
[42] M. M. Kabir, M. Shahjahan, and K. Murase, "A new local search based hybrid genetic algorithm for feature selection," Neurocomputing, vol. 74, pp. 2914-2928, 2011.
[43] D. P. Muni, N. R. Pal, and J. Das, "Genetic programming for simultaneous feature selection and classifier design," 2006.
[44] M. M. Kabir, M. M. Islam, and K. Murase, "A new wrapper feature selection approach using neural network," Neurocomputing, vol. 73, pp. 3273-3283, 2010.
[45] P. Resnick, N. Iacovou, M. Suchak, P. Bergstrom, and J. Riedl, "GroupLens: an open architecture for collaborative filtering of netnews," in Proceedings of the 1994 ACM conference on Computer supported cooperative work, 1994, pp. 175-186.
[46] X. He, D. Cai, and P. Niyogi, "Laplacian score for feature selection," in Advances in neural information processing systems, 2006, pp. 507-514.
[47] M.-L. Zhang and Z.-H. Zhou, "ML-KNN: A lazy learning approach to multi-label learning," Pattern recognition, vol. 40, pp. 2038-2048, 2007.
[48] H. O. Parametric, "Handbook Of Parametric And Nonparametric Statistical Procedures."
[49] S. Kashef and H. Nezamabadi-pour, "A label-specific multi-label feature selection algorithm based on the Pareto dominance concept," Pattern Recognition, vol. 88, pp. 654-667, 2019.
[50] H. Bayati, M. B. Dowlatshahi, and M. Paniri, "Multi-label feature selection based on competitive swarm optimization," Journal of Soft Computing and Information Technology, vol. 9, pp. 56-69, 2020.
[51] J. Lee and D.-W. Kim, "Feature selection for multi-label classification using multivariate mutual information," Pattern Recognition Letters, vol. 34, pp. 349-357, 2013. | ||
آمار تعداد مشاهده مقاله: 496 تعداد دریافت فایل اصل مقاله: 574 |