ارائه روش جهش هوشمند مبتنی بر الگوریتم PSO برای حل مسئله انتخاب ویژگی
محورهای موضوعی : مهندسی برق و کامپیوترمحمود پرنده 1 , مینا زلفی لیقوان 2 * , جعفر تنها 3
1 - دانشگاه تبریز،دانشکده مهندسی برق و کامپیوتر
2 - دانشگاه تبریز،دانشکده مهندسی برق و کامپیوتر
3 - دانشگاه تبریز،دانشکده مهندسی برق و کامپیوتر
کلید واژه: انتخاب ویژگی, بهینهسازی چندهدفه, الگوریتم PSO, مجموع وزندار تطبیقپذیر, جهش هوشمند, نخبهگرایی,
چکیده مقاله :
امروزه با افزایش حجم تولید داده، توجه به الگوریتمهای یادگیری ماشین جهت استخراج دانش از دادههای خام افزایش یافته است. داده خام معمولاً دارای ویژگیهای اضافی یا تکراری است که بر روی عملکرد الگوریتمهای یادگیری تأثیر میگذارد. جهت افزایش کارایی و کاهش هزینه محاسباتی الگوریتمهای یادگیری ماشین، از الگوریتمهای انتخاب ویژگی استفاده میشود که روشهای متنوعی برای انتخاب ویژگی ارائه شده است. از جمله روشهای انتخاب ویژگی، الگوریتمهای تکاملی هستند که به دلیل قدرت بهینهسازی سراسری خود مورد توجه قرار گرفتهاند. الگوریتمهای تکاملی بسیاری برای حل مسئله انتخاب ویژگی ارائه شده که بیشتر آنها روی فضای هدف تمرکز داشتهاند. فضای مسئله نیز میتواند اطلاعات مهمی برای حل مسئله انتخاب ویژگی ارائه دهد. از آنجایی که الگوریتمهای تکاملی از مشکل عدم خروج از نقطه بهینه محلی رنج میبرند، ارائه یک مکانیزم مؤثر برای خروج از نقطه بهینه محلی ضروری است. در این مقاله از الگوریتم تکاملی PSO با تابع چندهدفه برای انتخاب ویژگی استفاده شده که در آن یک روش جدید جهش که از امتیاز ویژگیهای ذرات استفاده میکند، به همراه نخبهگرایی برای خروج از نقاط بهینه محلی ارائه گردیده است. الگوریتم ارائهشده بر روی مجموعه دادههای مختلف تست و با الگوریتمهای موجود بررسی شده است. نتایج شبیهسازیها نشان میدهند که روش پیشنهادی در مقایسه با روش جدید RFPSOFS بهبود خطای 20%، 11%، 85% و 7% به ترتیب در دیتاستهای Isolet، Musk، Madelon و Arrhythmia دارد.
Today, with the increase in data production volume, attention to machine learning algorithms to extract knowledge from raw data has increased. Raw data usually has redundant or irrelevant features that affect the performance of learning algorithms. Feature selection algorithms are used to improve efficiency and reduce the computational cost of machine learning algorithms. A variety of methods for selecting features are provided. Among the feature selection methods are evolutionary algorithms that have been considered because of their global optimization power. Many evolutionary algorithms have been proposed to solve the feature selection problem, most of which have focused on the target space. The problem space can also provide vital information for solving the feature selection problem. Since evolutionary algorithms suffer from the pain of not leaving the local optimal point, it is necessary to provide an effective mechanism for leaving the local optimal point. This paper uses the PSO evolutionary algorithm with a multi-objective function. In the proposed algorithm, a new mutation method that uses the particle feature score is proposed along with elitism to exit the local optimal points. The proposed algorithm is tested on different datasets and examined with existing algorithms. The simulation results show that the proposed method has an error reduction of 20%, 11%, 85%, and 7% in the Isolet, Musk, Madelon, and Arrhythmia datasets, respectively, compared to the new RFPSOFS method.
[1] B. Xue, M. Zhang, W. N. Browne, and X. Yao, "A survey on evolutionary computation approaches to feature selection," IEEE Trans. on Evolutionary Computation, vol. 20, no. 4, pp. 606-626, Aug. 2015.
[2] J. Miao and L. Niu, "A survey on feature selection," Procedia Computer Science, vol. 91, pp. 919-926, 2016.
[3] I. Guyon and A. Elisseeff, "An introduction to variable and feature selection," J. of Machine Learning Research, vol. 3, , pp. 1157-1182, 2003.
[4] R. Kohavi and G. H. John, "Wrappers for feature subset selection," Artificial Intelligence, vol. 97, no. 1-2, pp. 273-324, Dec. 1997.
[5] G. Chandrashekar and F. Sahin, "A survey on feature selection methods," Computers & Electrical Engineering, vol. 40, no. 1, pp. 16-28, Jan. 2014.
[6] S. S. Darshan and C. Jaidhar, "Performance evaluation of filter-based feature selection techniques in classifying portable executable files," Procedia Computer Science, vol. 125, pp. 346-356, 2018.
[7] S. Karasu, A. Altan, S. Bekiros, and W. Ahmad, "A new forecasting model with wrapper-based feature selection approach using multi-objective optimization technique for chaotic crude oil time series," Energy, vol. 212, Article ID: 118750, 2020.
[8] H. Liu, M. Zhou, and Q. Liu, "An embedded feature selection method for imbalanced data classification," IEEE/CAA J. of Automatica Sinica, vol. 6, no. 3, pp. 703-715, May 2019.
[9] R. Vijayanand, D. Devaraj, and B. Kannapiran, "Intrusion detection system for wireless mesh network using multiple support vector machine classifiers with genetic-algorithm-based feature selection," Computers & Security, vol. 77, pp. 304-314, Aug. 2018.
[10] M. Amoozegar and B. Minaei-Bidgoli, "Optimizing multi-objective PSO based feature selection method using a feature elitism mechanism," Expert Systems with Applications, vol. 113, pp. 499-514, 15 Dec. 2018.
[11] A. Lin, W. Sun, H. Yu, G. Wu, and H. Tang, "Global genetic learning particle swarm optimization with diversity enhancement by ring topology," Swarm and Evolutionary Computation, vol. 44, pp. 571-583, Feb. 2019.
[12] R. Tanabe and H. Ishibuchi, "An easy-to-use real-world multi-objective optimization problem suite," Applied Soft Computing, vol. 89, Article ID: 106078, Apr. 2020.
[13] R. Eberhart and J. Kennedy, "A new optimizer using particle swarm theory," in. Proc. 6th IEEE Int. Symp. on Micro Machine and Human Science, MHS'95, pp. 39-43, Nagoya, Japan, 4-6 Oct. 1995.
[14] N. Jain, U. Nangia, and J. Jain, "A review of particle swarm optimization," J. of the Institution of Engineers India: Series B, vol. 99, no. 4, pp. 407-411, 2018.
[15] N. Gunantara, "A review of multi-objective optimization: methods and its applications," Cogent Engineering, vol. 5, no. 1, Article ID: 1502242, 2018.
[16] T. C. Bora, V. C. Mariani, and L. dos Santos Coelho, "Multi-objective optimization of the environmental-economic dispatch with reinforcement learning based on non-dominated sorting genetic algorithm," Applied Thermal Engineering, vol. 146, pp. 688-700, Jan. 2019.
[17] R. Zhang, F. Nie, X. Li, and X. Wei, "Feature selection with multi-view data: a survey," Information Fusion, vol. 50, pp. 158-167, Oct. 2019.
[18] B. Venkatesh and J. Anuradha, "A review of feature selection and its methods," Cybernetics and Information Technologies, vol. 19, no. 1, pp. 3-26, Mar. 2019.
[19] P. Pudil, J. Novovičová, and J. Kittler, "Floating search methods in feature selection," Pattern Recognition Letters, vol. 15, no. 11, pp. 1119-1125, Nov. 1994.
[20] B. Xue, M. Zhang, and W. N. Browne, "Particle swarm optimization for feature selection in classification: a multi-objective approach," IEEE Trans. on Cybernetics, vol. 43, no. 6, pp. 1656-1671, Dec. 2012.
[21] Y. Zhang, D. W. Gong, and J. Cheng, "Multi-objective particle swarm optimization approach for cost-based feature selection in classification," IEEE/ACM Trans. on Computational Biology and Bioinformatics, vol. 14, no. 1, pp. 64-75, Jan.-Feb. 2015.
[22] H. B. Nguyen, B. Xue, I. Liu, P. Andreae, and M. Zhang, "New mechanism for archive maintenance in PSO-based multi-objective feature selection," Soft Computing, vol. 20, no. 10, pp. 3927-3946, 2016.
[23] M. L. Zhang and Z. H. Zhou, "ML-KNN: a lazy learning approach to multi-label learning," Pattern Recognition, vol. 40, no. 7, pp. 2038-2048, Jul. 2007.
[24] S. Gu, R. Cheng, and Y. Jin, "Feature selection for high-dimensional classification using a competitive swarm optimizer," Soft Computing, vol. 22, no. 3, pp. 811-822, 2018.
[25] T. M. Hamdani, J. M. Won, A. M. Alimi, and F. Karray, "Multi-objective feature selection with NSGA II," in Proc. Int. Conf. on Adaptive and Natural Computing Algorithms, pp. 240-247, Warsaw, Poland, . 2007.
[26] L. Cagnina, S. C. Esquivel, and C. C. Coello, "A particle swarm optimizer for multi-objective optimization," J. of Computer Science and Technology, vol. 5, no. 4, pp. 204-210, 11-14. 2005.