Feature Selection Menggunakan Algoritma Meta-Heuristik
DOI:
https://doi.org/10.37366/jpcs.v2i2.2289Keywords:
Feature Selection, Machine Learning, Meta-HeuristicAbstract
Machine learning requires data to make predictions. Data can have a large number of features. The large number of features can cause machine learning models to overfit, increase model complexity, and high computational costs. Feature selection is one method for optimizing machine learning models. Feature selection reduces the number of features used in the learning process. This research proposes a feature selection method using meta-heuristic algorithms. The machine learning model serves as the objective function for the meta-heuristic algorithm. The objective function is evaluated at each iteration to obtain the most influential features in the model. The machine learning models used are Random Forest, k-Nearest Neighbors, and Support Vector Machine. The meta-heuristic algorithms used are Differential Evolution, Flower Pollination, Grey Wolf, and Whale Optimization. The research shows that using meta-heuristic algorithms can improve the accuracy of machine learning models with fewer features. The Support Vector Machine – Differential Evolution scheme has the highest accuracy and uses the fewest features.
Downloads
References
Cervantes, J., Garcia-Lamont, F., Rodriguez-Mazahua, L., dan Lopez, A. (2020). A Comprehensive Survey on Supprot Vector Machine Clasification: Applications, Chalenges and Trends. Neurocomputing, (189-215). https://doi.org/10.1016/j.neucom.2019.10.118.
Diaz-Uriarte, R., dan de Andres, R. A. (2006). Gene Selection and Classification of Microarray Data Using Random Forest. BMC Bioinformatics. https://doi.org/10.1186/1471-2105-7-3.
Gupta, S., & Deep, K. (2019). A Novel Random Walk Grey Wolf Optimizer. Swarm and evolutionary computation, 44, 101-112. https://doi.org/10.1016/j.swevo.2018.01.001.
Mirjalili, S., & Lewis, A. (2016). The Whale Optimization Algorithm. Advances in Engineering Software, (51-67). https://doi.org/10.1016/j.advengsoft.2016.01.008.
Nababan, A. A., Sitompul, O. S., dan Tulus. (2018). Attribute Weighting Based K-Nearest Neighbor Using Gain Ratio. Journal of Physics: Conference Series.
Yang, X. S. (2012). Flower Pollination Algorithm for Global Optimization. International conference on unconventional computing and natural computation (240-249). https://doi.org/10.1007/978-3-642-32894-7_27.
Tanabe, R & Fukunaga, A. (2013). Success-History Based Parameter Adaptation for Differential Evolution. IEEE Congress on Evolutionary Computation. https://doi.org/10.1109/CEC.2013.6557555.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2023 Journal of Practical Computer Science
This work is licensed under a Creative Commons Attribution 4.0 International License.