Komparasi Hasil Prediksi Dengan Menggunakan Hyperparameter Tuning Antara Random Search dan Grid Search
Abstract
Abstrak — Peningkatan yang signifikan pada transaksi
keuangan mencurigakan yang berpotensi merugikan lembaga
keuangan dan masyarakat semakin luas. Pencucian uang dan
penipuan finansial merupakan ancaman serius yang sulit
dideteksi oleh sistem tradisional, yang sering kali tidak mampu
mengimbangi kompleksitas metode kriminal yang semakin
canggih. Masalah utama penelitian ini adalah bagaimana
meningkatkan akurasi dan efisiensi dalam mendeteksi
transaksi mencurigakan menggunakan teknologi Machine
Learning. Penelitian ini dilakukan dengan mengembangkan
model pendeteksi transaksi mencurigakan menggunakan
algoritma XGBoost, Decision Tree, dan Logistic Regression
dengan membandingkan pencarian parameter terbaik untuk
Hyperparameter Tuning antara Random Search dan Grid
Search dalam menghasilkan prediksi yang bernilai tinggi.
Kata kunci— fraud, xgboost, decision tree, logistic
regression, random search, grid search, hyperparameter tuning
References
J. Yao, J. Zhang and L. Wang, “A financial statement
fraud detection model based on hybrid data mining
methods,” 2018 International Conference on Artificial
Intelligence and Big Data (ICAIBD), pp. 57-61, 2018,
doi: 10.1109/ICAIBD.2018.8396167.
R. Frumerie, “Money Laundering Detection using Tree
Boosting and Graph Learning Algorithms,” M.S. thesis,
Dept. Mathematics., KTH., Stockholm, Sweden, 2021.
[Online].
Available
:
https://www.diva-portal.org/smash/record.jsf?pid=diva2
%3A1663255&dswid=6464
N. Alfa, S. Mawar, N. H. Siahaan, R. Putri,
“Memahami Transaksi Keuangan Mencurigakan,”
ppatk.go.id. Accessed: Oct. 10, 2023. [Online.]
Available:
https://www.ppatk.go.id/siaran_pers/read/953/memaha
mi-transaksi-keuangan-mencurigakan.html
W. Nugraha and A. Sasongko, “Hyperparameter
Tuning on Classification Algorithm with Grid
Search,” SISTEMASI, vol. 11, no. 2, p. 391, May 2022,
doi: https://doi.org/10.32520/stmsi.v11i2.1750.
Y. Özüpak, “Machine learning-based fault detection in
transmission lines: A comparative study with random
search optimization,” Bulletin of the Polish Academy
of Sciences Technical Sciences, pp. 153229–153229,
, doi:
https://doi.org/10.24425/bpasts.2025.153229.
A. F. D. Putra, M. N. Azmi, H. Wijayanto, S. Utama,
and I. G. P. W. Wedashwara Wirawan, “Optimizing
Rain Prediction Model Using Random Forest and Grid
Search Cross-Validation for Agriculture
Sector”, MATRIK, vol. 23, no. 3, pp. 519–530, Jul.
, doi: 10.30812/matrik.v23i3.3891.
M. Arifin and S. Adiyono, “Hyperparameter Tuning in
Machine Learning to Predicting Student Academic
Achievement,” International Journal of Artificial
Intelligence Research, vol. 8, no. 1.1, 2024, doi:
https://doi.org/10.29099/ijair.v8i1.1.1214.
D. A. Anggoro and S. S. Mukti, “Performance
Comparison of Grid Search and Random Search
Methods for Hyperparameter Tuning in Extreme
Gradient Boosting Algorithm to Predict Chronic
Kidney Failure,” International Journal of Intelligent
Engineering and Systems, vol. 14, no. 6, pp. 198–207,
Aug. 2021, doi:
https://doi.org/10.22266/ijies2021.1231.19.
M. B. Prayoga, N. Cahyono, Subektiningsih, and
Kamarudin, “PENERAPAN GRID SEARCH UNTUK
OPTIMASI MODEL MACHINE LEARNING
DALAM KLASIFIKASI SENTIMEN KOMENTAR
YOUTUBE,” JATI (Jurnal Mahasiswa Teknik
Informatika), vol. 9, no. 3, pp. 3817–3824, June 2025,
doi: https://doi.org/10.36040/jati.v9i3.13375.
M. Gusarov, “Do I need to tune logistic regression
hyperparameters?” Medium.com. Accessed: Aug. 11,
[Online.] Available:
https://medium.com/codex/do-i-need-to-tune-logisticregression-hyperparameters-1cb2b81fc a69
L. Owen, “Understanding the Hyperparameters of
Popular Algorithms” in Hyperparameter Tuning with
Python. Birmingham, UK: Packt Publishing Ltd., 2022,
ch. 11, pp. 219–225.
S. F. N. Islam, A. Sholahuddin, and A. S. Abdullah,
“Extreme gradient boosting (XGBoost) method in
making forecasting application and analysis of USD
exchange rates against rupiah,” in Journal of Physics:
Conference Series, vol. 1722, no. 1, pp. 12-16, Jan.
, doi: 10.1088/1742-6596/1722/1/012016.



