Sistem Question Answering pada Data Kesehatan Menggunakan Model pre-trained BERT

Authors

  • Bagas Millen Alhafidz Telkom University
  • Ema Rachmawati Telkom University
  • Prasti Eko Yunanto Telkom University

Abstract

Setelah pandemi covid-19, kesehatan menjadi hal
yang harus diperhatikan. Sebagian besar masyarakat
menggunakan search engine sebagai alat untuk mencari
informasi tentang kesehatan. Namun informasi yang
didapatkan berupa query hasil search engine yang masih
umum. Sistem Question Answering adalah sistem yang
memberikan informasi sesuai informasi yang dibutuhkan oleh
pengguna secara spesifik. Pada penelitian ini dibangun sistem
Question Answering menggunakan metode Bidirectional
Encoder Representations from Transformer (BERT). BERT
merupakan sebuah pre-trained model yang menggunakan
arsitektur transformer. BERT dapat menyelesaikan tugas
sistem Question Answering. Dengan pre-trained model, sistem
tidak perlu melakukan training model dari awal. Sistem hanya
perlu menggunakan train model yang telah dibuat oleh orang
lain sesuai tugas yang dibutuhkan untuk menghemat waktu dan
sumber daya. Untuk mengukur performansi, digunakan metode
Exact Match (EM) dan F1-score. Hasil dari penelitian ini skor
terbaik yang didapat yaitu Exact Match 75% dan F1-score
76%.
Kata kunci— question answering, BERT, pre-trained model,
kesehatan

References

K. A. Rizqo, "Resmi! Jokowi Umumkan Status

Pandemi COVID-19 Dicabut," detikNews, 21 Juni

[Online]. Available:

https://news.detik.com/berita/d-6784804/resmijokowi-umumkan-status-pandemi-covid-19-dicabut.

[Accessed 6 Januari 2025].

L. N. Hakim, "DAMPAK PANDEMI WABAH

CORONAVIRUS DISEASE (COVID) 19 DAN

LOCKDOWN TERHADAP KESEHATAN

MENTAL: KAJIAN PSIKOLOGI DAN AGAMA,"

Kajian, vol. 25, no. 2, pp. 161-177, 2020.

A. Sixsmith, B. R. Horst, D. Simeonov and A.

Mihailidis, "Older People’s Use of Digital

Technology During the COVID-19 Pandemic,"

Bulletin of Science, Technology & Society, vol. 42,

no. 1-2, pp. 19-24, 2022.

D. Vargo, L. Zhu, B. Benwell and Z. Yan, "Digital

technology use during COVID-19 pandemic: A

rapid review," Human Behavior and Emerging

Technologies, vol. 3, no. 3, pp. 13-24, 2020.

J. Moudy and R. A. Syakurah, "Pengetahuan terkait

Usaha Pencegahan Coronavirus Disease (COVID19) di Indonesia," HIGEIA JOURNAL OF PUBLIC

HEALTH RESEARCH AND DEVELOPMENT, vol.

, no. 2, pp. 333-346, 2020.

F. Zhu, W. Lei, J. Zheng, S. Poria and T.-S. Chua,

"Retrieving and Reading: A Comprehensive Survey

on Open-domain Question Answering," arXiv,

J. Devlin, M.-W. Chang, K. Lee and K. Toutanova,

"BERT: Pre-training of Deep Bidirectional

Transformers for Language Understanding," 2018.

F. Koto, A. Rahimi, J. H. Lau and T. Baldwin,

"IndoLEM and IndoBERT: A Benchmark Dataset

and Pre-trained Language Model for Indonesian

NLP," in COLING 2020 - The 28th International

Conference on Computational Linguistics, 2020.

F. Dartiko, M. Yusa, A. Erlansari and S. A. Basha,

"Comparative Analysis of Transformer-Based

Method In A Question Answering System for Campus Orientation Guides," INTENSIF: Jurnal

Ilmiah Penelitian dan Penerapan Teknologi Sistem

Informasi, vol. 8, no. 1, pp. 126-143, 2024.

S. P. Widodo, "Comparative Analysis of Retriever

and Reader for Open Domain Questions Answering

on BPS Knowledge in Indonesian," Proceedings of

The International Conference on Data Science and

Official Statistics, vol. 2023, no. 1, pp. 337-343,

J. Bulian, C. Buck, W. Gajewski, B. Boerschinger

and T. Schuster, "Tomayto, Tomahto. Beyond

Token-level Answer Equivalence for Question

Answering Evaluation," 2022.

R. Alqifari, "Question Answering Systems

Approaches and Challenges," in Proceedings of the

Student Research Workshop Associated with

RANLP 2019, 2019.

B. S. Bhatt and H. A. Pandya, "Question Answering

Survey: Directions, Challenges, Datasets,

Evaluation," 2021.

A. Rogers, O. Kovalera and A. Rumshisky, "A

Primer in BERTology: What We Know About How

BERT Works," Transactions of the Association for

Computational Linguistics, vol. 8, pp. 842-866,

D. Grießhaber, J. Maucher and N. T. Vu, "Finetuning BERT for Low-Resource Natural Language

Understanding," in Proceedings of the 28th

International Conference on Computational

Linguistics (COLING 2020), Barcelona, 2020.

Published

2025-06-23

Issue

Section

Prodi S1 Informatika