Membandingkan Nilai Akurasi BERT dan DistilBERT pada Dataset Twitter

The growth of digital media has been incredibly fast, which has made consuming information a challenging task. Social media processing aided by Machine Learning has been very helpful in the digital era. Sentiment analysis is a fundamental task in Natural Language Processing (NLP). Based on the incre...

Full description

Saved in:
Bibliographic Details
Main Authors: Faisal Fajri, Bambang Tutuko, Sukemi Sukemi
Format: Article
Language:Indonesian
Published: Program Studi Sistem Informasi, Universitas Islam Negeri Raden Fatah Palembang 2022-12-01
Series:Jurnal Sistem Informasi
Subjects:
Online Access:https://jurnal.radenfatah.ac.id/index.php/jusifo/article/view/13885
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The growth of digital media has been incredibly fast, which has made consuming information a challenging task. Social media processing aided by Machine Learning has been very helpful in the digital era. Sentiment analysis is a fundamental task in Natural Language Processing (NLP). Based on the increasing number of social media users, the amount of data stored in social media platforms is also growing rapidly. As a result, many researchers are conducting studies that utilize social media data. Opinion mining (OM) or Sentiment Analysis (SA) is one of the methods used to analyze information contained in text from social media. Until now, several other studies have attempted to predict Data Mining (DM) using remarkable data mining techniques. The objective of this research is to compare the accuracy values of BERT and DistilBERT. DistilBERT is a technique derived from BERT that provides speed and maximizes classification. The research findings indicate that the use of DistilBERT method resulted in an accuracy value of 97%, precision of 99%, recall of 99%, and f1-score of 99%, which is higher compared to BERT that yielded an accuracy value of 87%, precision of 91%, recall of 91%, and f1-score of 89%.
ISSN:2460-092X
2623-1662