Sentiment analysis with hotel customer reviews using FNet
Shovan Bhowmik, Rifat Sadik, Wahiduzzaman Akanda, Juboraj Roy Pavel
Abstract
Recent research has focused on opinion mining from public sentiments using natural language processing (NLP) and machine learning (ML) techniques. Transformer-based models, such as bidirectional encoder representations from transformers (BERT), excel in extracting semantic information but are resourceintensive. Google’s new research, mixing tokens with fourier transform, also known as FNet, replaced BERT’s attention mechanism with a non-parameterized fourier transform, aiming to reduce training time without compromising performance. This study fine-tuned the FNet model with a publicly available Kaggle hotel review dataset and investigated the performance of this dataset in both FNet and BERT architectures along with conventional machine learning models such as long short-term memory (LSTM) and support vector machine (SVM). Results revealed that FNet significantly reduces the training time by almost 20% and memory utilization by nearly 60% compared to BERT. The highest test accuracy observed in this experiment by FNet was 80.27% which is nearly 97.85% of BERT’s performance with identical parameters.
Keywords
Bidirectional encoder representations from transformers; FNet; Fourier transform; Hotel reviews; Sentiment analysis; Transformer-based models
DOI:
https://doi.org/10.11591/eei.v13i2.6301
Refbacks
There are currently no refbacks.
This work is licensed under a
Creative Commons Attribution-ShareAlike 4.0 International License .
<div class="statcounter"><a title="hit counter" href="http://statcounter.com/free-hit-counter/" target="_blank"><img class="statcounter" src="http://c.statcounter.com/10241695/0/5a758c6a/0/" alt="hit counter"></a></div>
Bulletin of EEI Stats
Bulletin of Electrical Engineering and Informatics (BEEI) ISSN: 2089-3191 , e-ISSN: 2302-9285 This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU) .