Efficient transformer architecture for sarcasm detection: a study on compression and performance

Parul Dubey, Amit Mishra, Aruna Singh, Murtuza Murtuza, Akshita Chanchlani, Pushkar Dubey

Abstract


This sarcasm detection is a crucial subtask in natural language processing (NLP) particularly for sentiment analysis and conversational AI. Its complexity lies in interpreting context, tone, and intent beyond literal meanings. Traditional models often struggle to capture such nuances, especially in informal and diverse language settings. Moreover, existing approaches lack computational efficiency and fail to adapt well across different domains. This study evaluates three benchmark datasets—News Headlines, Mustard, and Reddit (SARC)—representing structured, scripted, and conversational sarcasm, respectively. Each dataset poses unique linguistic and contextual challenges. The proposed methodology integrates transformer-based models (RoBERTa and DistilBERT) with context summarization using BART and metadata embedding. A comparative analysis is conducted on both linguistic accuracy and computational efficiency. The novelty lies in aligning sarcasm detection performance with architectural optimization for real-time deployment. Evaluation is conducted using accuracy, F1-score, Jaccard coefficient, precision, and recall. Results show that RoBERTa delivers peak performance, while DistilBERT achieves a 1.74× speedup with competitive results, making it suitable for scalable and efficient sarcasm detection.

Keywords


Context summarization; DistilBERT; Natural language processing; Sarcasm detection; Transformer model

Full Text:

PDF


DOI: https://doi.org/10.11591/eei.v15i1.11102

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Bulletin of EEI Stats

Bulletin of Electrical Engineering and Informatics (BEEI)
ISSN: 2089-3191, e-ISSN: 2302-9285
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).