Continual learning on audio scene classification using representative data and memory replay GANs
Ibnu Daqiqil ID, Masanobu Abe, Sunao Hara
Abstract
This paper proposes a methodology aimed at resolving catastropic forgetting problem by choosing a limited portion of the historical dataset to act as a representative memory. This method harness the capabilities of generative adversarial networks (GANs) to create samples that expand upon the representative memory. The main advantage of this method is that it not only prevents catastrophic forgetting but also improves backward transfer and has a relatively stable and small size. The experimental results show that combining real representative data with artificially generated data from GANs, yielded better outcomes and helped counteract the negative effects of catastrophic forgetting more effectively than solely relying on GAN-generated data. This mixed approach creates a richer training environment, aiding in the retention of previous knowledge. Additionally, when comparing different methods for selecting data as the proportion of GAN-generated data increases, the low probability and mean cluster methods performed the best. These methods exhibit resilience and consistency by selecting more informative samples, thus improving overall performance.
Keywords
Audio scene classification; Continual learning; Generative adversarial model; Memory replay; Representative memory
DOI:
https://doi.org/10.11591/eei.v14i1.8127
Refbacks
There are currently no refbacks.
This work is licensed under a
Creative Commons Attribution-ShareAlike 4.0 International License .
<div class="statcounter"><a title="hit counter" href="http://statcounter.com/free-hit-counter/" target="_blank"><img class="statcounter" src="http://c.statcounter.com/10241695/0/5a758c6a/0/" alt="hit counter"></a></div>
Bulletin of EEI Stats
Bulletin of Electrical Engineering and Informatics (BEEI) ISSN: 2089-3191, e-ISSN: 2302-9285 This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU) .