Performance evaluation of generative adversarial networks for generating mugshot images from text description

Nur Nabilah Bahrum, Samsul Setumin, Nor Azlan Othman, Mohd Ikmal Fitri Maruzuki, Mohd Firdaus Abdullah, Adi Izhar Che Ani


The process of identifying photos from a sketch has been explored by many researchers, and the performance of the identification process is almost perfect, particularly for viewed sketches. Suspect identification based on sketches is one of the applications in forensic science. To identify the suspect using these kinds of methods, a face sketch is required. Hence, the methods require skilled artists to sketch the suspect based on descriptions provided by eyewitnesses. However, the skills of these artists are different from one another, which results in different rendered sketches. Therefore, this work attempts to propose a new identification method based only on forensic face-written descriptions. To investigate the feasibility of the proposed method, this study has evaluated the performance of some text-to-photo generators on both viewed and forensic datasets using three different models of GAN which are SAGAN, DFGAN, and DCGAN. Then, the generated images are compared to the real photo contained within those datasets to evaluate how well the proposed method recognizes the faces. The results demonstrated that the recognition rate for the generated photos by the DCGAN models is better than the other two models which achieve a 38.3% recognition rate at rank-10 for mugshot identification.


Face recognition; Face sketch recognition; Face sketch synthesis; Forensic sketch; Generative adversarial network; Text to photo

Full Text:




  • There are currently no refbacks.

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Bulletin of EEI Stats

Bulletin of Electrical Engineering and Informatics (BEEI)
ISSN: 2089-3191, e-ISSN: 2302-9285
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).