A precursor to successful automatic child exploitation material recognition is the ability to automatically identify pornography (largely solved) involving children (largely unsolved). Identifying children’s faces in images previously labelled as pornographic can provide a solution. Automatic child face detection plays an important role in online environments by facilitating Law Enforcing Agencies (LEA) to track online child abuse, bullying, sexual assault, but also can be used to detect cybercriminals who are targeting children to groom up them with a view of molestation later. Previous studies have investigated this problem in an attempt to identify only children faces from a pool of adult faces, which aims to extract information from the basic low- and high-level features i.e., colour, texture, skin tone, shape, facial structures etc. on child and adult faces. Typically, this is a machine learning-based architecture that accomplish a categorization task with the aim of identifying a child face, given a set of child and adult faces using classification technique based on extracted features from the training images. In this paper, we present a deep learning methodology, where machine learns the features straight away from the training images without having any information provided by humans to identify children faces. Compared to the results published in a couple of recent work, our proposed approach yields the highest precision and recall, and overall accuracy in recognition.
|Title of host publication||Deep Learning Applications for Cyber Security|
|Editors||Mamoun Alazab, MingJian Tang|
|Number of pages||9|
|Publication status||Published - 2019|
|Name||Advanced Sciences and Technologies for Security Applications|
Islam, M., Mahmood, A. N., Watters, P., & Alazab, M. (2019). Forensic detection of child exploitation material using deep learning. In M. Alazab, & M. Tang (Eds.), Deep Learning Applications for Cyber Security (pp. 211-219). (Advanced Sciences and Technologies for Security Applications). Springer Nature. https://doi.org/10.1007/978-3-030-13057-2_10