Human Gender Prediction on Facial Mobil Images using Convolutional Neural Networks

Mehmet Hacibeyoglu, Mohammed Hussein Ibrahim


The interest in automatic gender classification has increased rapidly, especially with the growth of online social networking platforms, social media applications, and commercial applications. Most of the images shared on these platforms are taken by mobile phone with different expressions, different angles and low resolution. In recent years, convolutional neural networks have become the most powerful method for image classification. Many researchers have shown that convolutional neural networks can achieve better performance by modifying different network layers of network architecture. Moreover, the selection of the appropriate activation function of neurons, optimizer and the loss function directly affects the performance of the convolutional neural networks. In this study, we propose a gender classification system from facial images taken by mobile phone using convolutional neural networks. The proposed convolutional neural networks have a simple network architecture with appropriate parameters can be used when rapid training is needed with the amount of limited training data. In the experimental study, the Adience benchmark dataset was used with 17492 different images with different gender and ages. The classification process was carried out by 10-fold cross validation. According the experimental results, the proposed convolutional neural networks predicted the gender of the images 98.8% correctly for training and 89.1% for testing.


Convolutional neural networks; deep learning; facial mobile images; gender classification

Full Text:

Submitted: 2018-08-02 11:45:34
Published: 2018-09-26 07:04:22
Search for citations in Google Scholar
Related articles: Google Scholar


J Richard Udry. The nature of gender. Demography, 31(4):561–573, 1994

Wu, Yingxiao, et al. "Human gender classification: A review." arXiv preprint arXiv:1507.05122 (2015).

Beckwith, L., Burnett, M., Wiedenbeck, S., & Grigoreanu, V. (2006, May). Gender hci: Results to date regarding issues in problem-solving software. In AVI 2006 Gender and Interaction: Real and Virtual Women in a Male World Workshop paper (pp. 1-4).

Yu, S., Tan, T., Huang, K., Jia, K., & Wu, X. (2009). A study on gait-based gender classification. IEEE Transactions on image processing, 18(8), 1905-1910.

Hoffmeyer-Zlotnik, J. H., Hoffmeyer-Zlotnik, J. H., & Wolf, C. (Eds.). (2003). Advances in cross-national comparison: A European working book for demographic and socio-economic variables. Springer Science & Business Media.

Demirkus, M., Garg, K., & Guler, S. (2010, April). Automated person categorization for video surveillance using soft biometrics. In Biometric Technology for Human Identification VII (Vol. 7667, p. 76670P). International Society for Optics and Photonics.

Marquardt, J., Farnadi, G., Vasudevan, G., Moens, M. F., Davalos, S., Teredesai, A., & De Cock, M. (2014, January). Age and gender identification in social media. In Proceedings of CLEF 2014 Evaluation Labs (pp. 1129-1136).

Golomb, B. A., Lawrence, D. T., & Sejnowski, T. J. (1990, October). Sexnet: A neural network identifies sex from human faces. In NIPS (Vol. 1, p. 2).

O'toole, A. J., Vetter, T., Troje, N. F., & Bülthoff, H. H. (1997). Sex classification is better with three-dimensional head structure than with image intensity information. Perception, 26(1), 75-84.

Edelman, B., VALEntin, D., & Abdi, H. (1998). Sex classification of face areas: How well can a linear neural network predict human performance?. Journal of Biological Systems, 6(03), 241-263.

Moghaddam, B., & Yang, M. H. (2002). Learning gender with support faces. IEEE Transactions on Pattern Analysis and Machine Intelligence, 24(5), 707-711.

Baluja, S., & Rowley, H. A. (2007). Boosting sex identification performance. International Journal of computer vision, 71(1), 111-119.

Khan, S. A., Ahmad, M., Nazir, M., & Riaz, N. (2014). A comparative analysis of gender classification techniques. Middle-East Journal of Scientific Research, 20(1), 1-13.

Levi, G., & Hassner, T. (2015). Age and gender classification using convolutional neural networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops (pp. 34-42).

Dhomne, A., Kumar, R., & Bhan, V. (2018). Gender Recognition Through Face Using Deep Learning. Procedia Computer Science, 132, 2-10.

Duan, M., Li, K., Yang, C., & Li, K. (2018). A hybrid deep learning CNN–ELM for age and gender classification. Neurocomputing, 275, 448-461.

Rodríguez, P., Cucurull, G., Gonfaus, J. M., Roca, F. X., & Gonzalez, J. (2017). Age and gender recognition in the wild with deep attention. Pattern Recognition, 72, 563-571.

Schmidhuber, J. (2015). Deep learning in neural networks: An overview. Neural networks, 61, 85-117.

Aloysius, N., & Geetha, M. (2017, April). A review on deep convolutional neural networks. In Communication and Signal Processing (ICCSP), 2017 International Conference on (pp. 0588-0592). IEEE.

Özkan İ. & Ülker, E. Derin Öğrenme ve Görüntü Analizinde Kullanılan Derin Öğrenme Modelleri. Gaziosmanpaşa Bilimsel Araştırma Dergisi, 6(3), 85-104., Accessed on: Agu. 1, 2018.

Sermanet, P., Chintala, S., & LeCun, Y. (2012, November). Convolutional neural networks applied to house numbers digit classification. In Pattern Recognition (ICPR), 2012 21st International Conference on (pp. 3288-3291). IEEE.

Nair, V., & Hinton, G. E. (2010). Rectified linear units improve restricted boltzmann machines. In Proceedings of the 27th international conference on machine learning (ICML-10) (pp. 807-814).

Tieleman, T., & Hinton, G. (2012). Lecture 6.5-rmsprop: Divide the gradient by a running average of its recent magnitude. COURSERA: Neural networks for machine learning, 4(2), 26-31.

Eidinger, E., Enbar, R., & Hassner, T. (2014). Age and gender estimation of unfiltered faces. IEEE Transactions on Information Forensics and Security, 9(12), 2170-2179.

Abstract views:


Copyright (c) 2018 International Journal of Intelligent Systems and Applications in Engineering

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
© Prof.Dr. Ismail SARITAS 2013-2018     -    Address: Selcuk University, Faculty of Technology 42031 Selcuklu, Konya/TURKEY.