AESN: A Database of Human Eyes, Noses, and Mouths for Person Identification or Facial Expression Recognition
DOI:
https://doi.org/10.56714/bjrs.49.2.13Keywords:
Facial Expression, Human Eyes, Images database, Person IdentificationAbstract
In this paper, we have constructed AESN, a database for regions of human faces. AESN is an improved dataset of human eyes, noses, and mouths that are clipped from facial photos. It was constructed to identify people, in addition to recognizing expressions of individuals. This database was collected from several resources, some from the Internet, and others taken by our cameras. We have got it through several types of multimedia, either directly from pictures or by extracting frames from videos. The purpose for needing a dataset for eyes was due to the emergence of the Corona pandemic, which obligated people to wear masks. In our work, we need AESN during developing a monitoring system to increase their ability to discover the identity of people through the eyes area only. It was considered a great challenge due to the loss of a large part of the face. We have accomplished promising results when using AESN images in our studies. In addition to the eyes, the remaining parts of the face (nose and mouth) have many interests due to the diversity of the shape of the nose, which mainly serves medical applications and the gestures that occur in the mouth as a result of emotional reflections that help us clarify the emotional state of the person, as well as helping researchers in finding biometrics features for recognition systems.
Downloads
References
M. D. Shah, C. Dhiman, International Journal of Computer Science and Mobile Computing 4(11), 117 (2015).
B. Rochwerger, C. Vázquez, D. Breitgand, D. Hadas, M. Villari, P. Massonet, F. Galán, ”An Architecture for Federated Cloud Computing, ” In Cloud Computing: Principles and Paradigms, 391 (2011).
A. J. Jalil, E. El-seidy, S. S. Daoud, N. M. Reda, Intelligent Systems And Applications, 11(2), 648 (2023).
A. Al Marouf,M. M.Mottalib, R. Alhajj, J. Rokn, O.Jafarullah, Bioengineering, 10(1), 25 (2022). Doi:https://doi.org/10.3390/bioengineering10010025
A. Akram, R. Debnath, Turkish Jornal of Electrical Engineering and Computer Sciences, 28(2), 917 (2020). Doi:https://doi.org/10.3906/elk-1905-42
M. N. R. Shuvo, S. Akter, M. A. Islam, S. Hasan, M. Shamsojjaman, T. Khatun, International Jornal of Advanced Computer Science and Applications, 12(3), 386 (2021). Doi:https://doi.org/10.14569/IJACSA.2021.0120346
A. Chaurasia, S. P. Singh, D. Kumar, 12(1), 137 (2017). Doi:https://doi.org/10.1007/97898110-3935-5_15
W. N. Widanagamaachchi, A. T. Dharmaratne, Research Gate, 1 (2009). Doi:https://doi.org/10.1007/978-981-10-3935-5_15
Y. Tang, Y.Zhang, X.Han, F. Zhang, Y. Lai, R. Tong, Computational Visual Media 8(2), 225(2022). Doi:https://doi.org/10.1007/s41095021-0237-5
M. Hassaballah, K. Murakami, S. Ido, In Proceedings of 12th IAPR Conference: Machine Vission and Applications, Japan, 406 (2011).
J. Kaur, International Journal of Computational Intelligence Research 13(5), 707 (2017). Doi:https://doi.org/ 10.5281/zenodo.4898039
M. K. Sri, P. N. Divya2, J. Vyshnavi, B. Tejaswini, International journal of Current Engineering and Scientific Research 5(4), 375 (2018).
CelebFaces Attributes (CelebA) Dataset | Kaggle. Avaliable:https://www.kaggle.com/datasets/jessicali9530/celeba-dataset (Accessed: 17 March 2023).
Caltech 101. Avaliable: https://data.caltech.edu/records/mzrjq-6wc02 (Accessed: 28 January 2023).
M. R. Goldsmith, C. M. Grulke, D.T. Chang, T. R. Transue, S. B. Little, J. R. Rabinowitz, R. Tornero-Velez, Dataset Papers in Science, 2014, 1 (2014). Doi:https://doi.org/10.1155/2014/421693
H. Chen, H. Takamura, H. Nakayama, In Proceedings of Conference: Findings of the Association for Computational Linguistics: EMNLP 2021, Dominican Republic, 1483 (2021). Doi:https://doi.org/10.18653/v1/2021.findings-emnlp.128

Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2023 J. Basrah Res. (Sci.)

This work is licensed under a Creative Commons Attribution 4.0 International License.