Journal of Applied Science and Engineering

Published by Tamkang University Press

ESCI jase impact factor scopus logo open access rate of Scopus journal

Emotion Recognition in Art Creation Using Visual Image Analysis Techniques

Zhen Xu

College of Creative Design, Hunan Vocational College for Nationalities, Yueyang, 414000, China

Received: September 9, 2025
Accepted: November 5, 2025
Publication Date: March 8, 2026

上傳圖片

The emotional transmission of art

 Copyright The Author(s). This is an open access article distributed under the terms of the Creative Commons Attribution License (CC BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are cited.

Download Citation:  RIS | BibTeX | http://dx.doi.org/10.6180/jase.202607_30.023  

Download PDF

Emotion drives artistic creativity and expression. Artists begin with emotional intent and convey it through visual language, which then evokes emotional responses in viewers. To support this transformation, educators need effective analytical methods. This study examines emotional recognition in artworks using visual image analysis. A dataset of 500 art images representing six emotions—affection, friendship, love, homesickness, patriotism, and sadness—was used. Images were collected from public sources and independently rated by three art experts, achieving strong agreement (Cohen’s κ = 0.87). The dataset was split into 70% training (350 images) and 30% testing (150 images), with balanced emotion categories. All images were resized to 256×256, converted to grayscale, and normalized before feature extraction. Among the tested methods, PCA performed best, achieving 94.5% exactness, 95.9% recall, 95.9% accuracy, and 97.6% precision. It was followed by LDA, stepwise regression, and a deep learning model. PCA showed the highest average accuracy (0.8567), with LDA close behind, while stepwise regression and the deep learning model reached 0.803 and 0.823. Both PCA and LDA produced low error rates (under 0.1). Overall, PCA and LDA effectively identify emotional patterns in artworks and support deeper understanding of how visual structure and composition convey emotion.

Keywords: Visual image analysis; Artistic creation; Emotional factor recognition method (Visual image analysis; Artistic creation; Emotional factor recognition method)

  1. [1] E. Segal and R. Sharan, (2005) “A discriminative model for identifying spatial cis-regulatory modules” Journal of Computational Biology 12(6): 822–834.
  2. [2] B. Yang, J. Yao, X. Yang, and Y. Shi. “Painting Image Classification Using Online Learning Algorithm”. In: Distributed, Ambient and Pervasive Interactions. DAPI 2017. Lecture Notes in Computer Science. Ed. by N. Streitz and P. Markopoulos. 10291. Springer, Cham, 2017. DOI: https://doi.org/10.1007/978-3-319-58697-7_29.
  3. [3] P. Jing, X. Liu, J. Wang, Y. Wei, L. Nie, and Y. Su, (2023) “StyleEDL: Style-guided high-order attention network for image emotion distribution learning” arXiv preprint arXiv:2308.03000:
  4. [4] Q. He, (2024) “Research on the influencing factors of artistic creation of fine arts painting in the context of visual culture” Applied Mathematics and Nonlinear Sciences 9(1):
  5. [5] R. Wang, K. Ding, J. Yang, and L. Xue, (2016) “A novel method for image classification based on bag of visual words” Journal of Visual Communication and Image Representation 40: 24–33.
  6. [6] H. M. Shahzad, S. M. Bhatti, A. Jaffar, S. Akram, M. Alhajlah, and A. Mahmood, (2023) “Hybrid facial emotion recognition using CNN-based features” Applied Sciences 13(9): 5572. DOI: https://doi.org/10.3390/app13095572.
  7. [7] M. Yalcin, H. Cevikalp, and H. S. Yavuz. “Towards Large-Scale Face Recognition Based on Videos”. In: 2015 IEEE International Conference on Computer Vision Workshop (ICCVW). Santiago, Chile: IEEE, 2015, 1078–1085. DOI: 10.1109/ICCVW.2015.141.
  8. [8] J. Y. Kuo, T. F. Hsieh, and T. Y. Lin, (2024) “Constructing multi-modal emotion recognition model based on convolutional neural network” Multimedia Tools and Applications 84: 31093–31118. DOI: https://doi.org/10.1007/s11042-024-20409-2.
  9. [9] H. Yang, Y. Fan, G. Lv, S. Liu, and Z. Guo, (2022) “Exploiting emotional concepts for image emotion recognition” The Visual Computer 39(5): 2177–2190.
  10. [10] P. L. Pablo and A. G. Tinio, (2018) “Characterizing the emotional response to art beyond pleasure: Correspondence between the emotional characteristics of artworks and viewers’ emotional responses” Psychology of Aesthetics, Creativity, and the Arts 237: 319–342.
  11. [11] Z. Yin, M. Zhao, Y. Wang, J. Yang, and J. Zhang, (2017) “Recognition of emotions using multimodal physiological signals and an ensemble deep learning model” Computer Methods and Programs in Biomedicine 140: 93–110.
  12. [12] Y. Zhai. “Research on Emotional Feature Analysis and Recognition in Speech Signal Based on Feature Analysis Modeling”. In: 2021 IEEE Asia-Pacific Conference on Image Processing, Electronics and Computers (IPEC). Dalian, China: IEEE, 2021, 1161–1164. DOI: https://doi.org/10.1109/IPEC51340.2021.9421211.
  13. [13] J. Yang, Q. Huang, T. Ding, D. Lischinski, D. CohenOr, and H. Huang, (2023) “EmoSet: A large-scale visual emotion dataset with rich attributes” arXiv preprint arXiv:2307.07961:
  14. [14] J. Pan, J. Lu, and S. Wang, (2024) “A multi-stage visual perception approach for image emotion analysis” IEEE Transactions on Affective Computing: DOI: https://doi.org/10.1109/TAFFC.2024.3372090.
  15. [15] R. Palanivel, D. K. R. Basani, B. R. Gudivaka, M. H. Fallah, and N. Hindumathy. “Support vector machine with tunicate swarm optimization algorithm for emotion recognition in human-robot interaction”. In: Proceedings of the 2024 International Conference on Intelligent Algorithms for Computational Intelligence Systems. 2024, 1–4. DOI: https://doi.org/10.1109/IACIS61494.2024.10721631.