Journal of Applied Science and Engineering

Published by Tamkang University Press

1.30

Impact Factor

2.10

CiteScore

Hongxia Sun1 and Xiong Wang This email address is being protected from spambots. You need JavaScript enabled to view it.1

1Luxun Academy of Fine Art, Dalian, 116000 China


 

Received: July 12, 2022
Accepted: August 11, 2022
Publication Date: August 23, 2022

 Copyright The Author(s). This is an open access article distributed under the terms of the Creative Commons Attribution License (CC BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are cited.


Download Citation: ||https://doi.org/10.6180/jase.202305_26(5).0015  


ABSTRACT


In the era of new technology, with the development of interactive application technology in museum exhibition design, the development of emotional demand for visiting experience has also received increasing attention. Emotional demand design is realized by means of image emotional semantic analysis and retrieval technology, combined with virtual reality technology and practice research of artificial intelligence technology of the exhibition design, which provides a new direction for the digital development of exhibition design field. Through the analysis of the development of museum technology and the demand for emotional value, the necessity of the current image emotional semantic analysis for museum construction and the feasibility of the application of image emotional semantic analysis are clarified, and the image semantic analysis strategy and the application method of museum image emotional semantic analysis are proposed. It provides a method to realize the exhibition effect of the integration of space and emotional experience, Scenarios blending, things blending with self, and emotion going along with reason, and to adapt to the new requirements of the development of the times. Finally, we make related experiments to show the effectiveness of the image emotion semantics.


Keywords: image emotional semantic analysis; museum exhibition; retrieval technology; redesign


REFERENCES


  1. [1] S. H. SER, (2019) “Engagement with Interactive Museum Collections: The Rise and Development of Interactive Museum Exhibitions in Thailand (2000-2019)" MFU Connexion: Journal of Humanities and Social Sciences 8(2): 77–86.
  2. [2] M. Shafiq, Z. Tian, A. Bashir, X. Du, and M. Guizani, (2021) “CorrAUC: A Malicious Bot-IoT Traffic Detection Method in IoT Network Using Machine-Learning Techniques" IEEE Internet of Things Journal 8(5): 3242–3254. DOI: 10.1109/JIOT.2020.3002255.
  3. [3] L. Statsenko and G. Corral de Zubielqui, (2020) “Customer collaboration, service firms’ diversification and innovation performance" Industrial Marketing Manage ment 85: 180–196. DOI: 10.1016/j.indmarman.2019.09.013.
  4. [4] T. Kijek and A. Matras-Bolibok, (2019) “The relationship between tfp and innovation performance: evidence from eu regions" Equilibrium. Quarterly Journal of Economics and Economic Policy 14(4): 695–709. DOI: 10.24136/eq.2019.032.
  5. [5] Q. Xia, Q. Cao, and M. Tan, (2020) “Basic research intensity and diversified performance: the moderating role of government support intensity" Scientometrics 125(1): 577–605. DOI: 10.1007/s11192-020-03635-x.
  6. [6] C. Powell and Y. Bodur, (2019) “Teachers’ perceptions of an online professional development experience: Implications for a design and implementation framework" Teaching and Teacher Education 77: 19–30. DOI: 10.1016/j.tate.2018.09.004.
  7. [7] A. Eichelberger and H. T. Ngo, (2018) “College students’ perception of an online course in special education" International Journal for Educational Media and Technology 12(2):
  8. [8] S. Yin and H. Li, (2020) “Hot Region Selection Based on Selective Search and Modified Fuzzy C-Means in Remote Sensing Images" IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 13: 5862–5871. DOI: 10.1109/JSTARS.2020.3025582.
  9. [9] L. Errichiello, R. Micera, M. Atzeni, and G. Del Chiappa, (2019) “Exploring the implications of wearable virtual reality technology for museum visitors’ experience: A cluster analysis" International Journal of Tourism Research 21(5): 590–605. DOI: 10.1002/jtr.2283.
  10. [10] M. Crippen, (2021) “Aesthetics and action: situations, emotional perception and the Kuleshov effect" Synthese 198: 2345–2363. DOI: 10.1007/s11229-019-02110-2.
  11. [11] Z. Xuan et al., (2022) “DRN-LSTM: A Deep Residual Network Based On Long Short-term Memory Network For Students Behaviour Recognition In Education" Journal of Applied Science and Engineering 26(2): 245–252. DOI: 10.6180/jase.202302_26(2).0010.
  12. [12] A. Jisi and S. Yin, (2021) “A new feature fusion network for student behavior recognition in education" Journal of Applied Science and Engineering (Taiwan) 24(2): 133–140. DOI: 10.6180/jase.202104_24(2).0002.
  13. [13] D. Rice and C. Zorn, (2021) “Corpus-based dictionaries for sentiment analysis of specialized vocabularies" Political Science Research and Methods 9(1): 20–35. DOI: 10.1017/psrm.2019.10.
  14. [14] J. Spence, (2021) “A Corpus Too Small: Uses of Text Data in a Hupa-English Bilingual Dictionary" International Journal of Lexicography 34(4): 413–436. DOI: 10.1093/ijl/ecab006.
  15. [15] D. Park, T. Kim, and S.-H. Lee, (2021) “Strong correspondence between prefrontal and visual representations during emotional perception" Human Brain Mapping 42(7): 2115–2127. DOI: 10.1002/hbm.25353.
  16. [16] D. Stylidis, K. Woosnam, and M. Ivkov, (2020) “Tourists’ emotional solidarity with residents: A segmentation analysis and its links to destination image and loyalty" Journal of Destination Marketing and Management 17: DOI: 10.1016/j.jdmm.2020.100458.
  17. [17] L. Wen, X. Li, and L. Gao, (2020) “A transfer convolutional neural network for fault diagnosis based on ResNet-50" Neural Computing and Applications 32(10): 6111–6124. DOI: 10.1007/s00521-019-04097-w.
  18. [18] S. Woo, J. Park, J.-Y. Lee, and I. Kweon, (2018) “CBAM: Convolutional block attention module" Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 11211 LNCS: 3–19. DOI: 10.1007/978-3-030-01234-2_1.
  19. [19] K. Hosny, M. Kassem, and M. Foaud, (2019) “Classification of skin lesions using transfer learning and augmentation with Alex-net" PLoS ONE 14(5): DOI: 10.1371/journal.pone.0217293.
  20. [20] L. Wang, Y. Shoulin, H. Alyami, A. Laghari, M. Rashid, J. Almotiri, H. Alyamani, and F. Alturise, (2022) “A novel deep learning-based single shot multibox detector model for object detection in optical remote sensing images" Geoscience Data Journal: DOI: 10.1002/gdj3.162.
  21. [21] D. Potdevin, C. Clavel, and N. Sabouret, (2021) “A virtual tourist counselor expressing intimacy behaviors: A new perspective to create emotion in visitors and offer them a better user experience?" International Journal of Human Computer Studies 150: DOI: 10.1016/j.ijhcs.2021.102612.
  22. [22] M. Kirtay, L. Vannucci, U. Albanese, C. Laschi, E. Oztop, and E. Falotico, (2021) “Emotion as an emergent phenomenon of the neurocomputational energy regulation mechanism of a cognitive agent in a decision-making task" Adaptive Behavior 29(1): 55–71. DOI: 10.1177/1059712319880649.