Education, Science, Technology, Innovation and Life
Open Access
Sign In

Intelligent Interactive Space Art Based on the Needs of Smart Cities under Internet Technology

Download as PDF

DOI: 10.23977/artpl.2025.060102 | Downloads: 18 | Views: 502

Author(s)

Yuanlong Tian 1, Shengnan Wang 2

Affiliation(s)

1 School of Arts, Weifang University of Science and Technology, Shouguang City, Shandong Province, China
2 Department of Integrated Arts, Silla University, Busan Metropolitan City, South Korea

Corresponding Author

Shengnan Wang

ABSTRACT

Relying on the development of smart cities, the art of intelligent interactive space has received extensive attention, forcing traditional industries to start the road of intelligent transformation. How to improve the intelligence level of spatial interaction has attracted much attention. In view of this problem, it is of great significance to study the intelligent spatial interaction method. The application research of voice interaction technology is gradually expanding in spatial interaction, and its performance advantages are crucial for solving intelligent transformation problems. This article aims to study the art of intelligent interactive spaces based on the needs of smart cities under internet technology, and also analyzes the construction of voice signal processing, voice interaction technology, and interaction systems. The results indicate that: The interactive space art embedded in this system has a higher user experience satisfaction score than the traditional interactive art, with a difference of 27.57%. It can be seen that the system can meet the needs of intelligent interactive space art, and the level of intelligence and user satisfaction have been greatly improved.

KEYWORDS

Intelligent Space Voice Interaction System, Internet Technology, Voice Interaction, Smart City

CITE THIS PAPER

Yuanlong Tian, Shengnan Wang, Intelligent Interactive Space Art Based on the Needs of Smart Cities under Internet Technology. Art and Performance Letters (2025) Vol. 6: 7-16. DOI: http://dx.doi.org/10.23977/artpl.2025.060102.

REFERENCES

[1] Chen T, Wang Y C, Lin Z. Predictive distant operation and virtual control of computer numerical control machines [J]. Journal of Intelligent Manufacturing, 2017, 28(5):1061-1077.
[2] Hibbeln M, Jenkins J L, Schneider C, Valacich JS, Weinmann M. How Is Your User Feeling? Inferring Emotion Through Human-Computer Interaction Devices[J]. MIS Quarterly, 2017, 41(1):1-21.
[3] Rozado D, Niu J, Lochner M J. Fast Human-Computer Interaction by Combining Gaze Pointing and Face Gestures [J]. ACM Transactions on Accessible Computing, 2017, 10(3):1-18.
[4] Michalakis K, Aliprantis J, Caridakis G. Visualizing the Internet of Things: Naturalizing Human-Computer Interaction by Incorporating AR Features [J]. IEEE Consumer Electronics Magazine, 2018, 7(3):64-72.
[5] Correia N N, Tanaka A. From GUI to AVUI: Situating Audiovisual User Interfaces Within Human-Computer Interaction and Related Fields [J]. EAI Endorsed Transactions on Creative Technologies, 2021, 8(27):1-9.
[6] Birch B, Griffiths C A, Morgan A. Environmental effects on reliability and accuracy of MFCC based voice recognition for industrial human-robot-interaction[J]. Proceedings of the Institution of Mechanical Engineers, Part B: Journal of Engineering Manufacture, 2021, 235(12):1939-1948.
[7] Zhang H. Voice Keyword Retrieval Method Using Attention Mechanism and Multimodal Information Fusion[J]. Scientific Programming, 2021, 2021(8):1-11.
[8] Motta I, Quaresma M. Opportunities and Issues in the Adoption of Voice Assistants by Brazilian Smartphone Users [J]. Revista ErgodesignHCI, 2020, 7(1):138-149.
[9] Cho E, MD Molina, Wang J. The Effects of Modality, Device, and Task Differences on Perceived Human Likeness of Voice-Activated Virtual Assistants [J]. Cyberpsychology, Behavior, and Social Networking, 2019, 22(8): 515-520.
[10] Matteo, Ribet, Marco, Sabatini, Luca, Lampani, et al. Monitoring of a controlled space flexible multibody by means of embedded piezoelectric sensors and cameras synergy[J]. Journal of intelligent material systems and structures, 2018, 29(14):2966-2978.
[11] Greenberg S, Honbaek K, Quigley A, Reiterer H. Proxemics in Human-Computer Interaction[J]. Dagstuhl Reports, 2018, 3(11):29-57.
[12] Shneiderman, Ben. Revisiting the Astonishing Growth of Human–Computer Interaction Research[J]. Computer, 2017, 50(10):8-11.
[13] Sreekanth N S, Narayanan N K. Multimodal Human Computer Interactionwith Context Dependent Input Modality Suggestion and Dynamic Input Ambiguity Resolution[J]. International Journal of Engineering Trends and Technology, 2021, 69(5):152-165.
[14] Devi N, Easwarakumar K S. A Clinical Evaluation of Human Computer Interaction Using Multi Modal Fusion Techniques [J]. Journal of Medical Imaging & Health Informatics, 2017, 7(8):1759-1766.
[15] Yuan Q, Wang R, Pan Z, Xu S, Luo T. A Survey on Human-Computer Interaction in Spatial Augmented Reality[J]. Journal of Computer-Aided Design and Computer Graphics, 2021, 33(3):321-332.
[16] Pradip Kumar Sharm, Seo Yeon Moon, Jong Hyuk Par. Block-VN: A Distributed Blockchain Based Vehicular Network Architecture in Smart City [J]. Journal of Information Processing Systems, 2017, 13(1):184-195.
[17] Pereira G V, Macadar M A, Luciano E M, Testa, M G. Delivering public value through open government data initiatives in a Smart City context [J]. Information Systems Frontiers, 2017, 19(2):213-229.
[18] Anthopoulos L. Smart utopia VS smart reality: Learning by experience from 10 smart city cases [J]. Cities, 2017, 63(3):128–148.
[19] Shin J G, Jo I G, Wan S L, Sang HK. A Few Critical Design Parameters Affecting User's Satisfaction in Interaction with Voice User Interface of AI-Infused Systems [J]. Journal of the Ergonomics Society of Korea, 2020, 39(1):73-86. 
[20] Faramita R, Lestari D P, Niwanputri G S. E-commerce Design Interaction with Voice User Interface using User-centered Design Approach[J]. International Journal of New Media Technology, 2020, 6(2):104-108.

All published work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2016 - 2031 Clausius Scientific Press Inc. All Rights Reserved.