Title: VirSen1.0: toward sensor configuration recommendation in an interactive optical sensor simulator for human gesture recognition
Authors: Kana Matsuo; Chengshuo Xia; Yuta Sugiura
Addresses: Department of Information and Computer Science, Faculty of Science and Technology, Keio University 3-14-1, Hiyoshi, Kohoku-ku, Yokohama, 223-8522, Japan ' Department of Information and Computer Science, Faculty of Science and Technology, Keio University 3-14-1, Hiyoshi, Kohoku-ku, Yokohama, 223-8522, Japan ' Department of Information and Computer Science, Faculty of Science and Technology, Keio University 3-14-1, Hiyoshi, Kohoku-ku, Yokohama, 223-8522, Japan
Abstract: Research is underway on the use of sensor simulation in generating sensor data to design a real-world human gesture recognition system. The overall development process suffers from poor interactive performance, because developers lack an efficient tool to support the sensor configuration, result checking, and trial-and-error that arise when designing a machine learning system. Hence, we have developed VirSen1.0, a virtual environment with a user interface to support the process of designing a sensor-based human gesture recognition system. In this environment, a simulator produces lightness data and combines it with an avatar's motion to train a classifier. Then, the interface visualises the importance of the features used for the model, via the permutation feature importance, and it provides feedback on the effect of each sensor to the classifier. This paper proposes a complete development process, from acquisition of learning data to creation of a learning model, using a single software tool. Additionally, a user study confirmed that by visualising the importance of the features used in the model, users can create learning models that achieve a certain level of accuracy.
Keywords: sensor simulator; interactive system; optical sensor; machine learning; graphical user interface.
International Journal of the Digital Human, 2023 Vol.2 No.3, pp.223 - 241
Received: 03 Feb 2023
Accepted: 17 May 2023
Published online: 25 Aug 2023 *