Virtual guitar: using real-time finger tracking for musical instruments Online publication date: Mon, 15-Apr-2019
by Noorkholis Luthfil Hakim; Shih-Wei Sun; Mu-Hsen Hsu; Timothy K. Shih; Shih-Jung Wu
International Journal of Computational Science and Engineering (IJCSE), Vol. 18, No. 4, 2019
Abstract: Kinect, a 3D sensing device from Microsoft, invokes the human-computer interaction research evolution. Kinect was implemented in many areas including music. One was implemented in a virtual musical instrument (VMI) system, which uses natural gestures to produce synthetic sounds similar to real musical instruments. From related work, it was found that the use of a large joint such as hand, arm and leg is inconvenient and limits the way of playing VMI. Thus, this study proposes a fast and reliable finger tracking algorithm suitable for virtual musical instrument playing. In addition, a virtual guitar system application is developed as an implementation of the proposed algorithm. Experimental results show that the proposed method can be used to play many varieties of songs with an acceptable quality. Furthermore proposed application could be used by a beginner who does not have any experience in music or on playing a real musical instrument.
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Computational Science and Engineering (IJCSE):
Login with your Inderscience username and password:
Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.
If you still need assistance, please email subs@inderscience.com