SVM-based Relevance Feedback for semantic video retrieval Online publication date: Tue, 29-Jun-2010
by Hadi Sadoghi Yazdi, Malihe Javidi, Hamid Reza Pourreza
International Journal of Signal and Imaging Systems Engineering (IJSISE), Vol. 2, No. 3, 2009
Abstract: This paper presents a novel method for efficient key frame extraction from video shot representation and employs a Support-Vector-Machine-based Relevance Feedback (SVM-RF) to bridging semantic gap between low-level feature and high-level concepts of shots. We introduce a new approach for key frame extraction using a hierarchical approach based on clustering. Using this key frame representation, the most representative key frame is then selected for each shot. Furthermore, our system incorporates user to judge about the result of retrieval and labelled retrieved shot in two groups, relevant and irrelevant. Then, by mean feature of relevant and irrelevant shots train an SVM classifier. In the next step, video database is classified in two groups, relevant and irrelevant shots. Suitable Graphic User Interface (GUI) is shown for capturing RF of user. This process continued until user satisfied with results. The proposed system is checked over collected shots from Trecvid2001 database and home videos include 800 shots of different concepts (10 semantic groups). Experimental results demonstrate the effectiveness of the proposed method.
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Signal and Imaging Systems Engineering (IJSISE):
Login with your Inderscience username and password:
Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.
If you still need assistance, please email subs@inderscience.com