Incorporating label-wise thresholding and class-imbalanced strategy into binary relevance for online multi-label classification
by Kunyong Hu; Tingting Zhai
International Journal of Intelligent Information and Database Systems (IJIIDS), Vol. 16, No. 4, 2024

Abstract: Binary relevance (BR) is widely used to solve multi-label classification problems. Typically, all binary classifiers in BR use a shared global fixed threshold to convert predicted values to binary classification results. However, some studies disclosed that tuning a separate threshold per label is better than a fixed global threshold. Given this discovery, in this paper, we adaptively train a thresholding model for the scoring model of each binary classifier in BR. By solving an online convex optimisation problem that minimises a logistic loss function, both models can be updated simultaneously. Furthermore, each binary classifier may suffer from the class-imbalance problem. To this end, we design three cost-sensitive strategies to adjust the misclassification cost of relevant and irrelevant labels for each binary classifier. An efficient closed-form update can be obtained by solving our formulated problem. Extensive experiments on multiple datasets demonstrate that our methods outperform other state-of-the-art methods.

Online publication date: Tue, 01-Oct-2024

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Intelligent Information and Database Systems (IJIIDS):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com