Boosting speech recognition performance: a robust and accurate ensemble method based on HMMs
by Samira Hazmoune; Fateh Bougamouza; Smaine Mazouzi; Mohamed Benmohammed
International Journal of Intelligent Systems Technologies and Applications (IJISTA), Vol. 22, No. 1, 2024

Abstract: In this paper, we propose an ensemble method based on hidden Markov models (HMMs) for speech recognition. Our objective is to reduce the impact of the initial setting of training parameters on the final model while improving accuracy and robustness, particularly in speaker independent systems. The main idea is to exploit the sensitivity of HMMs to the initial setting of training parameters, thus creating diversity among the ensemble members. Additionally, we perform an experimental study to investigate the potential relationship between initial training parameters and ten diversity measures from literature. The proposed method is assessed on a standard dataset from the UCI machine-learning repository. Results demonstrate its effectiveness in terms of accuracy and robustness to intra-class variability, surpassing basic classifiers (HMM, KNN, NN, SVM) and some previous works in the literature including those using deep learning algorithms such as convolutional neural networks (CNNs) and long short-term memory (LSTM).

Online publication date: Mon, 05-Feb-2024

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Intelligent Systems Technologies and Applications (IJISTA):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com