Fusion features for robust speaker identification
by Ines Ben Fredj; Youssef Zouhir; Kaïs Ouni
International Journal of Signal and Imaging Systems Engineering (IJSISE), Vol. 11, No. 2, 2018

Abstract: Speaker's identification systems aim to identify, through a set of speech parameters, the speaker's identity. Thus, a relevant speech representation is required. For this purpose, we suggest to combine spectral parameters as the Mel frequency Cepstral coefficients (MFCC) and the perceptual linear predictive (PLP) coefficients and prosodic parameter such as the signal fundamental frequency (F0). There are two main classes for F0 estimation divided into temporal and spectral methods. We employ the sawtooth waveform inspired pitch estimator (SWIPE) algorithm for F0 estimation. It is based on the pitch estimation in the frequency domain. In addition, we evaluate the Gaussian mixture model-universal background model (GMM-UBM) for the modelling purpose. Experiments are involved in Timit database. Identification rates are promising and prove the benefit of the combination for MFCC and PLP rather than using each feature separately and this mainly for noisy data.

Online publication date: Sun, 20-May-2018

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Signal and Imaging Systems Engineering (IJSISE):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com