LMS assessment: using IRT analysis to detect defective multiple-choice test items Online publication date: Fri, 24-Apr-2015
by Panagiotis Fotaris; Theodoros Mastoras
International Journal of Technology Enhanced Learning (IJTEL), Vol. 6, No. 4, 2014
Abstract: Due to the computerisation of assessment tests, the use of Item Response Theory (IRT) has become commonplace for educational assessment development, evaluation and refinement. When used appropriately by a Learning Management System (LMS), IRT can improve the assessment quality, increase the efficiency of the testing process and provide in-depth descriptions of item properties. This paper introduces a methodological and architectural framework which embeds an IRT analysis tool in an LMS so as to extend its functionality with assessment optimisation support. By applying a set of validity rules to the statistical indices produced by the IRT analysis, the enhanced LMS is able to detect several defective items from an item pool which are then reported for reviewing of their content. Assessment refinement is achieved by repeatedly employing this process until all flawed items are eliminated.
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Technology Enhanced Learning (IJTEL):
Login with your Inderscience username and password:
Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.
If you still need assistance, please email subs@inderscience.com