UPSI Digital Repository (UDRep)
|
|
|
Abstract : |
Item response theory (IRT) offers some advantages over classical test theory and has been widely used to analyze dichotomous types of data in educational testing. This study aims to explore which is the most appropriate model to be used in the analysis of dichotomous items of the Anatomy and Physiology course. The study involved 971 nursing students studying in the Ministry of Health Malaysia training colleges. Exploratory factor analysis was performed on the data of the final examination paper containing 40 multiple-choice items. Results of the analysis showed that the unidimensionality and local independence assumptions were met. Data calibration was performed using an IRT-based software, Xcalibre based on the negative twice the log-likelihood statistic (-2LL). Results showed that the 3PL model is the most appropriate model for analyzing the data of the study. This study concludes that the 3PL model should be given a priority in analyzing the dichotomously scored items that involve guessing elements. |
References |
1. Azrilah, A., Mohd Saidfudin, M., & Azami, Z. (2013). Asas model pengukuran Rasch: Pembentukan skala & struktur pengukuran. Bangi: Universiti Kebangsaan Malaysia. 2. Baker, F. B. (2001). The basic of item response theory (2nd ed.). Wisconsin: ERIC Clearinghouse on Assessment and Evaluation. 3. Baker, F. B., & Kim, S.-H. (2004). Item response theory: Parameter estimation techniques. New York: Marcel Dekker, Inc. 4. Chiu, T.-W., & Camilli, G. (2012). Comment on 3PL IRT adjustment for guessing. Applied Psychological Measurement, 1-11. 5. CTB/McGraw-Hill. (2008). Accuracy of the test scores: Why IRT models matter. McGraw-Hill Companies Inc. 6. De Ayala, R. J. (2009). The theory and practice of item response theory. New York: The Guilford Press. 7. DeMars, C. (2010). Item response theory: Understanding statistic measurement. New York: Oxford University Press, Inc. 8. Dimitrov, D. M., & Shelestak, D. (2003). Psychometric analysis of performance on categories of client needs and nursing process With the NLN Diagnostic Readiness Test. Journal of Nursing Measurement, 11(3), 207-223. 9. Embretson, S. E., & Reise, S. P. (2000). Item response theory for psychologists. Mahwah, New Jersey: Lawrence Erlbaum Associates, Inc. 10. Gao, S. (2011). The exploration of the relationship between guessing and latent ability in IRT models. Illinois: University at Carbondale. 11. Guyer, R., & Thompson, N. (2011). Item response theory parameter recovery using Xcalibre™ 4.1. Saint Paul, MN: Assessment Systems Corporation. 12. Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. USA: Sage Publications, Inc. 13. He, Q. (2010). Estimating the reliability of composite scores. Conventry, UK: The Office of Qualification and Examination Regulation. 14. Jöreskog, K. G., & Moustaki, I. (2001). Factor analysis of ordinal variables: A comparison of three approaches. Multivariate Behavioral Research, 347-387. 15. Kementerian Kesihatan Malaysia. (2011). Garis panduan penyediaan item anggota sains kesihatan bersekutu. Putrajaya: Kementerian Kesihatan Malaysia. 16. Krylovas, A., & Kosareva, N. (2011). Item response theory applications for social phenomena modeling. Societal Studies, 3(1), 77–93. 17. Lord, F. M. (1980). Application of item response theory to practical testing problem. N.J., Erlbaum: Hillsdale. 18. Lord, F. M., & Novick, M. R. (1968). Statistical theories of mental test scores. Reading, Massachusetts: Addison-Wesley. 19. Meyer, P. J., & Shi-Zhu. (2013). Fair and equitable measurement of student Learning in MOOCs: An introduction to item Response theory, scale linking, and score equating. Research & Practice in Assessment, 8, 26-39. 20. Ministry of Health. (2003). Diploma in nursing curriculum. Kuala Lumpur: Training Division. 21. Ministry of Health. (2009). Teacher’s guide Year 1 Semester I: Diploma in nursing. Kuala Lumpur: Ministry of Health. 22. Rao, C. R., & Sinharay, S. (2007). Handbook of statistic 26. Netherland: Elsevier. 23. Ruscio, J., & Roche, B. (2012). Determining the number of factors to retain in an exploratory factor analysis using comparison data of known factorial structure. Psychological Assessment, 24, 282-292. 24. Thorpe, G. L., & Favia, A. (2012). Data analysis using item response theory methodology: An introduction to selected programs and applications. Psychology Faculty Scholarship, 20, 1-33. 25. Wang, W. C., & Wilson, M. (2005). Exploring local item dependence using a random effects facet model. Applied Psychological Measurement, 29(4), 296-318. |
This material may be protected under Copyright Act which governs the making of photocopies or reproductions of copyrighted materials. You may use the digitized material for private study, scholarship, or research. |