UPSI Digital Repository (UDRep)
|
|
|
Abstract : |
This study is drawn from a larger study on the effectiveness of the Cultural Arts Guidance Program (PBSB) in Malaysian schools. It is a joint programme between the Department of National Culture & Arts (JKKN) and the Ministry of Education Malaysia. The PBSB effectiveness study was conducted in 2013 to help JKKN improve the programme implementation and set forth the future direction of PBSB. The three most popular areas of cultural arts, namely dance, music and theatre were studied. Several assessment instruments were developed based on the objectives of PBSB and the modules used in the programme. This study focuses only on the development and re-validation of the basic knowledge test of music used in the PBSB effectiveness study. The present article discusses the background of PBSB, some important findings from the PBSB effectiveness study and the psychometric characteristics of the items in the test from the perspective of the Item Response Theory (IRT). The test of multiple-choice items was administered to 437 PBSB students in primary and secondary schools that were selected through stratified random sampling technique. Data from the study were re-analysed using IRT to further establish the reliability and validity of the test. Overall, the test was found to possess sound psychometric characteristics as reflected by the model fit, the item-person map, reliability and validity of ability estimates and the difficulty, discrimination and guessing parameters. The test can be used to complement the existing assessment systems in PBSB, but different tests should be developed for each module. |
References |
1. Chong, H. Y. (2013). A simple guide to the Item Response Theory (IRT) and Rasch modeling. Retrieved from http://www.creative-wisdom.com/computer/sas/IRT.pdf, downloaded on 2 March 2015. 2. de Ayala, R. J., & Hertzog, M. A. (1991). The assessment of dimensionality for use in item response theory. Multivariate Behavioral Research, 26, 765-792. 3. DeMars, C. (2010). Item Response Theory: Understanding statistic measurement. New York, NY: Oxford University Press, Inc. 4. Dimitrov, D. M., & Shelestak, D. (2003). Psychometric analysis of performance on categories of client needs and nursing process With the NLN Diagnostic. Journal of Nursing Measurement, 11 (3), 207-223 . 5. Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of Item Response Theory. USA: Sage Publications, Inc. 6. Hambleton, R. K., & Swaminathan, H. (1985). Item Response Theory: Principles and applications. Boston: Kluwer-Nijhoff. 7. Lord, F. M. (1980). Application of Item Response Theory to practical testing problem. Hillsdale, New Jersey: L Erlbaum Associates . 8. Lord, F. M., & Novick, M. R. (1968). Statistical theories of mental test scores. Reading, Massachusetts: Addison-Wesley. 9. Meyer, P. J., & Shi-Zhu. (2013). Fair and equitable measurement of student Learning in MOOCs: An introduction to Item Response theory, scale linking, and score equating. Research & Practice in Assessment, 8, 26-39. 10. Reckase, M. D. (1979). Unifactor latent trait models applied to multi-factor tests: Results and implications. Journal of Educational Statistics, 4, 207-230. 11. Siti Eshah Mokshein et. al. (2015). Penilaian keberkesanan program bimbingan seni budaya (PBSB) di sekolah-sekolah Malaysia [Evaluation of the effectiveness of the cultural arts program (PBSB) in Malaysian schools]. Malaysia: UPSI Publisher. 12. Thissen, D. (1991). MULTILOG user’s guide: Multiple, Categorical item analysis and test scoring using item response theory. Chicago: Scientific Software. |
This material may be protected under Copyright Act which governs the making of photocopies or reproductions of copyrighted materials. You may use the digitized material for private study, scholarship, or research. |