UPSI Digital Repository (UDRep)
Start | FAQ | About
Menu Icon

QR Code Link :

Type :article
Subject :QA Mathematics
Main Author :Ayanwale, Musa Adekunle
Title :Calibration of polytomous response Mathematics achievement test using generalized partial credit model of item response theory
Place of Production :Tanjong Malim
Publisher :Fakulti Sains dan Matematik
Year of Publication :2021
Corporate Name :Universiti Pendidikan Sultan Idris

References

[1] Oladipupo-Abodunwa, T.O., Adeleke, J. O. & Ayanwale, M..A. (2019). Student Mathematics Engagement:Development and Validation of a Measurement Instrument, African J. Behav. Scale Dev. Res., vol. 1, no.2, pp. 17–23.

[2] Ayanwale, M.A. & Adeleke, J.O. (2020). Efficacy of Item Response Theory in the Validation and Score Ranking of Dichotomous Response Mathematics Achievement Test. Bulg. J. Sci. Educ.Policy, vol. 14,no. 2, pp. 260–285. Accessed: May 31, 2021. Available: https://www.academia.edu/45182779/

[3] Akinsola, M. K. (1994). Comparative effects of mastery learning and enhanced mastery Learning strategies on Learners’ Achievement and Self-concept Mathematics. Unpublished PhD Thesis. Faculty of Education. University of Ibadan. xvii+205pp.

[4] Umameh, M. A. (2011). A Survey of Factors Responsible for Learners’ Poor Performance in Mathematics in Senior Secondary School Certificate Examination (SSCE) in Idah Local Government Area of Kogi State, Nigeria. Unpublished M.Ed Dissertation. Faculty of Education. University of Ibadan.

[5] Awopeju, O. A. & Afolabi, E. R. I. (2016). Comparative Analysis of Classical Test Theory and Item Response Theory Based Item Parameter Estimates of Senior School Certificate Mathematics Examination. European Scientific Journal 12:263-284.

[6] Asikhia, O. A. (2010). Learners and teachers’ perception of the causes of poor academic Performance in Ogun state secondary schools: Implications for counseling for National development. European Journal of Social sciences 13.2: 28-36.

[7] Adegoke, B. A. (2011). Effect of direct Teacher influence on dependent-prone Learners’ Learning outcomes in secondary school mathematics. Electronic Journal of Research in Educational Psychology 9: 283 – 308.

[8] Abina, D. B. (2014). Influence of teacher characteristics, availability and utilization of instructional materials on learners’ performance in mathematics. Unpublished PhD Thesis. Faculty of Education. University of Ibadan. xiv+193pp.

[9] Adewale, J.G., Adegoke, B.A., Adeleke, J.O. & Metibemu, M.A. (2017). A Training Manual onItem Response Theory, 1st ed. Ibadan: Institute of Education, University of Ibadan in Collaboration with National Examinations Council, Minna, Niger State.

[10] National Examinations Council (2012). Chief Examiners Report in Mathematics. Retrieved on August 6, 2019 from http://www.mynecoexams.com/examiners report.html

[11] National Examinations Council (2013): Chief Examiners Report in Mathematics. Retrieved on August 6, 2019 from http://www.mynecoexams.com/examiners report.html

[12] Rupp, A. A. (2009). Item Response Theory modeling with Bilog-MG and Multilog for windows. International Journal of Testing 3.4: 365-384.

[13] Ostini, R.& Nering, M. L. (2006). Polytomous Item Response Theory Models. Thousand Oaks: Sage Publication.

[14] Bejar, I.I. (1997). An application of the continuous response level model to personality Measurement. Applied Psychological Measurement 1: 509-521.

[15] Masters, G. N. (1988). The analysis of partial credit scoring. Applied Psychological Measurementin Education 1: 279-297.

[16] Masters, G. N. (1982). A Rasch Model for Partial Credit Scoring. Psychometric,47:149-174.

[17] Ayanwale, M.A. (2019). Efficacy of Item Response Theory in the Validation and Score Ranking of Dichotomous and Polytomous Response Mathematics Achievement Tests in Osun State, Nigeria. doi: 10.13140/RG.2.2.17461.22247.

[18] Samejima, F. (1969). Estimation of Latent Ability using a Response Pattern of Graded Scores. Psychometrica,Monograph Supplements No. 17.

[19] Muraki, E. (1992). A Generalized Partial Credit Model: Application of an EM algorithm. Journal of Applied Psychological Measurement 16: 159-176.

[20] Yen, W.M. (1992). Item response theory. Encyclopedia of Educational Research. 6th ed. NY: Macmillian. 657-667.

[21] Muraki, E. (1990). Fitting a polytomous item response model to Likert-type data. Journal of Applied Psychological Measurement 14: 59-71.

[22] Adedoyin, C. (2010). Investigating the Invariance of Person Parameter Estimates based on Classical Test and Item Response Theories. An International Journal on Education Science 2: 107- 113.

[23] Adegoke (2013). Comparison of item statistics of physics achievement test using Classicaltest theory and item response theory frameworks. Journal of Education and Practice 4.22: 87 – 96.

[24] Adegoke (2014). Effects of Item-pattern scoring method on Senior Secondary School Learners Ability Scores in Physics Achievement Test. West African Journal of Education Vol. XXIV: 181-190.

[25] Ayanwale, M.A., Adeleke, J.O. & Mamadelo, T.I. (2018). An assessment of item statistics estimates of Basic Education Certificate Examination through Classical Test Theory and Item Response Theory approach. International Journal of Educational Research Review, 3(4), 55-67. Doi: 10.24331/ijere.452555.

[26] Enu, V.O. (2015). Using item response theory for the validation and calibration of mathematicsand eography items of Joint Command Schools Promotion Examination in Nigeria. Unpublished Doctoral Thesis. Institute of Education. University of Ibadan.

[27] Fakayode, O. (2018). Comparing CTT and IRT measurement frameworks in the estimation of item parameters, scoring and test equating of West African Examinations Council Mathematics Objective Test for June and November, 2015. Unpublished PhD thesis. Institute of Education, University of Ibadan.

[28] Ogbebor, U.C. (2017). Construct of Mock Economics Test for Senior Secondary School Learners in Delta State, Nigeria using Classical Test and Item response theories. Unpublished PhD thesis. Institute of Education. University of Ibadan

[29] Ojerinde D. (2013). Classical Test Theory (CTT) vs Item Response Theory (IRT): An Evaluation of the Comparability of Item Analysis Results. Paper Presentation at the Institute of Education. University of Ibadan. May 23, 2013.

[30] Umobong, M.E. & Jacob, S.S. (2016). A Comparison of Classical and Item Response Theory Person/Item Parameters of Physics Achievement Test for Technical Schools. African Journal of Theory and Practice of Educational Assessment, Vol.4,115-131.

[31] DeMars, C. (2010). Item Response Theory. Understanding statistics measurement. City: Oxford University Press.

[32] Embretson, S.E. & Reise, S.P. (2000). Item response theory for psychologist. Mahwah, New Jersy: Lawrence Erlbaum Associates.

[33] Grima, A. M. & Weichun, W. M. (2002). Test Scoring: Multiple-Choice and Constructed-Response Items. Paper presented at the annual meeting of the American Educational Research Association, New Orleans.

[34] Watkins, M.W. (2006). Determining Parallel Analysis Criteria. Journal of Modern Applied Statistical Methods, 5(2), 344-346

[35] Ledesma, R.D. &Valero-Mora, P. (2007). Determining the Number of Factors to Retain in EFA: an easy-to-use computer program for carrying out Parallel Analysis. Journal of Practical Assessment, Research and Evaluation, 12(2), 1-11.

 


This material may be protected under Copyright Act which governs the making of photocopies or reproductions of copyrighted materials.
You may use the digitized material for private study, scholarship, or research.

Back to previous page

Installed and configured by Bahagian Automasi, Perpustakaan Tuanku Bainun, Universiti Pendidikan Sultan Idris
If you have enquiries, kindly contact us at pustakasys@upsi.edu.my or 016-3630263. Office hours only.