UPSI Digital Repository (UDRep)
Start | FAQ | About

QR Code Link :

Type :article
Subject :LA History of education
Main Author :Kamarul Ariffin Ahmad
Additional Authors :Nora Liza Abdul Kadir
Mohd Zahren Zakaria
Muhamad Lothfi Zamri
Siti Najihah Abd Rahman
Nurhafza Mohamad Mustafa
Title :Distractor efficiency of an intermediate English proficiency course final examination paper
Place of Production :Tanjong Malim
Publisher :Fakulti Pembangunan Manusia
Year of Publication :2021
Corporate Name :Universiti Pendidikan Sultan Idris
PDF Full Text :Login required to access this item.

Abstract : Universiti Pendidikan Sultan Idris
Language proficiency tests have adopted various types of items which include multiple-choice questions (MCQ). Designing MCQ needs a longer time to be completed despite being perceived as easy to administer and mark. After the administration and marking process are completed, the test needs to be analysed to determine the index of facility, discrimination power and distractor efficiency. This study aimed to scrutinise those indices and determine if there is an association between the indices with the functionality of the distractors in an intermediate English Proficiency course final examination paper. With the use of QUEST software, the data were computed to get the difficulty and discrimination indices, and distractor efficiency.The finding showed that there is no clear association between facility and discrimination indices with the distractor efficiency. This study found that certain items had acceptable indices despite having poor or non-functional distractor and certain items had poor indices even though the distractors were functioning as they were intended to. This study concluded that distractor efficiency may have a trivial association to the item facility and discrimination power. Therefore, it is suggested for a deeper investigation to be conducted in terms of the language used to construct the stem and the options.  

References

Bichi, A. A., Hafiz, H., & Bello, S. A. (2016). Evaluation of Northwest University, Kano Post-UTME Test Items Using Item Response Theory. International Journal of Evaluation and Research in Education, 5(4), 261- 270.

Battisti, B., Hanegan, N., Sudweeks, R., & Cates, R. (2010). Using Item Response Theory to Conduct a Distracter Analysis on Conceptual Inventory of Natural Selection. International Journal of Science & Mathematics Education, 8(5), 845–868. DOI:https://doi.org/10.1007/s10763-009-9189-4.

Bond, T.G. & Fox, C.M. (2007). Applying The Rasch Model: Fundamental Measurement in the Human Sciences (2nd Ed.). London: Lawerence Erlbaum Associates, Publishers.

Crisp, G.T., & Palmer, E.J. (2007). Engaging Academics with a Simplified Analysis of their Multiple Choice Question (MCQ) Assessment Results. Journal of University Teaching and Learning Practice, 4(2), 88-106.

Büyükturan, E. B., & Sireci, A. (2018). Using Classroom Observation Scores Instead of Test Scores as Criterion in the Estimation of Discrimination Index. Journal of Education and Training Studies, 6(7), 55-62. DOI:https://doi.org/10.00004/jets.v6i7.3191.

Hassan S., & Hod, R. (2017). Use of item analysis to improve the quality of single best answer multiple choice question in summative assessment of undergraduate medical students in Malaysia. Education. Medicine Journal, 9(3), 33–43. DOI:https://doi.org/10.21315/eimj2017.9.3.4.

Ibbet, N. L., & Wheldon, B. J. (2016). The Incidence of Clueing in Multiple Choice Testbank Questions in Accounting: Some Evidence from Australia. e-Journal of Business Education & Scholarship of Teaching, 10(1), 20-35.

Iqbal, M. Z., Irum, S., &Yousaf, M.S. (2107). Multiple choice questions; Developed by thefaculty of a public sector medical college. Professional Med J, 24(9), 1409-1414.

Iwintolu, R. O., & Afolabi, E. R. I. (2018). Construction and Validation Of Polytomously-Scored Multiplechoice Items in Mathematics. Bulgarian Journal of Science and Education Policy, 12(2), 359-379.

Jonick, C., Schneider, J.,& Boylan, D. (2017). The effect of accounting question response formats on student performance. Accounting Education26(4), 291-315. DOI://doi.org/10.1080/09639284.2017.1292464.

Jonm. (2012). Question Analysis – Difficulty/Facility and Discrimination. Retrieved from: https://learntech.imsu.ox.ac.uk/blog/question-analysis-difficultyfacility-and-discrimination/.

Kamarul, A. A. (2018). Item Analysis and Score Dependability of English Proficiency 1 Examination in Public University (Master dissertation). University of Malaya, Kuala Lumpur, Malaysia.

Karkal, Y. R., & Kundapur, G. S. (2016). Item analysis of multiple choice questions of undergraduate pharmacology examinations in an International Medical School in India. Journal of Dr. NTR University of Health Sciences, 5(3), 183-186.

Kaur M, Singla S, Mahajan R. Item analysis of in use multiple choice questions in pharmacology. International Journal of Applied and Basic Medical Research, 6(3), 170-173.

Mahjabeen, W., Alam, S., Hassan, U., Zafar, T., Butt, R., Konain, S., & Rizvi, M. (2017). Difficulty Index,Discrimination Index and Distractor Efficiency in Multiple Choice Questions. Annals of Pakistan Institute of Medical Sciences, 13(4), 310–315.

Papenberg, M., & Musch, J. (2017). Of Small Beauties and Large Beasts: The Quality of Distractors on Multiple-Choice Tests Is More Important Than Their Quantity. Applied Measurement in Education, 30(4), 273–286. DOI:https://doi.org/10.1080/08957347.2017.1353987.

Pawade, Y. R., & Diwase, D. S. (2016). Can Item Analysis of MCQs Accomplish the Need of a Proper Assessment Strategy for Curriculum Improvement in Medical Education? Journal of Educational Technology, 13(1), 44–53.

Puthiaparampil, T., & Rahman, M. (2021). How important is distractor efficiency for grading Best Answer Questions?. BMC medical education, 21(1), 1-6. DOI:https://doi.org/10.1186/s12909-020-02463-0

Rao, C., Prasad, H. K., Sajitha, K., Permi, H., & Shetty, J. (2016). Item analysis of multiple choice questions: Assessing an assessment tool in medical students. International Journal of Educational and Psychological Researches, 2(4), 201-204.

Serpil KOÇDAR, Nejdet KARADAĞ and Murat Doğan ŞAHIN. (2016). Analysis of the Difficulty and Discrimination Indices of Multiple-Choice Questions According to Cognitive Levels in an Open and Distance Learning Context. The Turkish Online Journal of Educational Technology 15(26), 291–315. DOI:https://doi.org/10.1080/09639284.2017.1292464

Si-Mui Sim & Raja Isaiah Rasiah (2006). Relationship Between Item Difficulty and Discrimination Indices in True/False-Type Multiple Choice Questions of a Para-clinical Multidisciplinary Paper. Retrieved from: http://www.annals.edu.sg/pdf/35VolNo2200603/V35N2p67.pdf

Singh, A. K. (2009). Tests, Measurements and Research Methods in Behavioural Sciences. New Delhi: Bharati Bhawan Pub. & Dis.

Toksöz, S., & Ertunç, A. (2017). Item Analysis of a Multiple-Choice Exam. Advances in Language and Literary Studies, 8(6), 141–146.

Vegada, B., Shukla, A., Khilnani, A., Charan, J., & Desai, C. (2016). Comparison between three option, four option and five option multiple choice question tests for quality parameters: A randomized study. Indian Journal of Pharmacology, 48(5), 571–575. DOI:https://doi.org/10.4103/0253-7613.190757.

Zafar Iqbal, M., Irum, S., & Sohaib Yousaf, M. (2017). Multiple Choice Questions; Developed by the Faculty of a Public Sector Medical College. Professional Medical Journal, 24(9), 1409–1414. DOI:https://doi.org/10.17957/TPMJ/17.3989.

 


This material may be protected under Copyright Act which governs the making of photocopies or reproductions of copyrighted materials.
You may use the digitized material for private study, scholarship, or research.

Back to previous page

Installed and configured by Bahagian Automasi, Perpustakaan Tuanku Bainun, Universiti Pendidikan Sultan Idris
If you have enquiries with this repository, kindly contact us at pustakasys@upsi.edu.my or Whatsapp +60163630263 (Office hours only)