UPSI Digital Repository (UDRep)
|
|
|
Abstract : Universiti Pendidikan Sultan Idris |
This study was carried out to develop and evaluate the practicality of a lab report
scoring checklist to assess practical skills indirectly in the undergraduate optics course
(UOC). The study employed quantitative approach with the support of qualitative data.
A scoring checklist was developed for the obtained context from the needs analysis
using the ADDIE instructional design model. The checklist was validated by six experts
from the physics and physics education fields. The face validity was analysed using
descriptive statistics, while the content validity was analysed using the Content
Validation Index (CVI). The pilot test was conducted in three cyclic processes to obtain
the reliability of the developed checklist. It involved three raters from different
educational levels who evaluated 35 UOC lab reports. The first stage involves the
analysis of inter-rater agreement using Fleiss’ kappa coefficient. Next, the Cohen’s
kappa analysis was employed to determine the reliability of inter-rater agreement for
each cycle. After that, the practicality of checklist was evaluated by two lecturers while
marking 32 UOC lab reports. The instrument practicality was analysed using
descriptive statistics. Findings indicated that the developed scoring checklist had
satisfactory face validity, content validity (SCVI/Ave = 0.98, SCVI/UA = 0.87), interrater
agreement, and test-retest reliability. The checklist also obtained satisfactory
practicality level among course lecturers. As a conclusion, a lab report scoring checklist
with satisfactory validity, reliability and practicality level to assess indirect practical
skills in UOC has been successfully developed in this study. This study implies that the
developed checklist could overcome the practical skill assessment loads faced by the
UOC lecturers, guides the students in writing a professional lab report and proves that
the indirect assessment can be utilised to assess practical skills in the physics laboratory. |
References |
Abrahams, I., & Reiss, M. J. (2015). Assessment of practical skills. School Science Review, 357(June), 40–44.
Abrahams, I., Reiss, M. J., & Sharpe, R. M. (2013). The assessment of practical work in school science. Studies in Science Education, 49(2), 209–251. https://doi.org/10.1080/03057267.2013.858496
Adams, C. J. (2020). A constructively aligned first-year laboratory course. Journal of Chemical Education, 97(7), 1863–1873. https://doi.org/10.1021/acs.jchemed.0c00166
Al-Elq, A. H. (2007). Medicine and clinical skills laboratories. Journal of Family & Community Medicine, 14(2), 59–63. http://www.ncbi.nlm.nih.gov/pubmed/23012147%0Ahttp://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=PMC3410147
Atkinson, R. C., & Shiffrin, R. M. (1968). Human memory: A proposed system and its control processes. Psychology of Learning and Motivation - Advances in Research and Theory, 2(C), 89–195. https://doi.org/10.1016/S0079-7421(08)60422-3
Ausubel, D. (1968). Educational Psychology: A Motivation for the research question: A Cognitive View. Coloso University College.
Beagles, A., Beck, S., Cross, L., Garrard, A., & Rowson, J. (2016). Guidance for writing lab reports. Sheffield University.
Berchtold, A. (2016). Test–retest: Agreement or reliability? Methodological Innovations, 9. https://doi.org/10.1177/2059799116672875
Berita Harian. (2018, May 31). Tawaran masuk IPT meningkat 22 peratus. Berita Harian. https://www.bharian.com.my/berita/nasional/2018/05/432396/tawaran-masuk-ipt-meningkat-22-peratus
Branan, D., & Morgan, M. (2010). Mini-lab activities: Inquiry-based lab activities for formative assessment. Journal of Chemical Education, 87(1). https://doi.org/10.1021/ed8000073
Branch, R. M. (2009). Instructional design: The ADDIE approach (1st ed.). Springer New York. https://doi.org/10.1007/978-0-387-09506-6
Branson, R. (1978). The interservice procedures for instructional systems development. Educational Technology, 18(3), 11–14.
Brookhart, S. M. (2018). Appropriate criteria: key to effective rubrics. Frontiers in Education, 3(April), 22. https://doi.org/10.3389/feduc.2018.00022
Brown, H. D. (2004). Language assessment: principles and classroom practice. Longman.
Burrows, N. L., Ouellet, J., Joji, J., & Man, J. (2021). Alternative Assessment to Lab Reports: A Phenomenology Study of Undergraduate Biochemistry Students’ Perceptions of Interview Assessment. Journal of Chemical Education, 98(5), 1518–1528. https://doi.org/10.1021/acs.jchemed.1c00150
Caballero, C. L., & Walker, A. (2010). Work readiness in graduate recruitment and selection: A review of current assessment methods. Journal of Teaching and Learning for Graduate Employability, 1(1). https://doi.org/10.21153/jtlge2010vol1no1art546
Chabeli, M. M. (2006). Higher order thinking skills competencies required by outcomes-based education from learners. In Curationis (Vol. 29, Issue 3, pp. 78–86). https://doi.org/10.4102/curationis.v29i3.1107
Chen, B., Demara, R. F., Salehi, S., & Hartshorne, R. (2018). Elevating Learner Achievement Using Formative Electronic Lab Assessments in the Engineering Laboratory: A Viable Alternative to Weekly Lab Reports. IEEE Transactions on Education, 61(1). https://doi.org/10.1109/TE.2017.2706667
Chojnowski, M. (2017). Infrared thermal imaging in connective tissue diseases. In Reumatologia (Vol. 55, Issue 1). https://doi.org/10.5114/reum.2017.66686
Clark, R. C., & Mayer, R. E. (2012). E-learning and the science of instruction: Proven guidelines for consumers and designers of multimedia learning (3rd ed.). Pfeiffer. https://doi.org/10.1002/9781118255971
Cook, D. A., & Beckman, T. J. (2006). Current concepts in validity and reliability for psychometric instruments: Theory and application. American Journal of Medicine, 119(2), 166.e7-166.e16. https://doi.org/10.1016/j.amjmed.2005.10.036
Creswell, J. W., & Creswell, J. D. (2018). Research design (5th ed.). SAGE Publications.
Cuschieri, S., Grech, V., & Savona-Ventura, C. (2019). WASP (Write a Scientific Paper): Structuring a scientific paper. Early Human Development, 128, 114–117. https://doi.org/10.1016/j.earlhumdev.2018.09.011
Davis, M. H. (2003). Outcome-Based Education. Journal of Veterinary Medical Education, 30(3), 258–263. https://doi.org/10.3138/jvme.30.3.258
de Jong, T. (2010). Cognitive load theory, educational research, and instructional design: Some food for thought. Instructional Science, 38(2), 105–134. https://doi.org/10.1007/s11251-009-9110-0 Dixson, D. D., & Worrell, F. C. (2016). Formative and summative assessment in the classroom. Theory into Practice, 55(2), 153–159. https://doi.org/10.1080/00405841.2016.1148989
Fadzil, H. M., & Saat, R. M. (2018). Development of instrument in assessing students’ science manipulative skills. Malaysian Online Journal of Educational Sciences, 7(1), 47–57.
Fah, L. Y., & Hoon, K. C. (2014). Pengenalan kepada analisis data dengan AMOS 18 dalam penyelidikan pendidikan. Universiti Malaysia Sabah.
Falotico, R., & Quatto, P. (2015). Fleiss’ kappa statistic without paradoxes. Quality and Quantity, 49(2), 463–470. https://doi.org/10.1007/s11135-014-0003-1
Fisher, M. J., & Marshall, A. P. (2009). Understanding descriptive statistics. Australian Critical Care, 22(2), 93–97. https://doi.org/10.1016/j.aucc.2008.11.003
Ghofur, A., & Youhanita, E. (2020). Interactive media development to improve student motivation. IJECA (International Journal of Education and Curriculum Application), 3(1), 1. https://doi.org/10.31764/ijeca.v3i1.2026
Giessen-Hood, C. (1999). Teachers " Attitudes Towards the Implementation of Outcomes Based Education ( Obe ) in South Africa.
Gkioka, O. (2019). Learning how to teach experiments in the school physics laboratory. Journal of Physics: Conference Series, 1286(1). https://doi.org/10.1088/1742-6596/1286/1/012016
Guo, W. Y., & Yan, Z. (2019). Formative and summative assessment in Hong Kong primary schools: students’ attitudes matter. Assessment in Education: Principles, Policy and Practice, 26(6). https://doi.org/10.1080/0969594X.2019.1571993
Hamid, R., Shokri, S. N. E. S. M., Baharom, S., & Khatimin, N. (2016). Assessing students’ performance on material technology course through direct and indirect methods. Pertanika Journal of Social Sciences and Humanities, 24(April), 185–196.
Hammerman, E. (2008). Formative assessment strategies for enhanced learning in science, k-8. Corwin Press.
Hancock, L. M., & Hollamby, M. J. (2020). Assessing the practical skills of undergraduates: the evolution of a station-based practical exam. Journal of Chemical Education, 97(4), 972–979. https://doi.org/10.1021/acs.jchemed.9b00733
Harwood, C. J., Hewett, S., & Towns, M. H. (2020). Rubrics for assessing hands-on laboratory skills. Journal of Chemical Education, 97(7), 2033–2035. https://doi.org/10.1021/acs.jchemed.0c00200
Hinampas, R. T., Murillo, C. R., Tan, D. A., & Layosa, R. U. (2018). Blended learning approach: effect on students’ academic achievement and practical skills in science laboratories. International Journal of Scientific and Technology Research, 7(11), 63–69.
Hurley, K. F., Giffin, N. A., Stewart, S. A., & Bullock, G. B. (2015). Probing the effect of OSCE checklist length on inter-observer reliability and observer accuracy. Medical Education Online, 20(1), 29242. https://doi.org/10.3402/meo.v20.29242
Hussey, T., & Smith, P. (2008). Learning outcomes: A conceptual analysis. Teaching in Higher Education, 13(1), 107–115. https://doi.org/10.1080/13562510701794159
Ishak, M. R. (2014). Kajian Keberkesanan Program Pentaksiran Kerja Amali Sains (PEKA): Satu Penilaian di Sekolah Rendah (Study of Evaluation Program of Practical Skill Assessment (PEKA): Assessment in Primary School). Jurnal Pendidikan Malaysia, 39(2), 83–93. https://doi.org/10.17576/JPEN-2014-%x
Jamieson, S. (2004). Likert scales: How to (ab)use them. In Medical Education (Vol. 38, Issue 12, pp. 1217–1218). https://doi.org/10.1111/j.1365-2929.2004.02012.x
Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: reliability, validity and educational consequences. Educational Research Review, 2(2), 130–144. https://doi.org/10.1016/j.edurev.2007.05.002
Kaliannan, M., & Chandran, S. D. (2006). Empowering students through outcome-based education (OBE). Research in Education, 87(1), 50–63.
Kamarudin, N., & Halim, L. (2013). Konsep pengurusan alatan dan bahan untuk pembelajaran sains di makmal. Jurnal Teknologi (Sciences and Engineering), 60, 65–70. https://doi.org/10.11113/jt.v60.1449
Kamarudin, N., & Halim, L. (2014). Tahap pengurusan pelajar dan pengurusan masa dalam pengajaran amali Fizik. Sains Humanika, 2(4), 155–161.
Katawazai, R. (2021). Implementing outcome-based education and student-centered learning in Afghan public universities: the current practices and challenges. Heliyon, 7(5), e07076. https://doi.org/10.1016/j.heliyon.2021.e07076
Kemmis, S., McTaggart, R., & Nixon, R. (2014). The action research planner (1st ed.). Springer Singapore. https://doi.org/10.1007/978-981-4560-67-2
Killpack, T. L., & Fulmer, S. M. (2018). Development of a tool to assess interrelated experimental design in introductory biology. Journal of Microbiology & Biology Education, 19(3), 1–10.
Kolivand, M., Esfandyari, M., & Heydarpour, S. (2020). Examining validity and reliability of objective structured clinical examination for evaluation of clinical skills of midwifery undergraduate students: a descriptive study. BMC Medical Education, 20(1), 1–7. https://doi.org/10.1186/s12909-020-02017-4
Landis, J. R., & Koch, G. G. (1977). The Measurement of Observer Agreement for Categorical Data. Biometrics, 33(1). https://doi.org/10.2307/2529310
Leshe, S. (2016). Developing and implementing assessment moderation procedures to evaluate written laboratory reports. African Journal of Chemical Education, 6(1), 31–46.
Liew, S. S., Lim, H. L., Saleh, S., & Ong, S. L. (2019). Development of scoring rubrics to assess physics practical skills. Eurasia Journal of Mathematics, Science and Technology Education, 15(4), em1691. https://doi.org/10.29333/ejmste/103074
Lok, W. F., & Yau, P. W. (2020). A case study of direct assessment of students’ manipulative skills in chemistry practical: perspective of lecturers. Asian Journal of Assessment in Teaching and Learning, 10(2), 10–17.
Lynn, M. R. (1986). Determination and quantification of content validity index. Nursing Research, 35, 382–386. https://doi.org/https://doi.org/10.1097/00006199-198611000-00017
Malaysian Qualifications Agency. (2017). Malaysian Qualifications Framework (MQF) Second Edition. 1–39. https://www.mqa.gov.my/pv4/document/mqf/2019/Oct/updated MQF Ed 2 24102019.pdf
Maziyyah, N., & Krisridwany, A. (2020). Developing OSCE for pharmacy students in the pandemic era. Jurnal Farmasi Indonesia, 17(2), 178–187. https://doi.org/10.31001/jfi.v17i2.1075
McHarg, I. L. (1971). Design with nature. Doubleday.
McLeod, S. (2019). What are independent and dependent variables. Simply Psychology. www.simplypsychology.org/variables.html
Mohamadirizi, S., Mardanian, F., & Torabi, F. (2020). The effect of direct observation of procedural skills method on learning clinical skills of midwifery students of medical sciences. Journal of Education and Health Promotion, 9(1). https://doi.org/10.4103/jehp.jehp_672_19
Moidunny, K. (2009). The effectiveness of the national professional qualification for educational leaders (NPQEL). Unpublished Doctoral Dissertation.
Moreno, R., & Mayer, R. E. (1999). Cognitive principles of multimedia learning: The role of modality and contiguity. Journal of Educational Psychology, 91(2), 358–368. https://doi.org/10.1037/0022-0663.91.2.358 Moskal, B. M., & Leydens, J. A. (2001). Scoring rubric development: Validity and reliability. Practical Assessment, Research and Evaluation, 7(10).
Nurdin, E., Saputri, I. Y., & Kurniati, A. (2020). Development of comic mathematics learning media based on contextual approaches. JIPM (Jurnal Ilmiah Pendidikan Matematika), 8(2), 85. https://doi.org/10.25273/jipm.v8i2.5145
Nutbeam, D., Harris, E., & Wise, W. (2004). Theory in a nutshell: A practical guide to health promotion theories (2nd ed.). McGraw-Hill.
Oxford University Press. (2022). Oxford learner’s dictionary. Oxford University Press. https://www.oxfordlearnersdictionaries.com/definition/english/practical_1?q=practical
Popova, M., Bretz, S. L., & Hartley, C. S. (2016). Visualizing molecular chirality in the organic chemistry laboratory using cholesteric liquid crystals. Journal of Chemical Education, 93(6), 1096–1099. https://doi.org/10.1021/acs.jchemed.5b00704
Rao, N. J. (2020). Outcome-based education: an outline. Higher Education for the Future, 7(1), 5–21. https://doi.org/10.1177/2347631119886418
Salim, K. R., Puteh, M., & Daud, S. M. (2012). Assessing students’ practical skills in basic electronic laboratory based on psychomotor domain model. Procedia - Social and Behavioral Sciences, 56, 546–555. https://doi.org/10.1016/j.sbspro.2012.09.687
Schüler, I. M., Heinrich-Weltzien, R., & Eiselt, M. (2018). Effect of individual structured and qualified feedback on improving clinical performance of dental students in clinical courses-randomised controlled study. European Journal of Dental Education, 22(3), e458–e467. https://doi.org/10.1111/eje.12325
Simpson, E. J. (1971). Educational objectives in the psychomotor domain. Behavioral Objectives in Curriculum Development: Selected Readings and Bibliography, 60(2), 1–35. https://files.eric.ed.gov/fulltext/ED010368.pdf
Stanger, L. R., Wilkes, T. C., Boone, N. A., McGonigle, A. J. S., & Willmott, J. R. (2018). Thermal imaging metrology with a smartphone sensor. Sensors (Switzerland), 18(7). https://doi.org/10.3390/s18072169
Still, C., Powell, R., Aubrecht, D., Kim, Y., Helliker, B., Roberts, D., Richardson, A. D., & Goulden, M. (2019). Thermal imaging in plant and ecosystem ecology: applications and challenges. Ecosphere, 10(6). https://doi.org/10.1002/ecs2.2768
Suchman, E. A. (1967). Evaluative research: principles and practice in public service and social action programs. (Vol. 1, Issue 2). Russell Sage Foundation. https://doi.org/10.1080/00222216.1969.11969732
Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2), 257–285. https://doi.org/10.1016/0364-0213(88)90023-7
Sweller, J., Van Merrienboer, J. J. G., & Paas, F. G. W. C. (1998). Cognitive architecture and instructional design. Educational Psychology Review, 10(3), 251–296. https://doi.org/10.1023/A:1022193728205
Talha, M., Elmarzouqi, N., & Abou El Kalam, A. (2020). Towards a powerful solution for data accuracy assessment in the big data context. International Journal of Advanced Computer Science and Applications, 11(2), 419–429. https://doi.org/10.14569/ijacsa.2020.0110254
Tegou, L., Polatidis, H., & Haralambopoulos, D. (2007). Distributed Generation with Renewable Energy Systems: the Spatial Dimension for an Autonomous Grid. 47th Conference of the European Regional Science Association ‘Local Governance and Sustainable Development, Thematic Stream M: Environment, Natural Resources and Sustainability,’ September.
Turbek, S. P., Chock, T., Donahue, K., Havrilla, C., Oliverio, A., Polutchko, S., Shoemaker, L., & Vimercati, L. (2016). Scientific writing made easy: A step-by-step guide to undergraduate writing in the biological sciences. International Journal of Environmental and Science Education, 11(12), 5644–5652. https://doi.org/10.1002/bes2.1258
Veale, C. G. L., Jeena, V., & Sithebe, S. (2020). Prioritizing the development of experimental skills and scientific reasoning: a model for authentic evaluation of laboratory performance in large organic chemistry classes. Journal of Chemical Education, 97(3), 675–680. https://doi.org/10.1021/acs.jchemed.9b00703
Viera, A. J., & Garrett, J. M. (2005). Understanding interobserver agreement: the kappa statistic. Family Medicine, 37(5), 360–363. http://www1.cs.columbia.edu/~julia/courses/CS6998/Interrater_agreement.Kappa_statistic.pdf
Wilcox, B. R., & Lewandowski, H. J. (2017). Developing skills versus reinforcing concepts in physics labs: Insight from a survey of students’ beliefs about experimental physics. Physical Review Physics Education Research, 13(1), 1–9. https://doi.org/10.1103/PhysRevPhysEducRes.13.010108
Wiseman, E., Carroll, D. J., Fowler, S. R., & Guisbert, E. (2020). Iteration in an inquiry-based undergraduate laboratory strengthens student engagement and incorporation of scientific skills. Journal of the Scholarship of Teaching and Learning, 20(2), 99–112. https://search.ebscohost.com/login.aspx?direct=true&db=eric&AN=EJ1275056&site=ehost-live
Wood, B. K., & Blevins, B. K. (2019). Substituting the practical teaching of physics with simulations for the assessment of practical skills: an experimental study. Physics Education, 54(3), 035004. https://doi.org/10.1088/1361-6552/ab0192
Wright, J. S., Read, D., Hughes, O., & Hyde, J. (2018). Tracking and assessing practical chemistry skills development: practical skills portfolios. New Directions in the Teaching of Physical Sciences, 13(1). https://doi.org/10.29311/ndtps.v0i13.2905
Yenigun, K., & Ecer, R. (2013). Overlay mapping trend analysis technique and its application in Euphrates Basin, Turkey. Meteorological Applications, 20(4). https://doi.org/10.1002/met.1304
Yu, C. H. (2005). Test-retest reliability. Encyclopedia of Social Measurement, 3, 777–784.
Yu, X., Yu, Z., Liu, Y., & Shi, H. (2017). CI-Rank: Collective importance ranking for keyword search in databases. Information Sciences, 384, 1–20. https://doi.org/10.1016/j.ins.2016.12.022
Zezekwa, N., & Nkopodi, N. (2020). Physics teachers’ views and practices on the assessment of students’ practical work skills. Eurasia Journal of Mathematics, Science and Technology Education, 16(8). https://doi.org/10.29333/EJMSTE/8289
Zhang, H., & Wink, D. J. (2021). Examining an acid-base laboratory practical assessment from the perspective of evidence-centered design. Journal of Chemical Education, 98(6), 1898–1909. https://doi.org/10.1021/acs.jchemed.0c01405
Zhang, M. J., Newton, C., Grove, J., Pritzker, M., & Ioannidis, M. (2020). Design and assessment of a hybrid chemical engineering laboratory course with the incorporation of student-centred experiential learning. Education for Chemical Engineers, 30, 1–8. https://doi.org/10.1016/j.ece.2019.09.003
Zulkifli, H., Razak, K. A., & Mahmood, M. R. (2018). The usage of ADDIE model in the development of a philosophical inquiry approach in moral education module for secondary school students. Creative Education, 09(14), 2111–2124. https://doi.org/10.4236/ce.2018.914153
|
This material may be protected under Copyright Act which governs the making of photocopies or reproductions of copyrighted materials. You may use the digitized material for private study, scholarship, or research. |