Exploring Item Bias Analysis Methods for Enhanced Digital Assessment in the Tertiary Education Sector

Authors

  • Inko-Tariah, D. C. Department of Educational Psychology, Guidance and Counselling, Faculty of Education, Ignatius Ajuru University of Education, Rumuolumeni, Rivers State. Author
  • Anwuri, Owhorchukwu Department of Educational Psychology, Guidance and Counselling, Faculty of Education, Ignatius Ajuru University of Education, Rumuolumeni, Rivers State. Author

DOI:

https://doi.org/10.66545/jhbak736

Keywords:

Item analysis methods, digital assessment, tertiary education

Abstract

This paper explored Item Bias Analysis Methods for Enhanced Digital Assessment in the Tertiary Education Sector. With the growing adoption of digital platforms for university examinations, there are increasing concerns about the psychometric quality of Computer-Based Tests (CBTs). The Lord Raju method, grounded in Item Response Theory (IRT), offers a sophisticated approach to evaluating item discrimination and difficulty, enhancing precision and validity. In contrast, the Mantel Haenszel method, which relies on contingency tables, provides a more straightforward and accessible alternative. This paper compared these methods to assess their applicability, reliability, and suitability for tertiary assessments. Conducted with 3,851 third-year undergraduates at Ignatius Ajuru University of Education, the study utilized matrix sampling techniques to select 800 students and 50 test items. Data were gathered using the validated General Studies English Language Performance Test (GNSELPT), which has a reliability coefficient of 0.84. The findings indicate that both methods effectively identified Differential Item Functioning (DIF) items, though their consistency varied with gender and mode of entry. The study recommends regular monitoring and evaluation of test items using multiple methods as part of a quality assurance process.

References

Annan-Brew, R, (2020) Differential item functioning of West African Senior School Certificate Examination in core subjects in Southern Ghana, Published Doctoral Thesis, University of Cape Coast, Sam Jonah Library.

Abbott, W.T, (2023), DIF Detection Sensitivity of Lord's, Chi-Square, Raju's Area, Logistic Regression, Mantel-Haenszel, Standardization, And Transformed Item Difficulties Methods, In Comparison, Using R. Journal of Language Testing & Assessment. 3(1), 5-19. EPRA International Journal of Multidisciplinary Research (IJMR), 7(7), 1-12.

Bassman, M, (2023), A Comparison of the efficacies of differential item functioning detection methods. Journal of Language Testing & Assessment. 3(1), 5-19. International Journal of Assessment Tools in Education, 10(1), 145-159.

Anwuri, O, (2022) Detection of item bias in computer-based Test of Ignatius Ajuru University of Education, Unpublished Master's Thesis, Ignatius Ajuru University of Education.

Brown, G. T., & Kennedy, K. J. (2019). Computer-based testing: Practices, advantages, challenges, and practical guidelines. Cham: Springer.

Don Y, & Kayla, C. (2020). Gender-related differential item functioning analysis on an ESL Test. Journal of Language Testing & Assessment. 3(1), 5-19.

Hartoyo, V., Putra, I. E., & Syafryadin, S. (2020). Utilization of online quiz as a learning media to improve student learning outcomes. Journal of Physics: Conference Series, 1467(1), 012046.

Downloads

Published

2025-06-03

How to Cite

Exploring Item Bias Analysis Methods for Enhanced Digital Assessment in the Tertiary Education Sector. (2025). Journal of Innovations in Educational Assessment, 7(2), 82-100. https://doi.org/10.66545/jhbak736

Similar Articles

21-26 of 26

You may also start an advanced similarity search for this article.