Abstract: Context: Multiple-choice tests and exams are widely used as means to assess students' knowledge, particularly due to their efficiency for large classes. Their highly structured format also means that they can provide useful statistics to educators. However, the binary nature of the per-question feedback statistics (e.g. correct or incorrect) tend to mask gaps in students' knowledge due to the uncertainty in not knowing how confident students were in selecting a particular alternative. Indeed, assessing the reliability of one's knowledge and inference is a key academic skill which has particularly dire consequences in Engineering when it fails. This paper disseminates the results of introducing a Confidence Based Marking (CBM) online practice exam in a large first year Engineering subject, where students were required to indicate their confidence level for each answer and were provided feedback on the reliability of their assessment of their knowledge.
Purpose: This study investigated whether the introduction of a CBM online practice exam enabled students to more accurately judge their knowledge in order to better prepare them for the final exam, resulting in improved academic performance and whether there were any insights gained from differences in performance and confidence measures between different subsets of students.
Approach: During semester, an online multiple choice quiz question repository system was made available that required students to indicate the confidence of their answers according to a three-point CBM scale. Having become familiar with the system, students then sat an assessed online multiple choice CBM practice exam in the final week of semester that resembled the format of part of the final exam. Feedback was provided to them on the reliability of their assessment of their knowledge in order to identify their strengths and weaknesses in the subject material. Data was collected on the performance and confidence levels for each student for each question and analysed to identify relationships. Final exam data was collated and related to the online practice quiz data in order to measure any improvement in student performance.
Results: Student feedback indicated that the use of the quiz question repository during semester significantly improved feedback and their confidence over time with the subject material. Results from the online practice exam revealed some differences in performance and confidence levels across particular subsets of students. Students' results in the final exam were correlated with their CBM online practice exam performance and indicated a small improvement in overall results compared to previous years.
Conclusions: CBM assessment has been shown to provide a valuable measure of the reliability of students' knowledge, especially in Medicine. Applying such a methodology to a large Engineering subject, in the form of an online quiz question repository and online practice exam, has provided feedback to students to allow them to better judge their knowledge and improve in the key areas they need to do so. It is anticipated that further refinement of the system will translate to improved retention of knowledge and improved final exam performance.
To cite this article: Buskes, Gavin. Effects of the introduction of an online practice exam utilising confidence based marking [online]. In: 27th Annual Conference of the Australasian Association for Engineering Education : AAEE 2016. Lismore, NSW: Southern Cross University, 2016: 113-121.
[cited 26 Jul 17].