Using online quiz assessments
There has been an ongoing discussion about the different types of assessments that could be used to assess students’ ability to achieve the intended learning outcomes of a particular module or course. This discussion has been much more enriched now with the move to online teaching and learning, focusing mainly on summative assessments. In this case study, I will however share my experiences of using formative online quizzes. With technology advances there have been many attempts to look at the different types of assessments that could effectively assess students’ outcomes while encouraging their continued engagement. One of these assessment types has been the use of short online quizzes.
Within the context of the SAMR model (Puentedura, 2014), the online quiz improves cost efficiency by substituting for the offline quiz without changing the pedagogical nature of the learning activity. It also enhances the functionality and time efficiency of the activity. The quiz thus fits the nature of the learners and their needs of finding efficient tools that enhances the efficiency and speed of their learning process. The quiz falls under the Constructive learning activities (both individual and social focus), which allows students to engage in active learning, either on their own or through collaboration with others (as some students tend to answer the quiz together) (Beetham, 2004). The quiz tool is accompanied by a virtual learning environment discussion forum in which students can then ask questions and engage in online discussions. This also improves the time efficiency for both the students and myself since it works as a complement to my office hours.
I have used online quizzes for the past couple of years in one of my final year undergraduate economics modules — labour economics — with a cohort of 60+ students. It has been interesting to identify the possible effect of the change in the delivery mode on the effectiveness of the use of quizzes as this year I delivered the module entirely online.
When I designed the quizzes I included them as a form of formative assessment, focusing on assessing student instant understanding of the materials. This was to help identify any possible gaps in how they engaged with the lectures to form their understanding. Another reason for not making it a summative assessment is to allow students to test their understanding without the pressure of being marked on their immediate understanding. See Moving Multiple-Choice Tests Online: Challenges and Considerations for a discussion of the use of online tests for summative as opposed to formative assessment.
Students’ ability to use an online tool to assess their understanding right after the lecture without doing further studying motivates them to attend the lecture
As a supporting learning activity I use Moodle Quiz module, which is available for every student. It provides ease of access to all students to take the quiz online with no need to register on any platform or software.
The purpose of this quiz tool is to help students assess their level of understanding and identify knowledge gaps to be addressed in their independent study time. The way I set each quiz when I was teaching the module last year face-to-face was to post it online at the end of the lecture after finishing a particular topic. This provided an instant assessment of student understanding of materials immediately after the lecture.
The quiz was designed to be short: 10 true / false questions. It took on average approximately 5 minutes. Students are given the freedom to answer the quiz in the lecture room or on their own. Students’ ability to use an online tool to assess their understanding right after the lecture without doing further studying motivates them to attend the lecture (Crede, et al., 2010), which is proven to be a strong predictor of their grades (Spence, 2003). Indeed, the average score was 8/10.
For this academic year I taught the module entirely online. I designed the materials to be a mix of asynchronous videos and synchronous lectures. The asynchronous videos contained the main analysis of the topic while the synchronous lectures focused more on the empirical examination of every topic. With this structure I continued to use the quizzes as a form of assessment and I haven’t noticed any changes in the overall performance and student attitude towards taking the quizzes. The average score still this year was 8 out of 10.
The only difference to mention between the two years when using the quiz was that last year, when the lectures were face-to-face, the students were tested immediately after the end of the lecture. Hence they didn’t have any additional time to go back and study the materials to identify any gaps in their understanding. However, this year students had the opportunity to take the quizzes after studying the asynchronous videos materials a week ahead and watching/attending the synchronous lectures of the materials at the beginning of the week when the quiz is taken. Despite such difference in delivery and timing of the quizzes, the effectiveness hasn’t really changed; the average score didn’t change and continued to be very high. I would like to think that maintaining these high scores reflects students’ ability to achieve the learning outcomes of every topic and assess their understanding of every topic regardless of the time at which they choose to take the formative online quizzes.
Successes and challenges
Students found these quizzes very useful to assess their understanding and to see where their knowledge gaps are for every topic. An online discussion forum was set up on Moodle to accompany the quizzes. Here, students can ask further questions to their peers and to myself to address any lack of understanding or any issues or discuss the literature about a particular topic. One of the comments that I received from students was ‘This gives me confidence that I could do well for my exams’.
With such positive feedback I discussed the rationale, implementation and effectiveness of the quiz tool with my colleagues, especially those teaching the same module at the postgraduate level and they are likely to use it as well. One of the comments I received from a colleague was:
The Moodle Quiz module review presents an insightful view on students gaining a quick “5 minute” indication of knowledge gaps. The pedagogical style is maintained and students are not pressured to obtain a high score, only to obtain an indication of required effort for future studies to plug the gap. This is clearly dependent upon engaged students and is a complementary activity to taught sessions. In this way students can fully engage in learning and self-identify areas that required additional attention. The simplicity and economy of effort (both time and cost) of using this tool make it a credible and flexible option.
Maybe one of the key challenges of providing online quizzes especially if this is a formative assessment is that not all students would be keen on taking the quizzes. The way I try to overcome this challenge is to post frequent announcements on the Moodle page about the importance of the quiz and what it is testing for every topic, relating this to the summative assessment that students will be taking later in the module.
Beetham, H. 2004. Effective practice with e-learning. JISC: Bristol.
Crede, M., et al. 2010. Class attendance in college. Review of Educational Research, Vol. 80, No. 2, pp. 272-295. https://doi.org/10.3102/0034654310362998
Spence, C. 2003. LTSN Teaching Development Project report. Department of Historical and Cultural Studies, Goldsmiths College, University of London.
Puentedura, R. 2014. SAMR: A brief introduction.
 The SAMR model stands for the Substitution, Augmentation, Modification, and Redefinition model created by Ruben Puentedura.