Theme 2: Future-proof Assessment
This was the second session of the Economics Network's 2021 Virtual Symposium, taking place on the morning of 18th June.
The session included:
- Introduction (Christian Spielmann, University of Bristol)
- Open book exams, online and on-campus (Ralf Becker, University of Manchester)
- Take-home exams for technical subjects (Robert Riegler, Aston University)
- Sampling problem sheets (Jon Guest, Aston University)
- Viva voce assessment (Fabio Arico, University of East Anglia)
- Feedback from breakout groups:
- Programme-wide assessment (Alvin Birdi, University of Bristol)
- Online MCQs- possibilities and boundaries (Ralf Becker)
- There is no cheating - it's all collaboration (Guglielmo Volpe, City University, London)
- Feedback to large numbers of students (Jon Guest, Aston University, and Mike Reynolds, University of Leeds)
- Slides from all sessions (.pptx)
Discussion group summaries
Thesis: “There is no cheating, it’s all collaboration”
- In the real world/workplace, we are often tasked to complete a project and report back on it. In order to produce such output we need to show our skills in consulting and combining the appropriate resources and, in the process, we display creativity, organisational, critical and analytical (and probably more) skills. So, perhaps, the move to more coursework-based, open-book assessment is an opportunity to replicate such real-world scenario.
- Collaboration is generally welcome among students but collusion with the intent of ‘gaming’ the assessment is not. Tools such as Turnitin or the randomisation of questions (in particular technical/quantitative questions) is suggested and have been implemented in order to mitigate (not necessarily solve) the potential for collusion;
- Open-ended questions that do not have necessarily set answers and that require students to display higher order skills of analysis, synthesis and evaluation are suggested;
- Cooperation and collusion seems to be more prevalent among year 2 and year 3 students because of the weight of each of the years in determining the final degree classification;
- There is some consensus that the ‘imposed’ changes in assessment is pushing us to think more carefully and creatively about assessment design;
- One issue that, perhaps, has been overlooked over the past year or so, is the communication with students about the reasons for the changes in assessment. Should Universities and students work together towards the establishment of a new social-norm about the relationship with assessment?
Online MCQs – possibilities and limitations
- MCQs (and other computer aided assessments) can be used to assess some intended learning outcomes (ILOs) are too complex to be assessed by MCQs
- Differentiation at the top is difficult to achieve (other than with time pressure – but is working fast without recourse to available resources an ILO?)
- Computer aided assessment certainly has a great use as elements in continuous assessment throughout the term (summative or not)
- Asking students to design questions is an excellent way to engage students into creative thinking around the content and, at the same time, help you build up a good database of questions
Take home exams for technical subjects
- Balance between calculations and conceptual questions can prevent grade inflation and ensure variety of marks.
- Avoid questions that only ask for definitions. Application and communication is important.
- Short notes are useful for asking conceptual questions. Set word limits to reduce “brain dump”
- Michelle Pauli (20 May 2021) "Rethinking Assessment", Jisc
- "Two Stage Midterm Exam for Large Class" (2018) YouTube video by University of British Columbia
- David Nicol and Geethanjali Selvaretnam (2020) "Analysis of Two Stage Exams from an Inner feedback Perspective" YouTube video