The Economics Network

Improving economics teaching and learning for over 20 years

On-line delivery and marking of Excel based assessments

Introduction

As members of the Division of Economics at Glasgow Caledonian University the authors have been responsible for the provision of a number of modules, at a variety of levels, that both teach and make use of Excel spreadsheet skills. Some of these simply provide instruction on how to use basic Excel facilities and routines such as formula writing, copying, charting, sorting and filtering in the context of a variety of generic business problems. Others use these and some more advanced routines in the areas of mathematical and statistical business modelling.

With student numbers in excess of 1,200 per annum the marking tasks were becoming extremely onerous. Consequently, it was decided to develop a series of Excel workbooks for each of these modules, which would be able to mark the students' assessments on-line. At first, these automatic procedures were on a modest scale, but as experience was gained we were able to expand the range of questions that could be marked in this way into an assessment bank of considerable size and variety.

The Tests

At the start of each semester teaching files pertaining to each module are mounted on the intranet and numerous practice assessment files and other resources made available. These practice assessments are also automatically marked, and therefore give students the opportunity both to practise answering questions and also to familiarise themselves with the protocols that will be required in the 'real' assessments.

The ability to see the marks awarded is at the student's discretion, with the default being that the marks will not be shown. This feature was incorporated after protracted discussion amongst the teaching team. Some felt that if a student were performing poorly then seeing that this was the case would act as a disincentive. On the other hand, seeing that the answer was wrong allowed the opportunity to correct and perhaps improve performance. Consequently, it was decided to give students the choice of seeing their marks, as well as the ability to change their minds at any time.

Security and robustness

The marking sheet is contained within the actual assessment file, but is hidden, password protected and formatted to be invisible to the student. Nevertheless, a number of further difficulties had to be resolved before we felt confident that the marking sheet was secure. The following are just two:

  • Student answers that produced Excel error messages had to be trapped and ignored by the marking sheet so that an eventual score for the correct answers could be computed.
  • The default blank response to a question had to be differentiated from zero so that a correct answer of zero would not be marked as correct with the default response.

Both these and other difficulties were satisfactorily resolved by judicious use of formulae in the marking sheet. However, the sensitivity of the tests to either intentional or unintentional corruption by the student was a more worrying problem. At base, the issue of robustness stems from the inability to protect many of the sheets when they require the use of Excel routines such as Sorting, Filtering, the Solver or Data Analysis.

Despite intensive testing by the authors, it was still found that both the question and marking sheets could become corrupted in a variety of ways when subjected to student use. Fortunately however, we were able to monitor students who were doing the practice assessments and thereby discover a wealth of potentially destructive processes that could then be incorporated into the safety checks prior to the 'real' assessments. This was done in a variety of ways that usually involved using Visual Basic (VB) to remove those menu options and Toolbar buttons that could otherwise cause damage, but were not essential to the assessment. Then, so that a student who was VB literate could not put them back on again, the VB menu was removed. As can be imagined, this added considerably to the time involved in composing the assessments, since after removing the VB toolbar we would frequently find the need to use VB again - a classic case of 'painting oneself into a corner'!

However, even with this fairly high level of protection established it was decided to include a restore button in each question sheet that would clone the original question in the event of mishap. Inevitably this meant that the file size increased, but even for an assessment containing 20 questions this never exceeded 200Kb.

Finally, since all of the 'real' assessments would be invigilated by experienced Excel tutors, we felt confident that any untrapped problems that arose could be rectified with the minimum amount of disruption to the student. Extra time would, of course, be allowed where this happened.

Limitations

It will be clear that the marking sheet can only assess answers to questions that require numeric, non-binary responses, and this raises two issues:

  • How 'right' must the answer be before it is awarded full marks?
  • How can the marking sheet assess responses that are non-numeric - charts or comments for example?

In regard to the first issue, our original approach was to employ zero tolerance of arithmetic deviation from the correct answer. This created difficulties however. The default decimal place and type formatting frequently varied across the computers being used for the assessment, so that the marking program could return a zero mark for an answer that was only fractionally different from the correct answer, or expressed in a different format. This was particularly true with questions requiring use of the Excel Solver. The fix we devised was to include a hidden auto open front end in the question files that ensured homogeneous Excel configuration when the assessment commenced.

Nevertheless, we still had to address the question of whether, given homogeneous configuration, an answer of say, 1.2 to a question for which the correct answer was say, 1.25 was close enough to warrant any marks. Our eventual decision reflected our backgrounds as economists: sometimes it was and sometimes it was not! To be more precise, we thought that where the question called for an extensive series of calculations that could each feed an error into subsequent calculations, then a margin of 5% error around the final correct answer could be allowed (but see our comments under Customisation as to how this could be changed). Thereafter, an answer that was outside 5% error, but within 10% would be awarded half marks. Any answer outwith a 10% error would receive no marks. The logic of this approach is clearly that to come so close to the correct answer implied that the method(s) employed must have been more or less correct. On the other hand, there were question types where even the slightest mistake would cause severe problems in real life - Sorting being a classic example.

There were a number of questions that required students to enter the text version of the formula that they used, with the aim of testing whether students could write appropriate formulae and then copy them, rather than simply use a calculator on each and every required calculation. The latter requirement ensured that the marking program could assess these responses without recourse to a variety of dollar permutations to check all of the possible entries that would actually work. With regard to the issue of charts and graphs we could do no more than ask the student to show the actual graph and then mark it manually whilst still on the screen.

Because of the nature of the network configuration in our main teaching labs we currently have no alternative but to go round each student and manually record their final mark at the end of the assessment. At this stage we would then give credit for anything that they had done but which had been zero marked by the marking sheet. This was not unacceptable since as we have already said, the marking sheet is not able to identify credit-deserving attempts that are outside the tolerance limits, and so to a large extent the manual marking only applied to questions that had been zero marked.

Efficiency

The efficiency of the system is easy to compute. With an average of 1,200 students per annum, each attempting an average of 12 questions in formal assessments, this would create 14,400 questions to be marked. Prior to the introduction of this system, manual marking took approximately one minute per question (12 minutes per student), giving 240 staff marking hours. With the on-line marking we can record each student's mark in under a minute (including any manual marking of graphs, etc.), which represents a reduction of marking time in excess of 90%. The actual improvement in efficiency is, of course, dependent upon current student numbers and lab capacity. At the time of writing, we are able to access ten labs, each with twenty computers. This allows us to invigilate the entire assessment with 10 'Excel-literate' tutors over six sittings, meaning that 1,200 students taking a 90-minute assessment can be invigilated and marked on the basis of (10 x 1½+ * 6) = 100+ staff hours. A conventional written-exam for the same numbers of students and same duration would easily absorb 300+ staff hours.

Of course, there are also the staff hours involved in creating these assessments, but we now regard these as a fixed cost, and they are zero cost rated for those assessors who decide to use the material that we have produced.

Customisation

The marks awarded to each question clearly reflect our perception of the difficulty of the question in relation to the nature and extent of tuition received by our students. Consequently, we recognise that users of the system may well want to alter these parameters. This is easily done from the marking sheet and will automatically spread into the question sheets, the marks sheet and also the front-page where the marks details are displayed. Where it was impossible to protect a question sheet because an Excel routine had to be used, it is clearly feasible for the student to change the marks awarded to each question or question part. However, this is a virtual change and will not affect either the mark sheet or the marking sheet, which are both fully protected.

It is also an easy matter for the instructor to change the 'threshold' mark out of which the assignment is to be marked and this will then spread into the calculation of the percentage mark awarded.

Wherever possible the correct answers have been calculated in the marking sheet as 'live' formulae. This allows the assessor to alter the data in the question and be secure in the knowledge that the marking sheet will have adjusted the correct answer automatically. This said however, there are clearly question types where the correct answer must be fixed data items, since no formulae are employed : Sorting and Filtering are the most obvious cases. Here, data variation by the assessor will require that the correct answers are computed and then pasted into the appropriate range of the marking sheet.

As regards tolerances on the correctness of the answer, then these too are at the assessor's control from the marking sheet, and clearly need not be the same for each question, or question part.

Experience to date

To date, once the teething troubles were eliminated, student response has been excellent. We delivered a questionnaire to students that focused on the merits or demerits of being assessed in this way. Their responses were almost unanimously in favour of this form of assessment and of the ability to see their marks. This was further confirmed in the actual assessments by the fact that very few students elected not to see their marks.

Students were also highly appreciative of the practice tests, with one of the commonest comments being that they could use them at home and receive immediate feedback on their performance. These practice tests contained both on-line marking of attempts to solve the questions as well as fully worked and explained solutions that were only observable once the student had made an attempt at a particular question.

Conclusion

We hope that this case has provided a cogent explanation of the processes employed in developing what we believe to be a novel method of marking Excel based assessments. All of the extant question files are freely available by contacting the authors.

Back to top
Contributor profiles