I have sampled a number of questionnaires in use in economics departments in the UK and have grouped questions into the following broad categories:
overall quality indicators;
student characteristics, behaviour and status;
the skills of the lecturer;
reading and facilities;
contribution to learning.
These are discussed in turn. I try to draw out the key features, illustrating with examples of questions in use. In the subsequent section, there is a broader discussion of questionnaires in economics, containing some ideas and tips regarding best practice.
Less than half of the questionnaires sampled include questions or statements that invite students to rate the overall quality of modules and lecturers. Asking students to rate the overall quality of the lecturer is rare. The following are examples of these kinds of question and statements drawn from the sample of questionnaires reviewed:
All questionnaires contain at least one open question, although they vary significantly in the number of open questions and the proportion of open to closed questions – the largest number of open questions used is 13. I have detailed the most common questions asked – the percentage figures refer to the proportion of sampled questionnaires containing this question or a closely related question:
Here is a selection of other open questions used in economics questionnaires. Some of these are probably better dealt with as closed questions (for example, the question on the technical level of the course). One questionnaire asks what textbook(s) students have bought. In the light of increasing numbers of students and difficulties accessing library resources, this is an interesting question:
A small proportion of questionnaires ask questions about the students’ characteristics and behaviour. The most common question of this sort concerns student attendance at lectures and tutorials. Typically students are asked to rank their level of attendance from excellent to poor.
In some cases, students are asked whether they agree or not with the following statement:
Students may not wish to admit a level of delinquency, so responses may be biased upwards. It might help to be more precise in the question – one questionnaire asked students:
Other questions/statements that measure characteristics and status of students include:
All questionnaires contain a number of closed questions about the structure, coherence and level of the module as a whole. The key areas of concern are:
On a number of questionnaires, students are asked to respond to the following statement: ‘The course material stimulated my interest’ (strongly agree, …, strongly disagree).
Example: ‘The overall level of the course was about right, given my background’ (strongly agree, …, strongly disagree).
Design and organisation
Example: ‘The course was well organised’ (strongly agree, …, strongly disagree).
Clarity of course objectives
Example: ‘The course objectives were clearly explained at the outset’ (strongly agree, …, strongly disagree).
Difficulty of material (much too difficult, …, much too easy).
How did the level of difficulty of the material and quantity of material compare to other courses? (much more difficult, …, much easier).
Quantity of work required (much too much, …, much too little).
Consistency of content of course with course outline
One questionnaire contained a single question relating to the method of assessment. Students were asked:
Are you happy with the means of assessment?
I think this is an important question simply because assessment is such a key and contentious area and may give rise to valuable information that can be used in the design of assessment procedures. The form of this particular question is not ideal, as it is very likely to induce a negative response. It would be more useful to ask students to suggest alternative forms of assessment, possibly in the form of an open question.
Questionnaires contain relatively few questions that relate directly to the qualities and skills of the lecturer. In many cases, questions relate to aspects of the module and it is open to interpretation whether this implies anything or not about the performance of the lecturer. For example, it is common for questionnaires to ask whether a module is interesting or intellectually stimulating – it is quite a different question to ask whether the lecturer seeks to make the course interesting or stimulating.
Questions relating to the skills of the lecturer cover the following broad areas:
Speed of delivery
Instructor’s ability to stimulate interest in the subject
Students are asked: ‘Were lectures well prepared and organised?’
Use of and quality of visual aids, overheads and handouts
Examples: ‘Did the lecturer use visual aids?’, ‘Were the visual aids helpful?’
Instructor’s availability and helpfulness to students (excellent, …, very poor)
Were your essays/assignments marked and returned promptly? (always, …, never)
Has the lecturer been accessible to answer questions or give advice? (yes, …, no)
Most questionnaires ask about reading material. These are typical questions:
Did you receive helpful guidance regarding reading material?
Was the reading material readily available?
Some questionnaires include questions about facilities. For example, students are asked about the quality of the lecture rooms and access to computing facilities:
The computing facilities I needed for this module were adequate (agree, …, disagree).
As the objective of the courses is to promote learning, it can be useful to ask students whether they believe the course has promoted learning and the development of key skills. It is very rare for questionnaires to address these issues, but there are some questions of this sort. For example, in one questionnaire, students were asked to rate:
Contribution of the module to improving general analytic skills (excellent, …, very poor).
The third case study in section 5 provides the most comprehensive example of a questionnaire that addresses these issues.