Economics Network IREE Virtual Edition

Using a Personal Response System in Economics Teaching

Caroline Elliott1
International Review of Economics Education, volume 1, issue 1 (2003), pp. 80-86
DOI: 10.1016/S1477-3880(15)30213-9 (Note that this link takes you to the Elsevier version of this paper)

Up: HomeLecturer ResourcesIREEVolume 1 Issue 1

Abstract

This paper offers a brief introduction to a Personal Response System that can be used in group-teaching scenarios, reporting the results of a trial using the technology in a second-year undergraduate Microeconomics Principles course. Advantages and disadvantages of the technology are discussed, and the possibilities for using this technology more widely are explored.

JEL Classification: A22

Introduction

A Personal Response System (PRS) is a form of technology that offers a lecturer/tutor the opportunity to ask a group of students multiple-choice questions to which they reply individually by selecting an answer on a hand-held wireless transmitter. Receivers connected to a computer pick up these answers. Computer software then aggregates the responses, and the students can see the results on a large screen using a standard projector. Hence, there are similarities with the technique of ‘asking the audience’ on the game show Who Wants to be a Millionaire? (Seenan, 2000). However, very few companies produce PRSs and the use of this technology remains relatively uncommon in UK universities.2

I chose to trial the PRS in the second-year undergraduate Microeconomics Principles lectures at Lancaster University in spring 2001. The 47 students on the course were given ten lectures by another lecturer using traditional methods, and then ten lectures by myself using the PRS. Consequently, the students had the opportunity to compare the learning experiences offered by standard lectures and lectures incorporating the use of a PRS. During a 50-minute lecture I would punctuate my delivery by asking approximately five multiple-choice questions, to which the students would reply using their handsets.

Rationale

Having taught second-year microeconomics for a number of years, I was aware that it was a course that students have historically often found challenging. Further, in a lecture environment students may be unwilling to volunteer information regarding their level of understanding of material covered. Consequently, I primarily used the PRS questions as a means of anonymously testing students’ understanding of material recently covered. If, after observing the results of a question, I was concerned that students had not fully understood the material on which the question was based, I could briefly review the material for them, and also tailor follow-up tutorial content accordingly. I also used multiple-choice questions as a way of introducing a subject, asking students to apply their economic reasoning skills prior to being formally introduced to a new Microeconomics topic. In addition, I used the PRS to gauge how much information students had remembered about a topic from the first year of their economics degree studies.3

The undergraduate Microeconomics lectures are timetabled to last 50 minutes. However, there is considerable evidence suggesting that a lecture audience struggles to maintain concentration levels for such a long time period. For example, MacManaway (1970) reported that for the vast majority (84%) of his students, lecture concentration was limited to 20–30 minutes. Hence, another motivation for introducing the PRS was to break the lecture into segments using an activity that periodically required the students to respond, and thereby varying students’ lecture learning experience. Evidence also suggests that it may take students 5 minutes to settle down at the beginning of a lecture (McLeish, 1968; Lloyd, 1968).4 Hence, it was hoped that asking a PRS question at the beginning of lectures would stimulate students’ initial concentration levels.

It has long been recognised that the lecture environment often fosters passive learning.5 Alternatively, the PRS offers a method of active engagement: ‘if students are to learn to think, they must be placed in situations where they have to do so. The situations in which they are obliged to think are those in which they have to answer questions because questions demand an active mental response’ (Bligh, 1998, p. 15).

Prior to the development of the PRS technology, lecturers experimented with posing multiple-choice questions in lectures, asking students to respond using coloured cards to indicate their answers, as reported by, for example, Harden et al. (1968) and Dunn (1969). Harden et al. (1968) concluded that, whilst the preparation of lectures (and the lecture questions) inevitably took longer, the result was more effective and interesting teaching – that is, the same intended outcomes as from the use of a PRS. Nevertheless, research into the benefits of using a PRS specifically has now also begun. Hake (1998) reports on the adoption of a PRS in physics teaching, concluding that it can have a significant impact on students’ problem-solving skills, this being reflected in significantly improved test results. Hake’s (1998) results were confirmed by Cue (1998), who also found that the PRS increased active learning, depth of learning and student interest, again in the context of physics teaching. Meanwhile, Draper (2001) details the potential array of PRS uses, his analysis being relevant to a broad range of teaching disciplines.

PRS details

When in operation, the screen for the PRS displays the number of the question being asked, the time allotted to the question, and the number of chances each student has to answer the question. Once a question is asked, the clock is started and the time remaining in which to answer is continually shown. A count of the number of handsets that have responded to the question asked is shown, and as each different handset is used to answer a question another cell on the screen changes colour.

The PRS system can be operated in two modes: anonymous and named. In the anonymous mode, when students respond to a question using their handset, a cell on the screen changes colour and the number of the handset responding is displayed. If students are allocated a handset, and a file is set up to associate handsets with student names, then when screen cells change colour they can also indicate the name of the student answering. Each handset’s response to a question can be shown on the screen or kept hidden. If it remains hidden from the audience, double clicking on a cell reveals the answer selected and the time taken to answer the question after the clock was started. This ensures that, for example, by double clicking on the first cell to change colour it is possible to congratulate the first student to answer if they answer correctly.

The handsets have ten digits (including zero), corresponding to ten possible answers to the multiple-choice questions. However, I only offered the students four possible answers to ensure that the questions did not take too long to read and attempt. The handsets also have high-confidence and low-confidence buttons. When either the time runs out for answering the question, or the lecturer chooses to stop the clock, a bar-chart summary of the results is displayed, with the bars made up of different coloured segments to indicate the proportions of students answering with different confidence levels.

The computer keeps a record of the responses of the numbered handsets for the entire class session when the PRS is in the anonymous mode, and records the responses of individual students when in the named mode.6 The saved information also includes details of the time taken by each handset to answer every question, the number of attempts made by a handset when each question was asked, and the confidence levels of the answers selected.

PRS trial results including student feedback

In the ‘Rationale’ section it was suggested that using a PRS could enhance a lecturer’s ability to monitor students’ understanding of lecture material; provide an opportunity for students to engage in active learning; and boost students’ concentration levels at times when they might otherwise flag in lectures.

I can confirm that the PRS has provided a very useful means of checking students’ understanding of material covered, both quickly in the lectures and also after the lecture. This has meant that I can more accurately determine what material should be revisited in tutorials, as well as in the lectures. Further, I appreciate that it has offered students an easy method of gauging their own understanding, and comparing their performance against that of their peers. Whilst some of these benefits also transpire from the active learning methods reported by Harden et al. (1968) and Dunn (1969), the PRS has additional advantages. Bar-chart summaries of students’ answers are produced and visible to the lecturer and students alike, whilst responses can also be accurately recalled after the lecture has ended, including the responses of individual students when the PRS is used in the named mode.

I have also found that the PRS has had a very significant effect on students’ performance in lectures, stimulating their interest and concentration, as well as their enjoyment of lectures. It has proved to be an excellent method of encouraging active learning, whilst offering a means of varying the stimuli received by students in a lecture environment. Further, they have found the PRS very easy to use.

Arguably, a potential drawback is that less material can be covered in lectures. However, I feel that this is more than compensated by the greater awareness I have concerning the amount of material that students understand. In addition, it may be argued that using the PRS has enabled me to relate the pace of my presentation of material more closely to the pace of student understanding.

Initially, I was concerned about the reliability of the PRS technology. Yet so far it has proved totally reliable. It should also be noted that the technology is very easy to set up and even easier to dismantle. Set-up takes approximately 5 minutes. Before using the PRS some colleagues had registered concern that students might take the handsets away after lectures. This would add considerably to the cost of the system. However, the fear was unfounded, and the students were very careful with the handsets. They are, after all, useless for any other purpose. A related concern was that students might ‘fiddle’ with the handsets in lectures, but this proved not to be the case. The students very rapidly came to accept the PRS as a standard teaching tool on their course.

At the end of the lecture course, I asked the students (anonymously) to complete a questionnaire about the PRS as well as a standard lecturer feedback questionnaire. The PRS questionnaire contained five statements to which students could respond by selecting answers 1 to 5, 1 indicating strong disagreement and 5 denoting strong agreement. Students were also given the opportunity to add any additional comments at the bottom of the questionnaire.7

To the statement ‘The PRS is easy is use’, the median response was 5 and the mean response was 4.96. I fully expected this result and believe that it was helpful that I introduced the students to different features of the technology gradually. Hence, I only explained about the high- and low-confidence buttons on the handsets after the students had used the PRS in a couple of lectures. Similarly, I only used the named mode of operation after a number of lectures in which the PRS was used in the anonymous mode.

The statements ‘Using a PRS has increased my enjoyment of lectures’ and ‘Using a PRS has helped my concentration levels in lectures’ both gave rise to encouraging median responses of 4 and mean responses of 4.3. Clearly, not only was I aware that using the PRS improved students’ alertness, but also the vast majority of students recognised that their concentration levels improved when using the technology. Unfortunately, it cannot be deduced to what extent this reflects greater active learning or the changes in stimuli received during lectures. ‘Using a PRS has encouraged me to attend lectures’ produced a median answer of 4 and a mean response of 3.6, with some students pointing out that they would have attended lectures anyway.

The final statement ‘Using a PRS has increased my confidence on this course’ led to a median response of 4 and a mean response of 3.8. A number of students felt it necessary to add qualifying comments to their responses to this statement. These comments all expressed the view that they felt much less confident about using the PRS when it was in the named mode, such that I could check their individual answers by inspecting the results file. Nevertheless, I felt that students were more willing to ask questions in both lectures and follow-up tutorials, possibly when responses to PRS questions had revealed that a number of students found a topic difficult. I have continued to use the PRS since the trial ended, but mindful of the trial feedback I have always used the PRS in its anonymous mode, and will continue to do so unless particularly concerned about monitoring the progress of individual students.

Conclusions

The higher education sector has undergone considerable expansion in recent years and this trend can be expected to continue. Consequently, university teachers regularly need to consider the most effective methods of teaching large numbers of students. PRSs offer an innovative method of maintaining student interest and concentration, enhancing active learning and the level of interaction in a lecture setting, whilst allowing students as well as lecturers an opportunity to monitor the level of student understanding.

Using a PRS has forced me to think carefully about the extent of active learning that is undertaken in economics lectures. Attention has to be given both to developing questions that will test students’ understanding of lecture material, and to incorporating these questions successfully in a lecture context. Further, the lecturer must be willing to revisit material if the PRS results suggest that lecture material has not been well understood by a majority of the students. This can impact upon the material covered not only in lectures, but also in follow-up tutorials.

Although I have been impressed by the PRS system purchased, research into its effective uses must continue. For example, its potential to be used more formally as a tool of assessment is a topic for future research.8 Although I have found that a PRS may be easily adopted as a teaching method for theoretical and quantitative material in economics as well as other subjects, the value of its use in more discursive and contentious subjects remains to be explored. In such contexts it may instead be used to encourage debate, with students initially asked to select a multiple-choice answer most closely corresponding to their personal views. Once students’ responses have been gleaned, discussion could focus on the range of responses elicited and the reasons for the different responses, and perhaps at the end of the discussion another question could be asked to see if views have changed as a result of the discussion.

The PRS has provided me with a valuable teaching tool that I have continued using since the trial ended, and which I intend to use on a larger number of courses in the future, including quantitative economics courses.

Contact details

Caroline Elliott
Department of Economics
The Management School
Lancaster University
Lancaster LA1 4YX

Tel: (01524) 594225
Fax: (01524) 594244
Email: C.Elliott@lancaster.ac.uk

References

Bligh, D. (1998) What’s the Use of Lectures?, Exeter: Intellect.

Cue, N. (1998) ‘A universal learning tool for classrooms?’, paper presented at the First Quality in Teaching and Learning Conference, December, Hong Kong.

Draper, S. (2001) ‘Electronically enhanced classroom interaction’, http://www.psy.gla.ac.uk/~steve/ilig/handsets.html, 16 September.

Dunn, W. R. (1969) ‘Programmed learning news, feedback devices in university lectures’, New University, vol. 3, no. 4, pp. 21–2.

Hake, R. R. (1998) ‘Interactive-engagement versus traditional methods: a six-thousand-student survey of mechanics test data for introductory physics courses’, American Journal of Physics, vol. 66, no. 1, pp. 64–74.

Harden, R. McG., Wayne, Sir E. and Donald G. (1968) ‘An audio-visual technique for medical teaching’, Journal of Medical and Biological Illustration, vol. 18, no. 1, pp. 29–32.

Lloyd, D. H. (1968) ‘A concept of improvement of learning response in the taught lesson’, Visual Education, October, pp. 23–5.

McLeish, J. (1968) The Lecture Method, Cambridge Monographs in Teaching Methods, no. 1, Cambridge: Cambridge Institute of Education.

MacManaway, L. A. (1970) ‘Teaching methods in higher education – innovation and research’, Universities Quarterly, vol. 24, no. 3, pp. 321–9.

Seenan, G. (2000) ‘Ask the audience – the way to a better degree’, Guardian, 4 January.

http://www.varitronix.com/

Notes

[1] I would like to thank Jim Boyle and the Department of Mechanical Engineering at the University of Strathclyde for taking the time to discuss and demonstrate the PRS technology with me. Also, thanks to Geraint Johnes, an editor and an anonymous referee for providing very helpful comments on this paper. All errors of course remain the responsibility of the author.

[2] I purchased a PRS from Varitronix plc, with the aid of a Lancaster University Teaching Quality Enhancement Fund (TQEF) grant.

[3] Alternatively, students could simply be asked if they understand the material covered so far, to which they can reply ‘1’ or ‘2’, corresponding to ‘yes’ or ‘no’.

[4] However, after the initial settling period students’ concentration levels are likely to be at their peak.

[5] For example, see Bligh (1998).

[6] As such, it can also serve as an attendance record if this information is required.

[7] Further details about the questionnaire results are available from the author on request.

[8] I am also aware of companies using PRSs to measure support for proposals in meetings.

Top | IREE Home | Economics Network | Share this page