The Economics Network

Improving economics teaching and learning for over 20 years

Peer-instruction unveiled: unlocking the power of Student Response Systems

Introduction

I convene a large First Year core “Introductory Economics” module, and I am responsible for the design and delivery of the Macroeconomics components of such module.Taking a blended learning approach, the module is organised in lectures, small group seminars, and large group workshops. Starting from Week 4 of each Semester, seminars and workshops alternate to each other every second week, for a total of 4 sessions each. The learning material overlaps over seminars and workshops, so that the same material is delivered and formatively assessed within different teaching environments. Aside from class size, seminars and workshops differ under a set of other dimensions. In contrasts with seminars, workshops are structured as 2 hours events (1 for seminars), where student walk in to a problem-set never seen before (as opposite to pre-assigned problem-sets in seminars).

We face several challenges in teaching workshop events with the aim to:

  1. deliver a personalised learning experience even in a large-class environment,
  2. engage students in group-work activities, and
  3. maximise the number of applications covered, and the learning-gains generated within each session.

I believe that we successfully achieved these three fundamental goals combining the use of Student Response Systems (SRS) with Peer-instruction pedagogical practice, Mazur (1997).

Methodology

Each Economics student at UEA is now endowed with a TurningTechnologies clicker, which s/he can retain free of charge until the end of their course of study. Students bring their own device to all teaching events, as SRS are widely used across many core modules. The first hour of each workshop is entirely based on multiple choice quizzes, whose answers are polled via SRS. The second hour of each workshop is based on more traditional paper-and-pencil problem-solving sessions, where students are given time to work through problems, and solutions are discussed at the end of the session.

In this case-study I will focus on the description of SRS-based sessions. These are delivered using PowerPoint slides managed through TurningPoint software, which allows for interaction with the class via clicker devices. In all workshops the ‘clicking-session’ is structured on a precise algorithm, adapted from the practice of Bates and Galloway (2012), which students become easily acquainted with:

  1. Students are distributed a sheet containing a set of multiple-choice questions. (The range can vary between four to ten questions, according to needs and to the difficulty of the material taught). All questions cover material previously taught in lectures.
  2. An opening question is polled, where students are asked to rate how prepared they feel on the material that will be assessed.
  3. A question is polled, and students are asked to respond to the question autonomously and independently. Generally, each question displays 4 multiple choices. When the poll is closed, students are not revealed the distribution of answers or the right answer.
  4. Students are then asked to rate their confidence at having given a correct answer to the question just polled, basing their responses on a 4-level Likert-scale. When this self-assessment question is polled, the confidence distribution is revealed.
  5. Students are invited to discuss the question and compare their answers with each other. This is the Peer-instruction moment of the session, and students are also free to use their own notes while they work together.
  6. The same question asked at point 3 is re-polled for a second time. When the poll is closed, the distribution of answers and the correct answer are revealed. The facilitator discusses the answers, making sure to clarify any remaining doubt.
  7. The comparison between the proportion of correct responses given in the first and second round is shown to the students, who can appreciate the power of the Peer-instruction pedagogy to which they have just been exposed.
  8. The algorithm repeats from point 3 to point 7 until all the questions in the problem set are covered. A final set of questions is asked to the students to assess: (i) whether they found the use of SRS useful for their learning, (ii) whether they felt that they learnt from their peers, and (iii) whether they found that the causes of their mistakes was due to difficulty, bad phrasing, or their own lack of engagement with the material taught.

Findings

Further to continuous iterations of the methodology described above (over the Academic Year 2013-14, and the First Semester of the Academic Year 2014-15), we were able to collect a wide set of interesting findings:

  • Students are generally able to self-assess correctly. In particular, the number of students who perform above average in each session is positively associated to the number of students who self-report high levels of confidence in having given the right answer. The reverse occurs for low-performing students, who report themselves as not confident.
  • Subjective and objective evaluation of student confidence are generally aligned. In particular, the self-reported levels of confidence are negatively associated to the degree of entropy in the distribution of answers given to each multiple choice question over the first round of polling. Thus, if the facilitator observes a widely dispersed distribution of answers for a question, s/he can comfortably infer that students do not feel confident about the material covered in such question.
  • Peer-instruction generates positive learning-gains, which are measured by the difference between the proportion of correct answers polled in the second and in the first round of the teaching algorithm. Learning-gains can be interpreted as a measure of success of the Peer-instruction pedagogy.
  • Learning-gains are higher when the proportion of correct responses in the first round is lower. This demonstrates that Peer-instruction has the effect of levelling the playing field, putting the class on the same level of knowledge.

Overall, our findings validate the effectiveness of Peer-instruction pedagogies. Moreover, the methodology developed to deliver workshop sessions facilitates the creation of rich datasets that can be employed to monitor the performance of students, their ability to self-assess, and their ability to work in group. These important features endow the teacher with sharp tools that can be used to adjust teaching in real time, and share specific suggestions for improvement with the class.

Student feedback

Further to investigation through SRS-polled responses, questionnaires, and focus-group sessions, we can comfortably claim that students report consistently high levels of satisfaction with the workshop session teaching methodology. A significantly high proportion of students also feel that they have benefitted from learning from their peers at the end of each session.

Perhaps surprisingly, when asked the main cause of mistakes in answering the multiple choice questions, the great majority of students always mention their own lack of engagement with the material (‘I should have been more careful, or studied more’). This additional result is particularly interesting, especially when considered under the perspective of students taking ownership of their own learning and developing self-regulation mechanisms to improve their performance.

Suggestions and closing remarks

  • The introduction of Peer-instruction sessions, such as the ones described in this case-study, requires careful planning. The guides provided by Mazur (1997) and material from the Peer Instruction Network can be extremely useful. Doubts that could arise might pertain the duration of each session, and the optimal timing to close a poll. Personally I follow my gut feelings when addressing these issues, continuously taking the temperature in the class. Observing the number of responses obtained, and setting up a timer that counts to the last 30 seconds prior to the closure of a poll, are good strategies to maximise the number of responses received. (See also Miller et al., 2014). Another important caveat is about the quality of the questions asked. Multiple-choice formative assessment sessions can be easily trivialised, asking banal questions that have an immediate answer. Students might or might not know the answer, but they will easily grow bored of such questions. However, with a little experience, one can engineer problem-based questions which display only minimal differences across the multiple-choice alternatives offered. These are the kind of questions that spark the most lively debates and that truly unlock the potential of Peer-instruction, maximising the learning opportunities for the students (Lancaster, 2014).
  • Could Peer-instruction work without the aid of the SRS technology? I would think so, but SRS-aided teaching displays some advantages: (i) it enables me to cover a larger amount of material, as questions and answers are efficiently handled by the technology, (ii) it engages all my students, leading them to self-assess their skills and take ownership of their learning. The positive act of clicking a button on a keypad attributes ownership to each student of the response just given. At the same time, the fact that each student can see that other students did not respond correctly to a question, makes them feel more confident about asking for clarifications to improve their future performance.
  • Another important gain, related to the advantages of using SRS technology, is the opportunity of closing the feedback loop. At the end of each workshop session, I post a report on the module’s VLE showing all the statistics generated within the clicking-session, the proportion of correct answers for each question asked, and the feedback received on the session. Students enjoy being able to appreciate the power of Peer-instruction and seeing with their own eyes the learning-gains generated through this pedagogy. This is an invaluable asset to establish a feeling of partnership with the students, ensuring that they are confident not only about the material learnt, but also with their whole learning experience.

Acknowledgements

This case study was developed thanks to the support granted by the Higher Education Academy, Teaching Development Grant (GEN957, Oct 2013-Oct 2014) for the project: “When Student Confidence Clicks: Self-Efficacy and Learning in HE”. Free resources can be found from the Project Website. I also gratefully acknowledge the suggestions and the support offered by Dr Duncan Watson (UEA), Prof Simon Lancaster (UEA), the participants of the DEE Economics Network Conference 2013, and the participants of the HEA Conferences and the SRHE Conferences 2013 and 2014. The invaluable help provided by the student research assistants; Mr Chris Thomson, Mr Jack Kelhear, and Ms Phuong Bui is also gratefully acknowledged.

References

Aricò, F.R., (2014), “When Student Confidence Clicks: Self-Efficacy and Learning in HE”, Project Webpage: https://sites.google.com/site/fabioarico/hea_tdg.

Bates S., and Galloway, R., (2012), “The Inverted Classroom in a Large Enrolment Introductory Physics Course: a case study”, Proceedings of the HEA STEM Learning and Teaching Conference.

Lancaster, S., (2014), “Ask Tougher Questions”, Learning Highlights, Autumn, University of East Anglia.

Mazur, E., (1997), “Peer Instruction: A User’s Manual”, Prentice Hall: Upper Saddle River.

Miller, K., Lasry, N., Lukoff, B., Schell, B., and Mazur, E., (2014), “Conceptual Question Response times in Peer Instruction Classrooms”, Physics Education Research, 10, 2, 020113(1-6). https://doi.org/10.1103/PhysRevSTPER.10.020113

Back to top
Contributor profiles
Contributor profiles