Using Student Response Pads (“Clickers”) in the Principles of Marketing Classroom
Douglas J. Lincoln, Boise State University
While existent literature documents the potential and realized student learning benefits
accruing from the use of student response pads (aka “clickers”), there is a lack of such
literature addressing their use in the marketing classroom. This paper reports on how
Principles of Marketing students and their instructor (author) reacted to the use of clicker
technology in the class. In general, students feel clickers are easy and fun to use and most like
their ability to provide instant feedback. However, a majority of students do not feel using a
clicker helps them learn course content or improve their grade. Still, most students wanted to
see clickers used in their future courses and the instructor achieved his objectives for adopting
Clicker Technology and Classroom Use
Murphy and Riddle (2003) describe a number of student response systems for classroom use.
These systems are based on a variety of technologies ranging laptop computers to radio-
frequency, handheld clicker devices such as used in this study. Using clickers, students
respond to questions or other stimuli given verbally or visually (e.g., PowerPoint, other
projection means or written on the board) by the instructor or even students. One popular use
of clickers has the instructor posing a multiple-choice question to the entire class to check
their level of understanding of a key concept; collecting their responses; and then immediately
projecting a table or graph (e.g. bar chart) displaying how the entire class responded. The
instructor can then use this feedback to decide their next step. One might be to revisit material
not well understood another might be to ask students to break out into smaller discussion
groups and explore why certain (incorrect) response choices were given.
It appears that the majority of instructors using clickers are housed within the hard sciences.
Physics faculty and departments have probably employed this technology longer than any
other academic discipline. While some literature exists that contains best practices
suggestions (Beatty 2004, Draper and Brown 2004, and Duncan, 2005 and 2006), this
literature does not appear to address uses in marketing education. The purpose of this paper is
to describe how clicker technology was used in a Principles of Marketing class and how
students and I reacted to the experience. The goal is to provide others with insight as to what
students like and dislike about clicker use. This is done with the hope that future instructors
adopting the technology are prepared to maximize the potential value clickers offer for
increasing student involvement in their leaning processes.
The Clicker Application and Research Setting
I used eInstruction’s CPS® brand of clicker technology in my spring 2007 semester
Principles of Marketing class that met twice at week for 15 weeks and enrolled 83
undergraduate students; most in their third year of a four-year business degree program. My
reasons for adopting the technology were six-fold: (1) gain and keep student attention, (2)
generate and provide instant feedback to students on in-class activities (e.g., group quizzes,
case questions, etc.), (3) use instant feedback to adjust my lecture and other classroom
activities, (4) take attendance, and (5) minimize quiz paper handling/shuffling/scanning, and
(6) begin to learn how to use the technology. Clickers were specifically used by students to
take twelve weekly quizzes at the beginning of the class period on the first class day of each
week. Quizzes were first taken on an individual basis and immediately followed by taking the
same quiz on a group basis. Students also used clickers to respond to other questions/stimuli
during the second day of the week. Approximately 11% of the student’s final grade was
determined by responses to these second day clicker questions. Following the advice of
Duncan (2005), students earned half-credit for “wrong” responses to questions or other
stimuli on the second day of class but earned no credit for no response.
A clicker administered survey with twelve questions was run in anonymous mode during the
last week of class to assess student perceptions about using clickers and their reaction to
specific aspects of such. Survey participation was optional for the 72 students in attendance.
A total of 68 students participated for a response rate of 94.4%. The following section briefly
reviews what other clicker instructors/researchers have reported on their experience with
clickers; including their student reactions and outcomes. This literature review helps in
offering eight study hypotheses.
Current Knowledge on Clicker Use, Student Reactions, and Outcomes
Many lecturers of large classes note that students at the back of their lecture halls participate
less in work projects during lectures. Strategies have been developed in which well over 90%
of students attending lectures respond with clickers (Ohio State University 2005). It has been
demonstrated that clickers improve classroom dynamics for both student-lecturer and student-
student interactivity (Draper and Brown, 2004). These findings lead to Hypotheses 1 and 2:
Students will feel that clicker use helps keep their attention and that clickers make their class
Peer Instruction (PI) is a widely-used pedagogy in which lectures are interspersed with short
conceptual questions (Concept Tests) designed to reveal common misunderstandings and to
actively engage students in lecture courses (Crouch and Mazur, 2001). Clickers have the
potential to help both students and instructors identify students’ misconceptions and deal with
them at the time they are recognized. (Hatch, Jensen, and Moore, 2005). With this technology,
instructors obtain instant, specific feedback and students get the chance to express themselves
and see what others in the class are thinking (Terreri and Simons, 2005). Thus, clickers can
provide real time feedback leading both parties to adjust their future behaviors. These findings
lead to Hypotheses 3a and 3b: Students will like the instant feedback capability of clickers for
both quiz and non-quiz applications.
No studies were found that assessed how students feel about having to bring clickers to the
classroom on a regular basis; a possible inconvenience or hassle factor. However, given that
clickers were one more thing to bring to the classroom, it was my belief that students might
not appreciate such and might generate negative attitudes toward clicker use. This leads to
Hypotheses 4: Students will feel it is a hassle to bring their clicker to class.
Copas’ (2004) study found nine out of ten students reporting clickers easy to use. And, about
three out of four felt clickers were fun to use. This likeability for clickers was also reported in
a University of Minnesota study where 96% of surveyed students said they liked using
clickers (Hatch, Jensen, and Moore, 2005). These findings lead to Hypotheses 5a and 5b:
Students will feel clickers are easy and fun to use.
A Grand Valley State University study reported that 72% of all students felt clickers increased
their understanding of course (geology) material (Wampler, 2006). William (2003) found 91%
of a study’s MBA students feeling the quality of discussion following clicker questions
deepened their learning of course content. These findings lead to Hypothesis 6: Students will
feel that clicker use helps them learn Principles of Marketing.
Crouch and Mazur (2001) demonstrated that student electronic voting systems, when
combined with peer instruction, produce statistically significant improvements in standardized
test scores. Hake (1998) found students attending a course employing clickers scoring 25%
better on a post-instruction exam in contrast to students who took the same physics course
without clickers. These findings lead to Hypothesis 7: Students will feel that clicker use helps
improve their grades.
Williams (2003) found 98% of studied MBA students saying they would like to see clickers
used in their future classes. These findings lead to Hypothesis 8: Students will like to see
clickers used in more of their future classes.
Student responses to attitude-based survey questions and associated statistics are reported in
Table 1. As shown in this table, a majority of students believe clickers help keep their
attention (65.6% agreeing), make the class enjoyable (63.3% agreeing), are not a hassle to
bring to class (83.6% agreeing), are easy to use (89.6% agreeing), are fun to use (62.7%
agreeing), and want to see more use in future classes (64.1% agreeing). Given survey
response choices of yes, no, and indifferent, a majority of students said they did like how
clickers provide instant feedback on quiz and non-quiz activities (76.5% and 72.3%
A majority of students (54.4% disagreeing) do not think clicker use helps them learn
Principles of Marketing or helps improve their grade (53.0%). Questions on these two items
earned the lowest overall mean scores among all eight attitude questions.
Student Attitudes towards Classroom Use of Clickers
Hypotheses and Survey Item(s) Strongly Agree Disagree Strongly Mean
Agree Disagree Score*
% % % %
H1. Helped keep my attention 12.5 53.1 31.3 3.1 2.25
H2. Made class enjoyable 11.8 51.5 22.1 14.7 2.40
H4. Not a hassle to bring to class 35.8 47.8 6.0 10.4 1.91
H5a. Easy to use 44.8 44.8 9.0 1.5 1.67
H5b. Fun to use 13.4 49.3 23.9 13.4 2.37
H6. Helped me learn Principles of 7.4 38.2 38.2 16.2 2.63
H7. Helped improve my grade 4.5 42.4 39.4 13.6 2.62
H8. Like to see used in more of my 13.4 50.7 17.9 17.9 2.40
*Where Strongly Agree = 1, Agree = 2, Disagree = 3, and Strongly Disagree = 4
Hypotheses 1 and 2 (keeping attention and making class enjoyable) are supported. Study
results echo the existent literature reporting on how useful clickers can be at keeping student
attention. Hypotheses 3a and 3b (liking for instant feedback on quiz and non-quiz questions)
are also supported. A majority of students liked clickers for their ability to provide instant
feedback on these elements. However, it is interesting that nearly 18% said “no” to this for
quiz feedback. Perhaps this feeling is held by those tending not to do well on quizzes or
considered a waste of time for those tending to always get high marks. Hypotheses 4 (the
hassle of bringing clickers to class) was not supported as nearly the entire respondent pool did
not seem to mind this requirement. Perhaps this is due to the clicker’s relatively small size and
students being accustomed to regularly carrying other such small, electronic devices (e.g., cell
phones, ipods, etc.). Hypotheses 5a and 5b (clickers being easy and fun to use) are supported.
Identical to Copas’ (2004) findings, nine out of ten students say clickers are easy to use.
About three out of four students feel clickers are fun to use. This likeability for clickers
echoes that found in the Hatch, Jensen, and Moore (2005) study. Both Hypothesis 6 (helps
learning of Principles of Marketing) and Hypothesis 7 (helps improve grades) are not
supported. A minority (45.6%) believe clicker use helps them learn principles of marketing.
This is nearly 26% lower than found in the Wampler (2006) study. This percentage is also low
in contrast to the 91% reported by Williams (2003). Only a slight majority (51.5%) of
students believe clickers help improve their course grade. This seems low given Hake’s
(1998) finding that students attending a course employing clickers scored 25% better on a
post-instruction exam than those who took the same physics course not employing clickers.
Hypothesis 8 (Students will like to see clickers used in more of their future classes) is not
supported. While nearly two-thirds of the students want to see this occur, nearly 18% strongly
disagreed with this statement. This percentage is substantially higher than the 2%
disagreement level found by Williams (2003). It is possible that students in this class were
more sensitive to the incremental costs associated with using clickers than MBA students who
are often reimbursed for educational expenses. This finding may reflect student attitudes that
clickers did not help them learn course material or improve their grades.
Implications and Recommendations
Principles of Marketing students appear to like using clicker technology in their course and
instructors should not worry about students being inconvenienced by their requirements.
However, if the instructor wishes to increase the students’ perception of their “in use” value,
then he or she should clearly explain and reinforce the benefits that students will receive from
such use as this connection was not perceptually evident in my class. While my six objectives
for employing clicker technology in the class were met, these objectives are most likely not of
any direct value to the student. However, one benefit of clicker use-that being the ability to
provide instant feedback benefited both students and me. After taking the group-based quiz,
response frequencies were displayed on the whiteboard. When most groups responded
correctly, no class time was wasted on further review or discussion of the topic. But, the
opposite was true when more than 20% of the groups responded incorrectly. This feedback
capability allows classroom time to be efficiently and effectively utilized. I suspect that low
scores for clickers helping students learn course material could have been driven by my
choice and use of non-quiz questions. Many were simply recall or recognition type questions
(e.g. on textbook cases) and did not fully employ the Peer Instruction capabilities touted by
Crouch and Mazur (2001). Thus, I encourage future users to spend more time carefully
designing and executing classroom activities for which clickers are only tools for student
participation and not the focus of attention. Recognizing the inherent problems with student
self-reports on learning, future studies measuring the impact of clicker use on actual learning
may be better structured using experimental designs more conducive to cause and effect
Beatty, I., 2004. Transforming student learning with classroom communication systems.
Educause Center for Applied Research Bulletin (3), February 3.
Copas, G.M., 2004. Where's my clicker? Bringing remote into the classroom. Usability News,
Crouch, C.H., Mazur, E., 2001. Peer instruction: ten years of experience and results.
American Journal of Physics 69: 970-977.
Draper, S.W., Brown, M.I., 2004. Increasing interactivity in lectures using an electronic
voting system. Journal of Computer Assisted Learning 20: 81-94.
Duncan, D., 2005. Clickers in the classroom: How to enhance science teaching using
classroom response systems. San Francisco: Pearson-Addison Wesley.
Duncan, D., 2006. Clickers: A new teaching aid with exceptional promise. The Astronomy
Education Review 5(1):70-88. [Online] http://aer.noao.edu/cgi-bin/article.pl?id=194.
Hatch, Jay., Jensen, Murray., Moore, Randy., 2005. Manna from heaven or “clickers” from
hell? Journal of College Science Teaching (July/August): 36-39.
Hake, R., 1998. Interactive engagement versus traditional methods: A six-thousand student
survey of mechanics test data for introductory physics courses. American Journal of Physics,
Murphy, P., Riddle, R., 2003. Interactive learning tools and techniques: Personal response
systems resource guide. Duke University Center for Instructional Technology.
Ohio State University., 2005. Final report: Committee on classroom response systems. March
Terreri, A., Simons, T., 2005. What are they thinking?” Presentations 19 (2): 36
Wampler, Peter J., 2006. Clickers in the classroom-Rewards and regrets of using student
response systems in large enrollment geology course. Geological Society of America
Abstracts with Programs 38(7): 497.
Williams, Jeremy B., 2003. ‘Learning by remote control: Exploring the use of an audience
response system as a vehicle for content delivery. Proceedings of the 20th Annual Conference
of the Australasian Society for Computers in Learning in Tertiary Education (ASCILITE),