Document Sample
newport Powered By Docstoc



                        A Paper

                      Presented to

  The Department of Instructional Design and Technology



                  In Partial Fulfillment

           Of the Requirements for the Degree

                    Master of Science



                     Brian Newport

                     December 2004


                Dr. Armand Seguin, Chair

                   Dr. Marc Childress

                    Dr. Harvey Foyle

                                                   TABLE OF CONTENTS

Chapter 1: Introduction.................................................................................Page 1

Chapter 2: Review of Literature....................................................................Page 3

Chapter 3: Method.........................................................................................Page 14

Chapter 4: Results.........................................................................................Page 17

Chapter 5: Conclusions.................................................................................Page 21

References.....................................................................................................Page 22

Appendices....................................................................................................Page 25

                                            CHAPTER 1

        When I first started teaching, I taught in a very small high school where I had from five

to twelve students in a class. However, when I switched schools, the class size increased

significantly. Now it is not uncommon to have 24 students to a class and there have been some

years in which that number approached 30. With this increasing number of students in each

class, the individual student became lost and in some cases withdrawn to the point that he or she

would not say a word unless called upon. Even then, the stude nt was sometimes hesitant to

respond due to fear of being ridiculed by other peers, or possibly due to shyness. Thus, during

lectures or classroom discussions, some students would go through the entire class without any

useful monitoring. I was having a much harder time evaluating student comprehension of

lectures in a timely manner. Because of the number of students, papers were sometimes not

graded by the next class period and students’ problems were not being discovered before tests

were administered.

        Last year I attended a national science convention where I learned of a piece of

technology called a student response system in which the instructor could monitor the progress

of an entire class in real time, as well as obtain reports for each individual student. I felt that I

should somehow try to obtain one of these devices in an attempt to give more feedback to

individual students. I wanted to see if the students’ science concepts would improve by having

instant feedback from an instructor. Can this type of technology increase a student’s learning?

Over the summer I was able to persuade The Clicker Guys, (a distributor of eInstruction’s

Classroom Performance System®), to allow me to use such a device for a semester. I hoped to

learn for myself the capabilities of this technology.

       This type of technology goes by many names such as “classroom response systems”,

“classroom performance system (CPS)”,” peer response systems”, or sometimes affectionately

called “clickers”. There are many companies that make and distribute these student, hand held

devices. Some of the companies that manufacture this type of technology include: eInstruction

Corporation, Quizdom, TurningPoint, and Options Technologies Interactive (OTI), as well as

many others. Each company has a variety of hand held response systems with various features

ranging in price from about $1,500 to $8,000 for a complete system. Sets in this price range

include close to 30 hand held devices. Price differences are mostly due to the kind of receiver,

(wired, infrared or radio frequency) being used. The price also varies based on the kinds of

responses the students can enter into the keypad. Some keypads only allow for multiple choice,

true- false or yes-no type questions to be answered. Other, more expensive keypads are equipped

with a keyboard for entering numbers or letters. The common factor is that they all are used by

instructors to gather real-time data about the students’ understanding of the concepts being

presented. The reason I chose the CPS system by eInstruction was because The Clicker Guys, an

eInstruction system distributor, allowed me to use the system free for a semester so that I could

research this type of product. I would like to thank The Clicker Guys for providing me with this

opportunity. They have been tremendously helpful in getting the system set up for me as well as

always being available to answer my numerous questions I had about their system.

                                       CHAPTER 2
                                REVIEW OF THE LITERATURE

       Educational research in schools identifies the practice of monitoring student learning as

an essential part of high-quality education. In environments where there is careful monitoring of

students’ progress, students are more likely to succeed (Cotton, 1988). For the purposes of this

paper, I will use the definition of monitoring to mean “activities pursued by teachers to keep

track of student learning for purposes of making instructional decisions and providing feedback

to students on their progress” (Cotton, 1988).

       Instruction in the classroom can be accomplished in a variety of ways. Presentation is

one of many different parts of the learning process. Until just recently, monitoring students’

progress and giving students productive feedback during lecture or presentation has been a very

difficult, if not impossible task. In a recent article found in Education Technology, Research,

and Development (Fitch, 2004, p72), research suggests that, “interactivity is one of the most

important factors in the design and development of effective computer-based instruction

materials…and that interactive learning heightened student interest and improved higher

cognitive learning.” The article goes on to state that “it can be concluded that there is convincing

evidence that interactivity [and feedback] is a critical part of any form of technology-based

learning” (Fitch, 2004, p72). The author of the article believes that interaction may be the most

important factor in computer based instruction. Traditionally this interaction was only obtained

through direct questioning or a show of hands in a classroom. Thus, a limited amount of

interaction and feedback was possible and not every student was benefiting from this type of

interaction. To help alleviate this problem, current technology has bee n developed. This type of

technology allows instructors to ask questions and to obtain feedback from the entire class

simultaneously. This new technology is called a classroom response system.

       “A classroom response system is a small network (radio freque ncy, infrared, or wired) for

an individual classroom” (Wiley, 2004). These devices are made up of a computer, a projection

device, student transmitters, a receiver, and sometimes an instructor unit. The computer runs

software that records each student’s answers and provides feedback to the instructor as it comes

to the receiver. Students get immediate feedback to questions presented by the instructor. The

system works by recognizing the individual ID’s of each student’s response unit. Instructors can

ask questions that consist of yes/no, true/false, or multiple choice formats. Other uses for these

include polling students or even simulating a game- like situation much like the popular TV

show, Who Wants to Be a Millionaire, as well as other competition- like games.

       Other advantages for the instructor include being able to take attendance quickly, detect

attendance patterns in large classes, obtain baseline knowledge, correct persistent

misconceptions, check for concept mastery, aid in preparing students for new material and

obtain 100% student participation. Questions can be graded on the spot and reports of

individuals viewed instantly. Most systems can also integrate the system with PowerPoint.

Advantages for the students include interactive partic ipation, instant feedback, confidential

student-to teacher responses, immediate correction of mistakes, and students’ increased

enjoyment of the class (Smartroom, 2004). It was surprising to see that classroom response

systems have been around for almost three decades. However, these devices are finally starting

to become more prominent in schools at all levels.

       In a presentation at a No Child Left Behind leadership summit, Susan Patrick, the US

educational technology director, stated that technology can play a major role in supporting the

No Child Left Behind Act, (a law passed by President Bush in 2002 that forces schools to

educate every child to his or her fullest potential). Ms. Patrick states that one way to accomplish

this is by “equipping teachers with productive tools, empowering teachers, parents, and decision

makers with real-time data, engaging students in their education, and individualizing learning by

personalizing instruction for each student’s unique learning needs” (Patrick, 2004). The

classroom response system makes these claims. This project hopes to show that the classroom

response system is a valid piece of technology that can be used to improve student learning.

       Early research from the science department at Rutgers University provides some

interesting data collected from students who used a classroom response system in large lecture

halls in an introductory physics class. A survey about the use of the response devices was given

to the students in the class and 95% of the 85 students that responded said that they felt that they

got more out of lecture by using the CPS. Over 85% of the students also said that it was helpful

to discuss the wrong answers, and they felt that the CPS made them more involved in lecture.

Students also said that they gained confidence in themselves with a correct response and that

they were more likely to respond than if they had to raise their hand (Shapiro, 1997).

       Ball State University also did some preliminary research in classroom response systems,

but the data is very general. It states that 90% of the students liked the system and thought it had

value. However the nine faculty that were to experiment with this technology reported that it

takes a great deal of time to implement correctly. They felt that the response system was useful

and that it improved interaction among lectures (Ober, 1997).

       In an article found in The Physics Teacher magazine, Mr. Milner suggests ten tips to

follow to achieve better results when using a classroom response system in an introductory

physics classroom. They are as follows:

           1. Think carefully about the main concepts of every part of your lecture and
              create one conceptual multiple-choice question targeting this concept. Try
              to make the question as clear as possible. Do not overload it with

               mathematical calculations, since it will require more time to answer and
               will make more students guess.
           2. While coming up with the distracters, use the ones with wrong units, the
               conceptually impossible, as well as the distracters representing common
               novice misconceptions. Do not underestimate the power of humorous
           3. Do not allow the students to answer the question right away. After posting
               the question, ask students to discuss their answer with each other.
           4. Only after you see that most of the students came up with their answer,
               start the response system.
           5. After the answers are displayed on a large screen as a histogram, make a
               decision regarding what to do next. If the majority of the students answer
               correctly, go on. However, if the students answers are random or a split
               between two or three choices, ask for a few student volunteers to explain
               their choices. … Give students the opportunity to figure out the answers
           6. After this discussion, ask the students to input their answers a second time.
           7. Make the response questions part of your assessment. For instance, you
               can announce in advance that 30% of the exam will be composed of the
               questions based on the problems discussed in lecture. This will make the
               students focus more on the lecture and on the group discussions of the
               response questions.
           8. Do not feel disappointed if you spend too much time on one question and
               did not cover all the material. If you actively involved the students in the
               lecture, helping them understand rather than memorize the material, you
               did your job.
           9. Create a learning friendly environment of mutual respect and
               responsibility via getting to know your students.
           10. Always remember the goal of the response system questions is not to
               punish the students who could not answer the questions correctly, but to
               help every one of them to be successful in physics learning via active
               involvement in the lecture. (Milner-Bolotin, 2004, p. 253-254)

       The article also suggests that the instructor must first become acquainted with the

software and its uses before any significant learning can take place. In conjunction with

the above suggestions, Smartroom Learning Solutions advises that instructors using a

response system start their classes by asking three to five questions concerning the

homework to determine if the students did their homework. Smartroom also advises that

while lecturing, instructors ask a couple of questions every 10-15 minutes to see if the

students understand the material and to keep the students interested (Smartroom, 2004).

       Another study found at the eInstruction website, obtained from the physics department of

Eindhoven University of Technology, suggests a format of asking four various types of

questions. The first is exploration, which involves gathering the opinions of students. For

example, the instructor would ask “Have you understood my arguments regarding this

equation?” The second type is a form of verification that allows the instructor to see the level of

student understanding of a concept. This includes a question like, "Does this apply to high

temperatures?” A third form of questioning aims at the student’s ability to apply something he

or she learned to a new situation. This involves a multiple choice question for which the student

has to decide on the correct equation to use and hence find a solution. The fourth type of

question such as “Are you ready for me to continue my lecture?” helps the instructor to know

when he or she can move on to a different topic (Poulis, Massen, Robens, & Gilbert, 2001).

       Articles of others who have used these devices include Dr. Rick Groseberg, Professor of

Evolutionary Biology & Ecology at the University of Colorado. He once presented at a

conference in which there were 400 distinguished professors. He wanted to give a “level of

interactivity to the audience.” He had heard of these kinds of systems and was able to borrow

one for use in this lecture. It was an immediate success. The audience loved it and gave “a roar

of applause and whistles as the presentation came to a close.” Several professors in the audience

wanted to know how they could get one of these systems. Dr. Groseberg stated that “This was

such a great way to be engaging without being patronizing.” Even though a Hawthorne- like

effect (the test group performing well because they receive special attention) may have been

present, the response of the audience shows how valuable these systems can be if they are used

correctly (Conference Success with PRS, 2004).

         A professor of Chemistry at the University of Cincinnati thought the use of a personal

response system would be a great way to interact with his class. His concern was that he never

knew if his lectures were effective. If he asked the students for their opinion, most would not

even give a response. Using this response system would give the students a level of anonymity

and because the professor could tell who responded, every student had to participate. In

presenting the material to the students, he would ask questions that the students would have to

respond to with their hand held devices. The immediate results were shown and the rest of the

lecture could be modified if needed to help students understand the concepts better. The

personal response system took the guessing (of how students were grasping the material) out of

his class. The comment by the professor was, “I believe the personal response system is an

excellent classroom tool which gives professors feedback as to how effectively students are

learning the classroom materials. I would use it in every class that I teach.” This helps verify

the claims that shy students can feel free to participate without feeling embarrassed (Mack,


         One case study was found outside of the traditional school atmosphere in which IBM

studied the use of a classroom response system vs. the use of traditional lecture in its training of

managers. Several instructional sessions of both types were studied to try to find out if the

classroom response system would be a valuable asset to them. The initial trials in a classroom in

which the student response system was tried as a pilot “did stimulate student interest beyond just

the Hawthorn Effect, but it did not make the dramatic differences in the classroom environment

as was predicted. …This was contributed to the fact that the instructors were unfamiliar with

the system, the system was used too infrequently and the questions posed to the students were

almost all multiple choice (Horowitz, 1987).

       Based on the results of this pilot classroom, corrections were made according to the

research tips addressed earlier in this paper. Instructors learned to ask questions every 15-20

minutes, used a variety of questions, became better acquainted with the software, and obtained

very positive results. The test scores in the class in which the CPS was used increased up to 27%.

Feedback from a survey from 1 to 7 (7 being a strong vote for the new system) obtained results

of 6.6 showing a strong desire to continue using the CPS. The findings go on to state that the

response system has shown to improve the learning process, but that this concept should also be

explored further (Horowitz, 1987).

       Lately, many more schools and universities such as Rutgers, North Dakota State,

University of California, Berkley, Dartmouth, and Harvard are using these hand held devices in

their classrooms (Russell, 2003). Beginning in the spring semester of 2005 at Purdue University

in Indiana, students will be able to buy one response keypad for about 12 dollars. They will be

able to use the keypad in any of the 315 Purdue classrooms that use this type of technology.

According to an article at the eInstruction website, “Purdue has become the first university in the

country to implement a system-wide license for using audience response pads in the classroom.”

Purdue did this “to keep classroom technology convenient, affordable and reliable for students

and faculty” (Russell, 2003).

       After considerable searching, very little hard data on these systems could be found. Most

studies suggest an improvement in student performance of some kind, but not all are statistically

significant. Also, all agree that larger long term studies should be undertaken to truly see the

overall effects of these hand held devices. Each study that included questions regarding

participants’ views of the system indicated that students thought their classroom experience was

more enjoyable because of the response systems. These students seemed to pay more attention
                                                                                                - 10 -

during lectures if questions were asked in periods not to exceed 15 – 20 minutes. Every

indication showed that instructors who use the system have to put a great deal of upfront time

into learning and writing questions for the system, however, they all felt that the real-time

monitoring and feedback made them better presenters.

       There are a variety of reports that the instructor can generate from the information

collected from the clickers in class. Some of these reports are very useful to the instructor. The

Instructor Summary can be easily converted to a spreadsheet and the names can be omitted so

that the student’s scores can be quickly displayed with student ID numbers. Another important

report for the instructor is the Item Analysis. This report shows the percentage of students that

answered each choice as well as identifies the correct answer. This type of report quickly shows

the instructor the questions that need to be addressed again for better understanding. A partial

listing of possible reports and what information they included is listed below.
                                                                                           - 11 -

      Various types of reports that can be obtained from the CPS

Instructor Summary     A report that shows the students name, student ID, number of
                       answers correct and attempted, as well as the percent correct.
Study Guide            A report that is generated for each student selected and includes
                       the student’s name, ID, and a list of all of the questions asked
                       with the correct answer displayed as well as the answer that the
                       student selected. In this manner, each student could receive
                       their own individualized study guide. Disadvantage would be
                       LOTS of paper used.
Study Guide class      This report is much shorter, but less detailed. It includes each
summary                student’s name, ID, pad number, and only the question number
                       that the student missed with the correct answer and the answer
                       that he or she gave.
Question report        This includes the question asked, lists the choices with correct
                       answer, and also lists each student with the answers that he or
                       she gave for the instructor’s quick reference.
Response Report        This only gives the question, choice of answers, correct answer,
                       and percentage of students who chose each answer.
Item Analysis          This only shows the number of each question with the
                       percentages that the class picked, it also identifies the correct
Item analysis with     Same as above only with standard identified at the end.
Opinion survey         This only shows the question number as well as the total
                       number of students choosing each item. Then it shows the
                       average of the total in the last column. Very handy for a quick
                       summary of the totals in a survey.
Post report            Gives instructor an easy method of printing off results of
                       information gathered. Information is transferred into a
                       spreadsheet with choice of student name, ID, pad number, and
                       student full name. This can be easily printed and posted for
                       viewing results if names are left off report.
                                                                                              - 12 -

   In an attempt to get the most dependable results possible in this case study, I have decided to

use a popular physics assessment called the Force Concept Inventory (FCI), designed to test

students’ understanding of the most basic concepts in Newtonian physics. The test contains 30

multiple choice conceptual questions that requires no numerical computation. The authors of the

test have designed the multiple choice answers to reflect the common misconceptions student s

have about concepts dealing with forces in physics. The test is designed to be given as a pre-test

and post-test. “Various versions of the multiple-choice test were administered to more than 1000

college students and the validity and reliability were established in different ways.” (Savinainen

& Scott, 2002). The FCI can be used for three main purposes: as a diagnostic tool, for

evaluating instruction, and as a placement exam for future physics courses, although not for an

introductory course (Hestenes, Wells, & Swackhamer, 1992). “The FCI was first developed in

North America and is used as a diagnostic assessment tool at every level of introductory physics

instruction, from high school to university.” It has been determined by the authors of the FCI

that “a score below 60% means that a student has not made the transition to thinking within the

Newtonian paradigm, while a score of 85% means that the student is a Newtonian thinker”

(Savinainen & Scott, 2002).

       The authors of the FCI test suggest the following formula to measure the gains of

students from the pre to post FCI test results.

                       Gain = (post test % - pre test %) / (100 – pre test %)

This is then averaged for the whole class so that a single gain is reported for any given course

(Pearce & Roux, 2001). In an extensive study involving 62 introductory physics courses with

over 6000 students participating, FCI gains were recorded. These courses were broken into two

different delivery modes, 14 traditional courses (in which lecture was the major mode of
                                                                                             - 13 -

delivery), and 48 courses involving some type of interactive engagement. The average FCI gain

for the traditional courses was 0.23 +/- 0.04 (std dev) and for the interactive-engagement courses,

the average FCI gain was 0.48 +/- 0.14 (std dev) (Pearce & Roux, 2001).
                                                                                               - 14 -

                                           CHAPTER 3

       This pilot test using a classroom response system (CPS) was tested in two introductory

Physics classes consisting of 18 and 24 students. The class of 24 students consisted of seven

girls and 17 boys. The class of 18 consisted of four female students and 14 male students. It

should be noted that in the class of 24, there were two foreign exchange students from Mexico

they both were female and had taken a one semester introductory Physics class in Mexico three

years prior. The rest of the students could be considered White middle to upper class students

who had a background of introductory physical science (which includes one semester of very

basic Physics) in their freshmen year. The math backgrounds of the students range from

Trigonometry to Calculus. However, mathematical calculations were limited in this study and

were not a part of the FCI pretest/posttest used. The career interests of these students also ranged

greatly, from engineering to English to art and theater.

       The students are all high school seniors in a Midwest private Catholic high school.

Students meet on a block schedule in which classes are held for 90 minutes every other day. The

classroom was a large science class designed for Physics with movable student tables in which

students sit two to a table. The room has media capabilities to support the CPS system in terms

of a LCD projector, a computer, and a pull down screen in front of the classroom.

       The first nine weeks of the school year resulted in a trial period in which the students in

the instructor’s freshman Physical Science classes were given an opportunity to use the response

system so that the instructor could get used to using the software and so that the bugs could be

worked out of the system without any significant data being collected for those classes. This was

a direct result of the research which suggested the instructor have a time of getting used to the

software before any significant progress could be determined.
                                                                                               - 15 -

       During the start of the second nine weeks, students in the two introductory Physics

classes were given the FCI as a pretest just before the topics of Newtonian dynamics were

studied. Students were introduced to the clickers several weeks earlier, but only used them

briefly to familiarize themselves to the system. The two Physics classes were split such that one

class used the response system and one class did not. This was decided, in part, by the overall

class averages of each Physics class. The test group with 24 students had an overall class

average of 85.5% and was 1.5% lower than the smaller class for the first nine weeks. The class

with the 18 students had an overall class average of 87% for the first nine weeks. Perhaps this

lower percentage was due to the larger class having fewer opportunities for feedback and

questioning? Neither class used the clickers during the first nine week period. The lowest class

average (24 students) was assigned the classroom response system to use in the study of

Newtonian dynamics. The smaller size, higher average class was used as the control group and

did not get to use the devices. It was thought that the larger class with the lower average could

benefit more from the CPS system because of the more frequent feedback that would be


       The group of students that used the CPS system was assigned keypad numbers based on

the alphabetical order of their last name. After four and a half weeks of using the response

system, (the minimum time needed to cover the concepts on the FCI), the FCI was given for a

second time to see which of the two groups had the larger increase. Note, the response system

was used to take the FCI both times for the larger class and a ScantronT M format was used in the

smaller control class.

       There was an effort on the part of the instructor to eliminate as many variables between

the control group and the test group as possible with the exception of the use of the CPS system.
                                                                                              - 16 -

The same questions were asked in both classes, but the control group answered the questions

with the traditional paper and pencil and the results were graded and given back the next class

period, or students graded the assignment immediately after all the students had finished

answering the questions. There was an effort to ask at least 10 multiple-choice, true/false or

yes/no questions each day, with minimal mathematical calculations required. The distracters that

were used in these questions tried to focus on misconceptions, wrong units, or the conceptually

impossible. These were determined by using several years of questions from an “experienced”


        In the test group, the questions were projected on the large screen in front of the room

with the possible answers to the questions. There was an effort to use pause time so that students

could have time to read and formulate answers before the students were allowed to answer with

their keypads. Then a decision to move into a new concept or discuss the concept more was

determined from the results shown on screen. If a large number of students missed the question,

(not an exact number, but around four or less students) the instructor moved on to the next

                                                                                               - 17 -

                                           CHAPTER 4

       The complete results of the pre-test and post-test FCI scores along with the calculated

FCI gains are included in Appendix A. For the class that used the clickers, a very respectable

FCI gain of 0.44 +/- 0.16 (std dev) was obtained. This is very similar to the 0.48 +/- 0.14 (std

dev) results found in the research for an interactive-engagement class. However, the control

group fell just shy of these results too. The control class had an FCI gain of 0.38 +/- 0.21 (std

dev). The chart FCI Gain shows that the class that was using the response devices consistently

had results that were higher than the control group, but a two tailed t test showed the results were

not statistically significant. Results were t(39) = 1.02, p>0.5, not significant. Because of the

short time that the CPS was used in the Physics classroom and the students’ new exciting lecture

format, there could have been some Hawthorn effect. (Meaning some short term benefits

associated with the new technology and the special attention that may have been given to the test

group.) Also, a possible explanation for the larger gain in the control group may have come

from a lesser known effect called the John Henry effect. This is where the control group has a

desire to compete against the test group. However, as instructors and students continue to use

these types of response systems, these possible effects should decrease.

       It is also possible that the instructor had a tendency “to teach to the test”. However, the

instructor did not purposely do anything to allow this to happen. The only change in the method

of teaching that resulted was less emphasis on mathematical calculations in solving problems and

more emphasis on the conceptual aspects of the physics concepts. After the study was

completed, the instructor was able to see the specific weaknesses in both sets of student

understanding in the introductory physics concepts.
                                                                                              - 18 -

      Data table for the FCI gains of both the clicker group and the control group

                                                                        FCI         FCI
           FCI pre-        FCI post-     FCI Gain       FCI pre-        post-       Gain
           test            test          clicker        test no         test no     control
           clickers        clickers      group          clickers        clickers    group
                      13            26           0.77              10         25       0.75
                       4            22           0.69              16         25       0.64
                       9            23           0.67               5         20       0.60
                       6            21           0.63              11         22       0.57
                      14            23           0.57              10         21       0.55
                       9            21           0.57               9         20       0.53
                       5            19           0.55              11         20       0.48
                      12            22           0.55               9         18       0.43
                      11            21           0.52              11         18       0.37
                       8            19           0.49              10         17       0.36
                       8            15           0.47              13         19       0.35
                       4            16           0.46              10         16       0.30
                       5            16           0.43               6         13       0.29
                       4            11           0.37              15         18       0.20
                       5            14           0.36               8         12       0.18
                      11            17           0.32               9         12       0.14
                       3            11           0.30               6           8      0.09
                       6            13           0.29               9           8     -0.04
                       9            15           0.29
                       6            12           0.25
                       6            12           0.25
                      12            16           0.22
                       8            12           0.18

A verage
Score             7.7           17.26           0.44           9.9         17.33      0.38
SDV              3.22            4.47           0.17          2.85          5.03      0.21
Median              8              16           0.46            10            18      0.37
                                                                             - 19 -

                                      FCI gain
                                                        FCI Gain clicker group
                  0.9                                   FCI Gain control group
FCI gain value

                 -0.1 1   3   5   7   9 11 13 15 17 19 21 23 25
                                                                                                - 20 -

   Comments that the students gave on the clicker survey were very positive. Out of the 20

students that took the survey, 19 felt they gained more from the lectures and felt more involved

in the classroom because of the use of these response devices. Thirteen of the 20 students

reported that the clickers made them pay more attention in class. Sixteen of the 20 students said

that it was helpful seeing the histograms after each question was asked. Only one student gave a

negative response to the use of clickers in the classroom, but this student did not give any

comment as to why the student did not like the using the clickers. This same student also was

the only one who did not feel that it was helpful discussing the wrong questions, nor that the

clickers were helpful in learning the material. The students in general, did not have a strong

opinion for or against using the clickers for the whole lecture. All of the students surveyed

agreed that they liked the anonymous respons of the clickers. Most of the students felt that the

questions that were asked during the time that they used the clickers were neither too difficult

nor too simple.
                                                                                                 - 21 -

                                          CHAPTER 5

       In conclusion, the pre-post FCI test did appear to favor the CPS group. However, this

increase was not statistically significant. It appears to the instructor that the CPS made the

students more excited about the learning process by allowing the students to participate in the

lectures, see immediately misconceptions, and allow all student to participate in c lass. The

survey of the CPS verified that students enjoy using the system and that the students feel they

benefit more in class by using them. Allowing all students to participate in class discussions and

allowing them to see the immediate feedback seems to be the major benefit for using the CPS.

Because of the anonymity of the CPS, even the shy or hesitant students can interact without

feeling intimidated. Because this was only one pilot study used in relatively small classes for a

very short period of time, further study needs to be performed before any conclusive findings can

be stated. With increased usage, instructors will find better ways to benefit students by asking

better questions, adding questions, or creating other innovative uses. The fact that the

instructor’s questions can be saved, analyzed and improved on from year to year allows for the

possibility of increased student performance.

       For those classrooms that already have a computer and an LCD projector, the CPS is a

relatively inexpensive way to insure that all students are participating in class, gaining active

feedback in real-time and increase instructor’s knowledge of classroom performance, get papers

graded and back to students quickly, and provide a fun review for tests. This technology shows

promise and should be a serious consideration for future purchases in all classrooms.
                                                                                          - 22 -


Brocklebank, J. (2000, January 5). Who wants to be a millionaire format for engineering

       students. Daily Mall. Retrieved September 17, 2004 form

Burnstein, R. & Lederman, L. M. (2001). Using wireless keypads in lecture classes. The Physics

       Teacher, 39, 8-11.

Duke University, Arts & Sciences and Trinity College. (2003). Classrooms personal response

       systems. Retrieved September 3, 2004 from

Classrooms Personal Response Systems. (2003). Retrieved September 3, 2004 from

Conference Success with PRS. Evolution 2004 Conference, Ft. Collins, Colorado. Retrieved

       September 10, 2004 from

Cotton, K. (1988). Teaching Thinking Skills. SIRS. 11 Retrieved September 13. 2004 from

Cox, A. & Junkin, W. (2002, January) Enhanced student learning in the introductory physics

       laboratory. Physics Education, 37(1) Retrieved September 17, 2004 from

Fitch, J. (2004). Student feedback in the college classroom: a technology solution. Educational

       Technology, Research and Development, 52(1) p. 71-81.

Hake, R. (1998) Interactive engagement vs, traditional methods: a six-thousand-student survey

       of mechanics test data for introductory physics courses. Department of Physics, Indiana
                                                                                             - 23 -

       University. Retrieved September 19, 2004 from

Horowitz, H.M. Ph.D., (1987). Student response systems: interactivity in a classroom

       environment. IBM Corporate Education Center, Retrieved on September 6, 2004 from

Mack, J., (2003). The EduCue personal response system: true class participation

       Faculty Technology Resource Center. Retrieved August 28, 2004 from

Milner-Bolotin, M., (2004). Tips for Using a Peer Response System in a Large Introductory

       Physics Class. The Physics Teacher, 42, 253-254.

Ober, D. (1997). A student response system in an electronic classroom: technology aids for large

       classroom instruction. The Compleat Learner,2(4). Retrieved October 15, 2004 from

Poulis, J., Massen, C., Robens, E., and Gilbert, M (2001, July 11). Physics lecturing with

       audience paced feedback. Retrieved October 19, 2004 from


Russell, J. (2003, September 13) On campuses, handhelds replacing raised hands. The Boston

       Globe. Retrieved September 17, 2004 from

Savinainen, A., & Scott, P. (2002). The force concept inventory: a tool for monitoring student

       learning. Physics Education, 37, 45-52. Retrieved October 23, 2004, from
                                                                                            - 24 -

Shapiro, J. (1997, May 16). Pedagogical uses of the srs. Retrieved September 5, 2004 from

Smartroom Learning Solutions. (2004). How to incorporate beyond question into your

       curriculum. Retrieved September 9, 2004 from

U.S. Department of Education Secretary’s No Child Left Behind Leadership Summits. (2004,

       March 10). Empowering accountability and assessment: the road a head. PowerPoint

       presentation by Susan Patrick, U.S. Department of Education Educational Technology

       Director. Retrieved October 17, 2004 from

Wiley Higher Education. (2004). Classroom response systems faq. Retrieved August 28, 2004


Williams, R., & Stockdale, S. (2004, winter). Classroom motivation strategies for prospective

       teachers. The Teacher Educator, 49(3), p. 212-221.
                                                                                                - 25 -

                                               Appendix A

                 A Listing of Pros and Cons of the Classroom Response System

                   Pros                                               Cons
Each student must participate                    Students may be able to see the buttons that
                                                 other students are pushing
Roll can easily be taken in large classes        If questions are discussed while the quiz is
                                                 taken, the quiz takes much longer to

Instructor can easily see answers of entire      With this particular system, data was not
class                                            easily transferred over the network

Results can instantly be posted for tests or     Hardware/software trouble could cause
quizzes                                          problems

Students can answer anonymously                  The instructor is forced to spend a large
                                                 amount of time thinking up good questions
                                                 each time the clickers are used to be sure
                                                 that results are effective.

Students get instant feedback as to the          Higher level questions can not be given
correct answer

Instructor can re teach or review concepts       It takes some time for instructor to
immediately when necessary                       effectively learn how to use the system

Questions can be saved, revised and used         An additional cost for the system could be
easily year after year                           great, especially if LCD projector and/or
                                                 computer has not been previously
                                                                                                                    - 26 -

                                                       Appendix B
                    Questionnaire (and students’ responses) for the Classroom Perfo rmance System
                                         (Clickers) in the Physics Classroom

     Strongly         Moderately             No            Moderately          Strongly
      Agree             Agree              Opinion          Disagree           Disagree

1.   The use of the clickers helps me get more out of the lectures?
         5                  4                3                  2                   1                 avg. – 4.3

2.   I feel mo re involved in the clas sroom with the use of the clickers?
          5                  4                 3                  2                 1                 avg. – 4.6

3.   The use of the clickers helps me pay attention more in lectures?
         5                  4                 3                 2                   1                 avg. – 4.1

4.   It is helpfu l seeing the histogram of the class responses displayed on the screen after each ques tion.
           5                  4                  3                 2                 1                 avg. – 4.2

5.   I gain confidence when I correct ly respond to the questions that employ the clickers.
          5               4                   3                 2                  1                  avg. – 4.2

6.   I find it helpful when Mr. Newport discusses the wrong answers as well as the right answers to the questions
     with the clickers?
           5                4                3                2                 1                  avg. – 4.6

7.   The whole lecture should be given as a series of clicker questions and discussions.
         5                4                  3                  2                 1                   avg. – 3.2

8.   When using the clickers, the questions that are asked are too simple and a waste of time.
        5                  4                  3                 2                 1                   avg. – 2.2

9.   When using the clickers, the questions that are asked are too difficu lt and a waste of time.
        5                  4                  3                  2                   1                avg. – 2.3

10. I would p refer that Mr. Newport not use the clickers in class, they are too distracting.
        5                  4                 3                  2                   1                 avg. – 1.6

11. I like the idea that the clickers are anonymous.
          5                  4                 3                  2                 1                 avg. – 4.4

12. I feel that the strengths of the clickers are:
The students felt that the strengths of the clickers were as follows:
    They force everyone to participate not just the ones that were called on
    The results are instant
    I seem to learn and remember more
    More questions can be asked in a shorter time
    It helps to know if you are understanding the concepts correctly
    It makes the class more interesting
    It gives the class more immediate feedback
    It helps us find our own weakness
    Lets the class interact with the discussions
                                                                                        - 27 -

13. I feel that the weaknesses of the clickers are:

The students felt that the weakness of the clickers were as follows:
    That is all we do, it’s the same clicker/discussion everyday
    Sometimes it is hard to get them to register (the response)
    Limits the typed of questions (used in lecture)
    People that do not know the answers can guess or watch other clickers (for the answer)
    Distracting
    Some questions are confusing
    When using them for tests (or quizzes) I can punch in the wrong answer because I was
       not sure what question I was on
    I have trouble reading the questions on the overhead (screen)
    If I do not write something down, it is harder to remember
    I can’t think of any weakness
    Students could mess up on an answer
    Sometimes the time limit is too short
    They waste too much time

14. Co mments/Suggestions:

Thank you for co mpleting this survey for me. I appreciate your honesty.
                                                                                            - 28 -

                                          Appendix C
                                Student Permission Form
Dear Parents of Students of Mr. Brian Newport’s Science Classes,

As part of their science class, your son/daughter has been invited to participate in a research
project called The Effects of a Classroom Response System in the Science Classroom conducted
by Brian Newport for his Master’s Project at Emporia State University. The goal of this project
is to improve the ability of all students to understand science concepts by becoming more active
participants in lecture and gathering immediate student feedback to questions presented by the
instructor. This is accomplished by each student using a handheld device to submit answers that
are posted anonymously in a classroom chart. Students and instructor can then see correct
answers immediately know how many students in class understood the concepts and how many
did not understand. Thus the instructor can assess the entire class at one time.

Your permission for your son/daughter to participate in this project means that during the
duration of this semester, data from your son or daughter’s work will be collected for research.
Participation is voluntary and involves no unusual risks. Your son/daughter can refuse to
participate or withdraw from the project at any time with no negative consequences to his or her
grades. All students’ data will be kept confidential; all personal data will be removed from all
written documentation and assigned an identification number. Even if you agree to participate
by signing this form, you can change your mind at any time.

If you have any questions about this study, you may contact Brian Newport at 634 – 0315 ext
227 or by e- mail at

My signature below indicates that I give permission for my son/daughter,
(If the student is age 18 or above, please indicate below.)

__________________________________ to participate in the study.

Parent Signature                                                   Date

Student signature if over the age of 18                            Date
                                                     - 29 -

                   Appendix D

Specifications of the CPS System From eInstruction

Shared By: