Docstoc

HSNYSPhysicsAssess

Document Sample
HSNYSPhysicsAssess Powered By Docstoc
					                     An Analysis of the New York State June 2003 Regents Physics Exam:

                                  Prepared for STANYS and the NYS AAPT




Support for this manuscript was provided by the Department of Earth Sciences and Science Education, Department
     of Elementary Education and Reading, and the Department of Physics of SUNY Buffalo State College,




                            Findings and any errors reflect the work of the authors.




                                               August 17, 2003



                           Joseph L. Zawicki, Buffalo State College, STANYS DAL
                                   Kathleen Falconer, Buffalo State College
                                     Dan MacIsaac, Buffalo State College
                                        Michael Jabot, SUNY Fredonia
                                      Dave Henry, Buffalo State College
                                       Rita Fischer, Elba Central School
An Analysis of the New York State June 2003 Regents Physics Exam                                             Page 1


                                                      Abstract



                                                      Introduction
          A host of national and international studies indicated the need for reform in physics education (Schmidt,
McKnight & Raizen, 1997, O’Sullivan & Reese, 1997). Conceptual instruments have been developed that allow the
further elucidation of student comprehension (Halloun, Hake, Mosca, & Hestenes, 1995; Engelhardt & Beichner, in
press; Enochs & Riggs, 1990). Further development of an elementary science teaching efficacy belief instrument: A
pre-service scale. School Science and Mathematics, 90(8),694-706.). Standards, at both the national and state level,
have been developed for science education; these tools provide direction and guidance for the development of local
programs (National Research Council, 1991, American Association for the Advancement of Science, NYSED,
2001). National legislation, such as the No Child Left Behind Act, has provided further impetus for the
development of statewide assessments that measure student achievement, particularly at the commencement level
(Public Law 107-110, 2001).
          The process of change is often difficult; unanticipated problems often arise during the implementation of
new programs and philosophies. The shift to standards-based assessment in New York State has not been an
exception to this rule. Of all the New York State Regents high school science content areas, the transition in physics
has been the most problematic (Sullivan, 2003; Sullivan, 2002; Zawicki & Jabot, 2002; Winerip, 2003).
          Newly developed content exams in physics have generated considerable discussion across the state
(Zawicki and Jabot, 2002; Sullivan, 2003; Sullivan, 2002; Dillon, 2003). The passing rate on the statewide physics
assessment dropped from approximately 83% in June 2001 (the last “old,” norm-referenced, syllabus exam) to 61%
in June 2002 (the first “new, standards-based, core exam). Currently, the physics exam has the lowest passing rate
of all New York State Regents science content exams (NYSED, 2003b). While the statewide scores are not yet
available for the June 2003 exam, preliminary evidence on a statewide physics teacher listserv indicates that the
passing rate held steady at around 60% (Johnson, 2003).
          The official response to these concerns was to propose that physics was an advanced science, only taken by
elite students (NYSED, 2002). This suggestion was offered by NYSED despite the fact that core curricula were
intended to be accessible to all students in grades 9-12 (NYSED, 1997).
          These results spurred a spate of commentary on the statewide physics listserv hosted by SUNY Oneonta
following the June administration of both the 2002 and 2003 exams (SUNY Oneonta, 2003a). Physics teachers
voiced their concerns about the 2002 exam during sessions at statewide and regional conferences held by the
Science Teachers Association of New York State and by the New York State Section of the American Association
of Physics Teachers (Zawicki, 2002). While teachers expressed a number of concerns on the SUNY Oneonta
listserv, suggestions that students were taking much longer to complete the exam, or that they could not finish the
exam in the allotted three hour time period, were particularly striking. Overall, teachers comments support the idea
that the June 2003 exam was both long and scored quite harshly (SUNY Oneonta, 2003b). Recent newspaper
articles and comments on the listserv express concerns that the numbers of students enrolling for Regents Physics
will sharply decline in light of these trends. At least one large district has opted to link with a community college
for their introductory physics course (Palladium Times, 2003).
          The statistical mechanisms that are used by the New York State Education Department (NYSED) are
designed to produce exams that have a consistent level of difficulty (NYSED, 1999). Item Response Theory (IRT)
is used to analyze the pretest data that is basis for establishing the overall difficulty of the exam (NYSED, 1999).
The exams are developed in-house, written largely by teachers working with consultants and NYSED staff. Test
development occurs over a period of four years. Classroom teachers write test items during the first year; they are
edited and placed on pre-test forms that are then administered in selected schools near the end of the second year.
The test items are edited and reviewed by various groups of teachers and consultants; the items are organized into
test forms. These test forms are administered to students late in the third year; the statistical data collected from
these administrations is used to determine the difficulty estimates and the probability of students answering items
correctly.
          Student ability levels representing proficiency and distinction are determined using a separate standard
setting process. During the standard setting process, expert teachers review an anchor exam in which the test items
are ranked from easiest to most difficult. The teachers then select questions that are representative of the ability
levels of proficiency and of distinction (mastery). The standard setting data is used to determine the scaled scoring
for several years worth of exams.
An Analysis of the New York State June 2003 Regents Physics Exam                                                  Page 2


          An analysis of the June 2002 exam suggested that major difference between the June 2001 and June 2002
exams was the scaled scoring (Zawicki and Jabot, 2002). The data from the 2002 exam clearly suggested that the
change to scaled scoring was a major factor in the observed shift in passing rates between the two exams. The level
for proficiency that was established in standard setting process was considerably higher than previous benchmarks
on syllabus-based examinations.
          The system is designed to ensure consistency between different administrations of exams for any topic. A
very real concern exists that physics enrollment across the state will decline; recent figures suggest that the pre-
standards-based assessment enrollment was near 25%. Many teachers reporting much higher enrollments are facing
greatly reduced student rosters for the 2003-2004 academic year. The New York State Board of Regents, the
governing body the NYSED, is reviewing issues related to the physics exam in September of 2003.
          It is clear that if the system continues to operate as it is, we should expect to see similar results on the June
2004 administration of the exam. In order to determine the nature of the issues surrounding the exam, and to inform
decisions that will be made about the exam and the concomitant issues, we undertook a study of the June 2003
Regents Examination in Physics (NYSED, 2003a). Our assessment focused on several issues. In order to assess the
overall composition of the exam, a Rasch analysis was completed to determine the difficulty estimates of individual
test items. Since teachers did not keep track of the time that was required to complete the exam we addressed the
issue of latency by examining three factors: the reading level of the examination, the average conceptual (Bloom’s)
level of assessment items, and the format of the exam (Bloom & Krathwohl, 1956). We assessed the difficulty of
the most recent examination by comparing student scores on the AP B Physics exam with scores on the New York
State assessment.
          The data suggest that the reading level, the conceptual level of individual items, and the format are all
significantly different from earlier, syllabus-based administrations of the exam. Since the field testing did not
involve a full length exam, it is suggested that the field test data may not accurately capture the combined impact of
these factors on the overall difficulty of the assessment. These findings support the anecdotal evidence from the
listserv suggesting that the exam took a longer time to complete.
          Correlations between the AP B scores that were collected and the Regents exams have ranged from 0.53 to
0.86. This data from this June’s exam suggested a correlation of 0.68; approximately half of the variance in a
student’s Regents exam score could be explained by their AP B score. The data was further analyzed by plotting the
mean Regents examination score for each AP B score (1-5). There is a statistically significant difference between
the two sets of data. Students earning a “5” on the AP B exam who took the syllabus-based version of the Regents
exam scored approximately 5 points higher than did students completing the June administrations of either the 2002
or 2003 examinations. These findings support the conclusion that the core-based assessments are scored more
harshly than previous, syllabus-based iterations.

                                                    Exam Analysis
          A request, made late in June, for copies of students papers was made through the SUNY Oneonta listserv,
as well as through the websites of STANYS and the NYSS of the AAPT. The response was overwhelming. Over
2000 student answer sheets from the June 2003 Regents Physics Exam have been collected; about half of these
papers have been included in this analysis. Most of the papers were photocopied and shipped by mail; one large
BOCES collected papers from most districts and directly forwarded the materials to us.
          The papers were collected from across New York State, and included samples from urban, rural and
suburban districts; student genders and grade level were also captured when the data was available. Over 55 school
districts contributed papers; these represent 15 of the 16 statewide STANYS sections. Commitments to contribute
additional papers are still being made at this time. The Rasch analysis is based upon approximately 1000 papers at
this time; the other exams will be added to the analysis in a timely manner as possible.
          Each of the student responses on Parts A and B-1 of the exam, or the credit earned on items appearing in
sections B-2 and C, were entered into a spreadsheet; the data was then analyzed using a Rasch model (Bond and
Fox, 2001). Some difficulties were encountered in capturing the data for sections B-2 and C of the exam. (Some
districts only submitted copies exam sections A and B-1, while the teacher scoring on some papers failed to copy
clearly due to both the red ink typically used for correcting, and the unstructured nature of the student answer sheet
for these sections. The analysis indicated the overall difficulty of the individual test items; the relatively “difficult”
questions, as well as the “easier” questions were identified. The analysis enabled us make estimates of the overall
student ability required to pass the exam. This is significant, since the data from the 2002 exam suggested that
students correctly answered about as many test questions as they had in 2001; the change in the scaled scoring that
was used to calculate scores on the exam was a significant factor in the decreased passing rate from June 2001
(~83%) to June 2002 (~61%). It is important to note that, on the syllabus-based exam, students correctly answering
An Analysis of the New York State June 2003 Regents Physics Exam                                             Page 3


55 questions on Part 1, earned a scaled score of 65. On the core-based exams the scaled scores are more closely
related on a question to point basis for the multiple choice sections of the exam. Scaled scoring issues have not been
revisited by SED since the standard setting sessions that were held in the spring of 2002 (Zawicki and Jabot, 2002).

Rasch Analysis
          The item difficulty estimates are presented in Figure 1. Difficulties were calculated using a Rasch model
(Bond and Fox, 2001). Item difficulties ranged from –3.0714, a relatively easy question that required students to
draw a line of best fit, to 0.6462, a relatively difficult question that asked students to select the object with the
greatest inertia. While 96% of the students in our sample were able to draw the line of best fit correctly, only 34%
of the students were able to select the object with the greatest inertia. Most students incorrectly answered that an
object with a smaller mass and a high velocity would have more inertia than a larger object at rest, a well known
misconception. (Arons, 1997)

Figure 1. Item difficulty estimates for questions #1-76.




          In general, a similar pattern was observed for the additional items on the exam; students performed well on
straightforward (plug and chug) calculations, and typically had more difficulty with the conceptual items on the test.
The list of test items, and their difficulty estimates are included in appendices A and B.

Reading Level
          Several other aspects of the June 2003 exam were also explored. The reading level of the last four June
exams will be determined using a McLaughlin-SMOG test of readability (McLaughlin, 1969). As indicated in Table
1, the data suggest that the reading level of the exam has risen substantially over the past four years.

Table 1. Reading Comprehension Level (McLaughlin-SMOG)

                                   Exam Administration        Grade equivalent
                                       June 2000                      8th
                                                                   th
                                       June 2001                 10 (Low)
                                       June 2002                10th (High)
                                       June 2003                     11th

Conceptual Question Level
         The conceptual levels of the questions on the exam were also analyzed. The individual test items on four
June exams (2000, 2001, 2002 and 2003) were analyzed and placed into one of four categories that were loosely
based upon Bloom’s Taxonomy (Bloom, 1956). The categories used in the study were Knowing (which included
Bloom’s levels of knowledge and comprehension), Using 1 (application), Using 2 (analysis), and Integrating
(synthesis and evaluation). Five college physics faculty, with expertise in both physics and assessment, were
An Analysis of the New York State June 2003 Regents Physics Exam                                           Page 4


recruited to analyze and rank the items on each test. The results are shown in Table 2. There is a significant
difference in conceptual level (F(1, 8) = 4.649, p = 0.02).

Table 2. Average conceptual level, recent exam administrations (2000-2003).
                               Exam Administration       Average Conceptual Level
                                     June 2000                    1.614286
                                     June 2001                    1.592063
                                     June 2002                    1.892754
                                     June 2003                    1.736842

          In and of themselves, these two changes might well be considered quite appropriate – the reading level of
the test is approaching the grade at which most students have traditionally taken the course, and the number of
higher level questions appear to be increasing. These two trends must, however, be considered in light of an
additional factor: the format of the test has changed significantly since the implementation of the core curriculum.

Exam Format
         As shown in Table 3, the syllabus exam required students to answer 75 multiple choice questions and to
analyze approximately 3-4 problems, providing about 10-11 written responses based upon those problems. The core
exams now require students to answer approximately 45-50 multiple choice questions, and to consider between 8-16
problem situations, providing written responses to 24-29 individual questions. The students are being asked to read
and write more and to consider a much larger number of problems.

Table 3. Regents Physics Exam Formats
     Exam Administration Multiple Choice Items        Written Responses     Problems (for written responses)
           June 2000                75                        11                           3
         January 2001               75                        10                           3
           June 2001                75                        11                           4
         January 2002               75                        11                           3
           June 2002                45                        24                          12
         August 2002                47                        21                           8
         January 2003               50                        27                          10
           June 2003                47                        29                          16

Latency
         Anecdotal evidence, gathered from the New York State Physics Listserv hosted by SUNY Oneonta (SUNY
Oneonta, 2003a), suggests that students take more time to complete the core exams than they did the syllabus exams.
Several teachers reported a substantial number of students working on the exam for the full three hour time limit;
these teachers indicated that a number of students did not have sufficient time to complete the exam. While we do
not have numbers to support this claim, the increased reading level of the core exams, together with the increased
conceptual level of the test items and the more rigorous test format all provide support for the conclusion that the
exams are substantially longer than the prior syllabus-based assessments.

AP B Physics Exam: Regents Exam Relationships
          Several teachers, also on the listserv, reported a high level of correlation between the AP Physics B exam
and the June 2003 Regents exam (SUNY Oneonta, 2003a). The data initially posted on the Oneonta Listserv
suggests a correlation of about 0.68, a fairly high value. The authors were able to locate both Regents exam scores
and AP B test scores for a group of students from 1994 to date; this data was combined with the data that was posted
on the listserv. The pooled data included approximately 223 students.
          Correlations between the Regents physics exam and the AP B Physics exam ranged from 0.53 to 0.86; no
pattern to the correlations was apparent and the differences between the data for the syllabus and core exams were
not apparent. The data was then sorted based upon the AP B score; the average Regents scores for each AP B grade
(1-5) were then determined for both the syllabus and the core based NYS exams. The results are shown in Figure 2.
An Analysis of the New York State June 2003 Regents Physics Exam                                                 Page 5


Figure 2. AP B Physics Exam Scores vs. NYS Regents Physics Exam Scores


                           100. 0


                            90. 0


                            80. 0


                            70. 0


                            60. 0                                                        bef or e 2002

                                                                                         2002 & 2003
                            50. 0


                            40. 0


                            30. 0


                            20. 0


                            10. 0


                             0. 0
                                    1          2          3          4          5




         As shown in Figure 2, the data demonstrate that students earning a “5” on the AP B exam scored, on
average, five points lower on the more recent core exams than did those students earning a “5” on the AP B exam
but completing the previously administered syllabus-based assessments. At the other end of the spectrum, students
earning a “1” on AP B exams who also took the recent core exam, earned scaled scores approximately 15 points
lower than their counterparts completing previous, syllabus-based, exams. The “new,” standards-based physics
exams have become significantly more difficult (t(9) = -34.434, p < 0.001).

                                                         Discussion
          The issue has been raised that physics is an “advanced” science, targeting only “elite” students (REF). If
this is the case, then surely it was an advanced science prior to the adoption of the New York State Core Curriculum
in Physics. The reading level, the conceptual level of the individual items, and the overall difficulty of the exam
have increased significantly. Prior to the implementation of the core, the syllabus was designed to address physics
as a senior level elective. While it may be extremely appropriate to increase the rigor of the current statewide
assessments, it is difficult to appreciate that such a dramatic shift was required over such a brief time period. In fact,
the core writing team was charged with producing a document amenable to teaching physics at any high school
grade level. Given this charge, the readability of the exam, as well as its overall length, need to be carefully
reconsidered.

Reaction of the Leading Educational Agencies
          The Physics First conference hosted by Cornell this past summer indicated a strong interest in opening up
physics to more students across New York State. Recent national trends have indicated that we have reached an all
time high of physics enrollment -- ~25%. A significant number of physics educators, including these authors,
believe that physics classrooms should be accessible to all. Physics is well suited to both inquiry and constructivist
learning and teaching; recent developments will not enable us to take advantage of such concerns statewide.
          Several issues have been raised in the field related to the standard setting process. The overall passing rate
of the student papers included in this sample, 64%, are consistent with the results observed last year. It is assumed
that the statistical preparation of the scaled score conversions will ensure that the difficulty of future exams will be
consistent with the most recent administrations, unless standard setting is revisited.
          Anecdotal evidence has suggested that fewer students are taking physics as a direct result of the difficulty
of last year’s exam and the relatively low passing rate statewide. The Oswego City School District announced plans
to move from Regents physics to a local course with credit awarded through an area community college (Palladium
Times, 2003). The New York State Council of School Superintendents has submitted documents expressing their
concerns to Commissioner Mills (NYSCOSS). The data concerning the number of exams that have been written in
June 2003 should be examined carefully, once the statewide numbers become available.
          There is a disconnect between the expectations for student achievement established by the NYSED,
teachers in the field, and secondary institutions using exam scores as indicators of student preparation for college
study. The dramatically lower student scores observed over the past two years stands in stark contrast to scores just
the year before; such fluctuations make interpretations of student abilities difficult, at best.
An Analysis of the New York State June 2003 Regents Physics Exam                                              Page 6


          It is important to recognize that many of these changes may be appropriate over the long haul. If test
questions are able to indicate misconceptions and misunderstandings that our students have, then a careful review of
these items can foster program growth. However, it is absolutely essential that any such changes be communicated
to the field in a clear and direct manner. Physics Mentors and STANYS SARS can only present the information that
is supplied by SED; the lack of financial support for the Oneonta Mentor Network has exacerbated a difficult
situation. Changes in the reading level of the exam, and the length of time that is required to complete an exam need
to be addressed in a straightforward manner; teachers cannot be expected to adapt to new conditions which are not
clearly established. Two veteran teachers, both with passing rates of ~85% last year, indicated that they reviewed
last year’s exam and modified their curriculum appropriately. They both expected their students scores to increase;
both teachers experienced about a 12% decrease in the passing rate on this year’s exam.
          Physics teachers have a sense that the physics exams have become increasingly rigorous while the exams in
the other content areas have not followed suite. SED has indicated that the student ability level that is required to
pass an exam in one area is not particularly related to the ability to pass an exam in another content area. While this
may be true in an absolute sense, it seems reasonable that there should be some relationship between the student
ability levels across each course. The major concern is that the low passing rates that have been observed on the
past two June exams will negatively impact student enrollment in high school physics classes or that it will lead to a
boycott of state exams in favor of other assessments.

Recommendations
          Several suggestions should be considered at this point in time. The student ability levels established at the
last round of standard setting should be revisited. There is clearly a disconnect with the field. Whether or not the
current student ability level is appropriate, the evidence suggests that such changes are occurring too rapidly for the
field to adapt. Additional resources, such as contact with content specialists within the department or with
specialists from outside organizations, such as the STANYS SAR network or the Oneonta Physics Mentor network,
need to be either maintained or established. Any changes, such as those in reading level, conceptual level, focus, or
overall exam difficulty should be effectively communicated with the field.
          The expectations for proficiency and distinction are not clearly defined in the NYS Core Curriculum for
Physics; other states have included such expectations (Arizona Department of Education, 1997). New York State
should consider developing and publishing such expectations prior to the publication of the end of course
assessments.
          The length of time required for students to complete the exam needs to be carefully evaluated. During
future administrations, teachers should collect data about the length of time that students use to complete the exam.
Item analysis, through BOCES or similar organizations, should be routinely completed; this data would serve to
resolve testing issues as well as to foster appropriate program review.
An Analysis of the New York State June 2003 Regents Physics Exam                                              Page 7


                                                     References Cited

American Association for the Advancement of Science. (1989). Science for all Americans. New York: Oxford
       University Press.

Arons, A. B. (1997). Teaching Introductory Physics Teaching. New York: Wiley.

Arizona Department of Education. (1997). Science standards. Retrieved August 17, 2003, from
        http://www.ade.state.az.us/standards/science/

Bloom Benjamin S. and David R. Krathwohl. (1956). Taxonomy of educational objectives: The classification of
       educational goals, by a committee of college and university examiners. Handbook I: Cognitive Domain.
       New York: Longmans, Green

Bond, Trevor G., and Fox, Christine M. (2001). Applying the Rasch Model: Fundamental Measurement in the
        Human Sciences. Mahwah NJ: Lawrence Erlbaum Assoc.

Dillon, Sam. (2003, July 18). Outcry over regents physics test, but officials in Albany won't budge. New York
         Times. Retrieved July 18, 2003, from http://www.nytimes.com

Engelhardt, P. & Beichner, R. The development and evaluation of a DC circuits concept test. (in press). (To appear
        in American Journal of Physics.)

Enochs, L. G., & Riggs, I. M. (1990). Further development of an elementary science teaching efficacy belief
        instrument: A preservice scale. School Science and Mathematics, 90(8),694-706.

Halloun, I., R. R. Hake, E. P. Mosca, and D. Hestenes. 1995. Force Concept Inventory (Revised 1995); online
         (password protected) at <http://modeling.asu.edu/R&E/Research.html>.

Johnson, Timothy. (Personal communication, July 17, 2003)

McLaughlin, H. (1969). SMOG grading - a new readability formula. Journal of Reading. 22, 639-646.

National Research Council (1991). National Science Education Standards. Washington, D. C. : National Academy
         Press.

New York State Education Department. (2003a). Regents Examination in Physics, June 2003. Retrieved July 30,
       2003, from http://www.nysedregents.org/testing/hsregents.html

New York State Education Department. (2003b). School district report cards. Retrieved July 30, 2003, from
       http://usny.nysed.gov/schooladmin/repstat.html

New York State Education Department. (2002). Decision reached on June physics exam. August exam to be
       offered. Albany, NY: Author. May be retrieved electronically from
       <http://users.erols.com/lipta/nysed/update.htm>

New York State Education Department. (2001). Resource guide with core curriculum. Retrieved July 30, 2003,
       from http://www.emsc.nysed.gov/ciai/mst/scirg.html

New York State Education Department. (1999). Standard Setting and Equating on the New Generation of New
       York State Assessments Retrieved August 17, 2003, from
       http://www.emsc.nysed.gov/ciai/testing/assesspubs/standard%20settingand%20equatingon%20thenew%20
       generation.pdf

New York State Education Department. (1997). Math, science and technology resource guide. Retrieved August
       17, 2003, from http://www.emsc.nysed.gov/guides/mst
An Analysis of the New York State June 2003 Regents Physics Exam                                        Page 8



O’Sullivan, C. Y. and Reese, C. M. (1997). NAEP 1996 Report Card for the Nation and the States. Washington,
        DC: National Center for Educational Statistics.

Palladium Times (2003, August 14). Oswego Schools Superintendent Trashes Physics Regents. Palladium Times,
        Online edition. Retrieved on August 18, 2003, from http://www.pallp-times.com

Public Law 107-110, the No Child Left Behind Act of 2001 ESEA02 (2001). Available at
        <http://www.ed.gov/legislation/ESEA02/pg107.html>

Schmidt, William H., McKnight, C. C., Raizen, S. A. (1997). A splintered vision: An investigation of US science
        and mathematics education. Third International Mathematics and Science Study. Michigan State
        University: US National Research Center for the Third International Mathematics and Science Study.

Sullivan, William. (2003). Analysis of June, 2003 Administration of Physics & Math A Regents. New York State
         Council of School Superintendents. Retrieved August 17, 2003, from
         http://www.nyscoss.org/CMT/nyscossnews/upload/curricanal.pdf

Sullivan, William. (2002). Assessing the assessments, 2002. New York State Council of School Superintendents.
         Retrieved from http://www.superintendentofschools.com/File_Cabinet/pdf%20-%20Curriculum%20-
         %20NYSCOSS%20Paper%20ASSESSING%20THE%20ASSESSMENTS%202002.pdf

SUNY Oneonta (2003a). Messages posted to listserv. http://employees.oneonta.edu/ebertjr/listserv.html

SUNY Oneonta (2003b). Listserv archives. Messages may be retrieved from
      http://listserv.oneonta.edu/archives/ophun-l.html

Winerip, M. (2003, March 12). On education: When a passing grade defies laws of physics New York Times.
        Retrieved July 18, 2003, from http://www.nytimes.com

Zawicki, Joseph L., Jabot, Michael. (2002). Analysis of the June 2002 regents physics exam. Perspectives. New
        York State Science Education Leadership Association: Author.

Zawicki, J. L. (2002b). Personal communication, December 2002)
An Analysis of the New York State June 2003 Regents Physics Exam                                      Page 9


Appendix A Regents Physics Exam, June 2003, Rasch Analysis, Multiple choice items

                     Item   Key   R1    R2    R3    R4    RC (%)   Papers (n)   Difficulty Estimate
                     Q45     3    57    26    860   15     89.8       958              -2.17
                     Q40     3    40     3    807   106    84.2       956              -1.69
                     Q36     2    122   777   38    21     81.1       958              -1.46
                     Q38     3    12    54    759   131    79.2       956              -1.35
                      Q3     2    129   749   38    42     78.2       958              -1.28
                      Q1     4    25    136   47    750    78.3       958              -1.28
                     Q39     3    38    145   739   33     77.1       955              -1.23
                     Q13     4    17    61    153   723    75.5       954              -1.14
                     Q28     1    708   176   35    34     73.9       953              -1.06
                     Q26     3    55    102   706   94     73.7       957              -1.03
                      Q5     1    703   114   118   22     73.4       957              -1.02
                     Q19     2    53    701   149   53     73.2       956              -1.01
                     Q21     3    62    159   695   39     72.5       955              -0.98
                     Q42     4    28    211   26    690    72.0       955              -0.96
                      Q2     2    11    689   17    240    71.9       957              -0.94
                     Q12     3    66    172   682   37     71.2       957              -0.91
                     Q43     4    50    167   60    678    70.8       955              -0.90
                     Q17     1    653   49    197   57     68.2       956              -0.77
                     Q32     1    646   190   39    80     67.4       955              -0.74
                     Q24     3    220   88    643    7     67.1       958              -0.71
                     Q25     4    71    150   93    641    66.9       955              -0.71
                     Q18     3    55    266   637    1     66.5       959              -0.68
                     Q46     2    108   617   48    185    64.4       958              -0.59
                     Q20     3    23    298   607   27     63.4       955              -0.56
                     Q37     2    258   597   49    52     62.3       956              -0.51
                     Q16     4    33    302   26    594    62.0       955              -0.50
                      Q8     4    178   68    135   571    59.6       952              -0.40
                     Q29     4    90    147   152   566    59.1       955              -0.38
                     Q47     4    135   73    200   545    56.9       953              -0.29
                     Q27     2    308   536   40    71     55.9       955              -0.25
                     Q41     3    201   153   535   66     55.8       955              -0.24
                     Q23     3    312   77    528   39     55.1       956              -0.21
                     Q10     1    517   318   60    59     54.0       954              -0.17
                     Q30     2    304   509   139    3     53.1       955              -0.13
                      Q6     3    202   242   504    9     52.6       957              -0.11
                      Q9     2    201   503   159   89     52.5       952              -0.11
                     Q14     2    424   479   17    35     50.0       955              -0.01
                     Q35     2    146   463   256   91     48.3       956               0.06
                     Q31     2    100   460   210   186    48.0       956               0.08
                     Q33     2    184   449   93    229    46.9       955               0.12
                     Q22     2    121   441   395    1     46.0       958               0.16
                     Q44     4    188   54    292   422    44.1       956               0.24
                     Q34     4    122   371   46    418    43.6       957               0.25
                      Q7     2    95    394   320   148    41.1       957               0.36
                      Q4     4    313   20    237   387    40.4       957               0.39
                     Q11     3    69    76    352   458    36.7       955               0.54
                     Q15     1    324   13    42    576    33.8       955               0.67
An Analysis of the New York State June 2003 Regents Physics Exam                                  Page 10


Appendix B Regents Physics Exam, June 2003, Rasch Analysis, Constructed items

                       Item   Key   R0    R1    R2    RC (%)   Papers (n)   Difficulty Estimate
                       Q59     1    16    458    0     96.6       474              -3.35
                       Q54     2    27    51    398    89.0       476              -2.84
                       Q65     1    27    447    0     94.3       474              -2.81
                       Q58     1    34    440    0     92.8       474              -2.56
                       Q69     2    23    60    391    91.0       474              -2.32
                       Q67     1    45    430    0     90.5       475              -2.26
                       Q52     1    46    430    0     90.3       476              -2.24
                       Q57     1    52    422    0     89.0       474              -2.09
                       Q70     2    37    50    387    86.9       474              -1.89
                       Q64     2    36    56    381    86.5       473              -1.85
                       Q51     1    66    410    0     86.1       476              -1.83
                       Q75     2    19    118   337    83.5       474              -1.62
                       Q76     1    87    387    0     81.6       474              -1.49
                       Q73     2    37    127   310    78.8       474              -1.31
                       Q63     1    106   367    0     77.6       473              -1.24
                       Q55     2    61    92    322    77.5       475              -1.24
                       Q66     2    50    131   293    75.6       474              -1.13
                       Q60     1    125   350    0     73.7       475              -1.03
                       Q61     1    150   324    0     68.4       474              -0.77
                       Q72     2    132   106   236    61.0       474              -0.45
                       Q50     1    201   273    0     57.6       474              -0.31
                       Q68     1    206   269    0     56.6       475              -0.27
                       Q49     1    210   266    0     55.9       476              -0.24
                       Q71     1    238   237    0     49.9       475               0.00
                       Q56     1    242   233    0     49.1       475               0.04
                       Q74     1    243   231    0     48.7       474               0.05
                       Q53     1    246   230    0     48.3       476               0.07
                       Q48     1    253   223    0     46.8       476               0.13
                       Q62     1    266   207    0     43.8       473               0.25

				
DOCUMENT INFO
Shared By:
Categories:
Stats:
views:11
posted:4/20/2011
language:English
pages:11
VISAKH VISAKH
About