# SECTION XI —A

Document Sample

```					SECTION XI —APPENDICES FOR PART
TWO
APPENDIX A

FORMULAS FOR RELIABILITY AND SEM
(1)   Dressel KR-20 Reliability and Standard Error of Measurement (SEM)

⎡        n               n                       n     ⎤
⎢
n ⎢
∑     p i qi +   ∑                   ∑ kpipi′ ⎥
k 2 p i′qi′ + 2
i=1              i=1                     i=1   ⎥
reliability =     1−                                                    ⎥,
n−1⎢                                σt
2
⎢                                                      ⎥
⎢
⎣                                                      ⎥
⎦

where
pi =    proportion of total sample responding correctly to item i,
pi′ =   proportion of total sample responding incorrectly to item i,
qi =    1 − pi ,
qi′ =   1 − pi′ ,
k=     correction factor for formula scoring (.250 for 5-choice and 0 for SPR items),
n=     total number of items in the subscore or test section, and
σ2 =
t      variance of total formula score for the item type or test section

Estimate based on the Dressel adaptation of the Kuder-Richardson 20 (KR-20) reliability statistic for
formula scored tests. Dressel, P.L. (1940). Some remarks on the Kuder-Richardson reliability coefficient.
Psychometrika, 5, 305-310.

SEM = σ x 1 − reliability ,

where
σ x = standard deviation of the formula scores.

This estimate is identical to coefficient alpha.

(2)   Kristof Reliability and Standard Error of Measurement (SEM)

reliability =
[cov12 cov13 + cov12 cov 23 + cov13 cov 23 ] 2                 ,
cov 12 cov 13 cov 23 varT

where
1 = 1st section,
2 = 2nd section,
3 = 3rd section, and
var T = var1 + var2 + var3 + 2cov 12 + 2cov 13 + 2cov 23

Estimate based on Kristof, W. (1974). Estimation of reliability and true score variance from a split of a
test into three arbitrary parts. Psychometrika, 39, 491-499.

SEM = σ x 1 − reliability ,

where
σx =      standard deviation of the formula scores.

(3)   Angoff/Feldt (Split-Halves) Reliability and Standard Error of Measurement (SEM)

cov12 varT
reliability =                   ,
cov1T cov 2T
where
1   =   1st section,
2   =   2nd section,
varT    =   var1 + var2 + 2cov12 ,
cov1T    =   var1 + cov12 , and
cov 2T   =   var2 + cov12

Estimated based on Angoff, W. H. (1953). Test reliability and effective test length. Psychometrika, 18, 1-
14. (Formula 15, page 6.) and Feldt, L.S. (1975). Estimation of the reliability of a test divided into two
parts of unequal length. Psychometrika, 40, 557-561. (Formula 1, page 560).

SEM = σ x 1 − reliability ,

where
σ x = standard deviation of the formula scores.

(4)   Variance-Components Reliability and Standard Error of Measurement (SEM)

reliability = 1 −
∑ SEM 2          ,
σ2
t

w here the standard errors of measurement (SEMs) are the Dressel KR - 20 SEMs for the appropriate
sections and σ 2 is the corresponding total - score variance.
t

SEM =    ∑ SEM 2         ,

where the SEMs are those for the appropriate sections.

(5)   IRT Reliability and Standard Error of Measurement (SEM)

SEM =
∑ x (SEM 2 Nx ) ,
x

NT

where
SEM x = conditional SEM for score x,
N x = number of analysis sample test takers obtaining score x, and
N T = total number of analysis sample test takers.

Algorithms used to compute SEMx are described in Dorans, N. (1984). Approximate IRT formula score
and scaled score standard errors of measurement at different ability levels. SR-84-118.

SEM 2
reliability = 1 −              ,
σ2

where SEM is the average IRT conditional SEM and σ2 is the variance of the scores.

(6)   True-Score Intercorrelations of Scores

rab
rab =
′′                    ,
raa rbb
where
′′
rab   =    estimated true − score correlation betw een scores on part a and scores on part b,
rab   =    correlation betw een the observed scores,
raa   =    reliability (Dressel KR - 20 or Kristof) of scores on part a, and
rbb   =    reliability (Dressel KR - 20 or Kristof) of scores on part b.

(7)   Mantel-Haenszel D-DIF

The formula for the estimate of constant odds ratio is:
⎛              Wfm ⎞
⎜
⎝
∑ m R rm        ⎟
N tm ⎠
α MH =                           ,
⎛              Wrm ⎞
⎜
⎝
∑ m R fm        ⎟
N tm ⎠

where
R=       number right,
W=        number w rong,
T=       total in:
fm =      focal group at ability level m,
rm =      reference group at ability level m, and
tm =      total group at ability level m.

This can then be used in the following formula:
MH D − DIF = −2.35 ln[α MH ] .

Holland, P.W. and Thayer, D.T. (1985). An alternative definition of the ETS delta scale of item difficulty.
RR-85-43.

Items are classified as A for a particular combination of reference and focal groups if either MH D-DIF is
not statistically different from zero or if the magnitude of the MH D-DIF values is less than one delta unit
in absolute value. Items are classified as C if MH D-DIF both exceeds 1.5 in absolute value and is
statistically significantly larger than 1.0 in absolute value. All other items are classified as category B. B
and C items are further classified into two signed categories.

(8)   Expected Formula Score (EFS)

The formula for the expected item formula score is:
⎡⎛             1              ⎞    ⎤
EIFS i =   ∑ s ⎢⎜ PR is − k i − 1 * PWis ⎟ * CBPs ⎥ ,
⎢⎝                        ⎠        ⎥
⎣                                  ⎦

where
PRis = percent right f or it em i at rounded scaled score level s,
PWis = percent w rong f or it em i at rounded scaled score level s,
CBPs = percent of examinees in t he 1 9 90 college - bound senior cohort at rounded scaled score
level s, and
k i = number of response opt ions f or it em i.

This can then be summed across all of the items in the test to obtain:
EFS =   ∑iEIFS.

Formulas for Computing Alpha Reliability and SEM for Maine SAT Raw Scores

The purpose of this document is to describe formulas used for computing alpha reliability and SEM and
the related indices for the Maine SAT scores.

1. Item difficulty Pi
Item difficulty is the proportion of examinees that answered an item correctly, and can be calculated
using the formula:

m
Pi =               i

n   i

Where
mi is the number of students who answered item i correctly.
ni is the total number of students who were administered item i.

2. Item variance

Vari=Pi (1- Pi)

Where Pi is the proportion correct for item i.

3. Test variance
The variance for number correct total raw score for the test, s 2 , can be computed using the raw score
points and associated frequency counts:

( ∑ f m . xm ) 2
∑   f m .x −    2
m
n
s2 =
n −1

Where
xm is the raw score point
fm is the number of students who attained raw score point xm
n is the total number of students taking the test
The summation is over all score points of the test.

4. Alpha Reliability
For items that are scored dichotomously (0,1), The Kuder-Richardson (RKR-20) coefficient is the same as
Cronbach's alpha and can be calculated using the formula:
⎛                                 ⎞
⎛ k     ⎞⎜
∑     p i (1 − p i   )⎟
R   K R − 20   = ⎜       ⎟ ⎜1 −                              ⎟
⎝ k − 1 ⎠⎜                                  ⎟
2
s
⎜                                 ⎟
⎝                                 ⎠
where
k is the total number of items on the test
pi is the proportion correct for item i
s 2 is the variance for number correct total raw score for the test.
The summation is over all items on the test.

5. Standard error of measurement (SEM)
The magnitude of SEM depends on the reliability of the test and test variance, and can be calculated as
follows:

SEM = s 1 − RKR − 20

Where
s = Standard deviation of number correct raw scores, which can be computed by taking square root
of s 2 .
RKR-20 = the KR-20 reliability.
APPENDIX B
INTERPRETING SCORES ON THE SAT
Section 5 of the SAT Program Handbook is included on the following pages to aid in score interpretation.
The full report may be obtained at the following link:

Section 5: Using and Understanding SAT Scores and Score Reports

Test scores have long been useful in helping admissions staff and other educators better understand and
interpret students’ preparation and qualifications. SAT Program tests provide information about a
student’s developed critical reading and mathematical reasoning abilities and writing skills (through the
SAT) and mastery of specific subject areas (through the Subject Tests)—all of which are academic skills
generally associated with success in college. Because students from more than 27,000 U.S. secondary
schools experience vastly different educational models and grading systems, SAT Program test results
provide a consistent and objective measure of students’ abilities and achievement in these specific areas.

At the same time, there are also major differences among the nearly 4,000 two- and four-year colleges
and universities throughout the United States and in the types of admissions decisions they need to
make. In some cases, the primary admissions decision is whether or not a student has met certain basic
qualifications. In other situations, there may be many highly qualified applicants but not enough space.
Many institutions have some programs that may be essentially “open door,” while other programs are
highly competitive. At virtually all institutions, “admissions” means much more than simply deciding
who will be admitted. Outreach, recruiting, placement, and retention are often integral aspects of
admissions work. Thus, there is not a single approach to admissions and how SAT scores might be used
in the process.

In all cases, the use should be appropriate in the context of the institution’s mission and be based on
empirical data and/or a solid rationale. What may be an ideal use of SAT Program data at one institution
may be ineffective or inappropriate at another.

Please refer to the College Board’s Guidelines on the Uses of College Board Test Scores and Related
Data, which is available at www.collegeboard.com/research.

The following list illustrates some of the ways SAT scores might be used at different colleges and
universities:
• Use SAT scores to better understand other information in an applicant’s folder, such as grades and
courses taken.
• Include SAT scores as one element in an admissions index to determine basic qualifications or
preliminary screening.
• Review SAT scores to identify students who might be “at risk” and who might benefit from special
advisors, developmental programs, and/or persistence support.
• Recruit students (through the Student Search Service or from among those who have sent scores)
who have SAT scores similar to those of accepted or enrolled students at that institution.
• Conduct research on SAT scores and other criteria to identify characteristics used in decision making
that predict success in course placement, completion of freshman year, and/or graduation.
• Include information about SAT scores (such as the middle 50 percent) of all applicants, accepted
students, or enrolled freshmen in promotional materials so that students and counselors can develop
an understanding of how the student might fit in that particular institution.
Using the Writing Section
The addition of the essay provides colleges and universities a new measure with which to evaluate
applicants, as well as an opportunity for students to provide real evidence of their writing abilities.
Images of the original essays can be downloaded and printed by colleges and universities that students
designate as score recipients. Some of the reasons for reading students’ essays are:
• To compare the SAT essay with the application essay
• To use it as an additional placement essay

Understanding SAT and Subject Test Scores
Students tend to focus on their single score, but the score range offers a better picture of their skills. This
range usually extends from 30 or 40 points below the score to 30 or 40 points above, showing where a
student’s score would probably fall if the student took the test many times in a short period of time. Any
score within the range is considered to demonstrate the same level of ability as the score the student
included with scores. Colleges accept students with a wide range of test scores. If the middle 50 percent
of freshmen at a college had SAT mathematics scores between 500 and 560, then one in four students
had a score below 500, and another one in four had a score above 560. Colleges look at many factors
when choosing their students.

Percentiles

Students, high schools, and colleges can compare performance on any SAT Reasoning Test or Subject
Test with the performance of other college-bound seniors by looking at percentile ranks, listed on the
score report. The percentile rank shows what percentage of college-bound seniors earned a score lower
than theirs. For example, if a student’s critical reading national percentile rank is 64, then that student
did better than 64 percent of students in the nation who took the test.

The SAT Program defines a college-bound senior as a student who is graduating in the current year and
took an SAT Program test any time during high school. Colleges use college-bound seniors as a
reference group to make decisions about students who have applied for admission to their institution.

National and State Percentiles
The national percentile rank can differ from the state percentile rank because the national group of test-
takers is often a larger, more diverse group than the state group.

Score Ranges

Scores are approximations rather than precise measures of skill. The score range around the score
presents a better picture of a student’s performance. It provides an estimate of how a student’s scores
might vary if he or she were tested many times over a short period. The score usually falls in a range
about 30 to 40 points, the standard error of measurement (SEM), above or below a student’s true ability.

Average Scores

SAT Reasoning Test
Average scores are based on the most recent scores earned by students in the previous year’s graduating
class who took SAT tests anytime during high school. Average scores for the SAT critical reading,
mathematics, and writing sections are available at www.collegeboard.com/satdata.
SAT Subject Tests
The average score varies from Subject Test to Subject Test because different groups of students take
different tests.

Average scores for the SAT Subject Tests are available at www.collegeboard.com/satdata.

Comparing Scores

SAT Reasoning Test
When comparing section scores, remember that the student’s true score is not a single number—a test-
taker may score slightly higher in one area but still be equal in both skills. There must be a 60-point
difference between critical reading and mathematics scores, and an 80-point difference between writing
and another section, before more skill can be assumed in one area than another.

SAT Subject Tests
Different groups of students take different Subject Tests. For this reason, scores and percentiles of
different Subject Tests should not be compared. For example, a Biology percentile cannot be compared
with a Literature percentile.

Subscores

SAT Reasoning Test Writing Section
The raw scores for the multiple-choice writing section are converted to scaled scores that are reported as
a subscore on a 20–80 scale. The essay subscore is reported on a 2–12 scale. The multiple-choice
writing section counts for approximately 70 percent and the essay counts for approximately 30 percent
of the total raw score, which is used to calculate the 200–800 score. For more information, refer to the
Essay Scoring Guide on page 30.

SAT Subject Tests
Subscores are provided for listening, usage, and reading sections of some Language Tests. These
subscores, reported on the 20–80 scale, reflect a student’s knowledge of a specific language skill. For
example, reading subscores measure understanding of main or supporting ideas within a passage.

SAT/ACT Concordance Table— Comparing Scores
The table below compares combined critical reading and mathematics scores on the SAT with composite
scores on the ACT Assessment, and vice versa. SAT scores do not cover the full range of the ACT scale
due to differences in how percentiles are distributed at the top and bottom of the two scales.

Note: Scores on the critical reading section are comparable to scores on the former verbal section, and
scores on the mathematics section are comparable to scores on the former mathematics section.
Therefore, current concordance tables can still be used to compare SAT and ACT scores.
Evaluating Student Performance

Use the tables in this section to compare a student’s performance on SAT Program tests with the
performance of groups of students. These tables can give you a better idea of what a student’s scores and
percentile ranks mean by showing who else took the test and how well they did.

The following tables can be found online at www.collegeboard.com/satdata.

•   SAT Percentile Ranks This table compares the performance of groups of students who took the
SAT. The percentile ranks in this table are based on the most recent scores earned by high school
students who are members of the 2006 graduating class and took the SAT any time during high
school.
•   Critical Reading, Mathematics, and Writing Percentile Ranks Use this table to see how a student’s
critical reading, mathematics, and writing scores compare with those of college-bound seniors. The
percentile ranks shown are used on SAT score reports in 2006-07.
•   Effects of Repeating the SAT This table shows what effect repeating the SAT has on students’
scores.
•   SAT One-Year Mean Changes This table shows the percentage of high schools where mean scores
fluctuate from one year to the next.
•   Subject Test Percentile Ranks Use this table to see how a student’s performance on a Subject Test
compares with that of other students who took the same test. These percentile rankings appear on
score reports for the 2006-07 test administrations.
•   Language Subject Test Percentile Ranks Students who are native or heritage speakers of a
language, as well as students learning the language in high school, take these tests. Percentile ranks
for the combined groups are in the tables Subject Test Percentile Ranks and Subject Test Subscore
Percentile Ranks. Percentile ranks for students learning the language in high school are in the table
Subject Tests in Languages— Total Score and Subscore Percentile Ranks for Students Who Studied
a Language in High School.

There are also additional interpretative data tables available at www.collegeboard.com/satdata.
Comparing Group Scores on the SAT—For High Schools
Use this table to determine whether score changes for your school are statistically significant or are most
likely the result of chance.

How to Use This Graph
• Use this graph when comparing the mean scores of similar groups of high school students across
different years or within a given year.
1. First determine the average size of the two groups for which you are comparing scores,
then locate that point on the horizontal axis.
2. Next locate the point on the vertical axis corresponding to the difference in the mean
scores of the groups being compared.
3. Locate the point where the two values intersect. Score differences that lie in the area to
the left and beneath the curve are most likely to be due to chance; i.e., the chance of the
two mean scores being different is 5 percent or less. Those that lie in the area to the right
and above the curve are considered statistically significant.

Points to Note
• Many of the small year-to-year changes in the mean scores of a particular group (e.g., a school or
school district) are NOT statistically significant; i.e., they are probably the result of chance.
• When comparing group mean scores, note that the significance of a change in scores depends on
the average size of the groups. Small groups require a large change in order for the change to be
significant. In large groups, a smaller change can be significant.
• A table with additional data on score change is available at www.collegeboard.com/satdata.
Comparing Group Scores on the SAT—For Colleges and Universities
Use this table to determine when a difference between two group mean scores is statistically significant
and when it is not.

How to Use This Graph
• Use this graph when comparing the mean scores of similar groups of students across different
years or within a given year.
• First determine the average size of the two groups for which you are comparing scores, then
locate that point on the horizontal axis.
• Next locate the point on the vertical axis corresponding to the difference in the mean scores of
the groups being compared.
• Locate the point where the two values intersect. Score differences that lie in the area to the left
and beneath the curve are most likely due to chance; i.e., the chance of the two mean scores
being different is 5 percent or less. Those that lie in the area to the right and above the curve are
considered statistically significant.

Points to Note
• Many of the small year-to-year changes in the mean scores of a particular group (e.g., entering
freshmen) are not statistically significant; i.e., they are probably the result of chance.
• When comparing group mean scores, note that the significance of a change in scores depends on
the average size of the groups. Small groups require a large change in order to be significant;
large groups require smaller changes.
• A table with additional data on score change is available at www.collegeboard.com/satdata.
SAT as a Predictor of College Grades

Use these tables to evaluate the SAT as a predictor of college freshman grade point average.

Correlations for Total Group

Points to Note
• This table and the table below are from a 2005 report “Understanding What SAT Reasoning Test
Scores Add to High School Grades: A Straightforward Approach” (see Section 6).
• Results are shown for verbal (critical reading) and mathematics scores separately and combined;
for high school grade point average alone; and for scores combined with high school grade point
average (HSGPA).
• The best way to predict freshman grade point average is to use a combination of SAT scores and
• These correlations show how well various indicators predict a student’s freshman year
performance in college. The higher the correlation, the better the indicator as a predictor.
• When colleges studied their entire freshman classes, verbal and mathematics scores were very
similar in predicting freshman grade point average.
• Because students select colleges and colleges select students, the range of admission test scores
and high school grade point averages found among the enrolled students at a particular college
can be much narrower than the range found in the potential applicant population. To adjust for
this restriction of range, the Pearson-Lawley multivariate correction was used.
• Correlation values are weighted averages. That is, correlations were first computed for each
college and then averaged across the group of colleges.
• Data are based on 110,468 students in the freshman classes of 1995, 1996, and 1997 from 26
colleges.
• For both tables, high school grade point average was self-reported on the SAT Questionnaire.
• For “Correlations for Total Group” table, colleges were divided into four groups based on
average SAT scores (verbal + mathematics) in the freshman classes of 1995, 1996, and 1997.
Correlations by Gender and Ethnic/Race Groups

Points to Note
• For this table, gender and ethnicity/race were self-reported on the SAT Questionnaire. Hispanic
includes students who described themselves as Puerto Rican, Latin American, South American,
Central American, or Other Hispanic or Latino. The sample of students identifying themselves as
American Indian was too small to be included in this table. A total of 5,373 students either
omitted the ethnic/race question or identified themselves as Other or American Indian.

Fairness, Difficulty, and Reliability
Fairness

Meticulous care goes into developing and evaluating each test for fairness. Test developers write the
questions for the SAT and Subject Tests, sometimes incorporating questions submitted by high school
and college teachers from around the country.

Test development committees made up of high school and college faculty and administrators who are
geographically and ethnically diverse review each test before it is administered. To ensure that the SAT
and Subject Tests are valid measures of the skills and knowledge specified for the tests, as well as fair to
all students, the SAT Program maintains rigorous standards for administering and scoring the tests.

Careful and thorough procedures are involved in creating the test. Educators monitor the SAT Program’s
practices and policies and scrupulously review each new question over more than a two-year period to
ensure its utility and fairness. Each test question is then pretested before use in an actual SAT or Subject
Test. Not until this rigorous process is completed are newly developed questions finally used in SAT

Difficulty

The data show that the difficulty level of the material on the SAT and Subject Tests and the time
allocated to each section are appropriate for the intended test-taking population. Typically, students
answer only half of the questions correctly.

Reliability

The SAT Reasoning Test and Subject Tests are highly reliable. The data show that students who take an
SAT or Subject Test more than once within a short time earn similar scores at each testing. There are
detailed explanations of reliability rates for the SAT Reasoning Test and specific Subject Tests at
www.collegeboard.com/satdata.
Validity of the SAT Reasoning Test

The SAT is a very good predictor of first-year college grades. Social scientists express a positive
correlation on a scale of 0 to +1.0, with 0 indicating no correlation and +1.0 indicating a perfect
association between the two measures. Based on a study conducted on students in the freshman classes
of 1995, 1996, and 1997 from 26 colleges, the average correlation between high school grades and first-
year college grades is +0.58, while the correlation between SAT scores and freshman grades is +0.55.
The best predictor of all, and the use that the College Board recommends, is a combination of high
school grades and test scores, which has a correlation of +0.65, a level that social scientists consider
high (Bridgeman, Pollack, and Burton, 2005). Research has shown that the ability of the SAT to predict
freshman grades is fairly consistent across all ethnic groups, although the test seems to predict Asian
American performance best. The SAT also seems to be a better predictor of women’s performance than
of men’s, although this pattern is reversed for the most highly selective colleges.
SAT Essay Scoring Guide
SCORE OF 6                           SCORE OF 5                          SCORE OF 4
• An essay in this category          • An essay in this category        • An essay in this category
demonstrates clear and               demonstrates reasonably             demonstrates adequate mastery,
consistent mastery, although         consistent mastery,                 although it will have lapses in
it may have a few minor              although it will have               quality. A typical essay
errors. A typical essay              occasional errors or lapses
in quality. A typical essay
•   effectively and insightfully     • effectively develops a point     •   develops a point of view on the
develops a point of view on        of view on the issue and             issue and demonstrates
the issue and demonstrates         demonstrates strong critical         competent critical thinking,
outstanding critical               thinking, generally using            using adequate examples,
thinking, using clearly            appropriate examples,                reasons, and other evidence to
appropriate examples,              reasons, and other evidence          support its position
reasons, and other evidence        to support its position
to support its position
•   is well organized and clearly •      is well organized and          •   is generally organized and
focused, demonstrating               focused, demonstrating             focused, demonstrating some
clear coherence and smooth           coherence and progression          coherence and progression of
progression of ideas                 of ideas                           ideas
•   exhibits skillful use of      •      exhibits facility in the use   •   exhibits adequate but
language, using a varied,            of language, using                 inconsistent facility in the use
accurate, and apt vocabulary         appropriate vocabulary             of language, using generally
appropriate vocabulary
• demonstrates meaningful            • demonstrates variety in          •   demonstrates some variety in
variety in sentence structure        sentence structure                   sentence structure
• is free of most errors in          • is generally free of most        •   has some errors in grammar,
grammar, usage, and                  errors in grammar, usage,            usage, and mechanics
mechanics                            and mechanics
SCORE OF 3                           SCORE OF 2                         SCORE OF 1
•   An essay in this category     •      An essay in this category     •    An essay in this category
demonstrates developing              demonstrates little mastery,       demonstrates very little or no
mastery, and is marked by            and is flawed by ONE OR            mastery, and is severely flawed
ONE OR MORE of the                   MORE of the following              by ONE OR MORE of the
following weaknesses:                weaknesses:                        following weaknesses:
•   develops a point of view on •        develops a point of view on •      develops no viable point of
the issue, demonstrating             the issue that is vague or         view on the issue, or provides
some critical thinking, but          seriously limited, and             little or no evidence to support
may do so inconsistently or          demonstrates weak critical         its position
reasons, or other evidence to        inappropriate or insufficient
support its position                 examples, reasons, or other
evidence to support its
position
•   is limited in its organization   •   is poorly organized and/or    •    is disorganized or unfocused,
or focus, or may                     focused, or demonstrates           resulting in a disjointed or
demonstrate some lapses in           serious problems with              incoherent essay
coherence or progression of          coherence or progression of
ideas                                ideas
•   displays developing facility   •   displays very little facility   •   displays fundamental errors in
in the use of language, but        in the use of language,             vocabulary
sometimes uses weak                using very limited
vocabulary or inappropriate        vocabulary or incorrect
word choice                        word choice
•   lacks variety or               •   demonstrates frequent           •   demonstrates severe flaws in
demonstrates problems in           problems in sentence                sentence structure
sentence structure                 structure
•   contains an accumulation of    •   contains errors in grammar,     •   contains pervasive errors in
errors in grammar, usage,          usage, and mechanics so             grammar, usage, or mechanics
and mechanics                      serious that meaning is             that persistently interfere with
somewhat obscured                   meaning

Essays not written on the essay assignment will receive a score of zero.
APPENDIX C
STANDARDS VALIDATION
FOR THE
MAINE HIGH SCHOOL ASSESSMENT
MEETING AGENDA

PANELIST TRAINING PRESENTATION

FACILITATOR INSTRUCTION DOCUMENT

ACHIEVEMENT LEVEL DEFINITIONS

SAMPLE RATING FORM

EVALUATION RESULTS

PANELISTS

Measured Progress                     1                  Draft MHSA 2006-07 Technical Manual
MHSA Mathematics Standards Validation
Maine Department of Education, Augusta
June 13–14, 2007

Agenda

Wednesday, June 13, 2007

8:00 – 9:00         Registration
Continental Breakfast available

9:00 – Noon         Welcome and Introductions (Department of Education, College Board and
Measured Progress)
Overview of Standards Validation Process (Measured Progress)

Break

Begin Work Session
Materials Orientation

12:00 – 1:00        Lunch

1:00 – 4:30         Work Session Continues

Thursday, June 14, 2007

7:30 – 8:30         Continental Breakfast

8:30 – 12:00        Work Session Continues

12:00 – 1:00        Lunch

1:00 – 4:00         Work Session Concludes Standards Validation
Provide Feedback on Achievement Level Definitions
Complete Standards Validation Evaluation

Measured Progress                               2                   Draft MHSA 2006-07 Technical Manual
PANELIST TRAINING PRESENTATION

Measured Progress                3   Draft MHSA 2006-07 Technical Manual
Measured Progress   4   Draft MHSA 2006-07 Technical Manual
Measured Progress   5   Draft MHSA 2006-07 Technical Manual
Measured Progress   6   Draft MHSA 2006-07 Technical Manual
Measured Progress   7   Draft MHSA 2006-07 Technical Manual
Measured Progress   8   Draft MHSA 2006-07 Technical Manual
Measured Progress   9   Draft MHSA 2006-07 Technical Manual
Measured Progress   10   Draft MHSA 2006-07 Technical Manual
Measured Progress   11   Draft MHSA 2006-07 Technical Manual
Measured Progress   12   Draft MHSA 2006-07 Technical Manual
FACILITATOR INSTRUCTION DOCUMENT

GENERAL INSTRUCTIONS FOR MAINE HIGH SCHOOL ASSESSMENT
(MHSA) STANDARDS VALIDATION FOR MATHEMATICS GROUP
FACILITATOR

Prior to Round 1 Ratings
Introductions:
1. Welcome group, introduce yourself (name, affiliation, a little selected background information).
2. Have each participant introduce him/herself.

Take the Test

Overview: In order to establish an understanding of the Maine High School Assessment Mathematics
(MHSA) test items and for panelists to gain an understanding of the experience of the students who take
the test, each participant will take the test, including both the SAT and Math-A augmented items. This is
the assessment that was administered to Maine students, and it is this set of items on which we must set
standards.

Activities:
1) Introduce the MHSA and convey/do each of the following:
a. Tell panelists that they are about to take the actual assessment.
b. The purpose of the exercise is to help them establish a good understanding of the test
items and to gain an understanding of the experience of the students who take the
assessment.
2) Have each panelist sign the nondisclosure agreement and hand it to you.
3) Give each panelist a test booklet.
4) Tell panelists to try to take on the perspective of a student as they complete the test.
5) When the majority of the panelists have finished, pass out answer key.

Fill Out Item Map
Overview: The primary purpose of filling out the item map is for panelists to think about and document
the knowledge, skills, and abilities students need to answer each question. Panelists should have an
understanding of what makes one test item harder or easier than ano ther. The notes panelists take here
will be useful in helping them place their bookmarks and in discussions during the two rounds of ratings.

Activities:
1.          Pass out the following materials:
a. Item map
b. Ordered item book

2.          Review the ordered item book and item map with the panelists. Explain what each is, and
point out the correspondence of the ordered items between the two. Explain that the
items are ordered from easiest to hardest.

Measured Progress                                       13                    Draft MHSA 2006-07 Technical Manual
3.         Provide an overview of the task paraphrasing the following:
a. The primary purpose of this activity is for panelists to think about what makes one
question harder or easier than another. For example, it may be that the concept tested
is a difficult concept, or that the concept isn’t difficult but that the particular wording
of the question makes it a difficult question. Similarly, the concept may be a difficult
one, but the wording of the question makes it easier.
b. Panelists should take notes about their thoughts regarding each question. These will
be useful in the rating activities and later discussions.

4.         Tell panelists they will work individually at first. After they have completed the item
map, they will then discuss it as a group.

5.         Panelists will begin the item mapping process approximately five ordered items before
the lowest (Does Not Meet the Standards/Partially Meets the Standards) starting cut.

6.         Each panelist will begin with the starting ordered item and compare it to the next ordered
item. What makes the second item harder than the first? Panelists should not agonize
over these decisions. It may be that the second item is only slightly harder than the first.

7.         Panelists should work their way through the item map, stopping about five ordered items
after the Does Not Meet the Standards/Partially Meets the Standards starting cut.

8.         Panelists will then do the same process for the Partially Meets the Standards/ Meets the
Standards and Meets the Standards/Exceeds the Standards cuts, each time starting
approximately five ordered items before the cut and ending approximately five ordered
items after the cut.

9.         Note that panelists may feel that they need to expand the range of items they consider in
one direction or the other. Five ordered items before and after the starting cuts is a
guideline, but they may consider more items if necessary.

10.        Once panelists have completed the item map, they should discuss them as a whole group.

11.        Based on the whole group discussion, the panelists should modify their own item map
(make additional notes, cross things out, etc…)

Discuss Achievement Level Definitions and Describe Characteristics of the
“Borderline” Student

Overview: In order to establish an understanding of the expected performance of borderline students on
the test, panelists must have a clear understanding of:

1)         The definition of the fo ur achievement levels, and

2)         Characteristics of students who are “just able enough” to be classified into each achievement
level. These students will be referred to as borderline students, since they are right on the
border between achievement levels.

Measured Progress                                         14                      Draft MHSA 2006-07 Technical Manual
The purpose of this activity is for the panelists to obtain an understanding of the Achievement Level
Definitions with an emphasis on characteristics that describe students at the borderline -- both what
these students can and cannot do.

This activity is crit ical since the ratings panelists will be making in Rounds 1 and 2 will be based on
these understandings.

Activities:
1)       Introduce the task. In this activity they will:
a. Individually review the Achievement Level Definitions;
b. discuss the Definitions as a whole group;
c. generate whole group descriptions of borderline Partially Meets the Standards, Meets the
Standards and Exceeds the Standards students.
The facilitator should compile the descriptions as bulleted lists on chart paper; the chart paper
will then be posted so the panelists can refer to the lists as they go through the bookmark
process.

2)      Pass out the Achievement Level Definitions and have panelists individually review them.
Panelists can make notes if they like.

3)      After individually reviewing the Definitions, have panelists discuss each one as a whole
group, starting with Partially Meets the Standards, and provide clarification. The goal here is
for the panelists to have a collegial discussion in which to bring up/clarify any issues or
questions, and to come to a common understanding of what it means to be in each
achievement level. It is not unusual for panelists to disagree with the Definitions they will
see; almost certainly there will be some panelists who will want to change them. However,
the task at hand is for panelists to have a common understanding of what knowledge, skills,
and abilities (KSAs) are described by each Achievement Level Definition. Panelists will
have an opportunity to provide feedback and suggestions for edits to the Definitions for
future consideration by the Department of Education after the standards validation activities
are completed.

4)      Once panelists have a solid understanding of the Achievement Level Definitions, have them
focus their discussion on the knowledge, skills, and abilities of students who are in the
Partially Meets the Standards category, but just barely. The focus should be on those
characteristics and KSAs that best describe the lowest level of performance necessary to
warrant a Partially Meets the Standards classification.

5)      After discussing Partially Meets the Standards, have the panelists discuss cha racteristics of
the borderline Meets the Standards student and then characteristics of the borderline Exceeds
the Standards student. Panelists should be made aware of the importance of the Meets the
Standards cut.

6)      Using chart paper, generate a bulleted list of characteristics for each of the levels. Post these
on the wall of the room.

Measured Progress                                     15                      Draft MHSA 2006-07 Technical Manual
Round 1

Overview of Round 1: The primary purpose of Round 1 is to ask the panelists to evaluate and, if
necessary, revise the starting cut points. For this round, panelists will work as a whole group.
Beginning with the starting Does Not Meet the Standards/Partially Meets the Standards cut point,
panelists will evaluate each it em, starting approximately five ordered items before the cut and ending
approximately five ordered items after the cut. (Note, again, that panelists may feel that they need to
expand the range of items they consider in one direction or the other. Five ordered items before and
after the starting cuts is a guideline, but they may consider more items if necessary.) The panelists will
gauge the level of difficulty of each of the items for those students who barely meet the definition of
Partially Meets the Standards. The task that panelists are asked to do is to estimate whether a borderline
Partially Meets the Standards student would answer each question correctly. More specifically panelists

•   Would at least 2 out of 3 students performing at the borderline answer the question correctly?

This same process is then repeated for the five or so items above and below the starting Partially Meets
the Standards/Meets the Standards cut and the starting Meets the Standards/Exceeds the Standards cut.

Starting Cuts:         DNM/PM         between OI#6 & OI#7
PM/M           between OI#14 & OI#15
M/E            between OI#51 & OI#52

Activities:
1. Panelists should have their ordered item books, item maps, and the Achievement Level
Definitions. Pass out one rating form to each panelist.

2. Have panelists write round number 1 and their ID number on the rating form. The ID number is
on the back of their name tags.

3. Provide an overview of Round 1, covering each of the following:
a. Orient panelists to the ordered- item book. Explain that the items are ordered from easiest
to hardest. Tell them where the starting cut points are (i.e., between which two ordered
items), and have them place bookmarks in the appropriate places in the ordered item
booklet. Make sure panelists understand that the ordered item cut point for DNM/PM is
not the same as the raw score a student must obtain in order to be classified into Partially
Meets the Standards. The starting cut point is between ordered items 6 and 7, but a
student actually needs 17 points on the test to be classified as Partially Meets the
Standards.

b. The primary purpose of this activity is for the panelists to discuss whether students whose
performance is barely Partially Meets the Standards would correctly answer each item,
beginning approximately five positions prior to the starting Does Not Meet the
Standards/Partially Meets the Standards cut, and to place their bookmark where they
believe the answer of ‘yes’ turns to ‘no’. Remind panelists that they should be thinking
about two-thirds of the borderline students. Once they have completed the process for
the Does Not Meet the Standards/Partially Meets the Standards cut, they will proceed to
the remaining two cut points.

Measured Progress                                   16                     Draft MHSA 2006-07 Technical Manual
c. Each panelist needs to base his/her judgments on his/her experience with the content,
understanding of students, and the definitions of the borderline students generated
previously.

d. One bookmark will be placed for each cut point.

e. If panelists are struggling with placing a particular bookmark they should use their best
judgment and move on. They will have an opportunity to revise their ratings in Round 2.

f. Panelists should feel free to take notes if there are particular points about where they
placed their bookmarks that they think are worthy of discussion in Round 2.

4. Tell panelists that they will be discussing each cut point with the other panelists, but that they
will be placing the bookmarks individually. It is not necessary for the panelists to come to
consensus about whether and how the cut points should be revised.

5. Go over the rating form with panelists.
a. Lead panelists through a step-by-step demonstration of how to fill in the rating form.
b. Answer questions the panelists may have about the work in Round 1.
c. Once everyone understands what they are to do in Round 1, tell them to begin.

6. The panelists begin approximately five ordered items prior to the starting Does Not Meet the
Standards/Partially Meets the Standards cut and proceed through the ordered item book, each
time asking whether at least two out of three borderline students would correctly answer the
question. They will place their first bookmark at the point where the answer changes from “yes”
to “no.”

7. Once they have placed the first bookmark, they proceed to the Partially Meets the
Standards/Meets the Standards cut, beginning approximately five ordered items prior to the
starting cut.

8. Once they have placed the second bookmark, they will proceed to the Meets the
Standards/Exceeds the Standards cut, again beginning approximately five ordered items prior to
the starting cut.

9. After they have placed all three bookmarks, have panelists fill out their rating forms. Ask them
to carefully inspect their rating forms to ensure they are filled out properly.
a. The round number and ID number must be filled in.
b. The item numbers identifying each cut score must be adjacent.
c. Check each panelist’s rating form before you allow them to leave for a short break.
d. When all the rating forms have been collected, the group will take a break. Immediately
bring the rating forms to the R&A work room for tabulation.

Tabulation of Round 1 Results
Tabulation of Round 1 results will be completed by R&A as quickly as possible after receipt of the
rating forms.

Measured Progress                                    17                      Draft MHSA 2006-07 Technical Manual
Round 2
Overview of Round 2: In Round 2, the panelists will discuss their Round 1 ratings as a whole group and
revise their ratings on the basis of that discussion. They will discuss their ratings in the context of the
ratings made by other members of the group. The panelists with the highest and lowest ratings should
comment on why they gave the ratings they did. The group should get a sense of how much variation
there is in the ratings. Panelists should also consider the question, “How tough or easy a panelist are
you?” The purpose here is to allow panelists to examine their individual expectations (in terms of their
experiences) and to share these expectations and experiences in order to attain a better understanding of
how their experiences impact their decision-making.

To aid with the discussion, a psychometrician will present the group with the room average bookmark
placements from Round 1, as well as impact data. The impact data consist of the approximate
percentage of students statewide that would be classified into each achievement level category based on
the room average bookmark placements from Round 1.

Once panelists have reviewed and discussed their bookmark placements, they will be given the
opportunity to change or revise their Round 1 ratings.

Activities:
1. Make sure panelists have their ordered item booklets, item maps, and achievement level
definitions. Pass out one rating form to each panelist.

2. Have panelists write round number 2 and their ID number on the rating form.

3. A psychometrician will present and explain the following information to the panelists:
a. The average bookmark placement for the whole group based on the Round 1 ratings.
Based on their Round 1 ratings, panelists will know where they fall relative to the group
average. This information is useful so that panelists get a sense if they are more stringent
or more lenient than other panelists.

b. Impact data, showing the approximate percentage of students statewide that would be
classified into each achievement level category based on the room average bookmark
placements.

4. Provide an overview of Round 2. Paraphrase the following:
a. As in Round 1, the primary purpose is to place bookmarks where you feel the
achievement levels are best distinguished, considering the additional information and
further discussion.

b. Each panelist needs to base his/her judgments on his/her experience with the content area,
understanding of students, the definitions of the borderline students generated previously,
discussions with other panelists and the knowledge, skills, and abilities required to

5. Panelists should be given a few minutes to review the bookmark placements based on the room
average cut points from Round 1.

6. Once they have reviewed the materials, the panelists will discuss their Round 1 ratings,
beginning with the first cut point.

Measured Progress                                   18                      Draft MHSA 2006-07 Technical Manual
a. The discussion should focus on differences in where individual panelists placed their cut
points.

b. Panelists should be encouraged to listen to their colleagues as well as express their own
points of view.

c. If the panelists hear a logic/rationale/argument that they did not consider and that they
feel is compelling, then they may adjust their ratings to incorporate that information.

d. On the basis of the discussions and the feedback presented, panelists should make a
second round of ratings.

e. When placing their Round 2 bookmarks, panelists should not feel compelled to change
their ratings.

f. The group does not have to achieve consensus. If panelists honestly disagree, that is fine.
compelled or coerced into making a rating they disagree with.

Encourage the panelists to use the discussion and feedback to assess how stringent or lenient
a judge they are. If a panelist is consistently higher or lower than the group, they may have a
different understanding of the borderline student than the rest of the group, or a different
understanding of the Achievement Level Definitions, or both. It is O.K. for panelists to
disagree, but that disagreement should be based on a common understanding of the
Achievement Level Definitions.

7. When the group has completed their second ratings, collect the rating forms. When you collect
the rating forms carefully inspect them to ensure they are filled out properly.
a. The round number and panelist ID number must be filled in.
b. The item numbers identifying each cut score must be adjacent.
c. Provide the completed rating forms to R&A. The panelists will not see the results from
this round.

Recommendations for Enhancements or Modifications to Achievement Level
Definitions
Upon completion of Round 2, ask the panelists to review the Achievement Level Definitions and the
items that fall into each level according to the final recommended cut points. Working as a group, the
panelists will then compile a list of recommended modifications or enhancements to the Achievement
Level Definitions to reflect the specific KSAs required to successfully complete the items in each
achievement level. Panelists may also recommend edits that reflect skills that are measured on the test
but don’t appear in the KSAs, or vice versa. Make sure panelists know that these are recommendations
for future consideration by the Department of Education and that they may not all be implemented.

Complete Evaluation Form
Upon completion of Round 2, have panelists fill out the evaluation form. Emphasize that their honest
feedback is important.

Measured Progress                                    19                      Draft MHSA 2006-07 Technical Manual
ACHIEVEMENT LEVEL DEFINITIONS

Purpose: Achievement level definitions describe the quality of a student’s responses on state -level
assessments in relation to the mathematics standards for achieving Maine’s Learning Results.

Maine state-level assessments measure the knowledge and skills of students by sampling identified standards
within mathematics at the grade level assessed. Evidence includes responses to a combination of multiple -choice
items and items requiring student-created responses in an “on demand” setting.

Achievement Levels:

Exceeds the Standards - The student’s work demonstrates in-depth understanding of essential concepts in
mathematics, including the ability to make multiple connections among central ideas. The student’s responses
demonstrate the ability to synthesize information, analyze and solve difficult or unfamiliar problems, and apply
complex concepts.

Meets the Standards - The student’s work demonstrates an understanding of essential concepts in mathematics,
including the ability to make connections among central ideas. The student’s responses demonstrate the ability to
reason, analyze and solve problems, and apply concepts.

Partially Meets the Standards - The student’s work demonstrates incomplete understanding of essential
concepts in mathematics and inconsistent connections among central ideas. The student’s responses demonstrate
some ability to analyze and solve problems and apply concepts.

Does Not Meet the Standards - The student’s work demonstrates limited understanding of essential concepts
in mathematics and infrequent or inaccurate connections among central ideas. The student’s responses
demonstrate minimal ability to solve proble ms and apply concepts.

Measured Progress                                       20                        Draft MHSA 2006-07 Technical Manual
SAMPLE RATING FORM
Maine High School Assessment
Mathematics Rating Form

Round _________________

ID ____________________

Does Not Meet the        Partially Meets the              Meets the                 Exceeds the
Standa rds                Standards                    Standards                  Standards
Ordered Item              Ordered Item                 Ordered Item               Ordered Item
Numbers                   Numbers                      Numbers                    Numbers

First          Last       First          Last         First         Last         First          Last
1            ___        ___           ___           ___           ___          ___            72
##

Directions : Please enter the range of ordered item numbers that fall into each achievement level category
according to where you placed your cut points.

Note: The ranges must be adjacent to each other. For example: Does Not Meet the Standards: 1-18, Partially
Meets the Standards: 19-36, Meets the Standards: 37-54, Exceeds the Standards: 55-72.

College Board / Measured Progress                                     21                                             MeCAS 2006-07 Technical Manual
EVALUATION RESULTS
Maine High School Assessment (MHSA) Mathematics Standards Validation

NOTE: 14 participants attended the meeting. Only 13 evaluation forms were collected.

Evaluation of the standards validation procedures for the MHSA in mathematics

1.       How would you rate the training you received? (Circle one)

Appropriate                    13
Somewhat Appropriate           0
Not Appropriate                0

2.       How clear were the achievement level definitions? (Circle one)

Very Clear                 2
Clear                      7
Somewhat Clear             4
Not Clear                  0

3.       How do you feel about the length of time allotted at this meeting for validating achievement
standards? (Circle one)

Too little time            1
Too much time              0

4.       What factors influenced the standards you validated? (For each letter, please circle the most
appropriate rating from 1=Not at all Influential to 5=Very Influential)

A.       The achievement level definitions

Not Influential           1    2     3       4     5        Very Influential

0    0     2       9     2

B.       The assessment items

Not Influential            1    2     3       4     5        Very Influential

0    0     1       6     6

C.       Other panelists

Not Influential            1    2     3       4     5        Very Influential

0    3     3       5     2
College Board / Measured Progress                     22                        MeCAS 2006-07 Technical Manual
D.       My experience in the field

Not Influential            1    2       3       4       5      Very Influential

0    0       2       6       5

Not Influential            1    2       3       4       5      Very Influential

The data set. Knowing which one                 X
was #1 and which one was #72.
Starting cut points                     X

5.       How could the standards validation process have been improved?

§ I would have liked being able to get a little more immersed in the process to have a
more comfortable grasp.
§ You did an excellent job facilitating discussion, although we got off topic a couple of
times. Keeping on task would be the area to be improved.
§ Excellent process overall.
§ I enjoyed the process. I really don’t know what if anything should have been different.
§ Don’t cram things in after scheduled time to leave. Developed student descriptions
(those that just partially met, just met, and just exceeded) all on first day so they could
be typed and used for everyone to get when bookmarking.
§ Clock in room to help keep focus on work. Tricky to understand what to do when gray
area came to put above or below or in the middle.
§ Process good. Room too cold (wrong place for this comment).
§ No starting cut points (maybe a suggested range to look at).
§ I thought it was a great experience.

For each statement below, please circle the rating that best represents your judgment.

6.       Reviewing the assessment materials was:

Not Useful                 1    2       3       4       5        Very Useful

0    0       0       3       10

7.       The discussion with other panelists was:

Not Useful                 1    2       3       4       5        Very Useful

0    0       1       5       7

College Board / Measured Progress                      23                           MeCAS 2006-07 Technical Manual
8.       The standards validation task was:

Not Clear                  1   2       3      4       5     Very Clear

0   0       1      9       3

9.       My level of confidence in validating cut-points is:

Very Low                   1   2       3      4       5     Very High

0   0       2      8       3

§ Work needs to be done to break out the performances of students by individual learning result
so educators can identify the areas they need to do remediation with in their lead curriculum.
§ I appreciate the opportunity to not only be a member of this committee in making
recommendations to the DOE regarding cut-points, but also in the knowledge it gives me to
help better prepare students and certainly colleagues.
§ I would love to be asked back. I feel that this process will help me view my teaching,
curriculum, and assessments with a different perspective.
§ Thanks for inviting me. Very interesting process and product time well spent.
§ I felt validated and comfortable to state my views in the group.
§ Will be helpful when we develop a way to make SAT/Augment a more formative assessment
(especially in special levels of each standard).
§ Thanks to all involved for a well planned event.

College Board / Measured Progress                     24                       MeCAS 2006-07 Technical Manual
PANELISTS

Panelist                         Qualification                                Affiliation

Instructor of Mathematics and Head of the
Fred Brown                                                              University of Maine at Augusta
Math Tutoring Lab

Susan Hamilton              Math Dept Chair                             Ellsworth High School

Will Brooks                 Teacher                                     Waldo County Technical Center

Associate Professor of Mathematics and
Richard Paul                                                            University of Maine at Machias
Mathematics Education

Audrey Carter               Teacher                                     Mt. Desert Island HS

Bob Osterblom               Teacher                                     Fort Fairfield Middle/High School

Catherine Menard            Curriculum Coordinator                      MSAD 31

Crystal St. Onge            Teacher                                     Winslow High School

Gloria Doody Powers         Teacher                                     Houlton High School

Jodi Abbott                 Teacher                                     Skowhegan Area High School

Judith E. Clark             Teacher                                     Woodland Junior/Senior High School

Mark D. Snow                Teacher                                     Lake Region High School

Susan Finch                 Teacher                                     A. R. Gould School

Valerie E. Brown            Mathematics Dept. Chair                     Edward Little High School

College Board / Measured Progress                         25                           MeCAS 2006-07 Technical Manual
APPENDIX D

REPORT
Alignment Analysis of Secondary
Mathematics Standards and an
Augmented SAT Assessment

Maine

Norman L. Webb

June 15, 2007
REPORT

Alignment Analysis of Secondary Mathematics
Standards and an Augmented SAT Assessment
Maine

Norman L. Webb

June 15, 2007

This study was conducted for the State of Maine. An Alignment Analysis Institute was
held May 11, 2007, to analyze the secondary mathematics standards and an augmented
mathematics SAT Reasoning Test. This report consists of a description of the four criteria
used to judge the alignment between Maine Content Standards for mathematics for high
school and one form of the SAT and its augmentation. This report includes tables listing
the results of eight reviewers’ coding of the assessments and standards.
Acknowledgements

Reviewers:

Kristen Bieda                             WI
Mathew Felton                             WI
Jennifer Ruef                             WI
Cheryl Rose                               ME
David Bowie                               ME
Monique Culbertson                        ME
Margaret Moore                            ME

The State of Maine funded this analysis. Dan Hupp, Mathematics Specialist and SAT
Initiative Coordinator, and Valerie Seaberg, Coordinator of the Maine Educational
Assessment, from the Maine Department of Education were the main contacts for the
Department and oversaw the coordination of the study.

ii

Executive Summary .......................................................................................................... v

Introduction....................................................................................................................... 1

Alignment Criteria Used for This Analysis ...................................................................... 3
Categorical Concurrence......................................................................................... 3
Depth-of-Knowledge Consistency.......................................................................... 3
Range-of-Knowledge Correspondence................................................................... 5
Balance of Representation ...................................................................................... 6
Source of Challenge................................................................................................ 6

Findings............................................................................................................................. 7
Standards.................................................................................................................. 7
Alignment of Curriculum Standards and Assessments............................................ 9
Reviewers’ Notes ................................................................................................... 11
Reliability Among Reviewers................................................................................ 11

Summary ......................................................................................................................... 12

References....................................................................................................................... 12

Appendix A
Maine Mathematics Standards and Group Consensus Values Grade 11

Appendix B
Data Analysis Tables Maine Mathematics Standards and Augmented SAT Assessment

Appendix C
Standards to Augmented SAT Assessment

Appendix D
Debrief Summary Notes Maine Grade 11 Mathematics Standards to Augmented SAT
Assessment

iii
iv
Executive Summary

On May 11, 2007, eight reviewers analyzed the alignment between the Maine
Mathematics Standards (Maine Learning Results for secondary education) and an
augmented SAT assessment. The reviewers included mathematics education content
experts, mathematics teachers, mathematics education graduate students, and district
mathematics coordinators. Four of the reviewers were from Maine and four were from
Wisconsin. The reviewers were in three different locations and interacted with each other
via a telephone conference call.

Overall the alignment between the Maine standards and the augmented SAT was
found to be acceptable. The 72 item augmented assessment had 12 or more items for each
of the four clusters and satisfied the Categorical Concurrence criterion. The augmented
assessment and three of the four clusters met the Depth-of-Knowledge Consistency
criterion. The DOK criterion was only weakly met for Cluster 3 (Mathematical Decision
Making). However, only one item would need to be replaced to have an acceptable level
on this criterion. The assessment had items that were distributed among the performance
indicators to satisfy the Range-of-Knowledge Correspondence for all four clusters. Two
standards or performance indicators were over emphasized, one for Cluster 1 and one for
Cluster 3. However, since the other three alignment criteria were acceptably met for these
two clusters, these small imbalances were not considered as critical alignment issues.

A majority of reviewers coded 11 of the 72 items (15%) to standards within the
clusters rather than performance indicators because the items did not precisely match any
performance indicator. This does indicate there were items on the assessment that only
generally targeted the Maine expectations. However, since the Range criterion was
successfully met for all four clusters, having some items that did not precisely match
some performance indicators is not considered a serious alignment issue. Considering all
factors, this analysis indicates that the Maine mathematics standards and the augmented
SAT assessment are aligned.

v
vi
Alignment Analysis of Secondary Mathematics Standards and an
Augmented SAT Assessment
Maine

Norman L. Webb

Introduction

The alignment of expectations for student learning with assessments for
measuring students’ attainment of these expectations is an essential attribute for an
effective standards-based education system. Alignment is defined as the degree to which
expectations and assessments are in agreement and serve in conjunction with one another
to guide an education system toward students learning what they are expected to know
and do. As such, alignment is a quality of the relationship between expectations and
assessments and not an attribute of any one of these two system components. Alignment
describes the match between expectations and assessment that can be legitimately
improved by changing either student expectations or the assessments. As a relationship
between two or more system components, alignment is determined by using the multiple
criteria described in detail in a National Institute for Science Education (NISE) research
monograph, Criteria for Alignment of Expectations and Assessments in Language Arts
and Science Education (Webb, 1997).

An Alignment Analysis Institute was conducted May 11, 2007. Eight reviewers
were in three locations—Augusta, Maine; Madison, Wisconsin; and Washington, D.C.
Four of the reviewers were from Maine and four were from Wisconsin. The reviewers
included mathematics education content experts, mathematics high school teachers,
mathematics education graduate students, and district mathematics coordinators.
Reviewers were trained in the alignment process and completed the analysis of the Maine
standards over a telephone conference call. Reviewers used the Web-based Alignment
Tool (WAT) to enter data on the assessment and standards.

For the purposes of this analysis, we have employed the convention of standards
and objectives to describe two levels of expectations for what students are to know and
do. Standard as used here refers to the Maine Learning Results secondary (grades 9–12)
content standards organized into four clusters. Each cluster was delineated by two or
three mathematics standards (A through K). The standards were comprised of one to five
performance indicators, or objectives. It is assumed that the performance indicators or
objectives were intended to span the content of the standards under which they fall. The
clusters, standards, and objectives are reproduced in Appendix A.

As part of the alignment process training, reviewers were trained to identify the
depth-of-knowledge (DOK) of the objectives and assessment items. This training
included reviewing the definitions of the four DOK levels for mathematics and reviewing
examples of each. Then the reviewers participated in 1) a consensus process to determine

1
the depth-of-knowledge levels of the Maine content objectives and 2) individual analyses
of the assessment items of each of the assessments.

To derive the results on the degree of agreement between the Maine mathematics
standards and the augmented SAT, the reviewers’ responses were averaged. Any variance
among reviewers is considered legitimate, with the true depth-of-knowledge level for the
item falling somewhere between two or more assigned values. Such variation could
signify a lack of clarity in how the objectives were written, the robustness of an item that
can legitimately correspond to more than one objective, and/or a depth of knowledge that
falls in between two of the four defined levels. Reviewers were allowed to identify one
assessment item as corresponding to up to three objectives—one primary hit (objective)
and up to two secondary hits. Reviewers were instructed to use multiple hits for one item
if appropriate. Reviewers could only code one depth-of-knowledge level to each
assessment item, even if the item corresponded to more than one objective.

Reviewers were instructed to focus primarily on the alignment between the state
standards and the augmented SAT. The augmented SAT included the complete
mathematics section of the SAT (54 items) and 18 items developed specifically for
Maine. Reviewers were encouraged to offer their opinions on the quality of the standards,
or of the assessment activities/items, by writing a note about the item. Reviewers could
also indicate whether there was a source-of-challenge issue with the item—i.e., a problem
with the item that might cause the student who knows the material to give a wrong
answer, or enable someone who does not have the knowledge being tested to answer the
item correctly. For example, a mathematics item that involves an excessive amount of
reading may represent a source-of-challenge issue because the skill required to answer is
more a reading skill than a mathematics skill. Two reviewers did identify one item as
having a source-of-challenge issue. Reviewers wrote several notes clarifying their
rationale for their coding. Reviewers are required to write a note if they code an item to a
“generic objective” (an item that does not fit any objective, but does match a standard or
cluster). In many cases, reviewers’ notes referenced a difficulty in finding a precise
match between an assessment item and a performance indicator.

The results produced from the institute pertain only to the issue of agreement
between the Maine state standards and the augmented SAT assessment. Note that this
alignment analysis does not serve as external verification of the general quality of the
state’s standards or the augmented SAT. Rather, only the degree of alignment is
discussed in these results. The averages of the reviewers’ coding were used to determine
whether the alignment criteria were met. When reviewers did vary in their judgments, the
averages lessened the error that might result from any one reviewer’s finding. Standard
deviations are reported, which give one indication of the variance among reviewers.

To report on the results of an alignment study of Maine’s Learning Results and
an augmented SAT, the study addressed specific criteria related to the content agreement
major attention: categorical concurrence, depth-of-knowledge consistency, range-of-
knowledge correspondence, and balance of representation.

2
Alignment Criteria Used for This Analysis

This analysis, which judged the alignment between standards and assessments on
the basis of four criteria, also reported on the quality of items by identifying items with
sources of challenge and other issues. For each alignment criterion, an acceptable level
was defined by what would be required to assure that a student had met the standards.

Categorical Concurrence

An important aspect of alignment between standards and assessments is whether
both address the same content categories. The categorical-concurrence criterion provides
a very general indication of alignment if both documents incorporate the same content.
The criterion of categorical concurrence between standards and assessment is met if the
same or consistent categories of content appear in both documents. This criterion was
judged by determining whether the assessment included items measuring content from
each standard. The analysis assumed that the assessment had to have at least six items
measuring content from a standard in order for an acceptable level of categorical
concurrence to exist between the standard and the assessment. The number of items, six,
is based on estimating the number of items that could produce a reasonably reliable
subscale for estimating students’ mastery of content on that subscale. Of course, many
factors have to be considered in determining what a reasonable number is, including the
reliability of the subscale, the mean score, and cutoff score for determining mastery.
Using a procedure developed by Subkoviak (1988) and assuming that the cutoff score is
the mean and that the reliability of one item is .1, it was estimated that six items would
produce an agreement coefficient of at least .63. This indicates that about 63% of the
group would be consistently classified as masters or nonmasters if two equivalent test
administrations were employed. The agreement coefficient would increase if the cutoff
score were increased to one standard deviation from the mean to .77 and, with a cutoff
score of 1.5 standard deviations from the mean, to .88. Usually, states do not report
student results by standards, or require students to achieve a specified cutoff score on
subscales related to a standard. If a state did do this, then the state would seek a higher
agreement coefficient than .63. Six items were assumed as a minimum for an assessment
measuring content knowledge related to a standard and as a basis for making some
decisions about students’ knowledge of that standard. If the mean for six items is 3 and
one standard deviation is one item, then a cutoff score set at 4 would produce an
agreement coefficient of .77. Any fewer items with a mean of one-half of the items would
require a cutoff that would only allow a student to miss one item. This would be a very
stringent requirement, considering a reasonable standard error of measurement on the
subscale.

Depth-of-Knowledge Consistency

Standards and assessments can be aligned not only on the category of content
covered by each, but also on the basis of the complexity of knowledge required by each.
Depth-of-knowledge consistency between standards and assessment indicates alignment
if what is elicited from students on the assessment is as demanding cognitively as what

3
students are expected to know and do as stated in the standards. For consistency to exist
between the assessment and the standard, as judged in this analysis, at least 50% of the
items corresponding to an objective had to be at or above the level of knowledge of the
objective: 50%, a conservative cutoff point, is based on the assumption that a minimal
passing score for any one standard of 50% or higher would require the student to
successfully answer at least some items at or above the depth-of-knowledge level of the
corresponding objectives. For example, assume an assessment included six items related
to one standard and students were required to answer correctly four of those items to be
judged proficient—i.e., 67% of the items. If three, 50%, of the six items were at or above
the depth-of-knowledge level of the corresponding objectives, then to achieve a proficient
score would require the student to answer correctly at least one item at or above the
depth-of-knowledge level of one objective. Some leeway was used in the analysis on this
criterion. If a standard had between 40% and 50% of items at or above the
depth-of-knowledge levels of the objectives, then it was reported that the criterion was
“weakly” met.

Interpreting and assigning depth-of-knowledge levels to both objectives within
standards and assessment items is an essential requirement of alignment analysis. These
descriptions help to clarify what the different levels represent in mathematics:

Level 1 (Recall) includes the recall of information such as a fact, definition, term,
or a simple procedure, as well as performing a simple algorithm or applying a formula.
That is, in mathematics, a one-step, well-defined, and straight algorithmic procedure
should be included at this lowest level. Other key words that signify a Level 1 include
“identify,” “recall,” “recognize,” “use,” and “measure.” Verbs such as “describe” and
“explain” could be classified at different levels, depending on what is to be described and
explained.

Level 2 (Skill/Concept) includes the engagement of some mental processing
beyond a habitual response. A Level 2 assessment item requires students to make some
decisions as to how to approach the problem or activity, whereas Level 1 requires
students to demonstrate a rote response, perform a well-known algorithm, follow a set
procedure (like a recipe), or perform a clearly defined series of steps. Keywords that
generally distinguish a Level 2 item include “classify,” “organize,” ”estimate,” “make
observations,” “collect and display data,” and “compare data.” These actions imply more
than one step. For example, to compare data requires first identifying characteristics of
the objects or phenomenon and then grouping or ordering the objects. Some action verbs,
such as “explain,” “describe,” or “interpret,” could be classified at different levels
depending on the object of the action. For example, interpreting information from a
simple graph, requiring reading information from the graph, also is a Level 2. Interpreting
information from a complex graph that requires some decisions on what features of the
graph need to be considered and how information from the graph can be aggregated is at
Level 3. Level 2 activities are not limited to just number skills, but can involve
visualization skills and probability skills. Other Level 2 activities include noticing and
describing non-trivial patterns, explaining the purpose and use of experimental
procedures; carrying out experimental procedures; making observations and collecting

4
data; classifying, organizing, and comparing data; and organizing and displaying data in
tables, graphs, and charts.

Level 3 (Strategic Thinking) requires reasoning, planning, using evidence, and a
higher level of thinking than the previous two levels. In most instances, requiring
students to explain their thinking is at Level 3. Activities that require students to make
conjectures are also at this level. The cognitive demands at Level 3 are complex and
abstract. The complexity does not result from the fact that there are multiple answers, a
possibility for both Levels 1 and 2, but because the task requires more demanding
reasoning. An activity, however, that has more than one possible answer and requires
students to justify the response they give would most likely be at Level 3.
Other Level 3 activities include drawing conclusions from observations; citing evidence
and developing a logical argument for concepts; explaining phenomena in terms of
concepts; and using concepts to solve problems.

Level 4 (Extended Thinking) requires complex reasoning, planning, developing,
and thinking, most likely over an extended period of time. The extended time period is
not a distinguishing factor if the required work is only repetitive and does not require
applying significant conceptual understanding and higher-order thinking. For example, if
a student has to take the water temperature from a river each day for a month and then
construct a graph, this would be classified as at Level 2. However, if the student is to
conduct a river study that requires taking into consideration a number of variables, this
would be at Level 4. At Level 4, the cognitive demands of the task should be high and the
work should be very complex. Students should be required to make several
connections—relate ideas within the content area or among content areas—and have to
select one approach among many alternatives on how the situation should be solved, in
order to be at this highest level. Level 4 activities include developing and proving
conjectures; designing and conducting experiments; making connections between a
finding and related concepts and phenomena; combining and synthesizing ideas into new
concepts; and critiquing experimental designs.

Range-of-Knowledge Correspondence

For standards and assessments to be aligned, the breadth of knowledge required
on both should be comparable. The range-of-knowledge criterion is used to judge
whether a comparable span of knowledge expected of students by a standard is the same
as, or corresponds to, the span of knowledge that students need in order to correctly
answer the assessment items/activities. The criterion for correspondence between span of
knowledge for a standard and an assessment considers the number of objectives within
the standard with one related assessment item/activity. Fifty percent of the objectives for
a standard had to have at least one related assessment item in order for the alignment on
this criterion to be judged acceptable. This level is based on the assumption that students’
knowledge should be tested on content from over half of the domain of knowledge for a
standard. This assumes that each objective for a standard should be given equal weight.
Depending on the balance in the distribution of items and the necessity for having a low

5
number of items related to any one objective, the requirement that assessment items need
to be related to more than 50% of the objectives for a standard increases the likelihood
that students will have to demonstrate knowledge on more than one objective per
standard to achieve a minimal passing score. As with the other criteria, a state may
choose to make the acceptable level on this criterion more rigorous by requiring an
assessment to include items related to a greater number of the objectives. However, any
restriction on the number of items included on the test will place an upper limit on the
number of objectives that can be assessed. Range-of-knowledge correspondence is more
difficult to attain if the content expectations are partitioned among a greater number of
standards and a large number of objectives. If 50% or more of the objectives for a
standard had a corresponding assessment item, then the range-of-knowledge criterion was
met. If between 40% and 50% of the objectives for a standard had a corresponding
assessment item, the criterion was “weakly” met.

Balance of Representation

In addition to comparable depth and breadth of knowledge, aligned standards and
assessments require that knowledge be distributed equally in both. The range-of-
knowledge criterion only considers the number of objectives within a standard hit (a
standard with a corresponding item); it does not take into consideration how the hits (or
assessment items/activities) are distributed among these objectives. The balance-of-
representation criterion is used to indicate the degree to which one objective is given
more emphasis on the assessment than another. An index is used to judge the distribution
of assessment items. This index only considers the objectives for a standard that have at
least one hit—i.e., one related assessment item per objective. The index is computed by
considering the difference in the proportion of objectives and the proportion of hits
assigned to the objective. An index value of 1 signifies perfect balance and is obtained if
the hits (corresponding items) related to a standard are equally distributed among the
objectives for the given standard. Index values that approach 0 signify that a large
proportion of the hits are on only one or two of all of the objectives hit. Depending on the
number of objectives and the number of hits, a unimodal distribution (most items related
to one objective and only one item related to each of the remaining objectives) has an
index value of less than .5. A bimodal distribution has an index value of around .55 or .6.
Index values of .7 or higher indicate that items/activities are distributed among all of the
objectives at least to some degree (e.g., every objective has at least two items) and is used
as the acceptable level on this criterion. Index values between .6 and .7 indicate the
balance-of-representation criterion has only been “weakly” met.

Source-of-Challenge

The source-of-challenge criterion is only used to identify items on which the
major cognitive demand is inadvertently placed and is other than the targeted
mathematics skill, concept, or application. Cultural bias or specialized knowledge could
be reasons for an item to have a source-of-challenge problem. Such item characteristics
may result in some students not answering an assessment item, or answering an

6
assessment item incorrectly, or at a lower level, even though they possess the
understanding and skills being assessed.

Findings

Standards

Eight reviewers participated in the depth-of-knowledge (DOK) level consensus
process for the clusters, standards, and performance indicators for the Maine mathematics
standards. A summary of their deliberations is presented in Table 1. The complete group
consensus values for each standard and objective can be found in Appendix A. The group
of eight reviewers found only one objective judged to have DOK level 1. Two-thirds of
the objectives were judged to have a DOK level 2. Nine objectives (29%) had a DOK
level 3 (strategic reasoning). The complexity of the performance indicators was
reasonable for the high school level with nearly all of the indicators with a DOK level of
2 or higher and over one-third with a DOK level 3.

Table 1
Percent of Objectives by Depth-of-Knowledge (DOK) Levels for Secondary Standards,
Maine Alignment Analysis for Mathematics

Number of
Total number of                                 Percent within
Objectives                                   Cluster by Level
Level           Level
1. - Cluster 1. Number                        2               7                   87
8
and Operations                                3               1                   12
1               1                   20
2. - Cluster 2. Shape and
5          2               3                   60
Size
3               1                   20
3. - Cluster 3.
2             4                  50
Mathematical Decision               8
3             4                  50
Making
2              7                 70
4. - Cluster 4. Patterns           10
3              3                 30
1              1                  3
Total                   31            2             21                 67
3              9                 29

The reviewers were told that within each of the four clusters, the performance
indicators were intended to fully span the content of that standard and, in turn, each goal
is spanned by the performance indicators that fall under it. For this reason, the reviewers
only coded items to a standard or cluster if there were no performance indicator that the
item appeared to target. Two or more reviewers coded 34 of the 72 items to a generic
objective (performance indicator) (Table 2). These reviewers did not think these items

7
precisely matched a performance indicator, but did correspond to a standard. Such a large
number of items coded to generic objectives does indicate that the match between the
assessment and the standards at a general level rather than the more specific performance
indicator level. The majority of reviewers coded a total of eleven items to generic
objectives. For example, Item 31 required knowledge of powers and factors. Although,
this knowledge fits under 1.A. (Number Sense), reviewers found no performance
indicator that precisely required powers and factors. Other reviewers felt Item 31
matched performance indicators under 4.H. that expected students to use equations and
expressions. But the majority of reviewers did not have agreement on a performance
indicator that best matched this item. This in itself is an indication of a lack of precise
alignment between the standards and the assessment in this case. Reviewers’ notes
(Appendix C) provide more explanations of why reviewers used generic objectives. The
relatively large number of items coded by two or more reviewers to generic objectives
does indicate a lack of a precise match between the assessment and the standards.

Table 2
Items Coded to Generic Objectives by More Than One Reviewer, Maine Alignment
Analysis for Mathematics with the Augmented SAT

Item               of Reviewers)
11                 2                   3.D. (6)
11                 3                   2.E. (6)
11                 5                   2.E. (6)
11                 7                   2.E. (4)
11                 9                   1.B. (2)
11                11                   3.C. (2)
11                12                    1.I. (3)
11                13                   1.B. (3)
11                15                   1.B. (5)
11                16                   4.G. (6)
11                18                   2.E. (3)
11                19                   4.G. (3)
11                21               1.B. (4) 3.C. (3)
11                22                   2.E. (3)
11                24                   1.B. (2)
11                29                   1.B. (3)
11                30                   2.E. (4)
11                31                   1.A. (3)
11                35                   2.E. (3)
11                36                   4.G. (5)
11                37                   4.G. (3)
11                39                   1.B. (5)

8
Table 2 (continued)
Items Coded to Generic Objectives by More Than One Reviewer, Maine Alignment
Analysis for Mathematics with the Augmented SAT

Item               of Reviewers)
11                 42                   1.B. (4)
11                 43                   1.B. (2)
11                 46                   2.F. (2)
11                 47                   1.B. (3)
11                 48               2.E. (2) 4.H. (2)
11                 50                   1.A. (5)
11                 51                   1.A. (2)
11                 52                   4.G. (3)
11                 53                   4.G. (6)
11                 56                   2.F. (3)
11                 60                   1.B. (2)
11                 75                   2.F. (2)

Alignment of Curriculum Standards and Assessments

The SAT mathematics assessment had 54 multiple-choice items, all assigned one
point (Table 3). The Maine Augmentation included 18 operational items and 17 field test
items. Only the operational items were included in this analysis. All 18 items in the
Maine Augmentation were multiple choice and each item was assigned a value of one
point. Thus, each of the 72 items was assigned an equal weight in the alignment analysis.

Table 3
Number of Items and Point Value by Grade for Augment SAT Assessment

Number of        Number of Multi-
Subject Area                                                  Total Point Value
Items            Point Items
SAT Mathematics                   54                  0                      54
Maine Augmentation                18                  0                      18
Total                             72                  0                      72

The results from the alignment analysis are presented in Tables 4. “Yes” indicates
that an acceptable level on the criterion was fully met. “WEAK” indicates that the
criterion was nearly met, within a margin that could simply be due to error in the system.
“NO” indicates that the criterion was not met by a noticeable margin. More detailed data
on each of the criteria are given in Appendix B in the first three tables for each of the
grade levels. The first table in Appendix B, Table 9.1, lists the average number of items
coded by the eight reviewers for each standard.

Reviewers could code an item as measuring content related to more than one
performance indicator. Reviewers were instructed to assign each item to a primary

9
indicator. If the item produced information about a student’s knowledge of more than one
indicator, then the reviewer could code an item to up to two secondary indicators.
Reviewers used, on an average, 4 secondary hits in this analysis. In general, reviewers
found that each assessment item targeted only one performance indicator or standard.

In general, the alignment between the four Maine mathematics clusters of
standards and the augmented SAT assessment was reasonable. Each for the four clusters
of items had 12 or more corresponding items and met the acceptable level of the
Categorical Concurrence criterion of at least six items. The Depth-of-Knowledge
Consistency criterion was fully met by three of the four clusters. For these three clusters
over 50% of the items had a DOK level that was the same or higher than the DOK level
of the corresponding items. The criterion was only weakly met for Cluster 3 because, on
the average, 47% of the 12 items were judged as targeting content knowledge related to
this cluster that had a DOK level that was the same or higher than the DOK level of the
targeted performance indicator. The assessment and all four clusters successfully met the
Range-of-Knowledge Correspondence criterion. Two of the four clusters, 1 and 4, only
weakly had an acceptable level on the Balance of Representation criterion. This is not
considered a major alignment issue since the other three criteria were adequately attained
for these two clusters. A high proportion of the items judged to target Cluster 1
corresponded to the generic objective or standard 1.B. (Computation). For Cluster 4, a
greater number of items was judged to target performance indicator 4.H.3. (formulate and
solve equations and inequalities) than to other performance indicator.

Overall, at the cluster level, the augmented SAT is considered aligned to the
Maine clusters of standards. Full alignment, using the acceptable levels in this analysis,
would be attained by replacing only one item. There were a number of SAT assessment
items that reviewers did not find a precise match to a performance indicator, 11 by the
majority of the reviewers. However, the alignment is acceptable at the standard and
cluster level.

Table 4
Summary of Acceptable Levels on the Four Alignment Criteria for Maine Mathematics
Standards and Augmented SAT Assessment

Cluster                                 Alignment Criteria
Depth-of-
Categorical                    Range of          Balance of
Knowledge
Concurrence                   Knowledge         Representation
Consistency
Cluster 1. Number and
YES              YES             YES             WEAK
Operations
Cluster 2. Shape and Size      YES              YES             YES              YES
Cluster 3. Mathematical
YES              WEAK            YES              YES
Decision Making
Cluster 4. Patterns            YES              YES             YES             WEAK

10

Reviewers were instructed to comment about any items that contained an
inappropriate source of challenge. Their comments can be found in Tables (grade).5 in
Appendix C. Two reviewers did identify an issue with the response to one item (Item 31).
Otherwise, reviewers did not find any serious flaws with any of the items.

provide any other notes they took. These comments can be found in Tables (grade).7 in
Appendices C. After coding each grade-level assessment, reviewers were also asked to
respond to five debriefing questions. All the comments made by the reviewers are given
in Appendices D. The notes in general offer an opinion on the item or give an explanation
of the reviewer’s coding. Reviewers did provide a number of explanations why they felt
an item did not precisely match a performance indicator. Reviewers’ debriefing
comments indicated that only a couple of performance indicators were not included on
the assessment, 3.J. and 4.K. More than one reviewer indicated that the DOK levels of the
items could be higher. More items needed to have a DOK level 3 or a higher DOK level
2. Also, reviewers noted that the wording of the standards could be improved. The
wording is too narrow, increasing the difficulty for an exact match with SAT items.

Reliability Among Reviewers

The overall intraclass correlation among the mathematics reviewers’ assignment
of DOK levels to items was reasonable for all the analyses (Table 5). An intraclass
correlation value greater than 0.8 generally indicates a high level of agreement among the
reviewers. The intraclass correlation for assigning DOK levels was high, .85. The
pairwise agreement in assigning items to clusters and performance indicators was only
moderate. These values are without any adjudication and are a little low compared to
other alignment studies. Time pressure and the remote coding of items prevented an

Table 5
Intraclass and Pairwise Comparisons, Maine Mathematics Standards and an Augment
SAT Assessment

Subject Area               Intraclass     Pairwise         Pairwise        Pairwise
Correlation    Comparison      Performance        Cluster
DOK Levels     DOK Levels       Indicator
Mathematics                             .85          .60               .47            .78

11
Summary

On May 11, 2007, eight reviewers analyzed the alignment between the Maine
Mathematics Standards (Maine Learning Results for secondary education) and an
augmented SAT assessment. The reviewers included mathematics education content
experts, mathematics teachers, mathematics education graduate students, and district
mathematics coordinators. Four of the reviewers were from Maine and four were from
Wisconsin. The reviewers were in three different locations and interacted with each other
via a telephone conference call.

Overall the alignment between the Maine standards and the augmented SAT was
found to be acceptable. The 72 item augmented assessment had 12 or more items for each
of the four clusters and satisfied the Categorical Concurrence criterion. The augmented
assessment and three of the four clusters met the Depth-of-Knowledge Consistency
criterion. The DOK criterion was only weakly met for Cluster 3 (Mathematical Decision
Making). However, only one item would need to be replaced to have an acceptable level
on this criterion. The assessment had items that were distributed among the performance
indicators to satisfy the Range-of-Knowledge Correspondence for all four clusters. Two
standards or performance indicators were over emphasized, one for Cluster 1 and one for
Cluster 3. However, since the other three alignment criteria were acceptably met for these
two clusters, these small imbalances were not considered as critical alignment issues.

A majority of reviewers coded 11 of the 72 items (15%) to standards within the
clusters rather than performance indicators because the items did not precisely match any
performance indicator. This does indicate there were items on the assessment that only
generally targeted the Maine expectations. However, since the Range criterion was
successfully met for all four clusters, having some items that did not precisely match
some performance indicators is not considered a serious alignment issue. Considering all
factors, this analysis indicates that the Maine mathematics standards and the augmented
SAT assessment are aligned.

References

Subkoviak, M. J. (1988). A practitioner’s guide to computation and interpretation of
reliability indices for mastery tests. Journal of Educational Measurement, 25(1),
47-55.

Webb, N. L. (1997). Criteria for alignment of expectations and assessments in language
arts and science education. Council of Chief State School Officers and National
Institute for Science Education Research Monograph No. 6. Madison: University
of Wisconsin, Wisconsin Center for Education Research.

12
Appendix A

Maine Mathematics Standards and Group
Consensus DOK Values

Table 11.14
Group Consensus
Maine High School Mathematics Standards, Mathematics, Grade 11

Level        Description                                                                                     DOK
1.           Cluster 1. Number and Operations                                                                2
1.A.         NUMBERS AND SENSE: Students will understand and demonstrate a sense of what                     2
numbers mean and how they are used.
1.A.1.       Describe the structure of the real number system and identify its appropriate applications      2
and limitations.
1.A.2.       Explain what complex numbers (real and imaginary) mean and describe some of their               2
many uses.
1.B.         COMPUTATION: Students will understand and demonstrate computation skills.                       2
1.B.1.       Use various techniques to approximate solutions, determine the reasonableness of                2
1.B.2.       Explain operations with number systems other than base ten.                                     2
1.I.         DISCRETE MATHEMATICS: Students will understand and apply concepts in discrete                   2
mathematics.
1.I.1.       Use linear programming to find optimal solutions to a system.                                   3
1.I.2.       Use networks to find solutions to problems.                                                     2
1.I.3.       Apply strategies from game theory to problem-solving situations.                                2
1.I.4.       Use matrices as tools to interpret and solve problems.                                          2
2.           Cluster 2. Shape and Size                                                                       2
2.E.         GEOMETRY: Students will understand and apply concepts from geometry.                            2
2.E.1.       Draw coordinate representations of geometric figures and their transformations.                 2
2.E.2.       Use inductive and deductive reasoning to explore and determine the properties of and            3
relationships among geometric figures.
2.E.3.       Apply trigonometry to problem situations involving triangles and periodic phenomena.            2
2.F.         MEASUREMENT: Students will understand and demonstrate measurement skills.                       2
2.F.1.       Use measurement tools and units appropriately and recognize limitations in the precision        1
of the measurement tools.
2.F.2.       Derive and use formulas for area, surface area, and volume of many types of figures.            2
3.           Cluster 3. Mathematical Decision Making                                                         3
3.C.         DATA ANALYSIS AND STATISTICS: Students will understand and apply concepts of                    3
data analysis.
3.C.1.       Determine and evaluate the effect of variables on the results of data collection.               3
3.C.2.       Predict and draw conclusions from charts, tables, and graphs that summarize data from           3
practical situations.
3.C.3.       Demonstrate an understanding of concepts of standard deviation and correlation and how          2
they relate to data analysis.
3.C.4.       Demonstrate an understanding of the idea of random sampling and recognition of its role         2
in statistical claims and designs for data collection.
3.C.5.       Revise studies to improve their validity (e.g., in terms of better sampling, better controls,   3
or better data analysis techniques).
3.D.         PROBABILITY: Students will understand and apply concepts of probability.                        2
3.D.1.       Find the probability of compound events and make predictions by applying probability            2
theory.

A-1
Table 11.14
Group Consensus
Maine High School Mathematics Standards, Mathematics, Grade 11

Level        Description                                                                                   DOK
3.D.2.       Create and interpret probability distributions.                                               2
3.J.         MATHEMATICAL REASONING: Students will understand and apply concepts of                        3
mathematical reasoning.
3.J.1.       Analyze situations where more than one logical conclusion can be drawn from data              3
presented.
4.           Cluster 4. Patterns                                                                           2
4.G.         PATTERNS, RELATIONS, FUNCTIONS: Students will understand that mathematics is                  2
the science of patterns, relationships, and functions.
4.G.1.       Create a graph to represent a real-life situation and draw inferences from it.                2
4.G.2.       Translate and solve a real-life problem using symbolic language.                              2
4.G.3.       Model phenomena using a variety of functions (linear, quadratic, exponential,                 2
trigonometric, etc.).
4.G.4.       Identify a variety of situations explained by the same type of function.                      2
4.H.         ALGEBRA CONCEPTS: Students will understand and apply algebraic concepts.                      2
4.H.1.       Use tables, graphs, and spreadsheets to interpret expressions, equations, and inequalities.   2
4.H.2.       Investigate concepts of variation by using equations, graphs, and data collection.            3
4.H.3.       Formulate and solve equations and inequalities.                                               2
4.H.4.       Analyze and explain situations using symbolic representations.                                3
4.K.         MATHEMATICAL COMMUNICATION: Students will reflect upon and clarify their                      2
understanding of mathematical ideas and relationships.
4.K.1.       Restate, create, and use definitions in mathematics to express understanding, classify        3
figures, and determine the truth of a proposition or argument.
4.K.2.       Read mathematical presentations of topics within the Learning Results with                    2
understanding.

A-2
Appendix B

Data Analysis Tables

Maine Mathematics Standards and
Augmented SAT Assessment

Brief Explanation of Data in the Alignment Tables by Column

Table 11.1
Goals #        Number of standards plus one for a generic standard for each goal.
Standards #    Average number of standards for reviewers. If the number is
greater than the actual number in the goal, then at least one
reviewer coded an item for the goal/standard but did not find any
standard in the goal that corresponded to the item.
Level          The Depth-of-Knowledge level coded by the reviewers for the
standards for each goal.
# of standards by
Level          The number of standards coded at each level
% w/in std
by Level       The percent of standards coded at each level
Hits
Mean & SD Mean and standard deviation number of items reviewers coded as
corresponding to goal. The total is the total number of coded hits.
Cat. Conc.
Accept.        “Yes” indicates that the goal met the acceptable level for criterion.
“Yes” if mean is six or more. “Weak” if mean is five to six. “No”
if mean is less than five.
Table 11.2
First five columns repeat columns from Table 1.
Level of Item
w.r.t. Stand Mean percent and standard deviation of items coded as “under” the
Depth-of-Knowledge level of the corresponding standard, as “at”
(the same) the Depth-of-Knowledge level of the corresponding
standard, and as “above” the Depth-of-Knowledge level of the
corresponding standard.
Depth-of-
Know.
Consistency
Accept.        “Yes” indicates that 50% or more of the items were rated as “at” or
“above” the Depth-of-Knowledge level of the corresponding
standards.
“Weak” indicates that 40% to 50% of the items were rated as “at”
or “above” the Depth-of-Knowledge level of the corresponding
standards.
“No” indicates that less than 40% items were rated as “at” or
“above” the Depth-of-Knowledge level of the corresponding
standards.

B-1
Table 11.3
First five columns repeat columns from Table 1 and 2.
Range of
Standards
# Standards Hit        Average number and standard deviation of the standards hit
coded by reviewers.
% of Total     Average percent and standard deviation of the total standards that
had at least one item coded.
Range of
Know.
Accept.        “Yes” indicates that 50% or more of the standards had at least one
coded standard.
“Weak” indicates that 40% to 50% of the standards had at least one
coded standard.
“No” indicates that 40% or less of the standards had at least one
coded standard.
Balance
Index
% Hits in
Std/Ttl Hits Average and standard deviation of the percent of the items hit for a
goal of total number of hits (see total under the Hits column).
Index          Average and standard deviation of the Balance Index.

Note: BALANCE INDEX           1 – (∑ │1/(O) – I (k) /(H )│)/2
k=1

Where O = Total number of standards hit for the goal
I (k) = Number of items hit corresponding to standard (k)
H = Total number of items hit for the goal

Bal. of Rep
Accept.        “Yes” indicates that the Balance Index was .7 or above (items
evenly distributed among standards).
“Weak” indicates that the Balance Index was .6 to .7 (a high
percentage of items coded as corresponding to two or three
standards).
“No” indicates that the Balance Index was .6 or less (a high
percentage of items coded as corresponding to one standard.)

Table 11.4
Summary if goal met the acceptable level for the four criteria by each goal.

B-2
Table 11.5
Comments made by reviewers on items identified as having a source of challenge
issue by item number.

Table 11.6
The DOK value for each assessment item given by each reviewer. The intraclass
correlation for the group of reviewers is given on the last row.

Table 11.7
All notes made by reviewers on items by item number.

Table 11.8
The DOK level and standard code assigned by each reviewer for each item.

Table 11.9
This list for each item all of the standards coded by the eight reviewers as
corresponding to the item. Repeat of a standard indicates the number of reviewers
who coded that standard as corresponding to the item.

Table 11.10
This lists for each standard all of the items coded by the eight reviewers as
corresponding to the standard. Repeat of an item indicates the number of
reviewers who coded the item as corresponding to the standard.

Table 11.11
This table summarizes the number of reviewers who coded an item as
corresponding to a standard. It contains the same information as in Table 10.

Table 11.12
This table can be used to compare the DOK level of a standard to the average
DOK level of the items reviewers assigned to the standard. This table is helpful to
identify items with a lower DOK level that should be replaced by an item with a
higher DOK level to improve the Depth-of-Knowledge Consistency.

B-3
Table 11.1
Categorical Concurrence Between Standards and Assessment as Rated by Eight
Reviewers
Maine Mathematics SAT Augmented Study
Number of Assessment Items - 72

Standards                   Level by Objective        Hits
Cat.
Goals Objs       # of objs % w/in std
Title                       Level                      Mean S.D. Concurr.
#     #         by Level by Level
2       7         87
1. - Cluster 1. Number
3 10.38 3           1         12     14.38 2.06 YES
and Operations
1      1        20
2. - Cluster 2. Shape and                 2      3        60
2   6.88                           22.25 1.30   YES
Size                                      3      1        20

3. - Cluster 3.                           2      4        50
Mathematical Decision         3   9.38    3      4        50      12.5 1.41   YES
Making
2      7        70
4. - Cluster 4. Patterns      3   11.62   3      3        30     27.25 3.38   YES

1      1         3
2     21        67
Total             11   38.25                          76.38 4.03
3      9        29

B-4
Table 11.2
Depth-of-Knowledge Consistency Between Standards and Assessment as Rated by Eight
Reviewers (Assumes Each Objective Should Have Equal Representation)
Maine Mathematics SAT Augmented Study
Number of Assessment Items - 72

Level of Item w.r.t.
Standard          DOK
Standards                   Hits
%               %    Consistency
% At
Under          Above
Goals Objs
Title                          M S.D. M S.D. M S.D. M S.D.
#     #
1. - Cluster 1. Number
3    10.38 14.38 2.06 43 46 51 47 6 23       YES
and Operations
2. - Cluster 2. Shape and
2    6.88 22.25 1.30 41 39 46 34 12 25       YES
Size
3. - Cluster 3.
Mathematical Decision        3    9.38 12.5 1.41 53 46 40 44 7 25        WEAK
Making
4. - Cluster 4. Patterns     3    11.62 27.25 3.38 39 42 54 40 7 17       YES
Total             11   38.25 76.38 4.03 44 44 48 42 8 23

B-5
Table 11.3
Range-of-Knowledge Correspondence and Balance of Representation Between Standards and Assessment as Rated by Eight
Reviewers
Maine Mathematics SAT Augmented Study
Number of Assessment Items - 72

Range of Objectives               Balance Index
Rng. of                              Bal. of
Standards                    Hits               % of
# Objs Hit            Know. % Hits in Std/Ttl Index       Represent.
Total                  Hits
Goals Objs
Title                         Mean S.D. Mean S.D. Mean S.D.            Mean     S.D. Mean S.D.
#     #
1. - Cluster 1. Number and
3   10.38 14.38 2.06 7.25 0.83 70   8     YES       19      2    0.66 0.08    WEAK
Operations
2. - Cluster 2. Shape and Size     2   6.88 22.25 1.30 6.38 0.48 93    9     YES       29      2    0.77 0.07        YES
3. - Cluster 3. Mathematical
3   9.38 12.5 1.41 6.25 1.09 67     8     YES       16      1    0.75 0.06        YES
Decision Making
4. - Cluster 4. Patterns          3    11.62 27.25 3.38 7.12 1.36 61   11    YES       36      3    0.69 0.06    WEAK
Total              11   38.25 76.38 4.03 6.75 1.09 73   15              25      8    0.72 0.08

B-6
Table 11.4
Summary of Attainment of Acceptable Alignment Level on Four Content Focus Criteria
as Rated by Eight Reviewers
Maine Mathematics SAT Augmented Study
Number of Assessment Items - 72

Standards                              Alignment Criteria
Depth-of-
Categorical                  Range of     Balance of
Knowledge
Concurrence                 Knowledge    Representation
Consistency
1. - Cluster 1.
Number and              YES              YES        YES           WEAK
Operations
2. - Cluster 2. Shape
YES              YES        YES            YES
and Size
3. - Cluster 3.
Mathematical             YES          WEAK           YES            YES
Decision Making
4. - Cluster 4. Patterns      YES              YES        YES           WEAK

B-7
Table 11.6
Depth-of-Knowledge Levels by Item and Reviewers
Intraclass Correlation
Maine Mathematics SAT Augmented Study

Item   Rater 1   Rater 2   Rater 3    Rater 4     Rater 5   Rater 6   Rater 7   Rater 8
1      1         1         2          2           1         1         2         2
2      1         1         1          1           2         1         1         1
3      1         2         2          1           2         1         1         1
4      1         1         2          1           2         1         1         2
5      2         2         2          2           2         2         2         2
6      1         1         2          2           1         1         2         2
7      1         1         2          2           1         1         1         2
8      1         1         2          1           1         2         1         1
9      2         2         2          2           2         2         3         2
10     2         2         2          2           2         2         2         2
11     2         2         2          2           2         2         2         2
12     1         1         1          1           1         1         1         1
13     2         2         2          2           2         2         2         2
14     2         2         2          3           2         2         2         3
15     1         2         1          1           2         2         2         1
16     2         2         2          2           2         2         3         2
17     3         2         2          3           2         2         2         3
18     3         2         2          3           2         2         2         2
19     3         2         3          3           2         2         3         3
20     2         2         2          2           2         2         2         2
21     2         2         2          1           2         2         2         1
22     1         1         2          1           1         2         1         2
23     1         2         1          2           2         1         2         1
24     2         1         3          1           1         2         1         2
25     2         1         2          2           1         2         2         2
26     2         2         1          2           2         2         3         2
27     2         1         1          2           1         2         1         3
28     2         2         2          3           2         2         3         3
29     1         1         1          2           2         1         1         2
30     1         1         1          2           2         2         1         3
31     2         2         2          2           2         2         1         2
32     1         1         1          1           1         2         1         1
33     2         3         2          2           2         2         1         2
34     2         2         2          2           2         2         1         2
35     3         2         2          2           2         2         2         3
36     3         2         3          3           2         2         2         2
37     2         2         2          2           2         2         2         3
38     2         2         2          3           2         2         2         3
39     1         1         1          2           1         2         1         1
40     1         1         2          2           1         2         1         1

B-8
Table 11.6
Depth-of-Knowledge Levels by Item and Reviewers
Intraclass Correlation
Maine Mathematics SAT Augmented Study

Item   Rater 1   Rater 2   Rater 3    Rater 4     Rater 5   Rater 6   Rater 7   Rater 8
41     1         2         1          1           1         1         1         2
42     2         1         2          2           2         1         1         2
43     1         1         2          1           1         1         1         1
44     2         2         1          1           1         1         1         2
45     2         1         1          2           1         2         1         1
46     2         2         2          2           2         2         2         2
47     2         1         1          2           2         1         2         1
48     2         2         2          2           2         2         2         2
49     1         1         1          2           1         2         1         2
50     3         2         3          1           2         2         3         3
51     2         2         2          2           2         2         2         2
52     2         2         2          1           2         2         2         2
53     2         2         2          2           2         2         2         2
54     2         2         2          2           2         3         2         3
55
56     1         1         1          1           1         1         1         2
57     1         1         1          2           1         1         1         1
58
59     2         2         2          3           2         2         3         2
60     1         1         1          2           2         2         1         2
61     1         2         2          1           2         1         1         1
62
63     1         1         2          2           1         2         2         1
64     1         2         2          2           2         2         1         2
65     1         1         1          2           1         2         1         2
66
67     1         1         1          1           1         2         2         3
68     2         2         2          2           2         2         2         3
69     1         2         2          2           2         2         1         2
70     2         2         2          2           2         2         1         3
71
72     2         2         2          2           2         2         2         3
73     2         2         2          3           2         2         3         3
74     2         1         2          2           2         2         1         2
75     2         2         2          3           2         2         3         3
76
77     2         2         2          3           2         3         3         3
78     1         1         2          2           1         2         2         2
79

B-9
Table 11.6
Depth-of-Knowledge Levels by Item and Reviewers
Intraclass Correlation
Maine Mathematics SAT Augmented Study

Intraclass Correlation: 0.8515
Pairwise Comparison: 0.6012

B-10
Table 11.8
DOK Levels and Objectives Coded by Each Reviewer
Maine Mathematics SAT Augmented Study

Item       DOK0    PObj0   S1Obj0       DOK1    PObj1       DOK2    PObj2       DOK3    PObj3 S1Obj3          DOK4    PObj4       DOK5    PObj5         DOK6    PObj6        DOK7    PObj7 S1Obj7
1        1          4.H.3.            1          4.H.3.   2          4.H.3.   2          4.H.3.             1          4.H.4.   1          4.H.3.     2          4.H.3.    2          4.H.3.
2        1          3.D.              1          3.D.     1          3.D.     1          3.D.               2          3.D.2.   1          3.D.1.     1          3.D.      1          3.D.
3        1          2.E.              2          2.E.     2          2.E.     1          2.E.2.             2          2.E.     1          2.E.       1          2.E.      1          2.E.2.
4        1          3.C.2.            1          3.C.2.   2          3.C.     1          3.C.2.             2          3.C.2.   1          3.C.2.     1          3.C.2.    2          3.C.2.
5        2          2.E.2.            2          2.E.     2          2.E.     2          2.E.3.             2          2.E.     2          2.E.       2          2.E.      2          2.E.
6        1          4.H.3.            1          4.H.3.   2          4.H.3.   2          4.H.3.             1          4.H.3.   1          4.H.4.     2          4.H.      2          4.H.3.
7        1          3.C.2.            1          2.E.     2          2.E.2.   2          2.E.               1          2.E.     1          2.E.       1          2.E.2.    2          2.E.2.
8        1          1.A.1.            1          1.A.1.   2          1.A.1.   1          1.A.1.             1          1.A.1.   2          1.A.       1          1.B.1.    1          1.B.1. 2.F.1.
9        2          4.H.3.            2          4.H.3.   2          4.H.4.   2          1.B.   4.H.2.      2          4.H.4.   2          1.B.       3          4.H.2.    2          4.H.3.
10       2          4.G.2.            2          4.G.2.   2          4.G.2.   2          4.G.2.             2          4.G.2.   2          4.G.2.     2          4.G.2.    2          4.G.2.
11       2          4.H.2.            2          4.H.     2          3.C.     2          4.H.3.             2          4.H.3.   2          1.B.       2          4.H.3.    2          4.H.3. 3.C.
12       1          4.H.4.            1          3.C.2.   1          1.I.     1          1.I.               1          3.C.2.   1          3.C.2.     1          1.I.      1          3.C.
13       2          1.B.              2          1.B.     2          1.B.1.   2          4.H.3. 4.G.2.      2          1.B.     2          4.G.2.     2          4.G.2.    2          1.B.1.
14       2          4.H.3.            2          4.H.3.   2          4.H.3.   3          4.H.3.             2          4.H.3.   2          4.         2          4.H.3.    3          4.H.3.
15       1          1.B.              2          1.B.     1          1.B.     1          1.B.1.             2          1.B.     2          1.A./2./   2          1.B.      1          1.B.1.
1.
16       2          4.G.              2          4.G.     2          4.H.3.   2          4.H.1.   4.G.      2          4.G.     2          4.G.       3          4.G.      2          4.H.1.
17       3          2.F.2.            2          2.E.2.   2          2.E.     3          2.F.2.             2          2.E.2.   2          2.E.2.     2          2.F.2.    3          2.E.2.
18       3          4.K.1.            2          2.E.     2          2.E.     3          2.E.2.             2          2.E.     2          2.E.2.     2          2.E.2.    2          2.E.1.
19       3          4.G.2.            2          4.G.     3          4.H.4.   3          4.G.2.   4.H.4.    2          4.G.     2          1.A.1.     3          4.G.4.    3          4.G.
20       2          4.G.2.            2          4.G.2.   2          4.G.2.   2          4.G.2.             2          4.G.2.   2          4.G.2.     2          2.F.2.    2          4.G.2.
21       2          3.C.              2          1.B.     2          3.C.     1          1.B.               2          1.B.     2          1.B.       2          4.H.3.    1          1.B.1.   3.C.
22       1          2.E.1.            1          2.E.     2          2.E.1.   1          2.E.1.             1          2.E.     2          2.E.       1          2.E.1.    2          2.E.2.
23       1          4.G.2.            2          4.G.2.   1          4.H.4.   2          4.G.2.   4.H.4./   2          4.G.3.   1          4.H.4.     2          4.G.2.    1          4.G.3.
4.H.3.
24       2          3.J.              1          4.H.4.   3          1.B.     1          4.H.1.             1          4.H.4.   2          1.B.       1          4.H.1.    2          4.H.4.
25       2          2.F.2.            1          2.F.2.   2          2.E.3.   2          2.F.2.             1          2.F.2.   2          4.         2          2.F.2.    2          2.E.2.
26       2          4.H.2.            2          4.H.2.   1          4.H.2.   2          4.H.2.             2          4.H.2.   2          4.H.2.     3          4.H.2.    2          4.H.2.
27       2          2.E.2.            1          2.E.2.   1          2.E.2.   2          2.E.2.             1          2.E.2.   2          2.E.2.     1          2.E.2.    3          2.E.2.
28       2          4.H.3.            2          4.H.3.   2          4.H.4.   3          4.H.3.             2          4.H.3.   2          4.H.3.     3          4.H.3.    3          4.H.3.
29       1          1.B.              1          1.B.     1          2.F.     2          4.H.3.   4.G.2.    2          1.B.     1          4.H.3.     1          4.G.2.    2          1.B.1.
30       1          2.E.2.            1          2.E.2.   1          2.E.     2          2.E.               2          2.E.     2          2.E.       1          2.E.2.    3          2.E.2.
31       2          1.B.1.            2          1.A.     2          1.A.     2          4.H.4.             2          1.A./    2          4.H.3.     1          4.H.2.    2          1.B.1.
3.I.1.
32       1          4.H.3.            1          4.H.3.   1          4.H.3.   1          4.H.               1          4.H.3.   2          4.H.3.     1          4.H.3.    1          4.H.3.
33       2          3.C.2.            3          3.C.2.   2          4.H.2.   2          4.H.1.             2          3.C.2.   2          3.C.2.     1          3.C.2.    2          3.C.2.
34       2          1.B.              2          4.H.3.   2          4.H.4.   2          4.H.3.             2          4.H.3.   2          4.H.3.     1          4.G.2.    2          4.H.3.
35       3          2.E.2.            2          2.E.2.   2          2.E.     2          2.E.               2          2.E.2.   2          2.E.2.     2          2.E.2./   3          2.E.     2.F.

B-11
Table 11.8
DOK Levels and Objectives Coded by Each Reviewer
Maine Mathematics SAT Augmented Study

Item       DOK0    PObj0    S1Obj0       DOK1    PObj1        DOK2    PObj2        DOK3    PObj3    S1Obj3       DOK4    PObj4        DOK5    PObj5       DOK6    PObj6 DOK7 PObj7 S1Obj7
4.H.3.
36      3          4.G.               2          4.G.      3          4.G.2.    3          4.G.2.             2          4.G.      2          4.H.4.   2          4.G.   2    4.G.
37      2          4.G.               2          4.G.      2          4.G.2.    2          4.G.2.             2          4.G.      2          4.G.2.   2          2.F.2. 3    1.B.1. 2.E.
38      2          4.G.3.   4.H.3.    2          2.E.2./   2          4.H.1.    3          4.H.1.             2          4.H.3./   2          4.H.3.   2          4.H.1. 3    4.H.1. 4.G.3.
4.H.3.                                                                  2.F.2.
39      1          1.B.               1          1.B.      1          1.B.1.    2          1.B.               1          1.B.      2          4.H.3.   1          1.B.     1   1.A.
40      1          4.H.3.             1          4.H.4.    2          4.H.3.    2          4.H.3.             1          4.H.3.    2          4.H.4.   1          4.H.3.   1   4.H.
41      1          3.C.2.             2          3.C.2.    1          3.C.2.    1          3.C.2.             1          4.G.1.    1          3.C.2.   1          3.C.2.   2   3.C.2.
42      2          1.B.               1          1.B.      2          1.B.1.    2          4.G.2.   4.H.3.    2          1.B.      1          1.A.     1          1.B.     2   1.B.1.
43      1          1.B.               1          4.H.4.    2          4.H.3.    1          1.B.               1          4.H.3.    1          4.H.4.   1          4.H.3.   1   4.H.3.
44      2          2.E.2.             2          2.E.2.    1          2.E.2.    1          2.E.               1          2.E.2.    1          2.E.2.   1          2.E.2.   2   2.E.2.
45      2          4.H.3.   4.H.4.    1          4.H.4.    1          4.H.3.    2          1.B.     3.J.1.    1          4.H.3.    2          4.H.3.   1          4.H.3.   1   1.B.1.
46      2          2.E.2.   2.F.      2          2.E.2.    2          2.E.3.    2          2.E.2.             2          2.F.      2          2.E.3.   2          2.E.2.   2   2.E.2.
47      2          1.B.               1          1.B.      1          3.C.2.    2          3.C.2.             2          1.B.      1          3.C.2.   2          3.C.2.   1   1.B.1.
48      2          4.H.1.             2          4.H.      2          4.H.1.    2          2.E.     4.H.2.    2          4.H.      2          4.H.2.   2          2.E.     2   2.E.2.   2.E.1.
49      1          4.H.3.   1.A.1.    1          4.H.4.    1          4.H.4.    2          4.H.3.             1          4.H.3.    2          4.H.3.   1          4.H.3.   2   4.H.3.
50      3          3.J.1.   1.A.      2          1.A.      3          4.G.2.    1          1.A.1.             2          1.A.      2          1.A.     3          1.A.     3   4.H.4.
51      2          3.C.               2          1.A.      2          4.G.2.    2          1.I.3.             2          1.A.      2          1.I.3.   2          1.I.3.   2   1.I.
52      2          4.G.4.             2          4.G.      2          4.H.1.    1          4.G.               2          4.G.      2          2.E.1.   2          4.H.2.   2   4.H.1.
53      2          4.G.               2          4.G.      2          4.G.2.    2          4.G.               2          4.G.      2          4.H.4.   2          4.G.     2   4.G.
54      2          2.E.2.   2.F.      2          2.E.2.    2          2.E.      2          2.E.2.             2          2.E.2.    3          2.E.3.   2          2.F.2.   3   2.E.2.
55
56      1          2.F.1.             1          2.F.      1          2.F.1.    1          2.F.1.             1          2.F.      1          2.F.     1          2.F.1.   2   3.J.1.
57      1          2.E.1.             1          2.E.1.    1          2.E.1.    2          2.E.1.             1          2.E.1.    1          2.E.1.   1          2.E.1.   1   2.E.1.
58
59      2          3.C.2.             2          3.C.2.    2          3.C.4./   3          3.C.2.             2          3.C.2.    2          3.C.2.   3          3.C.2.   2   4.K.1.
3.C.2
60      1          1.B.2.             1          1.B.2.    1          2.F.      2          1.B.               2          1.B.2.    2          1.B.1.   1          1.B.     2   1.B.2.
61      1          2.E.1.             2          2.E.1.    2          2.E.1.    1          2.E.1.             2          2.E.1.    1          2.E.1.   1          2.E.1.   1   2.E.1.
62
63      1          2.F.1.             1          2.F.1.    2          2.F.1.    2          2.F.1.             1          2.F.1.    2          1.B.1.   2          2.F.1.   1   2.F.1.
64      1          1.A.2.             2          4.H.3.    2          1.A.2.    2          1.A.2.   4.H.3.    2          4.H.3.    2          1.A.1.   1          1.A.1.   2   1.A.1.
65      1          2.E.1.             1          2.E.1.    1          2.E.1.    2          2.E.1.             1          2.E.1.    2          2.E.1.   1          2.E.1.   2   2.E.1.
66
67      1          3.C.4.             1          3.C.4.    1          3.C.4.    1          3.C.1.             1          3.C.4.    2          3.C.4.   2          3.C.4.   3   3.C.4.
68      2          1.I.2.             2          1.I.2.    2          1.I.2.    2          1.I.2.             2          1.I.2.    2          2.E.     2          1.I.2.   3   1.I.2.
69      1          3.D.1.             2          3.D.1.    2          3.D.1.    2          3.D.1.             2          3.D.1.    2          3.D.1.   1          3.D.1.   2   3.D.1.
70      2          1.I.1.             2          1.I.1.    2          1.I.1.    2          1.I.1.             2          1.I.1.    2          1.I.1.   1          1.I.1.   3   1.I.1.

B-12
Table 11.8
DOK Levels and Objectives Coded by Each Reviewer
Maine Mathematics SAT Augmented Study

Item       DOK0    PObj0   S1Obj0       DOK1    PObj1       DOK2    PObj2        DOK3    PObj3    S1Obj3       DOK4    PObj4       DOK5    PObj5       DOK6    PObj6       DOK7    PObj7    S1Obj7
71
72      2          3.C.3.            2          3.C.3.   2          3.C.3.    2          3.C.3.             2          3.C.3.   2          3.C.3.   2          3.C.3.   3          3.D.2.   3.C.3.
73      2          2.E.3.            2          2.E.3.   2          2.E.3.    3          2.E.3.   3.J.1.    2          2.E.3.   2          2.E.3.   3          2.E.3.   3          2.E.3.
74      2          3.D.1.            1          3.D.1.   2          3.D.1.    2          3.D.1.             2          3.D.1.   2          3.D.1.   1          3.D.1.   2          3.D.1.
75      2          1.B.1.            2          2.F.     2          1.B.1./   3          2.F.1.             2          2.F.     2          1.B.1.   3          2.F.1.   3          1.B.1.
2.F.1.
76
77      2          3.C.3.            2          3.C.3.   2          3.C.3.    3          3.C.2.   3.C.3.    2          3.C.3.   3          3.C.     3          3.C.1.   3          3.C.2.
78      1          2.E.3.            1          2.E.3.   2          2.E.3.    2          2.E.3.             1          2.E.3.   2          2.E.3.   2          2.E.3.   2          2.E.3.
79

Objective Pairwise Comparison: 0.4667
Standard Pairwise Comparison: 0.784

B-13
Table 11.9
Objectives Coded to Each Item by Reviewers
Maine Mathematics SAT Augmented Study

Low                    Medium                       High
0                     7.734177                      12

1 Section 2    4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.4.
#1
2 Section 2    3.D.     3.D.     3.D.     3.D.      3.D.     3.D.     3.D.1. 3.D.2.
#2
3 Section 2     2.E.    2.E.     2.E.     2.E.      2.E.     2.E.     2.E.2.   2.E.2.
#3
4 Section 2    3.C.     3.C.2. 3.C.2. 3.C.2. 3.C.2. 3.C.2. 3.C.2. 3.C.2.
#4
5 Section 2     2.E.    2.E.     2.E.     2.E.      2.E.     2.E.     2.E.2.   2.E.3.
#5
6 Section 2    4.H.     4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.4.
#6
7 Section 2     2.E.    2.E.     2.E.     2.E.     2.E.2.    2.E.2.   2.E.2. 3.C.2.
#7
8 Section 2    1.A.     1.A.1. 1.A.1. 1.A.1. 1.A.1. 1.A.1. 1.B.1. 1.B.1.                2.F.1.
#8
9 Section 2    1.B.     1.B.     4.H.2. 4.H.2. 4.H.3. 4.H.3. 4.H.3. 4.H.4. 4.H.4.
#9
10 Section 2   4.G.2. 4.G.2. 4.G.2. 4.G.2. 4.G.2. 4.G.2. 4.G.2. 4.G.2.
#10
11 Section 2   1.B.     3.C.     3.C.     4.H.     4.H.2. 4.H.3. 4.H.3. 4.H.3. 4.H.3.
#11
12 Section 2    1.I.     1.I.     1.I.    3.C.     3.C.2. 3.C.2. 3.C.2. 4.H.4.
#12
13 Section 2   1.B.     1.B.     1.B.     1.B.1. 1.B.1. 4.G.2. 4.G.2. 4.G.2. 4.H.3.
#13
14 Section 2     4.     4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.3.
#14
15 Section 2     1.     1.A.     1.B.     1.B.      1.B.     1.B.     1.B.     1.B.1. 1.B.1.     2.
#15

16 Section 2   4.G.     4.G.     4.G.     4.G.      4.G.     4.G.     4.H.1. 4.H.1. 4.H.3.
#16
17 Section 2    2.E.    2.E.2.   2.E.2.   2.E.2.   2.E.2.    2.F.2.   2.F.2.   2.F.2.
#17
18 Section 2    2.E.    2.E.     2.E.     2.E.1.   2.E.2.    2.E.2.   2.E.2. 4.K.1.
#18
19 Section 2   1.A.1.   4.G.     4.G.     4.G.     4.G.2. 4.G.2. 4.G.4. 4.H.4. 4.H.4.
#19

B-14
Table 11.9
Objectives Coded to Each Item by Reviewers
Maine Mathematics SAT Augmented Study

20 Section 2   2.F.2. 4.G.2. 4.G.2. 4.G.2. 4.G.2. 4.G.2. 4.G.2. 4.G.2.
#20
21 Section 4   1.B.     1.B.     1.B.     1.B.     1.B.1.   3.C.     3.C.     3.C.     4.H.3.
#1
22 Section 4   2.E.     2.E.     2.E.     2.E.1.   2.E.1.   2.E.1.   2.E.1.   2.E.2.
#2
23 Section 4   4.G.2. 4.G.2. 4.G.2. 4.G.2. 4.G.3. 4.G.3. 4.H.3. 4.H.4. 4.H.4. 4.H.4.
#3

24 Section 4   1.B.     1.B.      3.J.    4.H.1. 4.H.1. 4.H.4. 4.H.4. 4.H.4.
#4
25 Section 4   2.E.2.   2.E.3.   2.F.2.   2.F.2.   2.F.2.   2.F.2.   2.F.2.     4.
#5
26 Section 4   4.H.2. 4.H.2. 4.H.2. 4.H.2. 4.H.2. 4.H.2. 4.H.2. 4.H.2.
#6
27 Section 4   2.E.2.   2.E.2.   2.E.2.   2.E.2.   2.E.2.   2.E.2.   2.E.2.   2.E.2.
#7
28 Section 4   4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.4.
#8
29 Section 4   1.B.     1.B.     1.B.     1.B.1.    2.F.    4.G.2. 4.G.2. 4.H.3. 4.H.3.
#9
30 Section 4   2.E.     2.E.     2.E.     2.E.     2.E.2.   2.E.2.   2.E.2.   2.E.2.
#10
31 Section 4   1.A.     1.A.     1.A.     1.B.1. 1.B.1.     3.J.1.   4.H.2. 4.H.3. 4.H.4.
#11
32 Section 4   4.H.     4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.3.
#12
33 Section 4   3.C.2. 3.C.2. 3.C.2. 3.C.2. 3.C.2. 3.C.2. 4.H.1. 4.H.2.
#13
34 Section 4   1.B.     4.G.2. 4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.4.
#14
35 Section 4   2.E.     2.E.     2.E.     2.E.2.   2.E.2.   2.E.2.   2.E.2.   2.E.2.    2.F.    4.H.3.
#15

36 Section 4   4.G.     4.G.     4.G.     4.G.      4.G.    4.G.2. 4.G.2. 4.H.4.
#16
37 Section 4   1.B.1.   2.E.     2.F.2.   4.G.      4.G.    4.G.     4.G.2. 4.G.2. 4.G.2.
#17
38 Section 4   2.E.2.   2.F.2. 4.G.3. 4.G.3. 4.H.1. 4.H.1. 4.H.1. 4.H.1. 4.H.3. 4.H.3.
#18
4.H.3. 4.H.3.
39 Section 7    1.A.   1.B.      1.B.     1.B.      1.B.    1.B.     1.B.1. 4.H.3.
#1

B-15
Table 11.9
Objectives Coded to Each Item by Reviewers
Maine Mathematics SAT Augmented Study

40 Section 7   4.H.     4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.4. 4.H.4.
#2
41 Section 7   3.C.2. 3.C.2. 3.C.2. 3.C.2. 3.C.2. 3.C.2. 3.C.2. 4.G.1.
#3
42 Section 7   1.A.     1.B.     1.B.     1.B.      1.B.     1.B.1. 1.B.1. 4.G.2. 4.H.3.
#4
43 Section 7   1.B.     1.B.     4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.4. 4.H.4.
#5
44 Section 7   2.E.     2.E.2.   2.E.2.   2.E.2.   2.E.2.    2.E.2.   2.E.2.   2.E.2.
#6
45 Section 7   1.B.     1.B.1.   3.J.1.   4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.4. 4.H.4.
#7

46 Section 7   2.E.2.   2.E.2.   2.E.2.   2.E.2.   2.E.2.    2.E.3.   2.E.3.    2.F.    2.F.
#8
47 Section 7   1.B.     1.B.     1.B.     1.B.1. 3.C.2. 3.C.2. 3.C.2. 3.C.2.
#9
48 Section 7   2.E.     2.E.     2.E.1.   2.E.2.    4.H.     4.H.     4.H.1. 4.H.1. 4.H.2. 4.H.2.
#10

49 Section 7   1.A.1. 4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.3. 4.H.4. 4.H.4.
#11
50 Section 7   1.A.     1.A.     1.A.     1.A.      1.A.     1.A.1.   3.J.1.   4.G.2. 4.H.4.
#12
51 Section 7   1.A.     1.A.      1.I.    1.I.3.    1.I.3.   1.I.3.   3.C.     4.G.2.
#13
52 Section 7   2.E.1.   4.G.     4.G.     4.G.     4.G.4. 4.H.1. 4.H.1. 4.H.2.
#14
53 Section 7   4.G.     4.G.     4.G.     4.G.      4.G.     4.G.     4.G.2. 4.H.4.
#15
54 Section 7   2.E.     2.E.2.   2.E.2.   2.E.2.   2.E.2.    2.E.2.   2.E.3.    2.F.    2.F.2.
#16
55 Aug
Form 1 #1
56 Aug        2.F.     2.F.     2.F.    2.F.1.   2.F.1.    2.F.1.   2.F.1.   3.J.1.
Form 1 #2
57 Aug       2.E.1.   2.E.1.   2.E.1.   2.E.1.   2.E.1.    2.E.1.   2.E.1.   2.E.1.
Form 1 #3
58 Aug
Form 1 #4
59 Aug       3.C.2. 3.C.2. 3.C.2. 3.C.2. 3.C.2. 3.C.2. 3.C.2. 3.C.4. 4.K.1.
Form 1 #5
60 Aug       1.B.     1.B.     1.B.1. 1.B.2. 1.B.2. 1.B.2. 1.B.2.             2.F.

B-16
Table 11.9
Objectives Coded to Each Item by Reviewers
Maine Mathematics SAT Augmented Study

Form 1 #6
61 Aug      2.E.1.   2.E.1.   2.E.1.   2.E.1.   2.E.1.    2.E.1.   2.E.1.   2.E.1.
Form 1 #7
62 Aug
Form 1 #8
63 Aug      1.B.1.   2.F.1.   2.F.1.   2.F.1.   2.F.1.    2.F.1.   2.F.1.   2.F.1.
Form 1 #9
64 Aug      1.A.1. 1.A.1. 1.A.1. 1.A.2. 1.A.2. 1.A.2. 4.H.3. 4.H.3. 4.H.3.
Form 1 #10
65 Aug      2.E.1.   2.E.1.   2.E.1.   2.E.1.   2.E.1.    2.E.1.   2.E.1.   2.E.1.
Form 1 #11
66 Aug
Form 1 #12
67 Aug      3.C.1. 3.C.4. 3.C.4. 3.C.4. 3.C.4. 3.C.4. 3.C.4. 3.C.4.
Form 1 #13
68 Aug      1.I.2.   1.I.2.   1.I.2.   1.I.2.    1.I.2.   1.I.2.   1.I.2.   2.E.
Form 1 #14
69 Aug      3.D.1. 3.D.1. 3.D.1. 3.D.1. 3.D.1. 3.D.1. 3.D.1. 3.D.1.
Form 1 #15
70 Aug      1.I.1.   1.I.1.   1.I.1.   1.I.1.    1.I.1.   1.I.1.   1.I.1.   1.I.1.
Form 1 #16
71 Aug
Form 1 #17
72 Aug      3.C.3. 3.C.3. 3.C.3. 3.C.3. 3.C.3. 3.C.3. 3.C.3. 3.C.3. 3.D.2.
Form 1 #18
73 Aug      2.E.3.   2.E.3.   2.E.3.   2.E.3.   2.E.3.    2.E.3.   2.E.3.   2.E.3.   3.J.1.
Form 1 #19
74 Aug      3.D.1. 3.D.1. 3.D.1. 3.D.1. 3.D.1. 3.D.1. 3.D.1. 3.D.1.
Form 1 #20
75 Aug      1.B.1. 1.B.1. 1.B.1. 1.B.1.          2.F.      2.F.    2.F.1.   2.F.1.   2.F.1.
Form 1 #21
76 Aug
Form 1 #22
77 Aug       3.C.    3.C.1. 3.C.2. 3.C.2. 3.C.3. 3.C.3. 3.C.3. 3.C.3. 3.C.3.
Form 1 #23
78 Aug      2.E.3.   2.E.3.   2.E.3.   2.E.3.   2.E.3.    2.E.3.   2.E.3.   2.E.3.
Form 1 #24
79 Aug
Form 1 #25

B-17
Table 11.10
Items Coded by Reviewers to Each Objective
Maine Mathematics SAT Augmented Study

Low                   Medium                       High
0                      13                          83

1.   15
1.A.   8 15 31 31       31 39 42 50 50 50 50 50 51 51
1.A.1. 8 8 8 8            8 19 49 50 64 64 64
1.A.2. 64 64 64
1.B.   9 9 11 13        13 13 15 15 15 15 15 21 21 21 21 24 24 29 29 29
34 39 39 39       39 39 42 42 42 42 43 43 45 47 47 47 60 60
1.B.1. 8 8 13 13         15 15 21 29 31 31 37 39 42 42 45 47 60 63 75 75
75 75
1.B.2. 60 60 60 60
1.I. 12 12 12 51
1.I.1. 70 70 70 70       70 70 70 70
1.I.2. 68 68 68 68       68 68 68
1.I.3. 51 51 51
1.I.4.
2.   15
2.E.   3 3 3 3           3    3    5    5     5     5    5    5 7 7 7 7 17 18 18 18
22 22 22 30       30   30   30   35    35    35   37   44 48 48 54 68
2.E.1. 18 22 22 22       22   48   52   57    57    57   57   57 57 57 57 61 61 61 61 61
61 61 61 65       65   65   65   65    65    65   65
2.E.2. 3 3 5 7            7    7   17   17    17    17   18   18 18 22 25 27 27 27 27 27
27 27 27 30       30   30   30   35    35    35   35   35 38 44 44 44 44 44 44
44 46 46 46       46   46   48   54    54    54   54   54
2.E.3. 5 25 46 46        54   73   73   73    73    73   73   73 73 78 78 78 78 78 78 78
78
2.F. 29 35 46 46        54 56 56 56 60 75 75
2.F.1. 8 56 56 56        56 63 63 63 63 63 63 63 75 75 75
2.F.2. 17 17 17 20       25 25 25 25 25 37 38 54
3.
3.C.   4 11 11 12       21 21 21 51 77
3.C.1. 67 77
3.C.2. 4 4 4 4            4 4 4 7 12 12 12 33 33 33 33 33 33 41 41 41
41 41 41 41       47 47 47 47 59 59 59 59 59 59 59 77 77
3.C.3. 72 72 72 72       72 72 72 72 77 77 77 77 77
3.C.4. 59 67 67 67       67 67 67 67
3.C.5.
3.D.   2 2 2 2           2 2
3.D.1. 2 69 69 69        69 69 69 69 69 74 74 74 74 74 74 74 74
3.D.2. 2 72

B-18
Table 11.10
Items Coded by Reviewers to Each Objective
Maine Mathematics SAT Augmented Study

3.J.    24
3.J.1.   31   45 50 56 73
4.     14   25
4.G.     16   16 16 16 16 16 19 19 19 36 36 36 36 36 37 37 37 52 52 52
53   53 53 53 53 53
4.G.1.   41
4.G.2.   10   10   10 10 10 10 10 10 13 13 13 19 19 20 20 20 20 20 20 20
23   23   23 23 29 29 34 36 36 37 37 37 42 50 51 53
4.G.3.   23   23   38 38
4.G.4.   19   52
4.H.     6   11   32   40   48   48
4.H.1.   16   16   24   24   33   38   38   38    38    48   48   52   52
4.H.2.    9    9   11   26   26   26   26   26    26    26   26   31   33   48   48   52
4.H.3.    1    1    1    1    1    1    1    6     6     6    6    6    6    9    9    9   11   11   11 11
13   14   14   14   14   14   14   14    16    21   23   28   28   28   28   28   28   28   29
29   31   32   32   32   32   32   32    32    34   34   34   34   34   35   38   38   38   38
39   40   40   40   40   40   42   43    43    43   43   45   45   45   45   45   49   49   49
49   49   49   64   64   64
4.H.4.    1    6    9    9   12   19   19 23 23 23 24 24 24 28 31 34 36 40 40 43
43   45   45   49   49   50   53
4.K.
4.K.1. 18 59
4.K.2.

B-19
Table 11.11
Number of Reviewers Coding an Item by Objective (Item Number: Number of Reviewers)
Maine Mathematics SAT Augmented Study

Low                  Medium                   High
1                     4                       8

1.   15:1
1.A.  8:1 15:1 31:3 39:1       42:1 50:5 51:2
1.A.1. 8:5 19:1 49:1 50:1       64:3
1.A.2. 64:3
1.B.  9:2 11:1 13:3 15:5       21:4 24:2 29:3 34:1 39:5 42:4 43:2 45:1 47:3
60:2
1.B.1. 8:2 13:2 15:2 21:1       29:1 31:2 37:1 39:1 42:2 45:1 47:1 60:1 63:1
75:4
1.B.2. 60:4
1.I. 12:3 51:1
1.I.1. 70:8
1.I.2. 68:7
1.I.3. 51:3
1.I.4.
2.   15:1
2.E.  3:6 5:6 7:4 17:1         18:3 22:3 30:4 35:3 37:1 44:1 48:2 54:1 68:1
2.E.1. 18:1 22:4 48:1 52:1      57:8 61:8 65:8
2.E.2. 3:2 5:1 7:3 17:4         18:3 22:1 25:1 27:8 30:4 35:5 38:1 44:7 46:5
48:1 54:5
2.E.3. 5:1 25:1 46:2 54:1       73:8 78:8
2.F. 29:1 35:1 46:2 54:1       56:3 60:1 75:2
2.F.1. 8:1 56:4 63:7 75:3
2.F.2. 17:3 20:1 25:5 37:1      38:1 54:1
3.
3.C.  4:1 11:2 12:1 21:3       51:1 77:1
3.C.1. 67:1 77:1
3.C.2. 4:7 7:1 12:3 33:6        41:7 47:4 59:7 77:2
3.C.3. 72:8 77:5
3.C.4. 59:1 67:7
3.C.5.
3.D.  2:6
3.D.1. 2:1 69:8 74:8
3.D.2. 2:1 72:1
3.J. 24:1
3.J.1. 31:1 45:1 50:1 56:1      73:1
4.   14:1 25:1
4.G. 16:6 19:3 36:5 37:3       52:3 53:6
4.G.1. 41:1

B-20
Table 11.11
Number of Reviewers Coding an Item by Objective (Item Number: Number of Reviewers)
Maine Mathematics SAT Augmented Study

4.G.2. 10:8   13:3   19:2 20:7 23:4 29:2 34:1 36:2 37:3 42:1 50:1 51:1 53:1
4.G.3. 23:2   38:2
4.G.4. 19:1   52:1
4.H.  6:1    11:1   32:1   40:1   48:2
4.H.1. 16:2   24:2   33:1   38:4   48:2   52:2
4.H.2. 9:2    11:1   26:8   31:1   33:1   48:2   52:1
4.H.3. 1:7    6:6    9:3    11:4   13:1   14:7   16:1 21:1 23:1 28:7 29:2 31:1 32:7
34:5   35:1   38:4   39:1   40:5   42:1   43:4 45:5 49:6 64:3
4.H.4. 1:1    6:1    9:2    12:1   19:2   23:3   24:3 28:1 31:1 34:1 36:1 40:2 43:2
45:2   49:2   50:1   53:1
4.K.
4.K.1. 18:1   59:1
4.K.2.

B-21
Table 11.12
Number of Reviewers Coding an Objective by Item (Objective: Number of Reviewers)
Maine Mathematics SAT Augmented Study

Low                   Medium                       High
1                      4                           8

1 Section 2 #1   4.H.3.:7   4.H.4.:1
2 Section 2 #2    3.D.:6    3.D.1.:1   3.D.2.:1
3 Section 2 #3    2.E.:6    2.E.2.:2
4 Section 2 #4    3.C.:1    3.C.2.:7
5 Section 2 #5    2.E.:6    2.E.2.:1   2.E.3.:1
6 Section 2 #6    4.H.:1    4.H.3.:6   4.H.4.:1
7 Section 2 #7    2.E.:4    2.E.2.:3   3.C.2.:1
8 Section 2 #8    1.A.:1    1.A.1.:5   1.B.1.:2 2.F.1.:1
9 Section 2 #9    1.B.:2    4.H.2.:2   4.H.3.:3 4.H.4.:2
10 Section 2 #10   4.G.2.:8
11 Section 2 #11    1.B.:1     3.C.:2     4.H.:1 4.H.2.:1 4.H.3.:4
12 Section 2 #12    1.I.:3     3.C.:1    3.C.2.:3 4.H.4.:1
13 Section 2 #13    1.B.:3    1.B.1.:2   4.G.2.:3 4.H.3.:1
14 Section 2 #14     4.:1     4.H.3.:7
15 Section 2 #15     1.:1      1.A.:1     1.B.:5 1.B.1.:2    2.:1
16 Section 2 #16    4.G.:6    4.H.1.:2   4.H.3.:1
17 Section 2 #17    2.E.:1    2.E.2.:4   2.F.2.:3
18 Section 2 #18    2.E.:3    2.E.1.:1   2.E.2.:3 4.K.1.:1
19 Section 2 #19   1.A.1.:1    4.G.:3    4.G.2.:2 4.G.4.:1 4.H.4.:2
20 Section 2 #20   2.F.2.:1   4.G.2.:7
21 Section 4 #1    1.B.:4    1.B.1.:1    3.C.:3 4.H.3.:1
22 Section 4 #2    2.E.:3    2.E.1.:4   2.E.2.:1
23 Section 4 #3   4.G.2.:4   4.G.3.:2   4.H.3.:1 4.H.4.:3
24 Section 4 #4    1.B.:2     3.J.:1    4.H.1.:2 4.H.4.:3
25 Section 4 #5   2.E.2.:1   2.E.3.:1   2.F.2.:5   4.:1
26 Section 4 #6   4.H.2.:8
27 Section 4 #7   2.E.2.:8
28 Section 4 #8   4.H.3.:7   4.H.4.:1
29 Section 4 #9    1.B.:3    1.B.1.:1    2.F.:1    4.G.2.:2 4.H.3.:2
30 Section 4 #10    2.E.:4    2.E.2.:4
31 Section 4 #11    1.A.:3    1.B.1.:2   3.J.1.:1   4.H.2.:1 4.H.3.:1 4.H.4.:1
32 Section 4 #12    4.H.:1    4.H.3.:7
33 Section 4 #13   3.C.2.:6   4.H.1.:1   4.H.2.:1
34 Section 4 #14    1.B.:1    4.G.2.:1   4.H.3.:5 4.H.4.:1
35 Section 4 #15    2.E.:3    2.E.2.:5    2.F.:1 4.H.3.:1
36 Section 4 #16    4.G.:5    4.G.2.:2   4.H.4.:1
37 Section 4 #17   1.B.1.:1    2.E.:1    2.F.2.:1 4.G.:3 4.G.2.:3
38 Section 4 #18   2.E.2.:1   2.F.2.:1   4.G.3.:2 4.H.1.:4 4.H.3.:4

B-22
Table 11.12
Number of Reviewers Coding an Objective by Item (Objective: Number of Reviewers)
Maine Mathematics SAT Augmented Study

39 Section 7 #1    1.A.:1     1.B.:5    1.B.1.:1 4.H.3.:1
40 Section 7 #2    4.H.:1    4.H.3.:5   4.H.4.:2
41 Section 7 #3   3.C.2.:7   4.G.1.:1
42 Section 7 #4    1.A.:1     1.B.:4    1.B.1.:2 4.G.2.:1 4.H.3.:1
43 Section 7 #5    1.B.:2    4.H.3.:4   4.H.4.:2
44 Section 7 #6    2.E.:1    2.E.2.:7
45 Section 7 #7    1.B.:1    1.B.1.:1   3.J.1.:1 4.H.3.:5 4.H.4.:2
46 Section 7 #8   2.E.2.:5   2.E.3.:2    2.F.:2
47 Section 7 #9    1.B.:3    1.B.1.:1   3.C.2.:4
48 Section 7 #10    2.E.:2    2.E.1.:1   2.E.2.:1 4.H.:2 4.H.1.:2 4.H.2.:2
49 Section 7 #11   1.A.1.:1   4.H.3.:6   4.H.4.:2
50 Section 7 #12    1.A.:5    1.A.1.:1   3.J.1.:1 4.G.2.:1 4.H.4.:1
51 Section 7 #13    1.A.:2     1.I.:1    1.I.3.:3 3.C.:1 4.G.2.:1
52 Section 7 #14   2.E.1.:1    4.G.:3    4.G.4.:1 4.H.1.:2 4.H.2.:1
53 Section 7 #15    4.G.:6    4.G.2.:1   4.H.4.:1
54 Section 7 #16    2.E.:1    2.E.2.:5   2.E.3.:1 2.F.:1 2.F.2.:1
55 Aug Form 1 #1
56 Aug Form 1 #2     2.F.:3    2.F.1.:4   3.J.1.:1
57 Aug Form 1 #3    2.E.1.:8
58 Aug Form 1 #4
59 Aug Form 1 #5    3.C.2.:7 3.C.4.:1 4.K.1.:1
60 Aug Form 1 #6     1.B.:2 1.B.1.:1 1.B.2.:4        2.F.:1
61 Aug Form 1 #7    2.E.1.:8
62 Aug Form 1 #8
63 Aug Form 1 #9    1.B.1.:1 2.F.1.:7
64 Aug Form 1 #10    1.A.1.:3 1.A.2.:3 4.H.3.:3
65 Aug Form 1 #11    2.E.1.:8
66 Aug Form 1 #12
67 Aug Form 1 #13    3.C.1.:1 3.C.4.:7
68 Aug Form 1 #14    1.I.2.:7  2.E.:1
69 Aug Form 1 #15    3.D.1.:8
70 Aug Form 1 #16    1.I.1.:8
71 Aug Form 1 #17
72 Aug Form 1 #18    3.C.3.:8 3.D.2.:1
73 Aug Form 1 #19    2.E.3.:8 3.J.1.:1
74 Aug Form 1 #20    3.D.1.:8
75 Aug Form 1 #21    1.B.1.:4 2.F.:2 2.F.1.:3
76 Aug Form 1 #22
77 Aug Form 1 #23     3.C.:1 3.C.1.:1 3.C.2.:2 3.C.3.:5
78 Aug Form 1 #24    2.E.3.:8
79 Aug Form 1 #25

B-23
Table 11.13
Assessment Item DOK vs Consensus DOK (Item Number: Number of Reviewers [Average
DOK])
Maine Mathematics SAT Augmented Study

Low                      Matched                    High
DOK                       DOK                       DOK
1                          4                        8

1.      15:1
[2]:      [2]
1.A.     8:1[    15:1   31:3   39:1    42:1   50:5    51:2
[2]:      2]      [2]    [2]    [1]     [1]   [2.4]    [2]
1.A.     8:5[    19:1   49:1   50:1    64:3
1.      1.2]     [2]    [1]    [1]    [1.6
[2]:                                    7]
1.A.     64:3
2.      [1.6
[2]:      7]
1.B.     9:2[    11:1   13:3   15:5    21:4   24:2    29:3   34:1   39:5    42:4    43:2   45:1   47:3
[2]:      2]      [2]    [2]   [1.6]   [1.7   [2.5]   [1.3    [2]   [1.2]   [1.5]    [1]    [2]   [1.6
5]             3]                                          7]
60:2
[1.5]
1.B.     8:2[    13:2   15:2   21:1    29:1   31:2    37:1   39:1   42:2    45:1    47:1   60:1   63:1
1.       1]      [2]    [1]    [1]     [2]    [2]     [3]    [1]    [2]     [1]     [1]    [2]    [2]
[2]:
75:4
[2.2
5]
1.B.    60:4
2.     [1.5]
[2]:
1.I.    12:3    51:1
[2]:     [1]     [2]
1.I.1    70:8
. [3]:    [2]
1.I.2    68:7
. [2]:   [2.1
4]
1.I.3    51:3
. [2]:    [2]
1.I.4
. [2]:
2.     15:1
[2]:     [2]
2.E.    3:6[    5:6[   7:4[   17:1    18:3   22:3    30:4   35:3   37:1    44:1    48:2   54:1   68:1

B-24
Table 11.13
Assessment Item DOK vs Consensus DOK (Item Number: Number of Reviewers [Average
DOK])
Maine Mathematics SAT Augmented Study

[2]:   1.5]    2]     1.25    [2]    [2]    [1.3    [1.7   [2.3   [3]     [1]     [2]    [2]    [2]
]                    3]      5]     3]
2.E.   18:1   22:4    48:1    52:1   57:8   61:8    65:8
1.     [2]   [1.2     [2]     [2]   [1.1   [1.3    [1.3
[2]:           5]                     2]     8]      8]
2.E.   3:2[   5:1[    7:3[    17:4   18:3   22:1    25:1   27:8   30:4    35:5    38:1   44:7   46:5
2.     1]     2]     1.67    [2.2   [2.3    [2]     [2]   [1.6   [1.5]   [2.2]    [2]   [1.4    [2]
[3]:                    ]      5]     3]                    2]                            3]
48:1   54:5
[2]   [2.2]
2.E.   5:1[   25:1    46:2    54:1   73:8   78:8
3.     2]     [2]     [2]     [3]   [2.3   [1.6
[2]:                                  8]     2]
2.F.   29:1   35:1    46:2    54:1   56:3   60:1    75:2
[2]:    [1]    [3]     [2]     [2]    [1]    [1]     [2]
2.F.   8:1[   56:4    63:7    75:3
1.     1]     [1]    [1.4    [2.6
[1]:                   3]      7]
2.F.   17:3   20:1    25:5    37:1   38:1   54:1
2.    [2.6    [2]    [1.6]    [2]    [2]    [2]
[2]:    7]
3.
[3]:
3.C.   4:1[   11:2    12:1    21:3   51:1   77:1
[3]:    2]     [2]     [1]    [1.6    [2]    [3]
7]
3.C.   67:1   77:1
1.     [1]    [3]
[3]:
3.C.   4:7[   7:1[    12:3    33:6   41:7   47:4    59:7   77:2
2.    1.29    1]      [1]     [2]   [1.2   [1.5]   [2.2    [3]
[3]:     ]                            9]             9]
3.C.   72:8   77:5
3.    [2.1   [2.2]
[2]:    2]
3.C.   59:1   67:7
4.     [2]   [1.5
[2]:           7]
3.C.
5.
[3]:
3.D.   2:6[
[2]:    1]

B-25
Table 11.13
Assessment Item DOK vs Consensus DOK (Item Number: Number of Reviewers [Average
DOK])
Maine Mathematics SAT Augmented Study

3.D.     2:1[    69:8    74:8
1.      1]     [1.7    [1.7
[2]:             5]      5]
3.D.     2:1[    72:1
2.      2]      [3]
[2]:
3.J.    24:1
[3]:     [2]
3.J.1    31:1    45:1    50:1    56:1    73:1
. [3]:    [2]     [2]     [3]     [2]     [3]
4.     14:1    25:1
[2]:     [2]     [2]
4.G.     16:6    19:3    36:5    37:3    52:3    53:6
[2]:    [2.1    [2.3    [2.2]    [2]    [1.6     [2]
7]      3]                      7]
4.G.     41:1
1.       [1]
[2]:
4.G.     10:8    13:3    19:2    20:7    23:4    29:2    34:1   36:2    37:3    42:1   50:1    51:1    53:1
2.       [2]     [2]     [3]     [2]    [1.7    [1.5]    [1]    [3]     [2]     [2]    [3]     [2]     [2]
[2]:                                      5]
4.G.     23:2    38:2
3.      [1.5]   [2.5]
[2]:
4.G.     19:1    52:1
4.       [3]     [2]
[2]:
4.H.     6:1[    11:1    32:1    40:1    48:2
[2]:      2]      [2]     [1]     [1]     [2]
4.H.     16:2    24:2    33:1    38:4    48:2    52:2
1.       [2]     [1]     [2]    [2.5]    [2]     [2]
[2]:
4.H.     9:2[    11:1    26:8    31:1    33:1    48:2    52:1
2.      2.5]     [2]     [2]     [1]     [2]     [2]     [2]
[3]:
4.H.     1:7[    6:6[    9:3[    11:4    13:1    14:7    16:1   21:1    23:1    28:7   29:2    31:1    32:7
3.      1.57    1.5]     2]      [2]     [2]    [2.2     [2]    [2]     [2]    [2.4   [1.5]    [2]    [1.1
[2]:       ]                                      9]                             3]                     4]
34:5    35:1    38:4    39:1    40:5    42:1    43:4   45:5    49:6    64:3
[2]     [2]     [2]     [2]    [1.4]    [2]    [1.2   [1.4]   [1.5]    [2]
5]
4.H.     1:1[    6:1[    9:2[    12:1    19:2    23:3    24:3   28:1    31:1    34:1   36:1    40:2    43:2
4.       1]      1]      2]      [1]     [3]    [1.3    [1.3    [2]     [2]     [2]    [2]    [1.5]    [1]

B-26
Table 11.13
Assessment Item DOK vs Consensus DOK (Item Number: Number of Reviewers [Average
DOK])
Maine Mathematics SAT Augmented Study

[3]:                                   3]     3]
45:2    49:2   50:1   53:1
[1.5]    [1]    [3]    [2]
4.K.
[2]:
4.K.   18:1    59:1
1.     [3]     [2]
[3]:
4.K.
2.
[2]:

B-27
Appendix C

Reviewers’ Notes and Source of

Maine

Mathematics Standards to Augmented
SAT Assessment
Table 11.5
Source-of-Challenge Issues by Reviewer
Maine Mathematics SAT Augmented Study

4              The phrasing of this problem could be problematic.
50             The correct answer is "A - None", because they could all be zero. If the
test makers say the answer is "B - One" then this is a source of challenge.
50             This is a poorly worded item. What is it testing? Whether they consider
zero to be an integer?

C-1
Table 11.7
Notes by Reviewer
Maine Mathematics SAT Augmented Study

2             A simple event probability rather than compound.
2             The item asks the student to calculate the probability of an event, but it
does not involve compound events as described in 3.D.1 and it does not
involve a probability distribution as described in 3.D.2
2             not compound so did not choose 3d1
2             This is a simple event rather than compound. Two steps are required, but
both are level 1
2             could not find a p.i. close enough
2             the closest match is 3d1, but this is not a compound event
3             This is a spatial visulatization item, but do performance indicator.
3             This does not involve coordinates (2E1), inductive or deductive reasoning
(2E2) or trigonometry (2E3). It is only about visualization.
3             no coordinate rep,
3             the performance indicator only addresses transformations in coordinate
geometry
3             this item did not fit a more specific category
3             could not find a p.i close enough
4             analyzing circle graphs
5             Does not use coordinates (2E1) or use trig (2E3). It is more about
applying a concept (relationships between angles) than using reasoning to
"explore" or "determine" this relationship (2E2).
5             concepts of geometry
5             no performance indicator on applying properties to solve problems
5             not a good match to more specific items
5             close to 2e2 but not using reasoning or determining properties
5             item did not fit in specific indicators within this section
6             it is evaluating an expression; not mentioned in the p.i.
7             This is a ratio item, but I did not find any pi for ratio.
7             Does not involve coordinates (2E1), reasoning to determine or explore
properties (2E2), or trig (2E3).
7             Item does not fit a specific indicator.
7             don't really need to know much geometry for this one, but there aren't any
indicators for geometry of circles
7             not a good match to more specific items
8             not a good match to more specific items
9             analyze situation
9             Item matches computation but not the specific indicators
9             not a good match to more specific items
11            Requires knowledge of mean, then manipulation of equation. Did not find
an ip on mean.
11            Makes use of algebra concepts but does not specifically require the
formulation of equations and inequalities (4H3)
11            concept of mean
11            not a good match to more specific items

C-2
Table 11.7
Notes by Reviewer
Maine Mathematics SAT Augmented Study

11            concept of average (mean) falls under statistics, but isn't listed
12            Venn diagrams not included in PI's
12            Item does not specific indicators.
12            no mention of Venn diag. or sets in p.i.
12            Can't find a specific PI - deals with analysis of data in diagram
13            Asks students to compute value, not to approximate solution as in 1B1.
13            It is computational, but not about approximating, determining
reasonableness or justifying solution (1B1) or about explaining operations
in other number systems (1B2).
13            various techniques
13            not a specific performance indicator fits here
14            could be solved graphically or algebraically
15            Property of operations. No pi.
15            Primarily computational, but not approximate, determining reasonablenss
or justifying (1B1) and not in other number systems than base 10 (1B2)
15            understanding operations
15            no indicator fits this specifically
15            a sort of haiku:
i am typing and nursing
i hit the wrong field
can't delete sec. objectives
15            need to understand computation to do this problem; doe not inovle
estimating or explaining operations (the other p.i.s)
16            This is a function item, but without a context (other than mathematics).
Close to 4G2, but not a fit.
16            No specific indicator matches the composition of functions.
16            The item requires function knowledge but does not fit indicators.
16            no indicator addresses finding specific values of functions
16            best fit to standarda/function-based
16            doesn't involve real life situation as all p.i.s indicate. involves evaluating
and I do not see it in the p.i.s
17            using properties rather than explore and determine
17            Requires mutliple steps as well as high level of reasoning.
18            Not a good fit any place. This is really a combination item. Did not find
any pi.
18            Typically a geometry problem, but no specific indicator.
18            use properties to determine..could be 2 e 2
18            no performance indicator fits this specifically
18            can be solved using geometric relationships in a hexagon.
19            Item requires understnading properties of functions. Not a real life
situation, but best fit is 4G2.
19            Need to understand properties of function but no specific indicator about
this.
19            symbolic rep only no phenomena or context

C-3
Table 11.7
Notes by Reviewer
Maine Mathematics SAT Augmented Study

19            no indicator addresses finding equivalent representations of a function
evaluated for specific values of a function
19            problem deals with properties of functions; no clear match on specific PI
21            Item 1 requires use of mean. No pi relates to mean.
21            computational but not specific to approximation, reasonableness or
justificaiton (1B1) or number systems other than base 10 (1B2)
21            mean not standards deviation
21            Item does not fit specific indicator. Solution requires two steps, but both
are simple.
21            no specific standard on using arithmetic mean to solve problems or just
applying computational measures
21            best fit to standard
21            requires knowledge of mean - not specifically mentioned
22            While coordinates are involved, the item is not specifically about drawing
coordinate representations of geometric figures (2E1) but rather a single
point.
22            no specific standard on applying properties of circles
22            this is a geometric item, best fit to standards
22            very parital: cordinate geometry anyway
23            notion of function not indicated/needed so did not choose g
23            Item could be interpreted to fit any of the three indicators.
24            Item requires a form of reasoning (if then), but is not in a situation with
more than one conclusion.
24            DOK level borderline 2/3
PI-understanding of properties of operations on types of numbers
24            Item does not match any indicator well.
24            best fit to standards
25            sol'n could be alg. or geom./best fit to standards
27            partially
27            seemed the best fit in geometry
29            Item is a basic ratio item. Computation, but no pi requires students to
compute a ratio.
29            Proportional computation, but no specific objective dealing with this.
29            indirect measurement
29            no specific standard for proportional reasoning
30            using propertie so may align with 2 e 2
30            Item involves properties within the same figure.
30            no specific standard for geometry of circles
30            This is clearly geometric, but it didn't seem to fit the more specific
categories.
30            Partial: applying properties and computing
31            This item requires knowledge of powers and factors, but really no pi
31            Nothing specifically about powers or factoring under number sense.

C-4
Table 11.7
Notes by Reviewer
Maine Mathematics SAT Augmented Study

31            Number sense but not number system
31            No indicator deals specifically with factors of numbers
32            A very direct fill in values in linear equation and compute.
32            Item requires evaluation of an equation only.
34            Item could be solved by an equation, but could be solved by computing
with numbers, but does not require determining reasonableness of
anaswer or justification of results (not 1B1).
34            may not need symbolic representation
35            using properties rather than determining propoerties
35            Item does not fit a specific indicator
35            no PI specific to angle measure
36            This is a sequence question, but no pi addresses sequences and series.
36            Essentially about completeing a pattern but no specific standard for this,
and does not require or ask for a symbolic representation.
36            not real-life
36            no standard for finding specific terms of a sequence
36            torn between DOK 2 and 3
36            No mention of sequences in patterning; could also be descrete math
36            deals with patterns - not a specific PI
37            Item requires students to identify a pattern and complete the lenght. There
is no pi that asks students to detect a pattern nor to find a length.
37            Essentially about completing a pattern, but no specific indicator for this,
and does not require or ask for a symbolic representation.
37            no standard fits for applying a pattern - there is also a little bit here about
knowing properties of equaliteral triangles, but not enough to double-code
it
37            partially
37            student can not complete problem without understanding of equilateral
triangles- under geometry - but no specific to the PI
38            interpret from graph
39            Basic computation using a simple proportion or fractions. No pi for just
compute.
39            Computational, but not about approximation, determining reasonableness,
or justifying answers (1B1) or going outside of base ten (1B2)
39            various techniques
39            Item does not fit a specific indicator.
39            no indicator specifically addresses doing computations using various
techniques
39            simple computation
39            I feel that this problem addresses understanding of what 1/4 of a number
is - and is number sense. However, I don't feel that either of the specific
40            Really fill in the values of a function, but no real-life situtation, therefore
not 4G2.

C-5
Table 11.7
Notes by Reviewer
Maine Mathematics SAT Augmented Study

40            if 4h4 read evaluate rather than analyze, it would fit
41            Really requires reading a graph and not predicting or draw conclusions.
42            A computation item. No approximations or non base ten.
42            Computational but not specific to any of the indicators.
42            no specific standard for doing exact computations
42            This item seems to be more about number sense/ratios than what the more
specific standards describe.
42            simple computation
43            Direct squaring of a radical. A computation that does not fit either pis
under 1B.
43            4h4 of analyze situations involving symbolic representation
43            Item requires a simple substitution, which does not match a specific
indicator.
43            torn between DOK 1 and 2.
44            Item involves property within a figure, not between two figures. Does not
match indicators.
44            torn between DOK 1 and 2.
45            Absolute value is the main challenge to the item, but do not see an
appropriate pi. Use 4H3 for inequalities and 4H4 for absolute value.
45            Does not match a specific indicator.
45            while this could be solved analytically, my guess is that most students
would just plug in the choices and see which yields the appropriate
answer - so just numerical computation.
46            Item requires Pythorean theorem and computing a perimeter. I used 2F for
perimeter.
46            no specific standard for perimeter of figures
47            Requires computing a ratio. Does not fit 1B1 nor 1B2. The graph is just a
format to hold data. Students really do very little with reading the graph.
47            Although in a table this is essentially about computing and comparing
ratios and there is no specific indicator under 1B for this.
47            no specific standard for finding and comparing ratios
47            torn between DOK 1 and 2.
48            Slopes of perpendicular lines does not require functions so it is not under
4G and there is no specific indicator under 4H dealing with slope or slope
of perpendicular lines.
48            graphs to interpret
48            No direct match to indicator.
48            no indicator addresses relationship between slopes of perpendicular lines
48            coordinate geometry is not in p.i.s
48            students need to read from coordinate graph AND need to know
relationship between slopes of perpendicular lines - not sure which to list
first
49            analyze situations with symbolic represenation
50            Item requires some knowledge of integers.

C-6
Table 11.7
Notes by Reviewer
Maine Mathematics SAT Augmented Study

50            About general properties of numbers but nothing specific to integers
under this standard.
50            may not use symbolic
50            doesn't exactly fit into a specific indicator
50            This item seems to be about number sense, but doesn't fit the more
specific standards.
50            demonstrates understanding of integers but p.i.s address systems of
numbers
51            This is a counting item. Students are to derive the number of ways. Item
usually is under data analysis. Could be under computation.
51            About understanding the relationship between numbers (how they can be
combined and broken down) but no specific standard for this.
51            may not use symbolic
51            no specific indicator fits to applying number sense to real-world problems
51            didn't see clear fit - put it in the category I felt if fit best
52            Requires understanding transformations of functions, but no specific
indicator for this.
52            use graph to interpret
52            Item does not fit specific indicator.
52            no indicator addresses transformations of functions
53            Item requires counting types of terms in a sequence. No pi addresses
sequence or series.
53            Understanding a pattern, but does not require algebraic representation and
there is no specific indicator for analyzing patterns.
53            may not use symbolic
53            Item does not fit a specific indicator.
53            no indicator addresses analyzing specific terms of sequences
53            I was torn between DOK 2 and 3, and also thought this might be standard
1A.
53            nothing in p.i.s with sequences
53            not clear fit
54            This item requires knowledge of geometric figures and the application of
Pythagorean Theorem (linear measure). I used 2F as closed to
Pythagorean Theorem.
54            use properties
54            partial
56            Estimation of measure... there is no specific indicator for this
56            no indicator for choosing appropriate measurement units
56            general sense of measurement
60            rates
60            Item requires to use time as a base, but does not require explanation.
60            Could also be 4H2 depending on how solved but I saw it as a simple rate
problem. Could also see it as a level 2, but such a small second step.
61            DOK 1 or 2?

C-7
Table 11.7
Notes by Reviewer
Maine Mathematics SAT Augmented Study

64            Although the solution to the equation is complex number, the item does
not require any description or explanation.
64            no explain
68            I thought this also might be 1I3 but I'm not really familiar with game
theory.
73            DOK of 2 or 3?
75            General estimation of weight (measurement) but no specific indicator for
this.
75            no indicator on estimating with different measures
75            DOK of 2 or 3?
77            Seems to fit more than one standard under this category.
77            insufficient information - I assume that the students would see a table with
summary statistics

C-8
Appendix D

Debrief Summary Notes

Mathematics Standards to Augmented
SAT Assessment
Table 11.15
Debriefing Summary
Maine Mathematics SAT Augmented Study

A. For each standard, did the items cover the most important topics you expected by
the standard? If not, what topics were not assessed that should have been?

· Standards 4K and 3J were not addressed at all as I recall. I also felt that the
probability questions were often basic probability questions and did not address the
two specific indicators listed in the standards (3D1 and 3D2).
· Explaining or describing uses of complex numbers. No linear programming
questions required the whole process. Game theory was marginally covered (and
that's fine). No trig using periodic phenomena. Probability DISTRIBUTIONS.
Identify a variety of situations using the same type of function. Reading math
presentations did not seem to be covered specifically.
· no questions assessing items in 4K; few questions from the actual SAT fit in 1.I
· 3J1, 4G3,4 and 4K were not addressed.
· It was hard to tell as there were many verbs and conjunctions. I found the most
missing topics in evaluating expressions and applying properties of simple figures.
· The complete assessment, including the augment, covered the majority of the
topics. Standard 4G3 didn't have the variety as discussed

B. For each standard, did the items cover the most important performance (DOK
levels) you expected by the standard? If not, what performance was not assessed?

· The items tended to be on the lower end of the DOK levels. Many DOK 2s were
right on the border between 1 and 2.
· Need for additional level 3 based on DOKs of standards but due to type of
assessment I am not sure expected balance.
· Level 2 dominated. That is expected from a multiple-choice format.
· Overall, the SAT questions and the augmented questions were at lower DOK levels
than the standards
· There seemed to be a pretty narrow band of level 2 questions, not surprising given
that these were multiple choice items. There was no place for proof, explanation or
discourse.
· Generally so. Performance that was not assessed were those where the standard
seemed to suggest level 3 or 4 with verbs such as create, evaluate, derive, and
model.
· The standards were assessed mostly at the 2-3 DOK level - as I expected in an
assessment of this type.

C. Were the standards written at an appropriate level of specificity and directed
towards expectations appropriate for the grade level?

· No, it was often quite difficult to find a specific indicator for the test item. It
seemed that there were a few very specific standards, but not enough standards to
cover the breadth of possible items, thus many items were coded at a more general

D-1
Table 11.15
Debriefing Summary
Maine Mathematics SAT Augmented Study

indicator. In particular problems dealing with: proportion, measurement estimation,
and patterns did not seem to be well represented in the standards.
· The STANDARDS are OK. The INDICATORS seem to have more examples than
a complete list of expected skills. Many indicators were too specific to align fully
with SAT questions.
· The standards were almost too specific to fit the SAT questions. For instance, there
were many questions about properties of circles, but this content is not represented
in the ME standards. Not sure if 4.K indicators can even be assessed by a test.
· I think they might be better subdivided, and attention to multiple solutions would
· I believe that they were too specific and too narrow. They were appropriate for the
· I have not had extensive experience with the standards (PI) and thus found myself
struggling at times to make clear a clear fit.

D. What is your general opinion of the alignment between the standards and
assessment:

ii. Acceptable Alignment (3) : 38%
iii. Needs slight improvement (3) : 38%
iv. Needs major improvement (2) : 25%

· Augment was acceptable but for other sections, the alignment seemed to be forced
or hitting a more generic standard.
· I rated alignment 'acceptable' realizing that the standards have already been
rewritten. Also, it is difficult to depend too much on trying to base all judgment on a
one-time test with relatively few questions and formats.
· These standards strike me as a sort of benchmark checklist, not a bad thing, but
they might benefit from additional standards for process, technology and even
pedagogy.
· The alignment was not a neat fit, but the assessment aligned with what I believe
was the intent of the standards. The standards are just poorly written. So going back
to the bigger picture of the standards, I think the alignment is generally there.

D-2
D-3
APPENDIX E
IRT C ALIBRATION COMMAND FILE
TITLE MHSA0607 MAT11
>COMMENT ;
>FILE DFNAME='C:\PSL3\ME07\MAT11\MAT11NEW.TXT',
NFNAME='C:\PSL3\OMIT_DOT.KEY',SAVE;
>SAVE PARM='C:\PSL3\ME07\MAT11\MAT11_1.PAR',
FIT='C:\PSL3\ME07\MAT11\MAT11_1.FIT';
>INPUT NIDW=10,NTOTAL=072,NTEST=1;
(T5,10A1,072A1)

>TEST1 TNAME=MAT11,NBLOCK=072;
>BLOCK001 BNAME='SAT01 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK002 BNAME='SAT02 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK003 BNAME='SAT03 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK004 BNAME='SAT04 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK005 BNAME='SAT05 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK006 BNAME='SAT06 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK007 BNAME='SAT07 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK008 BNAME='SAT08 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK009 BNAME='SAT09 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK010 BNAME='SAT10 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK011 BNAME='SAT11 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK012 BNAME='SAT12 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK013 BNAME='SAT13 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK014 BNAME='SAT14 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK015 BNAME='SAT15 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK016 BNAME='SAT16 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK017 BNAME='SAT17 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK018 BNAME='SAT18 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK019 BNAME='SAT19 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK020 BNAME='SAT20 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK021 BNAME='SAT21 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK022 BNAME='SAT22 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK023 BNAME='SAT23 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK024 BNAME='SAT24 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK025 BNAME='SAT25 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK026 BNAME='SAT26 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK027 BNAME='SAT27 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK028 BNAME='SAT28 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK029 BNAME='SAT29 ',NITEMS=1,NCAT=2,ORIGINAL=(0,1);
>BLOCK030 BNAME='SAT30 ',NITEMS=1,NCAT=2,ORIGINAL=(0,1);
>BLOCK031 BNAME='SAT31 ',NITEMS=1,NCAT=2,ORIGINAL=(0,1);
>BLOCK032 BNAME='SAT32 ',NITEMS=1,NCAT=2,ORIGINAL=(0,1);
>BLOCK033 BNAME='SAT33 ',NITEMS=1,NCAT=2,ORIGINAL=(0,1);
>BLOCK034 BNAME='SAT34 ',NITEMS=1,NCAT=2,ORIGINAL=(0,1);
>BLOCK035 BNAME='SAT35 ',NITEMS=1,NCAT=2,ORIGINAL=(0,1);
>BLOCK036 BNAME='SAT36 ',NITEMS=1,NCAT=2,ORIGINAL=(0,1);
>BLOCK037 BNAME='SAT37 ',NITEMS=1,NCAT=2,ORIGINAL=(0,1);
>BLOCK038 BNAME='SAT38 ',NITEMS=1,NCAT=2,ORIGINAL=(0,1);
>BLOCK039 BNAME='SAT39 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK040 BNAME='SAT40 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK041 BNAME='SAT41 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK042 BNAME='SAT42 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK043 BNAME='SAT43 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK044 BNAME='SAT44 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK045 BNAME='SAT45 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK046 BNAME='SAT46 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK047 BNAME='SAT47 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK048 BNAME='SAT48 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK049 BNAME='SAT49 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK050 BNAME='SAT50 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK051 BNAME='SAT51 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK052 BNAME='SAT52 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK053 BNAME='SAT53 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK054 BNAME='SAT54 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK055 BNAME='MEA01 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK056 BNAME='MEA02 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK057 BNAME='MEA03 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK058 BNAME='MEA04 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK059 BNAME='MEA05 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK060 BNAME='MEA06 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK061 BNAME='MEA07 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK062 BNAME='MEA08 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK063 BNAME='MEA09 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK064 BNAME='MEA10 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK065 BNAME='MEA11 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK066 BNAME='MEA12 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK067 BNAME='MEA13 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK068 BNAME='MEA14 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK069 BNAME='MEA15 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK070 BNAME='MEA16 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK071 BNAME='MEA17 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>BLOCK072 BNAME='MEA18 ',NITEMS=1,NCAT=3,ORIGINAL=(0,1,2);
>SCORE NOSCORE;
APPENDIX F

MHSA SAMPLE REPORTS

College Board / Measured Progress
Important Information for                                  and are based on new achievement standards. While                 Achievement Level Deﬁnitions
many students do not yet meet the Learning Results
Parents/Guardians                                       standards, keep in mind that these are challenging,        On this assessment, results are reported across four achievement
rigorous standards for student performance. These          levels. The general deﬁnitions below describe the quality of
High School Assessment                                    achievement level results are Maine-speciﬁc                student work for each achievement level.
information not contained in any previously released
Exceeds the Standards: The student’s work demonstrates
in-depth understanding of essential concepts in a content area,
All scores contained in these reports are included for     including the ability to make multiple connections among central
STATE OF MAINE         Maine reporting purposes only. While scores for most       ideas. The student’s responses demonstrate the ability to synthe-
students may also be used for college admission, they
Maine                        DEPARTMENT OF EDUCATION                                                                 size information, analyze and solve difﬁcult problems, and apply
23 State House Station    may not be used for that purpose if a student received     complex concepts.
High School                         Augusta, ME 04333
October 2007
accommodations during the test administration that
Meets the Standards: The student’s work demonstrates an un-
Assessment                                                  exceeded those made available by the College Board.
derstanding of essential concepts in a content area, including the
Susan A. Gendron                                                                   ability to make connections among central ideas. The student’s
COMMISSIONER              The Maine High School Assessment results should         responses demonstrate the ability to analyze and solve problems
be viewed as one measure of student performance            and apply concepts.
Dear Parents and Guardians,
together with multiple local measures such as
portfolios, performance exhibits, and end-of-term          Partially Meets the Standards: The student’s work demon-
The Maine Comprehensive Assessment System is
grades to create a more complete picture of a student’s    strates incomplete understanding of essential concepts in a
the State’s measure of student progress in achieving                                                                    content area and inconsistent connections among central ideas.
the State standards, known as Learning Results,                                                                         The student’s responses demonstrate some ability to analyze and
will be able to provide further information about your
adopted by the Maine Legislature in 1997. The Maine                                                                     solve problems and apply concepts.
student’s performance on the SAT as well as your
Educational Assessment (MEA) is administered in
school’s performance.
grades 3 through 8 to meet these state assessment                                                                       Does Not Meet the Standards: The student’s work demonstrates
requirements. Since the spring of 2006, the SAT                                                                         limited understanding of essential concepts in a content area
We hope you ﬁnd this report informative as we           and infrequent or inaccurate connections among central ideas.
Reasoning Test™ (SAT) has been administered to
continue to work toward improving the quality and          The student’s responses demonstrate minimal ability to solve
students in their third year of high school in place of
effectiveness of instructional opportunities so that all   problems and apply concepts.
the MEA for state and federal purposes. The move
Maine youth will graduate from high school prepared
from the MEA to the SAT in grade 11 was made to
for college, career and citizenship.
encourage all students in the goal of attaining college                                                                     Maine High School Assessment
and high-level workplace readiness as well as to
measure achievement. This year, the mathematics
Sincerely,                                      Summary Results
portion of the SAT Reasoning Test™ was augmented                                                                             May 2007 Administration
with 18 additional mathematics items (the Math-A
test) to more fully measure Maine’s Learning Results.
Susan A. Gendron
Exceeds         8
Commissioner of Education                     the
This Maine High School Assessment Report                                                                                             4
includes information on how your student scored on                                                                       Standards       6
the SAT Reasoning and Math-A tests administered
in May/June 2007, along with data on your child’s                                                                            Meets                         38
school, district, and state results. These results reﬂect                                                                                                 36
scores based on SAT and Math-A test questions that
Information on the                                         the
Standards                          41
were taken by the over 15,000 students who were                Maine High School Assessment
enrolled in their third year of high school across all                                                                     Partially                    31
Maine public schools. The SAT Reasoning Test™               • More information about Maine’s Learning Results can        Meets the                      31
employs an assessment design that requires students to        be found at www.maine.gov/education/lres/lres.htm.         Standards                      31
create a written response to a writing prompt, generate
select answers to multiple-choice questions. More
Does Not                 23
be found at www.maine.gov/education/sat_initiative/.        Meet the                   30
information about the design, history, and use of the                                                                                              21
Standards
SAT can be found at: http://www.maine.gov/education/        • School reports, which allow you to review the Maine
sat_initiative/.                                              High School Assessment results by school, may be                     0%             25%         50%          75%         100%
viewed at: www.maine.gov/education/sat_initiative/
These results are reported across four achievement         school_reports.htm as soon as they are available for             Critical Reading         Mathematics         Writing
levels that describe the quality of the student work          posting.
High
School

Content               Achievement                                      This Student’s Achievement Level and Score
Score
Area                    Level                                  Does Not Meet              Partially            Meets                  Exceeds

1100                        1129             1141                      1161                  1180

Does Not Meet                Partially          Meets                  Exceeds

Mathematics                                                                                                                x
1100                             1133        1141                      1161                  1180

Does Not Meet              Partially            Meets                  Exceeds

Writing                                                                                                                x
1100                        1129             1141                      1161                  1180

See reverse side for description of achievement levels and state summary results.
The diamond ( ) represents the student’s score. The bar (           ) surrounding the score represents the probable range of scores for the
student if he or she were to be tested many times. This statistic is called the standard error of measurement.

The scaled scores provided in the tables above reﬂect the 80-point scale used in all grades throughout the MeCAS system. The ﬁrst two
digits (11) denote the grade level of the assessment while the last two digits (00–80) show where the student scored on the 80-point
scale. If your child took the SAT under an approved College Board administration, he or she should have received college reportable scores
directly from the College Board approximately three weeks after testing. A conversion table showing all SAT scores in reading and writing
and their MHSA equivalents can be found on the State’s MHSA web page at http://www.maine.gov/education/sat_initiative/index.htm.
There is not a conversion table supplied for the mathematics portion of the MHSA since this test is a combination of the SAT mathematics
questions plus the Maine Math–A (augment).

This Student’s Achievement Level Relative to Student Achievement for School, District, and State

Student    School      District   State     Student   School   District       State      Student       School      District     State

Exceeds
%                                          2                                                5
the Standards

Meets
the Standards             ✔                                          ✔                                            ✔
Partially Meets
the Standards

Does Not Meet
the Standards
Name:                                                           Name:
Maine            MEDMS ID:                                        Maine          MEDMS ID:
High School        School:                                        High School      School:
Assessment                                                        Assessment
District:                                                       District:
Achievement Levels   Scaled Scores                              Achievement Levels   Scaled Scores
Mathematics:                                                      Mathematics:
Writing:                                                          Writing:

Name:                                                           Name:
Maine            MEDMS ID:                                        Maine          MEDMS ID:
High School        School:                                        High School      School:
Assessment                                                        Assessment
District:                                                       District:
Achievement Levels   Scaled Scores                              Achievement Levels   Scaled Scores
Mathematics:                                                      Mathematics:
Writing:                                                          Writing:

Name:                                                           Name:
Maine            MEDMS ID:                                        Maine          MEDMS ID:
High School        School:                                        High School      School:
Assessment                                                        Assessment
District:                                                       District:
Achievement Levels   Scaled Scores                              Achievement Levels   Scaled Scores
Mathematics:                                                      Mathematics:
Writing:                                                          Writing:

Name:                                                           Name:
Maine            MEDMS ID:                                        Maine          MEDMS ID:
High School        School:                                        High School      School:
Assessment         District:
Assessment       District:
Achievement Levels   Scaled Scores                              Achievement Levels   Scaled Scores
Mathematics:                                                      Mathematics:
Writing:                                                          Writing:

Name:                                                          Name:
Maine            MEDMS ID:                                        Maine         MEDMS ID:
High School        School:                                        High School     School:
Assessment                                                        Assessment
District:                                                      District:
Achievement Levels   Scaled Scores                             Achievement Levels    Scaled Scores
Mathematics:                                                      Mathematics:
Writing:                                                          Writing:
CONFIDENTIAL                                         Date:       May 2007
Maine                                                                                         Code:
High School                            Student Roster                                           Group Size:
District:
Assessment
High School                                         School:                                         Page: 1 of 1

Name                            MEDMS ID
Scaled Score     Achievement Level   Scaled Score      Achievement Level   Scaled Score        Achievement Level

Average School Scaled Score

Average District Scaled Score

Average State Scaled Score
October 2007

Maine
High School                                         High School
DEPARTMENT OF EDUCATION
2006–2007 School Year Reports
Assessment                                            Report
Dear School Board Members and School Personnel:
The Maine Comprehensive Assessment System is the State’s measure of student progress in
achieving the State standards, known as Learning Results, adopted by the Maine Legislature in
1997. The Maine Educational Assessment (MEA) is administered in grades 3 through 8 to meet
these state assessment requirements. Since the spring of 2006, the SAT Reasoning Test™ (SAT)
has been administered to students in their third year of high school in place of the MEA for state    Test Date:              May 2007
and federal purposes. The move from the MEA to the SAT in grade 11 was made to encourage all
students in the goal of attaining college and high-level workplace readiness as well as to measure
ID:
achievement. This year, the mathematics portion of the SAT Reasoning Test™ was augmented with         District:
18 additional mathematics items (the Math-A test) to more fully measure Maine’s Learning Results.
The combined tests form the Maine High School Assessment (MHSA).                                      School:
Due to the inclusion of the additional items in mathematics, it was necessary to set new
achievement level standards for that discipline this year. The new achievement level standards
are the result of a comprehensive process informed by Maine teachers and reviewed by advisory
committees. The achievement level standards were not changed for the Critical Reading and
Writing sections of the MHSA.
These 2006-2007 Maine High School Assessment Summary Reports contain the results of
student performance on the SAT in critical reading, mathematics, and writing reported according to
the achievement standards described above and disaggregated by student and school characteristics.
This report, together with individual student and subject-speciﬁc student roster reports, provides
support for use in program evaluation and planning. All scores contained in these reports are
included for Maine state and federal reporting purposes only. While scores for many students may
also be used for college admission, they may not be used for that purpose if a student received
accommodations during the test administration that exceeded those made available by the College
Board.
Contents of the Report
These results reﬂect scores based on SAT and Math-A test questions that were taken by over
15,000 students who were enrolled in their third year of high school across all Maine public          The report is divided into ﬁve main sections including a section describing
schools. The SAT Reasoning Test™ employs a design that requires students to create a written          the students tested and a separate section for the results in each content area.
response to a writing prompt, generate answers to open-ended mathematics questions, and select
SAT can be found at: http://www.maine.gov/education/sat_initiative/.
I look forward to working with you in support of our continued efforts to improve the quality
and effectiveness of the instructional opportunities designed to help all students achieve the high   Summary of Scores ...............................................................................2
standards of the Learning Results and graduate from any Maine high school prepared for college,
Summary of Student Participation ........................................................3
career, and citizenship.
Sincerely,
Mathematics Results .............................................................................6-7
Writing Results .....................................................................................8-9
Susan A. Gendron
Commissioner of Education
Date:                  May 2007
Maine
High School                             SUMMARY OF SCORES                                                                                                                         District:
School:
Assessment

Summary of School,
District, and State Scores               100%                                                                                                                      100%

75%                                                                                                                       75%
Average Scaled Score
Year                                       50%                                                                                                                       50%
School   District    State
25%                                                                                                                       25%

0%                                                                                                                        0%
2006–2007

District

District

District

District

District

District

District

District
School

School

School

School

School

School

School

School
State

State

State

State

State

State

State

State
Exceeds                       Meets               Partially Meets Does Not Meet                                            Exceeds                       Meets               Partially Meets Does Not Meet

Mathematics
2006–2007                                                                                   WRITING
100%                                                                                                                      100%

75%                                                                                                                       75%

Writing
2006–2007                                     50%                                                                                                                       50%

25%                                                                                                                       25%

0%                                                                                                                        0%
2
District

District

District

District

District

District

District

District
School

School

School

School

School

School

School

School
State

State

State

State

State

State

State

State
Exceeds                       Meets               Partially Meets Does Not Meet                                            Exceeds                       Meets               Partially Meets Does Not Meet

Page 2
Maine
High School
SUMMARY OF STUDENT PARTICIPATION                                                                                                                        Date:
District:
May 2007

Assessment                                                                                                                                                                               School:

Enrollment1                                                                     CONTENT AREA PARTICIPATION2
CATEGORY OF                                                       during testing window                          Critical Reading                            Mathematics                             Writing
PARTICIPATION                                                 School       District           State     School        District          State       School     District        State        School   District       State       School   District       State

N     %      N     %        N       %    N     %       N      %       N           %   N    %     N      %    N           %    N    %   N      %   N           %   N    %   N      %   N           %

Total number of students
Ethnicity       African American
Asian/Paci c Islander
Hispanic
White
Not Reported
Identi ed disability
Current LEP
Migrant

MODE OF
School        District           State       School     District        State        School   District       State       School   District       State
PARTICIPATION3                                                                                         N     %       N      %       N       %       N    %     N     %     N       %        N    %   N     %    N       %       N    %   N     %    N       %
Participation without accommodations
Identi ed disability (PET/IEP)
LEP
504 plan
Participation with accommodations
Identi ed disability (PET/IEP)
LEP
504 plan
Other
Participation through alternate assessment (PAAP)
Identi ed disability (PET/IEP)
LEP
504 plan
Approved non-participation in reading – 1st year LEP
Approved non-participation – special consideration
Non-participation – other

1 Percents are the percentage of students enrolled in each participation category.        2 Percents are the percentage of students, including those who participated through alternate assessment (PAAP), who participated in the content area.
3 Percents are the percentage of students in each content area by mode.                                                                                                                                                                                         Page 3
Date:       May 2007
Maine
District:
High School
Assessment

ACHIEVEMENT LEVELS: Achievement level deﬁnitions describe the quality of a student’s responses
STUDENTS AT EACH ACHIEVEMENT LEVEL
on state-level assessments in relation to the reading standards for achieving Maine’s Learning Results.
Maine state-level assessments measure the knowledge and skills of students by sampling identiﬁed                 School                District           State
standards within reading at the grade level assessed. Evidence includes responses to multiple-choice
items in an “on demand” setting.                                                                             N            %        N              %   N           %

Exceeds the Standards – The student’s work demonstrates the ability to read and interpret
literary and informational texts appropriate for the grade level by applying a variety of
2005-2006
reasoning skills and prior knowledge as the student draws in-depth inferences, analyzes
2006-2007
texts for subtle clues, synthesizes information across texts, and uses knowledge of text         Cum. Avg.
structures and literary devices to make deeper connections within or across texts to
increase comprehension. (scaled score 1161-1180)

Meets the Standards – The student’s work demonstrates the ability to read and interpret
literary and informational texts appropriate for the grade level by applying a variety of       2005-2006
reasoning skills and prior knowledge as the student draws inferences, identiﬁes summary         2006-2007
statements, connects ideas within and across texts, and uses knowledge of text structures        Cum. Avg.
and literary devices to increase comprehension. (scaled score 1141-1160)

Partially Meets the Standards – The student’s work demonstrates an inconsistent ability
to read and interpret literary and informational texts appropriate for the grade level. The
2005-2006
student’s ability to use a variety of reasoning skills and prior knowledge varies depending
2006-2007
on the texts as s/he draws inferences, identiﬁes summary statements, connects ideas within       Cum. Avg.
and across texts, and uses knowledge of text structures and literary devices to support
comprehension. (scaled score 1129-1140)

Does Not Meet the Standards – The student’s work demonstrates a limited ability to read
and interpret literary and informational texts appropriate for the grade level. The student’s
2005-2006
responses are often incorrect leaving the impression that the student found it difﬁcult to
2006-2007
use a variety of reasoning skills and prior knowledge as s/he draws inferences, identiﬁes        Cum. Avg.
summary statements, connects ideas within and across texts, or uses knowledge of text
structures and literary devices to support comprehension. (scaled score 1100-1128)

Page 4
Maine                                           CRITICAL READING RESULTS                                                             Date:         May 2007
District:
High School
Assessment
BY REPORTING SUBGROUPS                                                               School:

School                                                    District                                     State
REPORTING
Mean                                         Mean                              Mean
Tested       E           M             P               D                  Tested   E   M     P       D                 Tested   E   M   P   D
CATEGORIES                                                                                     Scaled                                       Scaled                            Scaled
Score                                        Score                             Score
N      N       %   N       %     N       %       N       %                N      %   %     %       %                   N      %   %   %   %
All Students
Ethnicity
African American
Asian/Pacific Islander
Hispanic
White
Not Reported

Identified disability
Yes
No

Limited English proficient students
Current LEP in first year
Current LEP beyond first year

Yes
No

Migrant
Yes
No

Gender
Female
Male
Not Reported

Title 1A targeted program
Yes
No

Yes
No

E = Exceeds the Standards M = Meets the Standards P = Partially Meets the Standards D = Does Not Meet the Standards
NOTE: Some achievement level results have been left blank because fewer than ﬁve (5) students were tested. N = Number                                                                   Page 5
Date:       May 2007
Maine
District:
High School
Assessment
MATHEMATICS RESULTS                                                      School:

ACHIEVEMENT LEVELS: Achievement level deﬁnitions describe the quality of a student’s responses
STUDENTS AT EACH ACHIEVEMENT LEVEL*
on state-level assessments in relation to the mathematics standards for achieving Maine’s Learning Results.
Maine state-level assessments measure the knowledge and skills of students by sampling identiﬁed                  School                District           State
standards within mathematics at the grade level assessed. Evidence includes responses to a combination
of multiple-choice items and items requiring student-created responses in an “on demand” setting.             N            %        N              %   N           %

Exceeds the Standards – The student’s work demonstrates in-depth understanding of
essential concepts in mathematics, including the ability to make multiple connections
among central ideas. The student’s responses demonstrate the ability to synthesize            2006-2007
information, analyze and solve difﬁcult or unfamiliar problems, and apply complex
concepts. (scaled score 1161-1180)

Meets the Standards – The student’s work demonstrates an understanding of essential
concepts in mathematics, including the ability to make connections among central ideas.
2006-2007
The student’s responses demonstrate the ability to reason, analyze and solve problems, and
apply concepts. (scaled score 1141-1160)

Partially Meets the Standards – The student’s work demonstrates incomplete
understanding of essential concepts in mathematics and inconsistent connections among
2006-2007
central ideas. The student’s responses demonstrate some ability to analyze and solve
problems and apply concepts. (scaled score 1133-1140)

Does Not Meet the Standards – The student’s work demonstrates limited understanding
of essential concepts in mathematics and infrequent or inaccurate connections among
2006-2007
central ideas. The student’s responses demonstrate minimal ability to solve problems and
apply concepts. (scaled score 1100-1132)

*Standards were reset for mathematics in 2007 so historical data are not available.
Page 6
Maine                                            MATHEMATICS RESULTS                                                                 Date:         May 2007
District:
High School
Assessment
BY REPORTING SUBGROUPS                                                               School:

School                                                    District                                     State
REPORTING
Mean                                         Mean                              Mean
Tested       E           M             P               D                  Tested   E   M     P       D                 Tested   E   M   P   D
CATEGORIES                                                                                     Scaled                                       Scaled                            Scaled
Score                                        Score                             Score
N      N       %   N       %     N       %       N       %                N      %   %     %       %                   N      %   %   %   %
All Students                           1

Ethnicity
African American
Asian/Pacific Islander
Hispanic
White
Not Reported

Identified disability
Yes
No

Limited English proficient students
Current LEP in first year
Current LEP beyond first year

Yes
No

Migrant
Yes
No

Gender
Female
Male
Not Reported

Title 1A targeted program
Yes
No

Yes
No

E = Exceeds the Standards M = Meets the Standards P = Partially Meets the Standards D = Does Not Meet the Standards
NOTE: Some achievement level results have been left blank because fewer than ﬁve (5) students were tested. N = Number                                                                   Page 7
Maine                                                                                                                             Date:       May 2007
District:
High School
Assessment
WRITING RESULTS                                                                   School:

ACHIEVEMENT LEVELS: Achievement level deﬁnitions describe the quality of a student’s responses
STUDENTS AT EACH ACHIEVEMENT LEVEL
on state-level assessments in relation to the writing standards for achieving Maine’s Learning Results.
Maine state-level assessments measure the knowledge and skills of students by sampling identiﬁed                                  School                District           State
standards within writing at the grade level assessed. Evidence includes responses to a combination of
multiple-choice items and items requiring student-created responses in an “on demand” setting.                                N            %        N              %   N           %

Exceeds the Standards – The student’s responses demonstrate skillful ability to select clear, precise
sentence improvements that are free of awkwardness or ambiguity; to recognize grammar and usage errors;
and to select revisions that add to the clarity, precision and overall effectiveness of a passage. The student’s 2005-2006    0
essay demonstrates an effectively developed and insightful point of view on the issue and outstanding            2006-2007
critical thinking, with clearly appropriate examples, reasons, and other evidence to support a position. The      Cum. Avg.
essay is well-organized and clearly focused, demonstrating clear coherence and smooth progression of
ideas and free of most errors in grammar, usage, and mechanics. (scaled score 1161-1180)

Meets the Standards – The student’s responses demonstrate ability to select clear sentence improvements
that are free of awkwardness or ambiguity; to recognize grammar and usage errors; and to select revisions
that add to the clarity and overall effectiveness of a passage. The student’s essay demonstrates an       2005-2006
effectively developed point of view on the issue and strong critical thinking, with generally appropriate 2006-2007
examples, reasons, and other evidence to support a position. The essay is well-organized and focused,      Cum. Avg.
demonstrating coherence and progression of ideas and generally free of most errors in grammar, usage,
and mechanics. (scaled score 1141-1160)

Partially Meets the Standards – The student’s responses demonstrate inconsistent ability to select clear
sentence improvements that are free of awkwardness or ambiguity; to recognize grammar and usage
errors; and to select revisions that add to the clarity and overall effectiveness of a passage. The student’s   2005-2006
essay demonstrates a developed point of view on the issue and some critical thinking, but may do so             2006-2007
inconsistently or with inadequate examples, reasons, or other evidence to support a position. The essay is       Cum. Avg.
generally organized and focused, but may demonstrate some lapses in coherence or progression of ideas
and may contain errors in grammar, usage, and mechanics. (scaled score 1129-1140)

Does Not Meet the Standards – The student’s responses demonstrate limited ability to select clear
sentence improvements that are free of awkwardness or ambiguity; to recognize grammar and usage
errors; and to select revisions that add to the clarity and overall effectiveness of a passage. The student’s   2005-2006
essay demonstrates a vague or seriously limited point of view on the issues and weak critical thinking,         2006-2007
with inappropriate or insufﬁcient examples, reasons, or other evidence to support a position. The essay          Cum. Avg.
is poorly organized and/or focused and may contain an accumulation of errors in grammar, usage, and
mechanics that interfere with understanding the message of the essay. (scaled score 1100-1128)

Page 8
Maine                                               WRITING RESULTS                                                                  Date:         May 2007
District:
High School
Assessment
BY REPORTING SUBGROUPS                                                               School:

School                                                    District                                     State
REPORTING
Mean                                         Mean                              Mean
Tested       E           M             P               D                  Tested   E   M     P       D                 Tested   E   M   P   D
CATEGORIES                                                                                     Scaled                                       Scaled                            Scaled
Score                                        Score                             Score
N      N       %   N       %     N       %       N       %                N      %   %     %       %                   N      %   %   %   %
All Students                           11

Ethnicity
African American
Asian/Pacific Islander
Hispanic
White
Not Reported

Identified disability
Yes
No

Limited English proficient students
Current LEP in first year
Current LEP beyond first year

Yes
No

Migrant
Yes
No

Gender
Female
Male
Not Reported

Title 1A targeted program
Yes
No

Yes
No

E = Exceeds the Standards M = Meets the Standards P = Partially Meets the Standards D = Does Not Meet the Standards
NOTE: Some achievement level results have been left blank because fewer than ﬁve (5) students were tested. N = Number                                                                   Page 9
Analysis and Reporting Decision Rules
Maine High School Assessment

This document details rules for analysis and reporting. The final student level data
set used for analysis and reporting is described in the “Data Processing
Specifications.” This document is considered a draft until the Maine State
Department of Education (DOE) signs off. If there are rules that need to be added or
modified after said sign-off, DOE sign off will be obtained for each rule. Details of
I.     General Information
Grade Subject              Data Used for MHSA Scaled Scores

HS        Mathematics      SAT unrounded raw score + Math-A unrounded raw score
HS        Critical         SAT Scaled Score
HS        Writing          SAT Scaled Score

B.     Reports Produced:
1.     Parent Letter Report
2.     Student Labels
3.     Student Roster
(School)
4.     Summary Report Package
(School and District)
-   Summary of Scores
-   Summary of Student Participation
-   Results
(by subject)

1
C.      Files Produced:
1.     State Student Results Files
(With names and without names)
2.     School and District Student Results Files (by subject)

3.     Report Card
(School, District)
4.     School Level Summaries by Demographic Variables
(David Silvernail)
5.     Standard & Poor’s Report (by subject)
(School, District, State)
6.     State Accommodation Frequency Report
7.     Top 50 HS Students
8.     Standard Deviations & Average Scaled Scores for MHSA
subgroups (by subject)

D.      School Type:

SchType       Source                 Description
Included in Aggregations*
School     District State
‘PUB’         iCore:                 Public          ü          ü        ü
schoolsubtypeid =
‘1’
‘BIG’         iCore:                 Private with    ü
schoolsubtypeid =      60% or
‘6’                    more
Publicly
Funded
(Big 11)
‘PSN’         iCore:                 Private         ü
schoolsubtypeid =      Special
‘4’, ‘7’               Purpose,
Private Non-
Sectarian
‘PRI’         iCore:                 Private
schoolsubtypeid =      Sectarian
‘3’
‘HOM’         HomeSchool = ‘1’       Home
School

* ü indicates that every student tested at a school with the associated school type is
included in aggregations at the level indicated.

2
E.   Other Information
1.    The retention flag in MEDMS is not used for the MHSA 0607.
-   If a student is linked to MEDMS, all demographic data of
record are pulled from MEDMS.
-   If the student does not link to MEDMS, then report the bubbled
student number and demographics from the booklet. These
students will be reported to the Login school and will be
assigned to the ‘not’ group for all demographics that exist only
in MEDMS (i.e. not LEP, not Gifted, etc).
3.    Students are removed prior to reporting if any of the following
conditions are true:
-   Student is not in the Maine DOE provided 3r d year HS file
(HSFlag = ‘0’)
-   HomeSchool = ‘1’ in MEDMS
-   NonMaine = ‘1’ in MEDMS
-   Student is privately funded (enrolled in a BIG or PSN school
without a sending district, or enrolled at a PRI school).
4.    If a student did not test, is identified in MEDMS as actively
enrolled and meets the following criteria, the student will be added
to the enrollment at the school indicated in MEDMS, with no test
data:
-   Enrolled at a ‘PUB’ school, or enrolled at a ‘BIG’ or ‘PSN’
school and has a sending district.
-   Is a Maine Resident (NonMaine = ‘0’ in MEDMS)
-   Is a third year student (HSFlag = ‘1’ in MEDMS)
The student is reported as defined by the rules described in this
document.
5.    If a school is not in iCore, then it is assumed to be private (‘PRI’)
and students enrolled in that school will be removed prior to
reporting.

3
II.   Student Participation / Exclusions

A.   Test Attempt Rules (by subject)
1.   A valid multiple choice response is A, B, C, D, E or a multiple
response (denoted by an asterisk) on the Math-A, or a ‘+’ or ‘-‘ on
an SAT item.
2.   A student attempted the test if the student provided a valid
response to at least one item.

B.   Not Tested Reasons (by subject)

If a student has more than one reason for not participating on the test, we
will assign one participation code using the following hierarchy:
1.   Alternate Assessment *
2.   Special Consideration (SpeCon = ‘1’)
3.   First Year LEP (LEP= ‘1’ and partLEP = ‘1’)
4.   Did not Participate
*Students are identified as participating in the MEA Alternate Asses sment
based on the MEA Alternate Assessment Decision Rules for each subject
assessed at the 3rd year HS level in MEDMS.

C.   Student Participation Status (by subject)
1.   If the student attempted the test:
-   And has a Not Tested reason of Special Consideration or
Alternate As sessment, the student will be reported with the
Not Tested reason.
-   Otherwise the Not Tested reason is ignored and the student
will be reported as Tested on the MHSA.
2.   If the student did not attempt the test:
-   And has a Not Tested reason then the student will be reported
with the Not Tested reason.
-   Otherwise the student is reported as Did Not Participate.

4
D.    Student Participation Summary (by subject)

Participation Status     Participation Scaled     Achievement Parent      Parent Letter    Roster
Flag          Score      Level       Letter      Text             Code
Report      (Achievement
Level)
Alternate                C                                      ü*        Alternate        ALT
Assessment                                                                Assessment
Special                  D                                      ü*        Special          ASC
Consideration                                                             Consideration
First Year LEP           E                                      ü         First Year LEP   LEP
Did not Participate      F                                      ü         Did not        DNP
Participate
Tested MHSA              A             ü          ü             ü         (Earned Achievement
without                                                                   Level)
accommodations
Tested MHSA              B             ü          ü             ü         (Earned Achievement
with                                                                      Level)
accommodations
(including Maine-
Only)

* If a student has a participation status of Special Considerations and/or Alternate
Assessment for all subjects assessed at the grade level, a Parent Letter is not produced.

III.     Calculations

A.    Minimum N Size

If there are less than 5 tested participants in a subgroup (students with
achievement levels), the scaled score and achievement levels are not
reported. This applies to all reports with aggregations.

B.    Rounding Table

Report                   Calculation                                            Rounded
(to the nearest)
Parent Letter Report     Relative Achievement Level Percent                     Whole value (%
is displayed)
Summary of Scores        Average Scaled Score                                   Whole value
Summary of Student       All percents                                           Whole value
Participation
Results                  Percent at each achievement level, Percent of points   Whole value
possible, Percent of students in each Category,
Scaled Score

5
C.   District and State Aggregations
If a student is publicly funded (has a sending district) and is enrolled
at a school with a school type that is ‘PUB’,’BIG’, or ‘PSN’, the student
is included in the sending district and state aggregations.

D.   Math Raw scores
1.    The SAT gridded response items must have scores of ‘-‘
converted to ‘O’ before calculating the SAT raw score.
2.    The SAT unrounded raw score is calculated using the following
guide:
‘+’ = 1 point
‘-‘ = -1/4 point
‘O’ = 0 point
3.    The Math-A unrounded raw score is calculated using the
following guide:
A correct response = 1 point
An incorrect response (or *) = -1/4 point
A blank response = 0 point

E.   Scaling
1.    For Math, scaling is done using a raw score to scaled score
conversion table provided by psychometrics, the SAT form, the
Math-A unrounded raw score and the SAT unrounded raw score.
2.    For Critical Reading and Writing, scaling is done using a look-up
table provided by psychometrics and the SAT scaled score.
3.    Scaled Scores are rounded to even integers.
4.    Achievement level coding:
1 = Does not meet the Standards
2 = Partially meets the Standards
3 = Meets the Standards
4 = Exceeds the Standards

6
IV.   Report Specific Rules

A.   Parent Letter Report
1.   If a student is missing a first and last name, then report as ‘NAME
NOT PROVIDED’.
2.   If a student has a participation status of Special Considerations
and/or Alternate Assessment for all subjects assessed at the
grade level, a Parent Letter is not produced (ParentLetter = ‘0’).

B.   Student Roster
District name and district level summaries are only displayed for ‘PUB’
schools.

C.   Summary Report Package
1.   If there are less than 5 students in a school and/or district, only
page 1 and the Summary of Student Participation pages are
produced.
2.   The cumulative average for the historical section on the results
page is the average calculated for years where there are 5 or
more students in a subgroup. If there is only 1 year of data
available, or only 1 year with 5 or more students, that year’s data
will also be displayed as the average. This is not calculated for
Math this year because standards were reset.

3.   Summary of Student Participation
-   The Current LEP category is defined as students who are
identified in MEDMS as 1st Year LEP or 2n d year and beyond.
(LEP = ‘1’ or ‘2’)
-   Content Area Participation
I.   The numerator is the sum of students with
participation statuses of Tested with
accommodations, Tested without accommodations,
Alternate Assessment, and 1st year LEP (reading
only).
II.   The denominator is calculated using the number
enrolled minus students with a Special
Consideration status.

7
-   Mode of Participation
I.   For each Mode of Participation group (Participated
with Accommodations, without Accommodations,
Participation through PAAP, and the non-
Participation groups), the percents are calculated
using a denominator of the total number of students
enrolled. The sum of the N’s for these 6 groups is
equal to the number total number of students
enrolled.
II.   For each subgroup within the modes (Identified
disability, LEP, 504 plan, Other), the percents are
calculated using a denominator that is the number
of students in that particular group (Mode).
III.   The Other subgroup is defined as students that
participated with accommodations, but do not have
an identified disability, are not current LEP, and are
not plan504.

V.   Data File Rules

Refer to file layouts for data elements and structure.

A.    State Student Results Files

1.    Only students from ‘PUB’ schools are included, or if they have a
sending district.

2.    Students with all participation statuses are included.

B.    District Student Results Files

1.    Only ‘PUB’ school districts will receive district files.

2.    Students with all participation statuses are included.

3.    A student with a sending district is included in both the tested
district file and the sending district file.

C.    School Student Results Files

1.    All schools with MHSA data will receive school files.

2.    Students with all participation statuses are included.

8
D.   Report Card

1.   The data reported in these files are the number of students tested,
the number and percent of students performing at each
achievement level, and the average scaled score.

2.   The district files are only produced for ‘PUB’ districts and include
students with participation statuses of Tested with or without
accommodations. A student with a sending district is aggregated
only to the sending district.
3.   The school files are only produced for ‘PUB’ and ‘BIG’ schools,
and include students with participation statuses of Tested with or
without accommodations.
4.   Schools or districts that have < 5 included students will only
include data for the number of students tested.

E.   School Level Summaries by Demographic Variables (David Silvernail)
1.   The files are only produced for ‘PUB’ and ‘BIG’ schools and
include students with participation statuses of Tested with or
without accommodations.

F.   Standard & Poor’s Report
1.   The data reported in these files are the number of students tested
and not tested, and the number of students performing at each
achievement level for the following subgroups:
-   Identified Disability, No Identified Disability
-   LEP (1st year or 2nd year and beyond), Not LEP
-   Migrant, Not Migrant
-   Gender
-   Ethnicity
2.   The state file only includes students from ‘PUB schools, or if they
have a sending district. Students with all participation statuses
except Special Considerations are included.

3.   The district files are only produced for ‘PUB’ districts. A student
with a sending district is aggregated only to the sending district.
4.   The school files are only produced for ‘PUB’ and ‘BIG’ schools.

9
G.    State Accommodation Frequency Report
The data reported in these files are the counts of each SAT
accommodation and the Maine-Only accommodation.

H.    Top 50 HS Students
Each student is rank ordered by scaled score in all subjects. These
rankings are averaged for each student who participated in all
subjects. The students are then ordered by the average ranking and
the top 50 students are identified.

I.   Standard Deviations & Scaled Scores for MHSA subgroups
This file includes students from ‘PUB’ schools, or if they have a
sending district. Students with participation statuses of Tested with or
without accommodations are included.

J.    Data File Table
(GG indicates grade, SSS indicates subject)

File                        Naming Convention
State Student Results       StudentGG.csv
StudentNoNamesGG.csv
StateStudentLayout.xls
District Student Results    StudentSSSGG.csv
DistrictStudentLayout.xls
School Student Results      StudentSSSGG.csv
SchoolStudentLayout.xls
Report Card                 SchoolReportCardGG.csv
DistrictReportCardGG.csv
ReportCardLayout.xls
School Level Summaries      MHSAGG.csv
(Silvernail)                SchoolLevelSummaryLayout.xls
SPLayout.xls
State Accommodation         AccommodationGG.xls
Frequency Report
Top 50 HS Students          Top50.rtf

Standard Deviations &       StandardDeviationGG.xls
Average Scaled Scores for
MHSA subgroups

10
APPENDIX G
FIT INFORMATION FOR MHSA MATHEMATICS
SECTION

Appendix G       1
APPENDIX H
IDENTIFYING MAINE MATHEMATICS
PERFORMANCE INDICATORS THAT NEED
WEBB ALIGNMENT CRITERIA
Identifying Maine Mathematics Performance Indicators that Need Additional
Measurement to Meet the Webb Alignment Criteria

The following narrative and set of graphics provide a detailed description of how the blueprint
for the Mathematics Augmentation Test (Math-A) was developed. The meeting devoted to this
effort was held at the offices of the College Board in New York City on November 20, 2006.
Present at this meeting were Robin O’Callaghan and Andrew Schwartz, mathematics specialists
at the College Board, and, Dan Hupp and Tad Johnston, mathematics specialists at the Maine
Department of Education.

Overview:
The mathematics items on the May 2007 SAT Mathematics Reasoning Test were first coded to
Maine mathematics content standards and also assigned a depth of knowledge ranking. This task
was completed using a consensus model. Also produced during this process was a set of
operating conventions to help ensure greater reliability in coding items for alignment.
Once coded, the remainder of the session was used to identify and begin development of a set of
items which when combined with the items of the SAT Mathematics Reasoning Test would
result in an instrument that would align well and adequately measure the Maine Learning Results
for secondary mathematics.

Plan of Action:
The alignment methodology and criteria of Dr. Norman Webb is based on categorical
concurrence, balance of representation, depth of knowledge, and range of knowledge across the
four content clusters of Maine’s mathematics standards. To create a stable design over testing
years, a test blueprint describing numbers of items in the content clusters was developed (Table
H-1). To populate the Maine High School Assessment for Mathematics, the fifty-four items
from the SAT Mathematics Reasoning Test are first coded and put into the matrix and the
desired balance is achieved by selection of items to be included in Mathematics Augmentation
Test (Math-A).

Method:
The participants independently worked each SAT item and then for each item assigned a depth
of knowledge (DOK) rating based on the Norman Webb definitions and a performance indicator
to which the item was aligned. If an item was not aligned to a specific performance indicator
(PI), but was aligned to a content standard, it was coded to the standard with the generic
indicator number of 0 (Table H-2).

Participants then discussed the coding for each item and recorded a consensus score for each
item. Several items seemed to go across performance indicators for content. Some performance
indicators overlap and several SAT items can be solved using connections across content areas of
mathematics. A list of coding decisions was created to ensure consistency for the process for the
present and possible future coding decisions. (The complete “Alignment Coding Guidelines for
the Maine Learning Results in Mathematics” can be found at the conclusion of this appendix).

Following coding, items were grouped by content cluster and an inventory created (Table H-3).
Table H-1
Maine Department of Education
SAT-A Mathematics
Test Specification and Alignment Tool

The table below describes how the Maine Department of Education augmented the SAT Mathematics Test to create the Maine High
School Assessment Mathematics (MHSA) Test that meets the alignment criteria designed by Dr. Norman Webb. Specific item
selection criteria are described on the following page. Following that is a description of the work plan for mathematics augmentation
(Math-A) test construction.
Detail of Plan for MHSA Mathematics Test Specifications – Item Selection
Items
SAT-A      Needed to    SAT-A            SAT-A
Content Clusters          SAT                   Item                                                               Items       Count Range of
SAT                       Prop       Correct  Item Count-          Prop
SAT                     Item                 Count-                                                            Needed for    Items in
Prop                     -Form       Balance   Alignment        -Alignment
Maine                    Count                 Form                                                              Form X,      Augmentation
Balanced    for Form   Correction        Correction
Balanced                                                            Year X
X, Year X
1. Number and             11-13   .20-.24        13          .23         0-2         17               .24             4              4-6
Operations
ME 1 Number and
Operations
2. Algebra and            19-21   .35-.39        21          .37          0-2           21            .29             0              0-2
Functions
ME 4 Patterns
3. Geometry and           14-16   .26-.30        16          .28          0-2           20            .28             4              4-6
Measurement
ME 2 Shape and Size
4. Data Analysis,         6-7     .11-.13        7           .12          0-1           14            .19             7              7-8
Statistics, Probability
ME 3 Mathematical
Decision Making
Total Items               54      1              57           1            3            72             1             15              18
Table H-2
SAT Mathematics Reasoning Test, May 2007 Item coding results – Consensus

SAT                 DOK                       SAT                  DOK
item #    PI code   determined                item #     PI code   determined
1   H3                 2                      39   B0                 1
2   D0                 1                      40   K1                 2
3   E2                 2                      41   C2                 2
4   C2                 1                      42   A0                 2
5   E2                 2                      43   A0                 2
6   H3                 2                      44   E2                 2
7   E2                 2                      45   H3                 2
8   A1                 2                      46   F2                 2
9   H4                 2                      47   A0                 2
10   B0                 2                      48   H2                 2
11   C0                 2                      49   H3                 2
12   I0                 1                      50   A0                 2
13   BO                 2                      51   D0                 2
14   H3                 2                      52   G3                 2
15   B0                 2                      53   G0                 2
16   H4                 2                      54   F2                 3
17   F2                 3
18   D0                 3
19   H4                 3
20   G2                 3
21   C0                 2
22   E1                 2
23   G2                 1
24   K1                 2
25   F2                 2
26   H2                 2
27   E0                 2
28   H3                 3
29   B0                 1
30   E2                 2
31   A0                 2
32   H3                 2
33   C2                 3
34   H3                 2
35   E2                 3
36   G0                 2
37   E2                 3
38   H3                 3
Table H-3 Item Coding Organized by Maine Clusters

DOK                                                DOK
SAT                 DOK             required      SAT                  DOK             required
item #     PI Code Determination (Webb)           item #     PI Code   Determination   (Webb)
Cluster 1 Number and Operations                   Cluster 4 Patterns
31 A0                      2         2            36 G0                     3          3
42 A0                      2         2            53 G0                     2          3
43 A0                      2         2            20 G2                     3          3
47 A0                      2         3            23 G2                     1          3
50 A0                      2         2            52 G3                     2          2
8 A1                      2         2            26 H2                     2          3
10 B0                      2         3            48 H2                     2          3
13 B0                      2         3             1 H3                     2          2
15 B0                      2         3             6 H3                     2          2
29 B0                      1         3            14 H3                     2          2
39 B0                      1         3            28 H3                     3          2
12 I0                      1         2            32 H3                     2          2
Cluster 3 Mathematical Decision Making                  34 H3                     2          2
11 C0                      2         3            38 H3                     3          2
21 C0                      2         3            45 H3                     2          2
4 C2                      1         3            49 H3                     2          2
33 C2                      3         3             9 H4                     2          2
41 C2                      2         3            16 H4                     2          2
2 D0                      1         3            19 H4                     3          2
18 D0                      3         3            24 K1                     2          2
51 D0                      2         3            40 K1                     2          2
27 E0                      2         3
Cluster 2 Shape and Size
22 E1                      2         1
3 E2                      2         3
5 E2                      2         3
7 E2                      2         3
30 E2                      2         3
35 E2                      3         3
37 E2                      3         3
44 E2                      2         3
17 F2                      3         2
25 F2                      2         2
46 F2                      2         2
54 F2                      3         2
The now coded SAT math inventory was evaluated and decisions made about items needed for
the Math-A to bring the combined tests into alignment with the MHSA blueprint. The results of
content decisions are summarized in the table below.

Maine Clusters    Count of Items     Items Needed       Items to           Total Items
on SAT             to Balance Test    Augment            Needed
1. Number and     12                 1                  4                  5
Operations
2. Shape and      13                 3                  4                  7
Size

3. Mathematical   8                  (-1)               7                  6
Decision Making
4. Patterns       21                 0                  0                  0

Total             54                 3                  15                 18

It should be noted that items involving combinations, permutations and arrangements are coded
in computation under the College Board protocol and under standard D in cluster 3 using the
Maine Learning Results (DO based on grade 5-8 performance indicator D4). This explains the
“(-1)” in cluster 3.

Participants also looked at the items by performance indicator for balance of representation and
range of knowledge and by assigned item DOK compared to those assigned by the Norman
Webb team to the standards and performance indicators of the Maine Learning Results. EH-
1mining the inventory of SAT Items in comparison with the Maine Learning Results led to
conclusions about the characteristics of the SAT item sets by cluster and a decision about items
to be developed for the Mathematics Augmentation.

In Cluster One it was noted that many items were coded to the content standards and not to
specific performance indicators. To ensure appropriate coverage, a majority of items need to be
targeted to specific indicators with a wide range of indicators addressed. DOK weakness in the
cluster was noted in items coded B0 which require a DOK of 3. An item specific to each of five
different indicators will provide good coverage as long as there is at least one indicator in each
standard.

In Cluster Two it was noted that items coded to E2 were common and items are often a DOK of
2 when DOK 3 is indicated. Indicator F2 is well represented. To fix balance and range issues,
items measuring indicators E1, E3, and F2 were needed. In terms of DOK, current coding shows
7 items below the required DOK and 6 items at or above. Items written must meet or eH-3eed
the Webb DOK coding for the indictor. Geometry items must clearly reflect E1 or E3.

In Cluster Three more items are needed to represent the range of the cluster and to meet the
balance of representation specified. While probability is represented in the SAT, compound
probability (D1) and probability distributions (D2) were not present in the form. Items
measuring correlation, standard deviation, sampling and experimental design are needed as well
as an item that measures the ability to choose a best solution when multiple logical solutions are
present (J1).

In Cluster 4, the consensus coding revealed 7 of 13 indicators present, with a solid DOK.
Balance of Representation and Range of Knowledge were deemed satisfactory for state
assessment purposes. It was decided that no items were needed for this year’s augmentation.

Summary of Items to be Developed for Augment

Content Description                   DOK target

Cluster 1: Number and Operations– Need 5 Operational Items
A1 structure of real numbers          DOK 2
A2 complex numbers (imaginary) DOK 2
B1 approximate or justify             DOK 3
B2 non-decimal bases                  DOK 2
I1 given feasible region, pick point DOK 2
I3 game theory –no items; traditionally not measured
I4 matrix multiplication, or matrix DOK 2
solution to system set up
(one area will not be addressed – choice to be made based on field test results of items)

Cluster 2: Shape and Size – Need 7 Operational Items
E1 transformations and coordinates DOK 2, 2-3 items
E3 trigonometry                      DOK 3 2-3 items
F1 precision in measurement or units DOK 1 (or 2) 2 items

Cluster 3: Mathematical Decision Making – Need 6 Operational Items
D1 compound probability            DOK 2, 2 items
D2 probability distributions       DOK 3
C3 standard deviation, correlation DOK 2
C4 random sampling                 DOK 2
C5 revise studies                  DOK 3 one of C5 or J1
J1 multiple logical conclusions    DOK 3

Cluster 4: Patterns – Need 0 Operational Items
Alignment Coding Guidelines for the Maine Learning Results in Mathematics
Rationale: There is considerable overlap among the mathematics performance indicators of the
Learning Results. It has been noted when aligning items that many can be coded to multiple
indicators. Historically, Maine Educational Assessment development teams noticed that even
when writing items for a specific indicator the item could be coded to other indicators as well. In
order to reduce this ambiguity, the following guidelines are suggested for use in coding and
writing items. These guidelines grew out of those used informally by the Maine Educational
Assessment (MEA) grades 3-8 development committees and were refined at a meeting between
the College Board and Maine Department of Education on November 20, 2006.

General
For items that are not directly aligned to grade 9-12 performance indicators but are aligned to
earlier grade performance indicators but with advanced levels of sophistication, the items should
be coded to the content standard letter (A – K) followed by a “0” which will denote a generic
measure of that content standard .

Cluster 1 Number and Operations

A0 - will be the code assigned for rules of exponents, application of ratio with the answer as a
ratio. If the item is a multi-step (2 or more steps) problem with proportions code as B0.
The effects of operation on a number will code as A0 building off 5-8 standards, unless the item
points to structure of the real numbers, e.g. closure (or lack thereof) of a set under an operation in
which case it will be coded to A1.
A1 - Check for A1 if use and/or interpretation of a number line is involved
A2 - Quadratic equations with no real roots, square roots of negative numbers and similar are
coded A2.

B0 - Computation items where B1 and B2 are not applicable are coded to B0.
B1 - If an item asks for or requires justification or estimation, code to B1.
B2 - Other bases may include problems involving angles and minutes or similar where, although
the numerals are written in base 10, another base must be used when computing and converting
across units. Operations defined on modulo systems or similar (group tables for eH-1mple) are
coded B2. If a defined operation uses base 10, then it is coded to K1.

I0 – These are discrete math items that do not code to I1, I2, I3, or I4. Note that counting items
(permutations and arrangements) get coded to D0 to match MLR at 5-8.
I4 – Computation involving row x column computation code to I4 even if matrices not explicitly
shown.

Cluster 2 Shape and Size

E – While deductive or inductive reasoning are present in nearly all items, items involving
transformations on the coordinate plane should be coded to E1 and trigonometry items to E3.
E2 – Line segments are also considered geometric figures so can be coded to E2.
F0 – Perimeter items are coded F0 unless using Pythagorean Theorem or working from an area
or volume in which case they are coded F2.
F2 – When finding a measure the Pythagorean Theorem is coded to F2 (see eH-1mple in MLR)

Cluster 3 Mathematical Decision Making

C0 – items with mean, median, mode and range where there is no change in the data set are
coded C0.
C1 – Problems with descriptive statistics where an element(s) in the data set is changed are
coded C1 (the effects of a change on measures of central tendency).
C2 – Pulling data from a graph, interpolating or extending a graph with data from a physical
situation or using patterns in a table of data are coded C2. Note that the presence of a table is not
an automatic coding of C2. If the table is just a list of numbers and the use of the table is to
organize data and the “meat” of the item is to then perform other operations with the numbers the
coding may be more appropriately coded to another standard or indicator.

D0 – combinations, permutations and other counting items are coded here based on similar
middle grade level descriptions (D4). Also coded here is the calculation of probability of simple
(not compound) events unless a prediction is required (then D1).
D1 – Compound probabilities OR a prediction based on probability are coded to D1. This
includes repeated events and conditional probabilities.
D2 – Probabilities determined based on probability distributions or frequency distributions are
coded to D2.

Cluster 4 Patterns

In general, building mathematical models code to standard G, while “solving” codes to H.
G0 – Patterning or sequence items (complete a sequence or pattern) code to G0 unless they
match to another indicator
G1 – Is the code for interpretation of graphs for special points, making a graph or recognizing a
graph for an equation or a function situation.
G2 – “Translate to algebra,” “write an expression,” “write an equation that could be used to
solve” (but not required to solve) are actions that indicate coding to G2. Items that ask, “which
equation matches the graph,” code to G2 as well.
G3 – If students are asked to develop an equation from a set of points, start with a situation and
make or recognize a graph, then the item is coded to G3.

H1 –is the code when the graph or table for an equation is used to find an answer. The graph or
table is used. Building a model codes to G3 unless it is tied to variation including slope in which
case it is coded to H2. Selecting the graph or table that matches an equation also goes in H1.

H2 – Slope and rate of change are coded H2, so are items that involve directly and inversely
proportional situations.
H3 – Includes solving an equation or inequality and solving word problems that lend themselves
to an algebraic solution. Also included within H3 are substituting coordinate values into an
equation to find coefficients or the value of a second variable.
H4 – Translation with function notation is coded H4. Finding a value can be included (instead
of in H3) if finding the value reveals understanding of function notation or requires
understanding of function notation. Unless there is a geometric measurement situation (code F2),
or a derived measurement situation (code F0), “plugging values into a formula with no need to
transform the formula” is coded H4. If the formula needs to be transformed to solve for a
missing value, then the item is coded to H3.

J1- Is used especially with content primarily from MLRs in grades 5-8 where a set of
conclusions is presented and students choose all that apply (only I, only II, I and III, II and III).
In cases where content is solely secondary the item should be coded to the primary content being
measured.

K1 – Using definitions and defined operations are coded to K1 (unless the definition is for an
operation in a system not base 10 – see B2). An eH-1mple of a defined operation is
& where a&b = 2a + 3b.
APPENDIX I
2006-07 POLICIES AND PROCEDURES FOR
ACCOMMODATIONS TO THE MAINE HIGH
SCHOOL ASSESSMENT (MHSA) AND
ALTERNATE ASSESSMENT TO THE
PSAT/MHSA
Maine’s                    2006-07 POLICIES AND PROCEDURES
SAT                               FOR ACCOMMODATIONS
Initiative          TO THE MAINE HIGH SCHOOL ASSESSMENT (MHSA)
AND ALTERNATE ASSESSMENT TO THE PSAT/MHSA

The No Child Left Behind (NCLB) Act mandates that all students in one high school year be included in state
assessment. In addition, Maine Learning Results legislation requires that all students will be included in a State
assessment at the eleventh grade level. In 2006-07 the SAT, in combination with math components scheduled
for a separate administration, will be used as Maine’s High School Assessment for students in Grade 11 or in
their third year of high school. Participation in the PSAT is required for second year high school students
but there will be no PSAT State Reports. Students will participate in these assessments through one of the
(Personalized Alternate Assessment Portfolio [PAAP]). Legal requirements for students identified for federally
funded programs have been taken into account in the development of this document.

POLICIES AND PROCEDURES FOR THE PARTICIPATION OF STUDENTS
WITH ACCOMMODATIONS IN THE MAINE HIGH SCHOOL ASSESSMENT

An accommodation is a change in the way an assessment is given or taken that does not alter what is being
measured. These policies and procedures for accommodations are designed so that all students with unique
learning needs have a fair opportunity to demonstrate what they know and are able to do on all State
assessments at the high school level. All Maine students participating in State assessments required under No

NOTE: Accommodations for the PSAT are contained in a separate document.

TWO CATEGORIES OF ACCOMMODATIONS FOR THE MAINE HIGH SCHOOL ASSESSMENT

The Maine High School Assessment provides the opportunity to use results in two different ways. The first is
as a measure of a student’s progress towards achievement of Maine’s Learning Results for State and Federal
purposes. The second is the use of score reports on the SAT portion of the test in a student’s application for
college admission. The way in which a student’s scores are reported depends on the type of accommodations
used. The two types of accommodations available to Maine students are:

1)           ACCOMMODATIONS APPROVED ONLY BY THE STATE OF MAINE.
a. The students for whom a team has determined that accommodations are necessary, but who
have not been approved for their use by the College Board, may use one or more of the
accommodations listed on the last two pages of this document.
b. The scores of students using any of the accommodations on the last two pages of this
document without College Board approval will be reported for Maine Purposes Only based on
Maine Achievement Standards for the Maine High School Assessment. Their scores on the SAT
portion of the test can not be sent to colleges by the College Board.
Codes for these accommodations can be found on pages 6 and 7. The codes must be recorded in the
student’s records as shown on page 3.

2)       ACCOMMODATIONS APPROVED BY THE COLLEGE BOARD FOR AN INDIVIDUAL STUDENT
a. Students with an identified disability who need accommodations and wish to have college
reportable scores on the SAT portion of their Maine High School Assessment must file an
official College Board Eligibility Form, identifying the accommodations they wish to use during
the administration of the SAT. The accommodations for which a student may apply include:
•       those listed by the College Board in the Eligibility Packet,
•       those needed by individual students and allowed by the College Board but not listed in
the Eligibility packet, and
•Maine accommodations listed on pages 6 and 7, through the College Board Eligibility
Form in the “Other” category.
b. The required documentation must accompany the request for College Board approved
accommodations.
c. The College Board will determine whether the use of the accommodations requested will be
approved for the use of the individual student, based on their review.
d. The scores of all students participating in the Maine High School Assessment will be reported
based on the combination of the SAT and the augmented portion of the Maine High School
Assessment. The scores for those students who took the SAT portion of the Maine High School
Assessment through standard administration or with accommodations approved by the College
Board may also be reported to colleges.

PROCEDURES FOR DETERMINATION OF NEED FOR ACCOMMODATION

All students being considered for accommodations on the Maine High School Assessment must have their
individual situations reviewed by a team prior to the time of assessment. This team should include at least one
of the student’s teachers, the building principal, related services personnel, the parent(s)/guardian(s) and,
whenever possible, the student. If it is not possible for the parent and student to attend the meeting, they
should be consulted regarding the committee’s recommendations for accommodations prior to the time of the
assessment.

For a student who has an Individual Educational Program (IEP), schools are required to address needed
accommodations at a Pupil Evaluation Team (PET) meeting. Membership for this meeting is prescribed in
Maine Special Education Regulations, Chapter 101, part 8, November 1, 1999.

Only students with an identified disability under IDEA-2004 may be considered for accommodations for a
standard SAT administration with resulting official College Board scores (See #2 above).

Those who may be considered for accommodations on the Maine High School Assessment taken for Maine
Purposes Only (See #1 above) include, but are not limited to, those who 1) are ill or incapacitated in some
way; 2) are Limited English Proficient (LEP); 3) have an identified disability under IDEA-2004 but have not
been approved by the College Board to use accommodations; 4) are identified as having disabilities under
Section 504 of the Rehabilitation Act; or 5) are identified by a team as needing accommodations in order to
demonstrate an accurate level of academic achievement.
Maine Department of Education                                   3                                           10/19/06 v3
Recommended accommodations should be consistent with accommodations already being employed in the
student’s instructional program. Any accommodations recommended for a student will be reflected in a
statement in the cumulative folder of the student (in the IEP for a student with an identified disability under
IDEA-2004). See the section on documentation for a suggested format.

Test Center (School) personnel should be familiar with and administer all allowed accommodations in
accordance with the directions provided in trainings for SAT Test Site Supervisors and those included in the
Maine High School Assessment Administrators’ Manual.

DOCUMENTATION OF ACCOMMODATIONS

Coding of Maine accommodations (see last two pages of this document) to be used by individual students will
be entered by school personnel according to the directions provided by the College Board.

As stated earlier, any accommodations made for a student and the reasons for these choices must also be
reflected in a statement in the student’s cumulative folder (in the IEP for a student with an identified disability).
The following is a suggested statement:

This student will participate in the Maine High School Assessment with the following accommodations:
SECTION                   REASON FOR ACCOMMODATION                    ACCOMMODATION CODE *

SAT Writing

SAT Mathematics                         -small group to minimize distractions   P2, P3
for student and others
levels from interfering with
demonstration of math skills and
knowledge
Maine Mathematics Component             small group to minimize distractions    P2, P3
for student and others
levels from interfering with
demonstration of math skills and
knowledge

* Refer to the last two pages of this document for the allowable accommodation codes for a Maine High School
Assessment taken for Maine Purposes Only.

REPORTING STUDENTS’ SCORES

OFFICIAL SAT REPORTS

For students taking the SAT portion of the Maine High School Assessment with accommodations approved by
the College Board for use by the individual, free official SAT score reports will be issued to three colleges
identified by that student.

MAINE REPORTS FOR ALL STUDENTS

Maine Department of Education                              4                                               10/19/06 v3
All students taking the Maine High School Assessment will be included in the school’s accountability system,
and their scores will be included in the State assessment reports based on Maine’s Learning Results,
regardless of the avenue of participation (standard administration, administration with accommodations (both
College Board approved and Maine Only accommodations), or alternate assessment). The scores on these
reports will be determined by the combination of the SAT and State components based on Maine’s
achievement standards.

POLICIES AND PROCEDURES FOR THE PARTICIPATION OF STUDENTS IN
ALTERNATE ASSESSMENT TO THE MAINE HIGH SCHOOL ASSESSMENT AND THE PSAT
The very few students who will require an alternate assessment are those who need a modified measure of
performance because their exceptionality is so significant that it does not allow access to the standard
assessment, even with a combination of accommodations. Every effort must be made to assess students

Maine’s alternate assessment at grades 10 (second year of high school) and 11 (third year of high school), the
2006-2007 Personalized Alternate Assessment Portfolio (PAAP), requires the use of Tasks linked to the
for a student functioning up to an achievement level comparable to that of a fourth grade student. Any student
those provided for selected Content Standards in the PAAP Task Bank.

The Content Areas required in the PAAP are based on those measured in the standard assessments used at a
and Science. For students at Grade 10, the PAAP Content Areas are Reading, Writing, and Mathematics.

PROCEDURES FOR DETERMINATION OF NEED FOR ALTERNATE ASSESSMENT

Students who may be considered for alternate assessment include those who have an identified significant or
profound disability under IDEA-2004, those who are Limited English Proficient, or those who are identified as
having disabilities under Section 504 of the Rehabilitation Act. Only those special education students with a
significant cognitive disability may have their scores reported for AYP based on alternate standards.

All students being considered for alternate assessment must have their individual situations reviewed by a
team prior to the time of assessment, allowing sufficient time for appropriate administration of the alternate
assessment. This team should include at least one of the student’s teachers, the building principal, related
services personnel, the parent(s)/guardian(s) and, whenever possible, the student. If it is not possible for the
parent and student to attend the meeting, they should be consulted regarding the committee’s
recommendations. The PAAP will require the accumulation of evidence to be gathered during most of the
school year and submitted in the Spring. Student work to be included in the Grade 11 PAAP in Science may be
collected over two years (second and third years of high school).Teams are encouraged to meet during the
year prior to an administration year, allowing for the gathering of student work during the following school year.

For a student who has an Individual Educational Program (IEP), schools are required to address the need for
the alternate assessment at a Pupil Evaluation Team (PET) meeting. Membership for this meeting is
prescribed in Maine Special Education Regulations, Chapter 101, part 8, November 1, 1999.

The recommendation for a student to take an alternate assessment will be reflected in a statement in the
cumulative folder of the student (in the IEP for a student with an identified disability). See the section on
documentation below for a suggested format.

Maine Department of Education                            5                                              10/19/06 v3
Trained school personnel should administer PAAPs at the high school level in accordance with Maine’s
Personalized Alternate Assessment Portfolio (PAAP) Training Manual, available at
www.mecas.org/paap/manual. Schools that have students requiring alternate assessments who are in out-of-
school in-state placements must assure that all information regarding the administration and submission of
PAAPs is forwarded to the students’ schools/programs.

DOCUMENTATION OF ALTERNATE ASSESSMENT

At the time of the PAAP Registration (October 13 – February 2 for 2006-07), students at grades 10 and 11 or
in their second or third years of high school who are participating in the Maine High School Assessment or
PSAT through alternate assessment must be registered online at the Measured Progress Web Site
(http://iservices.measuredprogress.org). The reasons for the use of this assessment option (i.e., SPED, LEP or
504) should be documented in the school’s MEDMS system for each student.

As stated earlier, the PAAP provided to a student and the reasons for this option must also be reflected in a
statement in the cumulative folder of the student (in the IEP for a student with an identified disability under
IDEA-2004). The following is a sample documentation format:

This student will participate in an alternate assessment to the Maine High School Assessment.
Section                   Reason for Alternate Assessment                      Student Designation:
Example(s)                                    LEP, 504, IEP*
*Identify specific     disability   from
Chapter 101, part 3.
Writing                                              PAAP                                         IEP, 01
Mathematics                                          PAAP                                         IEP, 01
Science                                               PAAP                                        IEP, 01
-The PET has identified alternate assessment
(PAAP) as the appropriate avenue for
participation in the Maine assessments for the
third year high school for this student because
her school program is individualized at a level
approximately 6 years below her grade
placement.

Note: Guidance on PAAP development, content, and scoring is available annually through a series of regional workshops
and online information. For details, please refer to the DOE web site at: www.mecas.org/paap/registration.

ALTERNATE ASSESSMENT SCORING AND REPORTING OF STUDENT SCORES

The alternate assessment contractor, Measured Progress, will arrange for pickup of PAAPs on April 6,
2007. PAAPs received at Measured Progress after April 9th, 2007 will not be scored and the students for
whom a late alternate assessment is submitted will be counted as non-participants in the Maine High
School Assessment for AYP reporting.

All state assessment reports, including PAAP score reports, will be sent to schools by Measured
Progress.

Note: Refer to the last two pages of this document for the allowable State accommodations.

Maine Department of Education                                6                                                 10/19/06 v3
APPROVED ACCOMMODATIONS FOR THE MAINE HIGH SCHOOL ASSESSMENT
Use of These Accommodations will Result in Scores Reportable for Maine Purposes Only
All accommodations used must:                    not change what is being measured,
be approved for individual students by a team, and
Code          Accommodations            Category                    Details on Delivery of Accommodations
be a regular part of the student’s daily instruction.

MT1.      with time extended beyond standard                Extended time may be needed by students who are unable to meet time
administration (same day).                        constraints, are easily fatigued, or unable to concentrate for the length of time
MT2.                                                        allotted for test completion. Testing may be extended until student can no
with time extended beyond standard
longer sustain the activity.

MT3.      with multiple or frequent breaks.                 Multiple or frequent breaks may be required by students whose attention span,
distractibility, or physical condition, require shorter working periods.
MT4.      at a time of day or a day of the week most        Individual scheduling may be used for students whose school performance is
beneficial to the student.                        noticeably affected by the time of day or day of the school week on which it is
done.
MT5.      using flexibility in the order in which content   Flexibility in the order of presentation may be used, for example, to build
area tests are given.                             confidence in the student by testing those content areas in which they are
strongest first, or to alleviate concerns by allowing them to complete the
content area about which they are most apprehensive first.

MS1.      in school site other than regular classroom.      Students may be tested in an alternative site to reduce distractions for

MS2.      in out-of-school setting by school personnel.     Out-of-school testing may be used for students who are hospitalized or unable
to attend school.

MP1.      individually.
Individual or small group testing may be used to minimize distractions for
students whose test is administered out of the classroom or so that others will
MP2.      in a small group.                                 not be distracted by accommodations being used (ex., dictation).

MP3.      using a human reader                              A human reader may be used for a student whose inability to read would
hinder performance. A Reader’s Script will be provided based on registration
with this accommodation. NOTE: When used for the Reading Passages, MP3
becomes a modification that is not allowed on other State assessments.

MP4.      using sign language (NOT allowed for              Trained personnel may use sign language to administer the test for deaf or
reading passages).                                hearing impaired students, with the exception of the reading passages. Sign
language may be used only for questions and directions in the reading
sessions.
MP5.      with opportunity for student to move, stand,      This opportunity may be used in a setting other than the classroom for a
and/or pace during assessment.                    student who cannot focus when seated for sustained periods of time.
MP6.      using alternative or assistive technology         The test may be presented through his/her regular communication system to a
that is part of the student’s communication       student who uses alternative and assistive technology on a daily basis.
system.
MP7.      by school personnel known to the student          The test administrator may be a member of the staff who works with the
other than the student’s classroom teacher        student from time-to-time or on a daily basis, but is not the student’s regular
(e.g., ESL Title I, Special Education)            teacher for general curriculum.

Maine Department of Education                                        7                                                         10/19/06 v3
APPROVED ACCOMMODATIONS FOR THE MAINE HIGH SCHOOL ASSESSMENT (CONTINUED)
Use of These Accommodations will Result in Scores Reportable for Maine Purposes Only

Code           Accommodations Category                                Details on Delivery of Accommodations
MP8.          using large print version of assessment.     A 20 pt. photo-enlarged print version of the SAT will be supplied based on
registration with this accommodation.
MP9.          using Braille version of assessment.         A braille version of the SAT will be supplied based on registration with this
accommodation.
MP10.         with student use of a bilingual dictionary   Dictionaries used must be approved by ESL/bilingual program staff. The
as needed.                                   student may have the dictionary available for individual use as needed.

MP11.         using “sheltered English” content for an     Simplification of content specific terms (ex., congruent, parallel, setting,
LEP student in a manner that does not        character) is NOT allowed. Such simplification would change what is being
compromise test integrity.                   measured. Guidance identifying those terms will be provided.

MP12.         using a cassette version of the test.        A cassette version of the SAT will be supplied based on registration with this
accommodation.
MR1.          using a scribe or recording device (oral   The student may dictate answers to trained personnel or record answers in
dictation to a scribe or a recording       an individual setting so that other students will not benefit by hearing
device is NOT allowed for the Writing      answers or be otherwise disturbed. Recorded answers must be scribed prior
session ).                                 to the return of test materials.
MR2.          using alternative or assistive               The technology is used to permit the student to read and/or respond to the
technology/devices that are part of the      test. In addition to computers, such devices might include, for example, text
student’s communication system.              enlargers, speech-to-text, amplification devices, Dynaboxes, etc. Speech-to-
text may not be used for the Writing session.

MR3.          other assistive devices.                     To enable a student to organize thinking, focus, and/or use a device that
serves as a specific strategy related to a test item, other assistive devices
may be used. They might include such things as templates, graphic
organizers, arithmetic tables (only in the calculator allowed session of the
Mathematics test), noise buffers, place markers, carrels, etc.
MR4.          with student use of a word processor.        A student may use a word processor. When used for the Writing session,
spell check, grammar check, and word prediction programs should be turned
off.
MR5.          with student use of a brailler.              A student may use a braillewriter, a slate and stylus, and/or an electronic
brailler to respond to questions. Responses would need to be recorded in
standard format by a scribe.
MR6.          with student use of visual aids.             Visual aids include any optical or non-optical devices used to enhance visual
capability. Examples include magnifiers, special lighting, markers, filters,
large-spaced paper, color overlays, etc.

MR7.          with student use of a bilingual dictionary   Dictionaries used must be approved by ESL/bilingual program staff. The
as needed.                                   student may have the dictionary available for individual use as needed.

student understanding following the          what he/she has been asked to do. If directions have been misunderstood by
reading of test directions.                  the student, the directions may be paraphrased or demonstrated. Test items
may not be paraphrased or explained.
NOTE: Due to federal regulations under No Child Left Behind, local word-for-word translation into native language is no longer an
allowable accommodation.
O.       OTHER                                                  (MUST BE DOCUMENTED AND SUBMITTED TO
THE DEPARTMENT OF EDUCATION IN ADVANCE)
Contact Linda Parkin, Alternate Assessment Coordinator
Linda.parkin@maine.gov
207- 624-6775

Maine Department of Education                                       8                                                         10/19/06 v3
APPENDIX J
RESPONSE TO MHSA MATHEMATICS
ALIGNMENT STUDY (5/11/07)
Response to MHSA Mathematics Alignment Study (5/11/07)

On May 11, 2007, Dr. Norman Webb of the University of Wisconsin and the Wisconsin
Center for Education Research conducted an alignment study to determine the degree to
which the mathematics portion of the SAT in combination with the Math-A
(augmentation) was aligned with the high school expectations of the 1997 Maine
Learning Results (MLR).

Dr Webb directed an eight member panel of secondary and higher education mathematics
experts for this exercise –four from Maine and four from Wisconsin. Based on the
panel’s collective judgment, Dr. Webb concluded that overall, the standards and the
assessment were aligned at an acceptable level. Additionally, the alignment study shows
that of the 16-cell matrix that comprises the study, 13 were determined to meet the full
and complete “yes” criterion while the remaining 3 cells were classified as “weak” in
their alignment relationship.

To strengthen the three weak-alignment areas, the Department will undertake the
following steps:

1). The Depth-of-Knowledge Consistency (DOK) rating for Cluster 3 (Mathematical
Decision Making) received a .47, or slightly lower than the required 50% of items judged
to be “at or above” the DOK recommendation. As is mentioned in the report, this rating
can become a “yes” by increasing the DOK of a single item. Because we will have field
test data on approximately 60 newly developed Math-A items when the MHSA
mathematics development committee meets this fall, we should be able to increase the
DOK in at least a couple of instances within the cluster to strengthen next year’s design.

2). The Balance of Representation (BOR) is classified as weakly aligned in Cluster 1
(Number and Operations; 0.66 vs. the needed 0.7) and Cluster 4 (Patterns; 0.69 vs. 0.7).
In each case, the lesser rating stems from a single performance indicator that is over
emphasized. In Cluster 1, the generic computation objective, “Students will understand
and demonstrate computation skills.” is judged to be the focus of too many items while in
Cluster 4, performance indicator H3, “Formulate and solve equations and inequalities.”
has also been rated as over targeted. In each case above, the wording of Maine’s original
(1997) Learning Results has been a challenge to measure in a balanced way.

These state standards were Maine’s first attempt at creating such expectations and were
not specifically designed to convey the assessment targets that have subsequently become
such an important component of the standards movement. Vast variation in grain size and
much generalized language has been problematic at all grade levels when assessing that
original document. As of June 2007, the Maine legislature passed into law a new set of
state standards that has much clearer assessment targets and a more consistent grain size.
These new standards will first be measured on state assessments during the 2008-09
school year.
For the interim school year (2007-08), the Department will again look to the newly
administered field-test items to more distinctly and directly measure the two computation
performance indicators of the 1997 MLR and thereby de-emphasize the generic
computation goal in order to address the Cluster 1 issue. Likewise in Cluster 4, we will
try to more specifically distribute the measure of some of the other Algebra performance
indicators so that fewer will be classified as predominately formulating and solving
equations and inequalities. Within each Cluster mentioned above, we will be seeking to
adjust and focus only a couple of assessment items.
Changing the MHSA Score Scale from the Traditional 200-800 College Board Scale to the
1100 -1180 Maine Comprehensive Assessment Scale

Beginning with the spring 2007 administration of the Maine High School Assessment (MHSA),
all reports will be issued using a new scale which ranges from 1100 -1180. The following
narrative provides an explanation and rationale for the implementation of that new scale.

During Maine’s first administration of the SAT Initiative in the spring of 2006, both student and
school results were reported using the College Board’s traditional 200 – 800 scale scoring system
for the Critical Reading, Writing and Mathematics portions of the test. Because no additional
questions were added to the SAT Reasoning Test™ during that initial year of testing and to
prevent confusion, the traditional SAT scores were reported and categorized after the
achievement level standard setting was completed during May of that year (see Table 1).

In the fall of 2006, Dr. Norman Webb of the University of Wisconsin conducted an alignment
study that indicated the mathematics portion of the SAT would benefit from additional test items
in order to more fully measure the range of Maine’s Learning Results. To address this finding, an
item development committee was formed and, using Dr. Webb’s design, the committee created
18 extra mathematics items to augment the traditional 54 mathematics items on the SAT.. These
additional items, known collectively as the Math-A (for Augment) were administered during the
last week of April 2007 and the results were combined with the traditional SAT mathematics
results.

Because the Maine High School Assessment (MHSA) mathematics test now consisted of 72 (54
+ 18) common items, it was no longer possible to assign a traditional College Board score to the
results from this new test. Additionally, because the 2008 MHSA testing program will resume
testing in science (a test developed in partnership with Measured Progress), it was necessary to
incorporate a new scale to which high school student achievement could be ascribed uniformly
across all disciplines. The State’s assessment committee looked to the other parts of Maine’s
Comprehensive Assessment System (MeCAS), to inform the design a coherent reporting
program.

To convey student results in grades 3-8, two years ago the Maine Educational Assessment
(MEA) adopted an 80-point scale for each subject area with the digit of the reported grade
preceding the 80-point continuum; e.g., grade 3 scores range from 300-380, grade 4 scores from
400-480, etc. This system of scale scores translates each of Maine’s four achievement levels
across approximate 20-point* intervals. For example in grade 3, “meets the standards” covers the
are 441–460 and 461–480.

To maintain consistency within the State assessment system, it was decided to extend and
continue the same scale used in the elementary and middle grades into the high school
assessment program using the range of 1100–1180 (11 representing grade 11) with the same 20-
point intervals as described above. This new scale was applied to the results of the spring
administration of the 2007 MHSA. For the SAT subject areas of Critical Reading and Writing,
where no augmentation was necessary, the traditional SAT scores were mapped directly to the
new MHSA range using the same cut scores established during the achievement level standard
setting in the spring of 2006 to define each achievement level. For the newly created MHSA
mathematics test, a new achievement level standard-setting committee was convened and new
raw score cut points established measuring all 72 math items. The resulting cut scores were then
translated to the same previously described 1100 -1180 scale (see Table 3).

The new MHSA scale will allow schools and districts to analyze all tested subject areas for
trends over time and to utilize data to improve instructional practices.
For schools wanting to compare 2006 traditional SAT scores to 2007 MHSA scores in Critical
Reading and Writing, conversion tables has been included to help facilitate this process (see
Table 4).

* The scaled cut score between “does not meet the standards” and “partially meets the standards”
does not always fall exactly at the 20-point range but rather varies slightly in response to
statistical parameters defined by the “meets the standards” and “exceeds the standards”
achievement levels at each grade and subject area (see Table 2).

Table 1
2006 SAT Initiative High School Assessment Cut Scores

Mathematics
Achievement        Scaled Score        Percent In Level
Level              Range
Does Not Meet      200-370             28
Partially Meets    380-450             25
Meets              460-640             42
Exceeds            650-800             5

Achievement        Scaled Score        Percent In Level
Level              Range
DNM                200-360             24
PM                 370-450             32
MS                 460-610             38
ES                 620-800             7

Writing
Achievement        Scaled Score        Percent In Level
Level              Range
DNM                200-340             21
PM                 350-440             32
MS                 450-610             40
ES                 620-800             6
Table 2
MEA 2006–07 Scaled Score Achievement Level Ranges for Grades 3-8

Reading       Exceeds        Meets           Partially Meets    Does Not Meet
Grade 3       361–380        341–360         331–340            300–330
Grade 4       461–480        441–460         431–440            400–430
Grade 5       561–580        541–560         531–540            500–530
Grade 6       661–680        641–660         629–640            600–628
Grade 7       761–780        741–760         729–740            700–728
Grade 8       861–880        841–860         829–840            800–828

Mathematics    Exceeds       Meets           Partially Meets    Does Not Meet
Grade 3        361–380       341–360         325–340            300–324
Grade 4        461–480       441–460         429–440            400–428
Grade 5        561–580       541–560         529–540            500–528
Grade 6        661–680       641–660         627–640            600–626
Grade 7        761–780       741–760         727–740            700–726
Grade 8        861–880       841–860         829–840            800–828

Science &      Exceeds         Meets          Partially         Does Not Meet
Technology                                    Meets
Grade 4        461–480        441–460         429–440           400–428
Grade 8        861–880        841–860         831–840           800–830
Writing        Exceeds        Meets           Partially Meets   Does Not Meet
Grade 5        561–580        541–560         521–540           500–520
Grade 8        861–880        841–860         817–840           800–816
Table 3
MHSA 2006-07 SCALED SCORE RANGES

MHSA               Exceeds     Meets              Partially    Does Not Meet
Meets
Critical Reading   1161–1180   1141–1160          1129–1140    1100–1128
Mathematics        1161–1180   1141–1160          1133–1140    1100–1132
Writing            1161–1180   1141–1160          1129–1140    1100–1128

Table 4
SAT - MHSA Conversion Chart

2006     2007                              2006      2007
SAT      MHSA Achievement Level            SAT      MHSA Achievement Level
200     1110  does not meet                200      1112 does not meet
210     1110  does not meet                210      1114 does not meet
220     1112  does not meet                220      1114 does not meet
230     1114  does not meet                230      1116 does not meet
240     1114  does not meet                240      1118 does not meet
250     1116  does not meet                250      1118 does not meet
260     1118  does not meet                260      1120 does not meet
270     1118  does not meet                270      1120 does not meet
280     1120  does not meet                280      1122 does not meet
290     1120  does not meet                290      1124 does not meet
300     1122  does not meet                300      1124 does not meet
310     1124  does not meet                310      1126 does not meet
320     1124  does not meet                320      1128 does not meet
330     1126  does not meet                330      1128 does not meet
340     1128  does not meet                340      1128 does not meet
350     1128  does not meet                350      1130 partially meets
360     1128  does not meet                360      1132 partially meets
370     1130  partially meets              370      1132 partially meets
380     1132  partially meets              380      1134 partially meets
390     1134  partially meets              390      1134 partially meets
400     1134  partially meets              400      1136 partially meets
410     1136  partially meets              410      1138 partially meets
420     1138  partially meets              420      1138 partially meets
430     1138  partially meets              430      1140 partially meets
440     1140  partially meets              440      1140 partially meets
450     1140  partially meets              450      1142 meets
460     1142  meets                        460      1144 meets
470   1144   meets     470   1144   meets
480   1144   meets     480   1146   meets
490   1146   meets     490   1146   meets
500   1148   meets     500   1148   meets
510   1148   meets     510   1150   meets
520   1150   meets     520   1150   meets
530   1150   meets     530   1152   meets
540   1152   meets     540   1152   meets
550   1154   meets     550   1154   meets
560   1154   meets     560   1154   meets
570   1156   meets     570   1156   meets
580   1158   meets     580   1158   meets
590   1158   meets     590   1158   meets
600   1160   meets     600   1160   meets
610   1160   meets     610   1160   meets
620   1162   exceeds   620   1162   exceeds
630   1164   exceeds   630   1164   exceeds
640   1164   exceeds   640   1164   exceeds
650   1166   exceeds   650   1166   exceeds
660   1168   exceeds   660   1166   exceeds
670   1168   exceeds   670   1168   exceeds
680   1170   exceeds   680   1170   exceeds
690   1170   exceeds   690   1170   exceeds
700   1172   exceeds   700   1172   exceeds
710   1174   exceeds   710   1172   exceeds
720   1174   exceeds   720   1174   exceeds
730   1176   exceeds   730   1174   exceeds
740   1178   exceeds   740   1176   exceeds
750   1178   exceeds   750   1178   exceeds
760   1180   exceeds   760   1178   exceeds
770   1180   exceeds   770   1180   exceeds
780   1180   exceeds   780   1180   exceeds
790   1180   exceeds   790   1180   exceeds
800   1180   exceeds   800   1180   exceeds
Table 5
MHSA 2006-07 Mathematics Raw Score to Scaled Score Conversion Chart

Raw Score Range
Scaled Score     Min       Max       Achievement Level
1100        -15.5     -13.25   does not meet
1102         -13      -12.75   does not meet
1104        -12.5     -12.25   does not meet
1106         -12      -11.75   does not meet
1108        -11.5       -11    does not meet
1110       -10.75     -10.25   does not meet
1112         -10        -9.5   does not meet
1114        -9.25      -8.25   does not meet
1116          -8         -7    does not meet
1118        -6.75       -5.5   does not meet
1120        -5.25      -3.75   does not meet
1122         -3.5      -1.75   does not meet
1124         -1.5       0.75   does not meet
1126           1        3.25   does not meet
1128          3.5       6.25   does not meet
1130          6.5       9.25   does not meet
1132          9.5       15.5   does not meet
1134        15.75       16.5   partially meets
1136        16.75      20.25   partially meets
1138         20.5        24    partially meets
1140        24.25      29.75   partially meets
1142          30        31.5   meets
1144        31.75      35.25   meets
1146         35.5       38.5   meets
1148        38.75      41.75   meets
1150          42        44.5   meets
1152        44.75      47.25   meets
1154         47.5      49.75   meets
1156          50         52    meets
1158        52.25        54    meets
1160        54.25      56.75   meets
1162          57       57.75   exceeds
1164          58       59.25   exceeds
1166         59.5       60.5   exceeds
1168        60.75      61.75   exceeds
1170          62       62.75   exceeds
1172          63       63.75   exceeds
1174          64        64.5   exceeds
1176        64.75      65.25   exceeds
1178         65.5      65.75   exceeds
1180          66         72    exceeds

```
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
 views: 4 posted: 10/18/2011 language: English pages: 199
How are you planning on using Docstoc?