A Video Ethnography of the Implementation of a Highly Rated Science Curriculum
in a Diverse Middle School Classroom
National Setting: Diversity and Curriculum Reform
Despite the best of intentions for equity, the science education reform movement has failed to promulgate
curricula which respond adequately to the diversity of the U.S. student population. In an effort to reform
U.S. science curricula, the AAAS’ Project 2061 has developed a rating system that identifies promising,
extant standards-based curricula. Curricula that are highly rated according to these Project 2061 criteria
must adequately convey a sense of lesson purpose; take account of student ideas; engage students with
relevant phenomena; develop and use scientific ideas; encourage students to make evidence-based
arguments, inter alia. (Roseman, Kesidou, & Stern, 1996; 2001) Inasmuch as they imply recognition of
individual differences among students, these criteria would seem to elicit instruction that would be sensitive
to diversity issues.
Research Focus, Design, and Methods
While a curriculum unit that has been highly rated according to the 2061 criteria may be intended to
“instantiate” certain qualities, it is quite another matter whether students “get” them as the unit is
implemented. The current project thus investigates this gap between intention and implementation by
asking how a highly rated curriculum entitled Chemistry that Applies (CTA) (State of Michigan, 1993) is
actually functioning in a demographically diverse public middle school classroom. The methodology used
to answer that question is a comprehensive video ethnography.1 Accordingly, the speech of four children
of diverse ethnic backgrounds was recorded, transcribed, and digitally linked to video. It is currently being
coded for evidence of the enactment of the Project 2061 criteria as students work through the curriculum.
We explore the ways in which the children interpret this curriculum unit, and respond to its requirements,
through their questions and their actions, and their orientations toward one another, the phenomena they
encountered in the labs, and the text.
Sample Data & Analysis
The following vignette and discussion illustrate our proposed methodology for analyzing the functioning of
the CTA curriculum vis a vis the Project 2061 criteria. In this scene, Kim, Mike, Rafael, and Angelique2
are preparing to write up the results of an experiment in which Alka-Seltzer and water were weighed before
and after they were mixed in a closed environment. Suddenly, Angelique asks whether she can weigh the
bottle again even though this experiment was not part of the lab instructions.
Mike: It didn’t lose or gain
Kim: It didn’t gain or it lose…
Rafael: So what do we do with this? [gestures towards a bottle]
Kim: Not when it stayed the same. The weight stayed the same…because…
Angelique: Can I weigh the bottle again?
Kim: Yeah but now the gas ran out.
Angelique: I know. I want to weigh it now.
[For the next 45 seconds Angelique weighs the bottle and records the results]
Angelique: It lost weight.
Mike: No it didn’t. It stayed the same.
Kim: It didn’t lose or gain.
Angelique: No, right now.
Mike: No. It’s the same.
Angelique: No it’s not.
Mike: I mean the end result’s the same. Gas doesn’t have weight.
Angelique: I’m talking about this.
This project is part of an ongoing NSF/IERI grant entitled “Scaling up Curriculum for Achievement, Learning, and
Equity Project” (SCALE-uP). (Lynch, Kuipers, Pyke, & Szesze, 2003)
All names are pseudonyms.
In this complex interaction, we see several diverse strategies for interpreting the curriculum and building
conceptual understandings. In the case of Mike, a white child, an important focus of his interpretive energy
is in determining and responding to a lesson’s purpose, the first of the Project 2061 criteria – in this
instance, the conservation of matter. In the face of contradictory input, Mike adopts the strategy of
producing answers that take the rhetorical form, “it stays the same,” and he is frequently correct. Mike
frames his (false) conclusion using abstract stylistically “marked” scientific terms (cf. 2061, criteria #4)
stating that “gas doesn’t have weight.” For Angelique, an African-American girl, the table presents many
opportunities for verbal and manual engagement with relevant phenomena (cf. 2061, #3). Though she
seldom uses scientific terms in her observations (cf. 2061, #4), Angelique initiates her own experiment and
apparently engages in scientific thinking (cf. 2061, #5). Rafael is a former ESL student who accesses
lesson purpose not by internalizing the rhetoric of the curriculum or teacher as Mike does here, nor by
engaging in independent experimentation like Angelique, but by concentrating on procedures and
sequences (cf. 2061, #1). Rafael’s orientation to manual participation in the experiment (cf. 2061, #3) may
be a way to minimize potential loss of face should he misunderstand the procedures due to a lack of facility
with English. Kim, an Asian-American student, is oriented flexibly to the table practices, the unit purpose,
the textbook and the teacher, consulting all four for evidence. She has internalized the logic of the lesson,
applies it to the new situation proposed by Angelique’s second experiment, and engages in scientific
thinking by making a correct prediction (cf. 2061, #5).
Preliminary ethnographic analysis of such vignettes suggests significant differences among the four
students at the lab table. Emphatically, each student is not intended to represent a demographic group.
However, each child does show a distinctive pattern of interaction with his/her peers, the curriculum, and
the teacher. Qualitative observations have been extended through quantitative analyses of the entire video
corpus of 18,771 utterances. Our initial results reveal that these distinct interpretive strategies are
systematic across the data. Current research efforts are focused on three areas which afford us an
ethnographic window into how this diverse group of students is responding to the curriculum. Qualitative
and quantitative analyses are performed on (1) students’ use of scientific terminology, (2) clarification
episodes (conceptual, procedural, or linguistic “trouble” (Jefferson 1980), coupled with peer, self, teacher,
or text remedies), and (3) instances of object manipulation.
Patterns of scientific term usage reveal that Mike, for example, uses scientific terms the most often, and
Rafael the least. Our interest, however, lies in interpreting this variation in terms of students’ interpretive
orientations to the curriculum, and to the 2061 criteria. That is, Mike’s frequent use of scientific
terminology may signal his self-identification with the goals and lesson purpose, and a corresponding
diminution of his identification with the colloquial orientation of the table activity system. Clarification
episodes likewise provide an ethnographic window into the communicative processes by which the students
are interpreting a lesson’s purpose (cf. 2061 #1), and resolving misunderstandings about it. Acts of object
manipulation can be viewed as ethnographic indicators of the 2061 criterion #3: engaging students with
One of the criticisms often leveled at interactional research in education is that it is "micro" ethnography
and therefore not generalizable. (McDermott & Roth 1978) The scope of our video corpus, however,
demonstrates the robustness of our preliminary qualitative hypotheses. The hands-on learning strategies of
Angelique and Rafael, for example, can be documented across the data. Our shareable, permanent
ethnographic record is a database that permits large-scale searches, specific retrieval, analytically-driven
coding and comparison. (Stigler, Gallimore, and Hiebert 2000) With improved and culturally informed
sampling design, the efficiency of the analysis may improve to the point that it is no longer simply an
illustration of statistical realities, but can actually play a role in helping teachers understand the cultural
context of science education for diverse learners. Thus by marrying the insights of both qualitative and
quantitative research, this research hopes to offer a clearer understanding of how the 2061 criteria are being
enacted in the classroom. This will clarify which parts of this highly rated curriculum are working (and
which are not) for which subgroups of students, and, by extension, how we can improve the curriculum to
best teach all students.
Jefferson, G. 1980. On “trouble premontory” response to inquiry. Sociological Inquiry 50: 153-85.
Lynch, S., Kuipers, J., Pyke, C., & M. Szesze. 2003. Examining the effects of a highly rated science
curriculum unit on diverse students: Results from a planning grant. Paper Presented at the Annual
Meeting of the American Educational Research Association, Chicago, IL.
McDermott, R. & D. Roth. 1978. The social organization of behavior: Interactional approaches. Annual
Review of Anthropology 7: 321-45.
Roseman, J.E., Kesidou, S., & L. Stern. 1996. Identifying curriculum materials for science literacy: A
Project 2061 evaluation tool. Paper presented for the National Research Council Colloquium,
“Using the National Science Education Standards to Guide the Evaluation, Selection, and
Adaptation of Instructional Materials,” Washington, DC.
Roseman, J., Kesidou, S., & L. Stern. 2001. Identifying Curriculum Materials for Science Literacy: A
Project 2061 Evaluation Tool.
State of Michigan, Michigan Department of Education. 1993. Chemistry That Applies. Lansing, MI.
Stigler, J., Gonzales, P., Kawanaka, T., Knoll, S., & A. Serrano. 1999. The TIMSS videotape classroom
study: Methods and findings from an exploratory research project on eighth-grade mathematics
instruction in Germany, Japan, and the United States. Education Statistics Quarterly 1(2): 109-12.