VIEWS: 3 PAGES: 7 POSTED ON: 7/3/2011
Inquiry Stages Gathering appropriate data Each phase of inquiry begins with the collection of data that can be used to construct rich descriptions of what students/teachers/school leaders/ISTEs currently know, believe, and can do. Care should be taken with ethical considerations when gathering data. Three steps are embedded within the data collection process: 1. Deciding on the information that is needed What aspects of the students’/teachers’/school leaders’/ISTEs’ current knowledge, beliefs, and skills do we want to understand? Do we want to make comparisons to the national or international picture? Will this data enable us to find out what we need to focus on in our practice and our professional learning? 2. Choosing appropriate tools What tools will give us the information that we require? Do we have the knowledge to use these tools wisely? If not, is there enough supporting information with the tools? 3. Using the tools When do we use the tools? Who should administer them? Why? Who should record the data? Why? The text below describes the following sources of evidence and some potential approaches to collecting them: using observations and audio/video recordings; drawing on related research and literature; using data on student outcomes; responding to student voice; creating written reflections. Using inquiry approaches to improve practice If you find the tables in this section helpful, you may like to build up a personal or shared resource by drawing up similar tables for some inquiry approaches that you have used, illustrating each approach with examples from your practice. The Inquiry and Evidence-based Practice chapter adopts Reid’s (2004) metaphor of an “inquiry tool box” to describe the development of a suite of inquiry approaches, techniques, and skills. By growing their “tool boxes” thoughtfully and systematically over time, educators can draw from a range of approaches to inquire into the specific learning needs of people within a particular context. The fundamental purpose of all these inquiry approaches is to enable educators to evaluate the adequacy of their theories (what they know and believe) and their practice (what they do) in terms of the outcomes they want to achieve. This section introduces some of the inquiry approaches that an ISTE may take1. The approaches are organised under three headings that indicate key considerations for an inquiry: Gathering appropriate data; Critically analysing data; Selecting a collaborative process and activities to scaffold learning. After a brief discussion of each of these considerations, the text uses tables to present examples of some useful inquiry approaches. The tables also provide examples of how some of these approaches were used in the learning cases and suggest sources for more information.2 1 As with other chapters in these learning materials, users are not expected to read this section in one sitting; rather it is envisaged that they will refer to relevant approaches while exploring specific questions within their practice. 2 Note that the text does not attempt a comprehensive introduction to all inquiry approaches or to all the approaches used within the learning cases. Critically analysing data Three steps are embedded within critical analysis of data: 1. Drawing initial inferences 2. Asking deeper, more complex questions in order to make sense of the data 3. Making decisions about where to go next. The next step within each phase of the inquiry and knowledge-building cycle is to critically analyse the data that has been gathered, making inferences about what it reveals about the strengths and needs of students and educators and about the adequacy of educators’ current theories of practice. This sense-making process is vital in the transformation of data into evidence. Educators need some sort of analytical framework to guide them through their analysis. The framework should enable them to focus on the purpose of their interactions, to investigate how well their practice matches the values and beliefs they espouse, and to evaluate the degree to which they are achieving their intended outcomes. It should also allow educators to explore the coherence and connections between data that relates to students, to teachers, and to ISTEs. Three steps are embedded within critical analysis of data: 1. Drawing initial inferences on the basis of our expectations What are our expectations for all of these students? Are our expectations being met by the whole group and by particular subgroups? (Subgroups could include students who belong to particular ethnic groups or who are having difficulties in a specific aspect of their learning.) How do our results compare to the national picture? Are we satisfied with these results? What does the data tell us about the students’ strengths and needs? What are our expectations for teachers/school leaders? What are the strengths and needs of the teachers/school leaders? What are our expectations of ourselves as ISTEs? What are our strengths and needs? What have we contributed to the school outcomes? 2. Asking deeper, more complex questions in order to make sense of the data How can we invite others into the analysis in order to include a variety of perspectives? What do we need to know about the educators’ content knowledge, pedagogy, and theories of practice if we are to explain the results we have? What tools will give us this information? What are the patterns and links we can see when we look at all of our data analyses? Do we need to disaggregate some of the data? 3. Making decisions about where to go next What does this information tell us about the priorities we should be setting? Based on these priorities, what are our targets for ourselves and the students, teachers, and school leaders? What do we need to learn to do to promote these targets? How can we align the needs of the students, teachers, and school leaders with our needs as ISTEs? What should we expect to notice if our changed practice is having an impact? The text below describes two approaches that educators find useful when critically analysing and learning from data: using a framework to analyse practice; adopting the Model I – Model II framework. Using a framework to analyse practice When inquiring into their ways of working, educators generally need to select or create a framework for analysing evidence (such as transcripts or observation records) of their practice. When working collaboratively, an agreed framework is an especially valuable tool for establishing common expectations and understandings. The development of a framework ensures that the principles and theories underpinning practice are articulated and agreed. The framework also often determines the data and evidence of practice that are collected. Using the framework to analyse that evidence provides a common frame of reference with which to evaluate the adequacy and impact of practice. The analysis may demonstrate coherence between theory and practice, or it may reveal dissonance between them. Adopting the Model I – Model II framework Argyris and Schön (1974) explain that the concepts of “Model I” and “Model II” represent the behaviour of people with contrasting theories-in-use. People who operate according to a Model I theory-in-use tend to take a competitive and defensive stance to the world. People who operate according to a Model II theory-in- use tend to take a more collaborative and less defensive stance. The shift from Model I to Model II behaviour requires “double loop learning” – learning that involves the questioning of basic assumptions and values. This enables people to shift from reasoning that is characterised by defensiveness to what Argyris (1990) calls “productive reasoning”. (See pages 133–137.) New Zealand ISTE Eileen Piggott-Irvine (2003) suggests four stages for helping participants in action research make the shift from Model I to Model II behaviour: 1. Map the problem and how it is dealt with, by examining and exposing participants’ espoused theories and their theories-in-use. 2. Diagnose the extent to which participants themselves create and maintain problems. (The dissonance this creates can be a catalyst for change.) 3. Take productive reasoning from an espoused theory to a theory-in-use so that participants learn to conduct conversations that are simultaneously critical and collaborative. 4. Reinforce the practice in new learning situations with support from peer coaching. Argyris (2000) suggests a number of methods for helping people to understand how they might espouse Model II behaviour but practise Model I behaviour. These require people to review their interactions to discover areas in which Model II behaviour would have produced better outcomes. In the “left-hand and right-hand case exercise”, the method is to divide a page into two columns. In one column, the inquirer recounts a frustrating conversation as he or she remembers it. In the other column, the inquirer writes the corresponding thoughts and feelings that he or she didn’t express at the time. Argyris (2000) also advocates recording interactions. A recording provides an unbiased replay of a meeting or other interaction. Members of a learning community can return to it on repeated occasions to note places in which Model I behaviour was obvious and produced negative effects. ISTEs who use electronic recording tools find that as they become more practised in their use, they move from observing and commenting on the superficial aspects of interactions to digging deeper into the theories- in-use that their recordings reveal. This can be an uncomfortable process, even when the ISTE has support from a colleague or critical friend. However, this process can enable the double-loop learning associated with substantive change. Case 3: Effective Communication within Learning Interactions Catherine’s inquiry was focused on how she could conduct conversations with teachers so that each participant felt that they could openly and respectfully discuss challenging issues. Her thinking about communication was underpinned by the Model I – Model II framework. Catherine felt that Jack was not engaging in the learning she was facilitating and was really just paying lip service to the concepts they’d been discussing. She shared her problem with her colleagues, Michael and Allan, and enlisted their help in devising a strategy for analysing her interaction with Jack. She began by reconstructing the dialogue from memory and annotating it, in line with Argyris’s left-hand and right-hand case exercise. See: Screen 4 transcript of conversations with Jack.doc Then through a series of role plays and practice conversations, Michael and Allan helped Catherine to deconstruct her original conversation, surface the theories-in-use that had led to its ineffectiveness, and both shape and practise a more effective model of communication. For example, in the analysis of one moment from the conversation, they explore Catherine’s reluctance to “check in” with Jack as the conversation proceeds (see video Clip 10). They then role-play an alternative dialogue that is more in keeping with Catherine’s belief in the value of Model II behaviour (see video Clip 11). See also the learning stories: “Building a culture of inquiry”, page 90; “Being a critical friend”, pages 128–129; “More haste, less speed”, page 138. What forms does inquiry take? Over the course of their professional careers, educators are likely to develop a range of approaches to inquiry, each of which is designed to facilitate critical reflection. Reid (2004) suggests that the approaches might include: In Case 3, an ISTE uses role play with two colleagues to critically interrogate the assumptions and beliefs underpinning her practice and to develop improved ways of communicating in challenging situations. See video Clip 10. action research, where the educator identifies an issue/puzzle/contradiction, gathers data in relation to the issue, draws on research, analyses the data, theorises a strategy, acts and reviews; critical dialogue, where a group of educators meet regularly and engage in a form of critical discussion, typically involving one member describing a practice or a dilemma in his/her teaching and the group interrogating the assumptions and beliefs about learning upon which that practice is based. This often leads to new strategies or approaches (e.g., Smith-Maddox, 1999); classroom/work-place observations, where individuals, pairs or groups can observe each other teaching as a part of the process of collaboratively exploring an issue. They might describe what they see (in written form or orally) and then analyse and interpret these observations through reflection and critical discussion, in order to develop new strategies in relation to the issues/problems identified; journals, where educators write regularly in journals about their work, recording their criticisms, doubts, questions, successes and joys. Looking over these at intervals can often reveal some rhythms or irregularities that are not picked up when there is a focus on individual events or practices; critical data analysis, where educators interrogate data (gathered by them or by the system), seeking to reveal issues or interesting observations that might form the focus of further inquiry; appreciative inquiry, where educators gather data about successes and try to understand the factors that promote these, rather than focusing on problems. This form of inquiry starts with the assumption that whatever you want more of, already exists in an organisation. It is a matter of examining the whole, not looking at the separate parts of a system that are not working; portfolios, where an educator compiles evidence of successful development in his/her work. Portfolios foster reflection because they cause the educator to identify professional strengths and weaknesses; writing, where educators use various approaches to reflect on their work, including narrative inquiry (involving story-telling), and proposal writing (involving research and the development of a reasoned argument, as these materials do); text analysis, where educators analyse policy and other texts in order to unearth assumptions and theories and to subject these to critical analysis; program evaluation, where educators seek to assess the outcomes of particular activities, using approaches that range from goal-based evaluations to those that are open-ended and responsive. page 6 Reid emphasises that each approach: can be used in different ways for different purposes and with different starting points; is supported by a body of research literature; requires a number of action-oriented skills and techniques. He suggests that educators need to develop a suite of inquiry techniques and skills – a kind of “inquiry tool box” – and that these include the ability to: clarify meaning; identify issues/problems/dilemmas/puzzles/successes; develop inquiry questions; collect data (e.g., through observation, documentation analysis, photographs, audio or video recording, quantitative data, interviews, questionnaires); locate and draw on research; critically interrogate practice and data; analyse/interpret and theorise [about] quantitative and qualitative data; develop and implement strategies to enhance student learning outcomes; and assess the extent to which strategies or action have improved learning or the learning environment. page 7 Because they are so complex, it is important that people be allowed time to slowly develop, monitor, and improve their “tool boxes”. The development of an inquiry tool box might involve selecting an approach, reading about it, talking with people who have used it, experimenting with it and documenting experiences, and reflecting on the approach itself as well as the focus of the inquiry in which it has been used. No education system or single institution should simply exhort people to engage in inquiry without an acknowledgement that inquiry skills need to [be] built thoughtfully and systematically. Recommended reading Argyris, A., Putnam, R., and McLain Smith, D. (1985). Action Science: Concepts, Methods, and Skills for Research and Intervention. San Francisco: Jossey-Bass. You can also link from the website to an Action Science forum, where you can participate in discussion on the book and on action science more generally. Piggott-Irvine, E. (2003). "Facilitating Openness and Learning Partnerships in Action Research". Paper presented at the Action Learning, Action Research, and Process Management (ALARPM) 6th World Congress, University of Pretoria, South Africa, 21–24 September. New Zealand ISTE Eileen Piggott-Irvine describes a process based on the Model I – Model II framework that can be used to help educators develop the high trust and open relationships that enable problems to be discussed and resolved.
Pages to are hidden for
"Inquiry Stages"Please download to view full document