Document Sample

An Instructional Innovation Proposal For Summer 2005

Mark Keil Professor Department of Computer Information Systems Phone: (404) 651-3830 e-mail:

I have reviewed the proposal and I support the request for a course release/cost reimbursement for the project. _________________ Department Head __________ Date

LONGITUDINAL SELF-ASSESSMENT OF COURSE CONTENT PROFICIENCY: A NEW APPROACH FOR STUDENT EVALUATION OF COURSES PROBLEM STATEMENT For as long as most of us can remember, the Robinson College of Business has relied on the Student Evaluation of Instructor Profile (SEIP) as the primary (some would argue sole) source of feedback to faculty on their teaching performance. While SEIPs can be immensely helpful in assessing dimensions of teaching performance such as “presentation/clarity,” they are not designed to guage actual learning or to get a tap on the extent to which students’ understanding of key concepts changed during the course of the semester. Some of the limitations of the existing SEIP instrument are more subtle. For example, numerous cases have been reported in which the same faculty member teaching two sections of the same course on the same campus have obtained SEIP scores that are radically different. There is no doubt that we are measuring something with the current SEIP, but knowing what we are measuring is a different matter entirely. Based on 14 years of teaching at GSU and countless SEIP administrations, I am convinced that the SEIP scores can be highly variable, as it appears that what they truly measure, or reflect, is the “chemistry” that develops between an instructor and the students in the class. Another key limitation stems from the fact that SEIPs are administered at the end of the semester—thus there is no baseline for evaluating such factors as “overall value of the course” in terms of the contribution to student learning. For these and other reasons, many faculty have become frustrated with the limitations of the SEIP instrument and long for an alternative means of obtaining student feedback that would be more helpful in terms of assessing student learning. I am not suggesting that SEIPs aren’t useful or that we should get rid of them. I do believe, however, that their value could be increased if we stepped back and considered the possibility of exploring other measures that could be used to supplement them. PROPOSAL OBJECTIVES The proposed innovation focuses on the development and implementation of a new approach for assessing student learning. Innovations in this area are important as institutions of higher learning are being pressed by outside accreditation bodies to do a better job in the area of assessment (Dary, 1991; Shepard, 2000). The approach taken here involves the administration of a survey instrument both at the beginning and at the end of the course to assess students’ self-reported levels of understanding concerning key concepts and techniques that the course is designed to cover. Thus, unlike the current SEIPs, the approach is oriented toward assessing content proficiency and is longitudinal in nature. Consistent with current thinking in the area of assessment, the proposed innovation will not be aimed at providing “critiques of the teacher, the teacher’s performance, and of teaching methods that are unrealated to student estimates of what they have gained from them” (Seymour et al., 2000, p. 1). Instead, it will be oriented toward assessing what the student perceives that s/he has actually learned as a result of the course. The longitudinal nature of the proposed assessment innovation is consistent with Gill’s (2005) critique of current assessment approaches which are often single-shot assessments. Such traditional assessments are not constructed in the same scientifically rigorous fashion that would be employed if we treated every course as a scientific experiment in which the objective was to test the hypothesis that students actually learned something. While there is limited


evidence in the literature in support of a pre and post-course self-assessment model, such an approach has not to my knowledge been applied in a business school context (Dooley and Lindner, 2001). An underlying assumption behind the proposed innovation is that students’ evaluation of a course is influenced by the extent to which they feel they have actually learned something that is of value. Thus, part of the aim here would be to investigate the extent to which this new approach influences how students perceive the overall value of a course. If this proposal is funded, students and faculty will be expected to achieve the following objectives: Student objectives 1. Obtain a baseline that measures course content proficiency at the beginning of the course. This will give the student a pre-course understanding of where s/he stands in terms of having acquired the knowledge that the course is designed to teach. 2. Obtain a post-course measure of content proficiency at the end of the course. This will give the student a post-course understanding of where s/he stands in terms of having acquired the knowledge that the course is designed to teach. 3. Obtain a report that visually indicates the change in content proficiency that has occurred during the course of the semester. This will give the student an indication of how effective the course was in increasing his/her knowledge of specific topics relating to the course content. Faculty objectives 1. Develop an instrument that can be customized for any particular course and used to assess student learning. 2. Obtain better baseline information at the beginning of a course concerning students’ level of proficiency with key course concepts and techniques. Such information will enable faculty to make adjustments to the course as needed (e.g., spending more or less time on certain concepts) in order to maximize the value for students. 3. Obtain a post-course measure of students’ content proficiency. This will enable the instructor to understand the extent to which s/he has communicated key course concepts and techniques in a manner that “sticks” with the students. 4. The innovative approach developed here would then be written up as a journal article targeted to the Academy of Management Learning and Education (AMLE). AMLE is considered to be an excellent vehicle for disseminating scholarship of teaching within the management/business school community. The innovation should have broad applicability within RCB. 5. The innovation would also be presented to the RCB faculty in the form of an FDC workshop. METHOD The basic approach behind this proposal is simple and uncomplicated. A survey instrument would be developed during summer 2005 and administered during fall 2005. Data would be analyzed and a report would be written up and made available to the Faculty Development Committee. The specific steps needed to accomplish the objectives put forth in this proposal include: 1. Conduct a literature review focusing on course and learning assessments. 2. Develop an instrument that can be customized for any particular course and which can be sed to assess students’ proficiency with respect to key concepts and techniques. 3. Administer the instrument (both pre and post course)

4. 5. 6. 7. 8.

Provide feedback to students Assess overall worthwhileness of course Collect and analyze the data Write up the results and submit to a journal. Plan and organize FDC workshop to disseminate the innovation within RCB

EVALUATION During the Fall Semester 2005, I plan to conduct an experiment in my CIS 8150 class. Students will be randomly assigned to either a treatment or control group. The control group will fill out a short survey at the end of the course designed to measure only the overall worthwhileness of the course. Subjects in the control group will not be asked to complete pre or post measures of content proficiency. Subjects in the treatment group will be asked to complete both pre and postcourse measures of content profiency and will be presented with a report summarizing how their content proficiency has changed during the course of the semester. At the end of the post-course measure, they will receive the same short survey that the control group receives which is desiged to mesure only the overall worthwhileness of the course. In addition to the quantitative data, interviews will be conducted with a subset of the subjects in the treatment group to obtain qualitative information on their reactions to the new assessment approach. The teaching innovation will be assessed as described and will be written up for publication in the Academy of Management Learning and Education (AMLE) so that other faculty can learn from the experiment conducted at Georgia State. BUDGET AND MEDIA/CLASSROOM One course release is requested for Dr. Keil in order to provide the time needed to experiment with the approach described in this proposal, to develop the described assessment materials, and to test the instructional innovation in a classroom setting. All materials have been acquired and no special classroom is needed to experiment with the teaching innovation. REFERENCES
Dary, E.T., Assessing Student Learning and Development: A Guide to the Principles, Goals, and Methods of Determining College Outcomes, San Francisco: Jossey-Bass, 1991. Dooley, K.E. and Lindner, J.R., “Competencies for the Distance Education Professional: A SelfAssessment Model to Document Learning,” Proceedings of the 28 th Annual National Agricultural Education Research Conference, December 12, 2001, pp. 171-182. Gill, G., “The Cruelest Experiment,” Working Paper, University of South Florida, 2005. Shepard, L.A., “The Role of Assessment in a Learning Culture,” Educational Researcher, vol. 9, no. 7, 2000, pp. 4-14. Seymour, E., Wiese, D.J., Hunter, A., and Daffinrud, S.M., “Creating a Better Mousetrap: On-line Student Assessment of their Learning Gains, Working Paper, University of Colorado, Boulder, 2000.


Shared By: