Assessment Summary Example
Shared by: TGuiliani
Assessment Summary: Graphic Design & Illustration, 2002-03 GDI Mission Statement: The mission of the Graphic Design and Illustration Department is to facilitate student learning and meet the needs of the business community by providing a relevant and current curriculum that is based on sound educational principles. The Graphic Design and Illustration Department is committed to using learning centered strategies, making effective use of instructional resources, and continuously assessing student academic achievement for the purpose of ongoing improvement. Intended Learning Outcomes: 1. Demonstrate knowledge of software programs and traditional production tools related to the Design Industry 2. Prepare professional presentations of artwork to effectively market oneself 3. Apply the principles and elements of design 4. Apply time-management and multi-tasking skills Outcomes Assessed During 2002-03: • Demonstrate knowledge of software programs and traditional production tools related to the Design Industry • Apply the principles and elements of design Benchmarks: Assessment during the 2002–03 academic year was limited to GDI 101, Introduction to Visual Communications. Since it is recommended as a first semester course for the curriculum, GDI 101 is probably not the best source for checking final outcomes for the department. However, it does serve an important role in this process; it is an excellent benchmark when assessing similar outcomes in more advanced courses. Please see the “Suggested Changes” section at the end of this summary for more on linking this assessment to more advanced courses. Assessment Methods Used: Student population assessed As previously mentioned, the students assessed during the 2002-03 academic year were limited to both sections of GDI 101, Introduction to Visual Communications, during both terms of the year. Unfortunately at the time of this writing, statistics were unavailable for the Fall term 02. Eighteen total students were assessed during Spring term 03. Assessing body Jeffrey C. Derksen, a professional designer and adjunct instructor at the University of Denver was contracted to assess a project students routinely complete in GDI 101, a combination mark design. Description of the process As one of the course projects, students are asked to construct a combination mark (informally known as a logo) for one of four companies. Below is a project description the students are given at the time the project is assigned: Objective: To communicate the nature of a company through a design that appeals to a particular audience and to apply Gestalt principles to the development of a mark that is more than the sum of its parts. Procedure: Design a combination mark for one of the companies listed below. • Manard—A national heavy-equipment manufacturer that specializes in tractors, end loaders and so on. • Antique Oak—A trendy eatery located on Chicago’s North Side that caters to young professionals. • Aurora—Manufacturer for retail sales of hiking clothing, tents and camping equipment. • Quadrata—An interior design firm that specializes in corporate accounts. Using each of the Gestalt principles discussed in class and your text, create a number of thumbnails. Make full- sized roughs of the best two, then create a finished design of the strongest one. You may use paint, ink, cut paper, etc., but limit your design to one color, (tints and shades are acceptable). Minimum size for the design should be 8 x 10 inches. Use at least two Gestalt principles in your final design and be prepared to discuss them in class. Consider the company and how it can be represented with an accurate and positive image. Be prepared to discuss why your design suits the company. Keep the audience in mind. What will appeal to them? Mount your design on black presentation board. Erase any pencil lines or guides when you have finished. To protect your artwork, attach a cover sheet of tracing paper or layout paper. It is also helpful to place your projects in a large envelope prior to turning them in. Remember to include your name (in pencil) on the back of your work. Mr. Derksen was provided with the same rubric the instructor used to grade the project. Theoretically, additional people could be brought on to assess the same project using this instrument. The rubric follows: Beginning Developing Accomplished Exemplary Score 1 2 3 4 Technical Demonstrates little or Demonstrates some Demonstrates Demonstrates Ability no knowledge of the knowledge of the adequate knowledge extensive knowledge techniques, materials, techniques, materials, of the techniques, of the techniques, and tools needed to and tools needed to materials, and tools materials, and tools complete the project. complete the project. needed to complete needed to complete The project lacks The project has the project. The the project. The neatness and good moderate problems project has only project is executed craftsmanship. with neatness and minor problems with with professional craftsmanship. neatness and quality crafts- craftsmanship. manship. Design/ Lacks planning. Some evidence of Evidence of good Effective use of Composition Makes little or no use planning. Moderate planning. Good use design elements and of design elements or use of design of design elements or principles. Shows principles. Includes elements or principles. Makes use careful planning and few or none of the principles. Includes of a number of the incorporation of the characteristics of some characteristics characteristics of characteristics of Gestalt Principles: of Gestalt Principles: Gestalt Principles: Gestalt Principles: • Figure/Ground • Figure/Ground • Figure/Ground • Figure/Ground • Continuation • Continuation • Continuation • Continuation • Similarity • Similarity • Similarity • Similarity • Proximity • Proximity • Proximity • Proximity • Closure • Closure • Closure • Closure Presentation Visual appearance of Visual appearance of Visual appearance of Visual appearance of the project and/or the the project and the the project and the the project and the student’s presentation student’s presentation student’s presentation student’s presentation doesn’t support the are barely adequate are consistent with complement the solution to the design to support the design the solution to the solution to the design problem. The student solution. The student design problem. The problem. The student cannot support can raise some points student can support can effectively his/her reasoning. that support his/her his/her reasoning. support his/her reasoning. reasoning and educate the client. Concept/ Project fails to solve Project solves some Project solves some Project effectively Effectiveness the design problem. aspects of the design aspects of the design solves the design The student uses problem. The student problem. The student problem. The student obvious or unoriginal uses some creative, uses some creative, integrates creative, devices in the design. unique and/or unique and/or unique and/or Few or none of the individual devices in individual devices in individual devices to specifications and the design. Some of the design. Most of ensure the design’s project parameters the specifications and the specifications and effectiveness. The are followed. project parameters project parameters specifications and are followed. are followed. project parameters are followed. Using a separate scorecard for each section, Mr. Derksen marked the level where he thought each student scored in the Technical, Design, and Concept/Effectiveness areas. Two of the sections evaluated directly correlate with the outcomes measured during 2002–03, Technical (demonstrate knowledge of industry tools) and Design (apply the principles and elements of design). Note that Mr. Derksen did not assess the Presentation category, since he judged the student projects after they had been presented. After Mr. Derksen scored the projects, he returned the projects and the scorecards to the program chair who tabulated the scores for comparison. The results The results are first displayed individually for each section of GDI 101. The average of both sections is displayed next. A summary then examines any trends. Section L01 results are as follows: Technical Ability • Beginning level – 8.3% • Developing level – 33.3% • Accomplished level – 33.3% • Exemplary level – 25% Design/Composition • Beginning level – 8.3% • Developing level – 50% • Accomplished level – 33.3% • Exemplary level – 8.3% Presentation Not evaluated Concept/Effectiveness • Beginning level – 8.3% • Developing level – 41.7% • Accomplished level – 41.7% • Exemplary level – 8.3% Section L02 results are as follows: Technical Ability • Beginning level – 8.3% • Developing level – 41.7% • Accomplished level – 41.7% • Exemplary level – 8.3% Design/Composition • Beginning level – 8.3% • Developing level – 50% • Accomplished level – 41.7% • Exemplary level – 0% Presentation Not evaluated Concept/Effectiveness • Beginning level – 8.3% • Developing level – 33.3% • Accomplished level – 41.7% • Exemplary level – 16.7% Average of both sections: Technical Ability • Beginning level – 8.3% • Developing level – 37.5% • Accomplished level – 37.5% • Exemplary level – 16.7% Design/Composition • Beginning level – 8.3% • Developing level – 50% • Accomplished level – 37.5% • Exemplary level – 4.2% Presentation Not evaluated Concept/Effectiveness • Beginning level – 8.3% • Developing level – 37.5% • Accomplished level – 41.7% • Exemplary level – 12.5% Observations about the scores: Technical Ability: The distribution of scores indicate the majority of students fall in the mid-range with equal numbers displaying developing and accomplished skills, at 37.5%. The number of students displaying exemplary skills followed at 16.7%. This would probably be in line with expectations for a beginning freshman class, but increased emphasis on technical ability in class instruction would likely boost some students from the developing category to accomplished and possibly increase the student presence in exemplary as well. Design/Composition: The distribution of scores indicates the majority of students fall in the mid-range with a full half of students displaying developing skills. The number of students displaying accomplished skills followed at 37.5%. The number of students displaying exemplary skills (4.2%) was about half that of those displaying beginning skills at 8.3%. With half the students exhibiting developing skills, there is clearly a need for increased instructional emphasis in design and compositional skills. Presentation: The evaluator did not feel comfortable making a determination regarding these skills based solely on the physical properties of the projects, so some method must be determined for providing access to the verbal presentations in order to obtain a meaningful assessment in this category. Options might include inviting the evaluator to attend classroom presentations, or perhaps a second option might be to videotape class presentations for review by the evaluator at a later date. Concept/Effectiveness: The distribution of scores indicate the majority of students once again fall in the mid- range with slightly more falling into the accomplished level (41.7%) than the developing level (37.5%). The number of students displaying beginning skills and exemplary skills were very close, 8.3% and 12.5% respectively. Increased emphasis in instruction regarding “conceptualizing and communicating effectively” would boost student skills from the developing category to the accomplished category and hopefully provide more students with exemplary skills. Other Assessment Methods Used Within The Department: Portfolio Show Capstone Students completing an Associates of Applied Science in Graphic Design and Illustration are required to complete and publicly show their student portfolio. Prior to the show, a panel of two or three professional designers meets with each student to discuss their portfolio’s strengths and weaknesses. Since the field of graphic design and illustration has no formal licensure process, a designer’s career rests with his or her portfolio. Completing the Loop: Use of Results Sharing the results The results will be shared with instructors who teach GDI 101 as well as advanced courses that will be compared with each other and with GDI 101. By making these instructors aware of the initial measurement tool and results, it is hoped that the 2003–04 academic year assessment will measure a broader student population and be a little more focused in the rubric’s construction. Suggested changes The 2002–03 assessment was fairly one-dimensional. It measured one project from two sections of a beginning course. However, it also partially measured two of the four stated outcomes. (I state partial measurement because only traditional tool use was measured, not software skill.) For the 2003–04 academic year, the project could be used to measure another outcome as well, professional presentation, with the addition of videotaping to the process. Additionally, since the project is a standard within the department, more advanced classes could complete it without a major change to the curriculum and a comparison could be drawn between classes to see if students are gradually improving in the stated goals. As a group, the instructors for these courses will assess all the projects across courses. The multi-instructor strategy will benefit the assessment process in two ways: • With a number of evaluators involved, a flaw in the 2002–03 assessment will be overcome. A single, though experienced, evaluator was measuring the subjective field of design. A larger group of evaluators will iron out any particular bias within the group, creating a more even-handed evaluation. • A possible correlation between student progression through the program and skill level advancement can be examined. It is hoped that a strong correlation will be seen—more advanced students should do better on the same project than those new to design. This approach is not bulletproof though, since different students are measured at different stages, rather than measuring the same student multiple times during her or his progression through the program. A possible future assessment should involve having the same individuals do the same project at different times in the program to measure individual growth.