Things to Consider When Undertaking a Program
Shared by: dd4f6d48e300e297
Things to Consider When Undertaking a Program Evaluation I. Key Questions to Ask Before Undertaking an Evaluation Key questions to ask yourself or others in considering undertaking an evaluation: (1) Who has the need for what information and why? (In other words, who wants the info? What do they want to know? Why do they want it?) What questions do people want answered about the programs (e.g., of outreach program: are customers aware of the program, are they getting the information; others: what are the barriers, what works/what doesn’t, what’s going on out there). Develop a list of potential stakeholders with their questions, information needs, and interests in the evaluation. This will help determine the purpose of the evaluation and get buy-in. (2) What is the “program” that you have been asked to evaluate. Define it or describe it by using methods like logic modeling to determine, inputs, activities, outputs , and outcomes. This will help organize your stakeholder questions and information sources around key program components. (3) What issues and/or problem areas have already been identified? Is there already sufficient information out there to answer many of the questions? Do a quick search of literature or interview program stakeholders. You may decide that many questions have already been answered and the evaluation is not needed. (4) What information sources exists? Can they be used to answer client/stakeholder questions? If readily available sources do not exist (as they often do not for behavioral or environmental outcomes) how feasible is it to obtain them. Estimate the time and costs. This will help you in designing the evaluation methodology. (5) What will the program do with the information or results once it receives them? What decisions need to be made or what decision-making process will the results feed into? (developing strategy or plan, begin rule making, shift in policy or management approach, develop performance measures). In other words, is the time right for the project? (6) Who is the principle owner of the evaluation? Does he/she/they have the capability or authority to make effective use of the evaluation information? Someone will need to adopt the results and make sure they are integrated into the program and followed up. II. Evaluation Projects to Avoid. These are some of my thoughts on evaluation projects to avoid based on bitter experience (I apologize if these sound obvious, but they weren’t always to me): - Avoid efforts that are not evaluation projects. Examples include state oversight (e.g., permit quality reviews, and regional “data pulls” (e.g., all uncompleted hazardous waste clean-ups) (See definition of program evaluation attached to OPEI/OPAA program evaluation competition memo) - Avoid efforts to characterize and promote “success stories.” Examples of successful practices or process may be an important outcome of the evaluation. But if you start with that mind set, it will bias the results. - If results are needed within a 6 month time frame, avoid evaluation designs that require an Information Collection Request and OMB approval. It takes too long. If you have a year or more, do it, but build in up front approval time into the project schedule. - Avoid programs areas that have been evaluated several times before (”burned over”). If GAO, IG, and others have already been there, you are unlikely to find anything new. If you must, ask new questions or use different information sources. If evaluations are going on concurrently, coordinate as best you can to avoid duplication of effort (Example: When I conducted an evaluation of the water quality standards process recently I discovered that the IG was also conducting an audit of the same program at the same time. I coordinated with them and avoided asking similar questions or went deeper on some topics. I also avoided their information sources). - Avoid evaluation of new initiatives or “buzz words.” Some “programs” or initiatives are too new to have developed a track record or sufficient data to draw on. The problem with buzz words or a new popular Administrator’s initiative, everybody will claim they are doing it. Outline of the “program” will be vague and hard to define (example: technology transfer). My advice, don’t go there. But if you have to: develop strict operating definitions of the program for the evaluation and stick to them. - Avoid evaluation projects that no one seems really interested in but YOU. No matter how interesting it may seem or intellectually stimulating you find it, it will generally be a waste of time; unless, of course, you have brilliant marketing skills and you are confident you can sell the results to anyone.