Docstoc

ILP Appendixes

Document Sample
ILP Appendixes Powered By Docstoc
					Appendixes to the report of AEA’s International Listening Project
  Appendix A: Original AEA Request for Proposals...................................................................................... 2
  Appendix B: Proposal submitted by Jim Rugh ....................................................................................... 10
     Addendums to Jim Rugh’s International Listening Project proposal to AEA (revised June 3) ............ 15
  Appendix C: Methodology and process timeline used for this International Listening Project ............ 19
  Appendix D: Questionnaire used for survey .......................................................................................... 26
  Appendix D CV of Jim Rugh .................................................................................................................... 29




International Listening Project Appendixes                                                                                                Page 1
    Appendix A: Original AEA Request for Proposals




AEA's International Listening Project
Request for Proposals
Proposal Due Date: Monday, May 2, 2011

Introduction
In November of 2010, a task force was established by the AEA Board of Directors to develop strategies
for learning what appropriate roles might be for AEA in the international community. The AEA
International Listening Project Task Force seeks to learn how AEA can engage in ways that are mutually
beneficial to the association and entities internationally. AEA currently designates a representative to the
International Organization for Cooperation in Evaluation (IOCE), provides joint membership with the
Canadian Evaluation Society (CES), and supports an International and Cross-Cultural Topical Interest
Group. In 2010, about 15%, or about 900 AEA members, resided outside the United States, and in the
same year, AEA welcomed about 8% or 220 international conference registrants to its annual meeting.
The goal of the International Listening Project is to develop AEA policies that clearly articulate the role
the association wants to play in the international community of evaluators and evaluation users in
accordance with AEA's Goals Policies (see Appendix A: 2010 Goals Policies Section 1.4).

Purposes
This activity, AEA's International Listening Project, is aligned with AEA's mission. Specifically, the activity
reflects AEA's commitment to valuing a global and international evaluation community and understanding
of evaluation practices (Value iii). The project also builds on previous work conducted by AEA's
International Committee (see Appendix B: Proposed Relationships and Responsibilities - Goals and
Policies Specific to International Issues, 2009). The goal of this project is to advance AEA's development
of policies that are in accordance with AEA's mission and values.

This project should embody AEA's intention to "listen" to others among international communities. AEA
values its relationship with partners in a global community and believes that AEA will only benefit by
listening well to the perspectives, experiences, and needs of its partners. Finally, we anticipate that in the
process of developing policy guidance, the project will also identify specific activities. These should be
captured and will be shared with AEA's Executive Director. The policy guidance should be in the form of
key principles, themes, and criteria for future internationally-focused work.

What follows are suggestions for the shape and content of this project, based on current understandings
of the international evaluation domain by the AEA Board. Other approaches to this project are welcomed.

Participants
The project will solicit the perspectives and advice of a wide range of AEA stakeholders. The individuals
and organizations to be consulted include, in priority order:

         1.   Leaders of other evaluation associations
         2.   Members of AEA who live outside the United States
         3.   Members of AEA who live in the United States but do international work
         4.   Leaders responsible for evaluation in multilateral and bilateral agencies



International Listening Project Appendixes                                                               Page 2
         5. Representatives of foundations interested in evaluation, evaluation capacity building, and
            evaluation in international contexts
         6. Members of AEA who only work within the United States

A preliminary list of likely organizations to engage is listed below. Additional associations, organizations,
and funding agencies should be identified and included.

                African Evaluation Association (AfrEA)
                Australasian Evaluation Society (AES)
                Canadian Evaluation Society (CES)
                Community of Evaluators (COE) (South Asia)
                European Evaluation Society (EES)
                International Development Evaluation Association (IDEAS)
                International Organization for Cooperation in Evaluation (IOCE)
                International Program Evaluation Network (IPEN) (Russia, NIS)
                Network of Networks of International Evaluators (NONIE)
                Latin American Evaluation Network (EvalNet)
                Organization for Economic Co-operation and Development (OECD)/Development Co-
                operation Directorate (DAC)
                Red de Seguimiento, Evaluacion y Sistematización en America Latina y el Caribe (ReLAC)
                United Nationas Evaluation Group (UNEG)
                United States Agency for International Development (USAID)
                World Bank Evaluation Cooperation Group (ECG)
                The Ford Foundation
                The Bill & Melinda Gates Foundation
                The David & Lucile Packard Foundation
                The Rockefeller Foundation

Activities

1. Planned interviews with specific individuals
Specific individuals will be selected to provide a range of views and to ensure relevant organizations are
engaged. These individuals will include representatives of other associations, organizations, and funding
agencies. Interviewees may include follow up with individuals who post insights or comments on the AEA
websites (below). A set of semi-structured interview questions should be developed based on a list of
possible AEA goals. The interviews would be conducted by phone, virtually, or in person where possible.

2. Web-based feedback
The project should facilitate generating feedback through a web-based platform; feedback should also be
made transparent as much as possible. There are at least two options: 1) Visibly sharing the feedback, as
occurs in comments on a blog that allows synergies to emerge between ideas and issues. (This option
could be facilitated with the assistance of AEA's staff via a blog.) 2) Soliciting feedback which is only
shared when it has all been received (similar with approaches to reviewing AEA policy statements).

3. Other
Respondents to the RFP may offer suggestions of other effective and economical ways to solicit feedback
in accordance with the overall purposes of the project.

The information gathering and analysis processes should accommodate other languages (French,
Spanish) besides English.




International Listening Project Appendixes                                                              Page 3
Analysis
Basic descriptive analysis of interview data and documentation may be conducted. The goals listed in
previous documentation (see Appendixes) may provide an initial set of categories for clustering the data,
and should be revised as needed. Data in these categories could then be summarized, with analysis of
any patterns in the responses in terms of sources of comments or positions on particular issues, and
illustrative comments.

Product
The project will result in a recommended set of principles, strategic directions, and goals for AEA's
international presence and roles that AEA could pursue. One principle, for example, could be "AEA should
partner with at least one other organization in all of its international activities." And one strategic
direction could be "AEA should enhance the professional development opportunities for evaluators around
the globe, especially in areas under-served by extant opportunities." The recommendations should
represent highest priorities, rather than an exhaustive catalogue of possibilities, and may reflect the
standpoints of targeted individuals, organizations, or other entities. All recommendations should be
congruent with AEA's mission, vision, and values. Advantages and limitations of each recommended
principle, strategic direction, and goal should be identified, including benefits to the AEA as an
association, benefits to partnering entities, political consequences, industry and peer trends, technology,
costs, and financial sustainability. Ideas for specific activities may emerge during the data collection
process, and these can usefully be summarized; however, operational-level activities are not the focus of
this project.

The final report from this project will be used primarily by the AEA Board of Directors to determine the
association's future priorities in the international domain. A written narrative report (20 pages maximum)
will be the major expected product. Presentation slides or some alternative format for communicating
findings are the second product.

Who will conduct this "listening project"?
An evaluation researcher will be contracted to conduct the project. The researcher should be well-
positioned to represent AEA, should have knowledge of and contacts with relevant societies and
associations, and should have access to a wide variety of partners. Experience in interviewing and
qualitative analysis is also required.

The International Listening Task Force (Patricia Rogers, chair, Tristi Nichols, and Victor Kuo) will appoint
an International Listening Project Oversight Task Force from the membership of AEA (3-4 persons). The
Oversight Task Force will be responsible for reviewing proposals and recommending one to the board, for
direct communications with the contracted researcher during the project, and for reviewing a draft of the
final report. One or more members of the Board's International Listening Task Force will also serve on
the Oversight Task Force to ensure common understandings are enacted and to facilitate
communications. The International Listening Task Force more broadly will serve as the liaison between
the contracted researcher and the Board. The AEA Board is ultimately responsible for approving both the
contracted researcher and the final report.

Timeline
    May 2, 2011: Proposals Due
    May 25, 2011: Consultants reviewed and selected (Board approval required)
    August 15, 2011: Draft report and presentation materials due to Board
    September 1, 2011: Board review and approval of draft materials due back to Consultant
    September 30, 2011: Project Completion
    November 2-5, 2011: AEA Think Tank Session (proposed by AEA Task Force)

Budget


International Listening Project Appendixes                                                           Page 4
The maximum expendable under this contract is $8,000.

Instructions for Submitting Proposals
Proposals are due by Monday, May 2, 2011. Submit an electronic version of the proposal, as a single file
including CV(s), to Heidi Nye at the American Evaluation Association: heidi@eval.org

Proposals should be no longer than 5 pages and include:
     Brief narrative describing approach and methods
     Task plan and timeline
     udget with budget narrative

CVs of key consultants may be included as appendices in addition to the 5 page-narrative. No other
appendices or approaches to extending the length of the proposal will be accepted.

Proposals that do not follow the submission instructions and length restrictions will not be reviewed.

Questions
Questions regarding this the substance of this proposal should be directed to AEA President Jennifer
Greene at jcgreene@illinois.edu; questions regarding the submission process should be addressed to
Heidi Nye in the AEA office at heidi@eval.org.

Appendix A-1: AEA Goals Policies 2010-2011, Section 1

I. GOALS POLICIES - AS APPROVED JULY 2010:

1. Evaluators: AEA will provide and increase access to resources, and contribute to communities, that
enable and support evaluators to:

A. become knowledgeable, effective, culturally competent, and ethical professionals;
       i. provide resources that support rigorous education regarding history, methods, and theories of
       evaluation
       ii. provide resources for teaching evaluation standards, ethics and culturally responsive practices

B. use a multicultural lens to inspire excellence and rigor in evaluation theories, methods, applications,
and practices, specifically to:
        i. expand our understanding of multiculturalism and enhance our ability to confront oppression in
        various forms.
        ii. promote international solidarity among evaluators and openness to additional diverse social,
        cognitive, and political perspectives that can influence how we think about and practice
        evaluation.

C. develop, disseminate, and transfer knowledge about evaluation;

D. engage diverse communities in evaluation practice;

E. use culturally responsive evaluation models to contribute to inclusiveness in society and to enhance
social justice and equity for persons of color and others from underrepresented groups;

F. contribute to building evaluation capacity within the communities and organizations in which they
work;

G. facilitate meaningful feedback mechanisms from evaluation consumers/public at all stages of
evaluation;


International Listening Project Appendixes                                                           Page 5
H. engage in the field and profession of evaluation and in the life of the association;

I. engage in other fields and associations that are related to or aligned with the field of evaluation;
Specifically, AEA has taken steps towards achieving the above Goals Policies through the following
activities and programs:

        1.1. Annual Conference: AEA will have an Annual Conference

                 1.1.1 Content for the Annual Conference is primarily identified through a member-driven
                 peer review process
                 1.1.2 Content for the Annual Conference is also identified by the President or her or his
                 Designee for a thematically-focused Presidential Strand

        1.2. Awards: AEA will have an Awards Program

                 1.2.1 The AEA Awards Program will recognize excellence in the field of evaluation.

        1.3. Diversity Focused Programs: AEA will offer diversity-focused programs and services

                 1.3.2 AEA will offer a Graduate Education Diversity Internship Program
                 1.3.1 AEA's diversity-focused programs will be aimed at expanding the diversity of the
                 membership and those practicing and teaching in the field.
                 1.3.2 AEA will offer a Graduate Education Diversity Internship Program
                 1.3.3 AEA will offer a range of other diversity-focused programs for students and
                 professionals

        1.4 Internationally Focused Programs: AEA will internationally-focused programs and services

                 1.4.1 AEA's internationally-focused programs will be aimed at supporting our
                 international members, evaluators working in international contexts, AEA's
                 networking with other associations, and the field as it expands internationally

                 1.4.2 AEA will designate a representative to a major international organization
                 1.4.3 AEA will offer a joint membership program with the Canadian Evaluation Society
                 1.4.4 AEA will offer a range of other internationally-focused programs

        1.5 Journals and Peer-Reviewed Content: AEA will offer peer-reviewed content

                 1.5.1 AEA will offer New Directions for Evaluation as part of membership
                 1.5.2 AEA will offer the American Journal of Evaluation as part of membership
                 1.5.3 AEA will take a variety of approaches to increasing access to other peer-reviewed
                 content.

        1.6 Practitioner-focused content: AEA will offer practitioner-focused content

                 1.6.1 AEA will offer association-sponsored publications with practitioner-focused content
                 1.6.2 AEA will increase access to other materials including electronic resources for
                 evaluation practitioners

        1.7 Professional Development: AEA will offer professional development




International Listening Project Appendixes                                                                Page 6
                 1.7.1 AEA will offer professional development workshops as part of the annual
                 conference,
                 1.7.2 AEA will offer an Evaluation Institute
                 1.7.3 AEA will offer a range of other professional development opportunities.

         1.8 Standards and Principles for the Field: AEA will serve as a leader in the setting and vetting of
         Standards and Principles for the field

                 1.8.1 AEA will develop and maintain Guiding Principles for Evaluators
                 1.8.2 AEA will send a Representative to the Joint Committee on Standards for
                 Educational Evaluation (JCES)
                 1.8.3 AEA will be involved in the vetting process for new standards in the field
                 1.8.4 AEA will also support other programs that promote and advance ethical practice in
                 the field.

Appendix A-2: AEA International Committee Report

AEA International Committee's proposed statement for consideration by new AEA
committees related to RELATIONSHIPS AND RESPONSIBILITIES - GOALS AND POLICIES
SPECIFIC TO INTERNATIONAL ISSUES[1]
INTERNATIONAL GOALS

AEA commits to promoting and supporting, in the United States and internationally:
1. recognized evaluation principles, concepts, methods, and techniques;
2. improvements and refinements of existing and emerging evaluation concepts, methods, and
techniques
3. mutual learning related to the theory and practice of evaluation for multiple purposes in a diversity
of contexts.
4. strong international evaluation organizations and networks
5. the needs of the AEA's international membership

DESIRED RESULTS

These   goals are intended to ensure:
        mutual learning and sharing of knowledge and experiences;
        professional and personal development;
        evaluation capacity building and training;
        business opportunities and consulting;
        fostering and developing relationships;
        networking and collaborating;
        the needs of AEA's international members are considered and reflected in AEA's policies and
         practices
        international organizations' growth and development; and
        fun!

These goals will be demonstrated by significant progress in the following areas:

1. Professional Development of Evaluators:

AEA members within the United States who:



International Listening Project Appendixes                                                             Page 7
         a) are more knowledgeable about evaluation principles and practices outside the United
         States;
         b) create and participate in opportunities for mutual learning and sharing of knowledge with
         evaluators and evaluation organizations globally;
         c) support the creation or strengthening of evaluation organizations or groups in other
         countries; and
         d) are aware of individuals and organizations outside the United States with evaluation
         expertise that can be useful to their professional needs and facilitate communication with such
         individuals/organizations.
AEA members from other countries and other evaluators throughout the world who:
         a) are even more competent, effective, and ethical;
         b) develop, disseminate, and transfer knowledge about evaluation with evaluators in the
         United States and other countries;
         c) engage with and contribute to inclusive, diverse, and international networks and
         communities of evaluation practice;
         d) understand their role in building evaluation capacity in their nations, communities, and the
         organizations within which they work; and
         e) access and use high-quality resources, including those of AEA.

     The AEA's contribution will be to:
maintain, develop, facilitate and support relationships among members of the AEA, in the United States
and elsewhere, that promote collaboration, partnership, experience and understanding of evaluation at
a global level.

2. Institutional Capacity

    Decision-makers and policy-makers in the United States and elsewhere will:
    have a basic knowledge about multiple ways of doing and using evaluation;
          a) hold positive views about the role evaluation can play;
          b) have evaluation policies in place;
          c) use well-known, respected resources, including those of AEA, to support their efforts; and
          d) make evaluation a standard business practice in their organization's operations.

      The AEA's contribution will be to:
i. maintain, develop, facilitate, and support relationships with and among a variety of organizations
(international, national and regional membership associations, consortia of organizations) that promote
high-quality evaluation standards and practices; and
ii. assist these organizations, as appropriate, to develop or promote effective evaluation and its
continual improvement within their spheres or sectors.

3. Public Awareness:

The general public in the United States and elsewhere are:
          a) aware of the concept of evaluation;
          b) understand the value of evaluation;
          c) understand that evaluation can improve programs and policies in a variety of settings
          around the world;
          d) expect/demand that programs be evaluated; and
          e) be better consumers of evaluation information.

     The AEA's contribution will be to:


International Listening Project Appendixes                                                          Page 8
          i. provide access to educational materials and in other ways make available to the public
          opportunities to learn about sound evaluation principles, concepts, methods, and techniques;
          and
          ii. offer access to individuals and organizations available for networking.


[1] This is based on the AEA Board draft policy statements contained in the document Policy-Based
Board Governance pages 8 & 9. Input for this statement was received from participants in two sessions
during the AEA conference in Denver, 2008. It was submitted to the AEA Board in January 2009




International Listening Project Appendixes                                                        Page 9
Appendix B: Proposal submitted by Jim Rugh

                                                Jim Rugh
                                         451 Rugh Ridge Way
                                      Sevierville, TN 37876 USA
                      Phones: Mobile: +1 (865) 696-0401 H/O: +1 (865) 908-3133
                        e-address: JimRugh@mindspring.com Skype JimRugh
                                     Monday, December 19, 2011

To AEA International Listening Task Force
c/o Heidi Nye

Dear Heidi and other colleagues,

I am writing to express my interest in serving as the Coordinator of the implementation of
AEA’s International Listening Project.

I’ll begin this response to the RFP by attempting to make the case for why I am especially well
qualified to lead this undertaking on behalf of AEA, then describing how I would propose to
coordinate the process.

My interest and qualifications:
Shortly after I arrived in San Antonio last November several AEA Board members approached
me to let me know that the Board had decided to form the International Presence Task Force
(IPTF) and launch the International Listening Project. (Evidently the name of the TF has now
been changed to the International Listening Task Force.) I was obviously pleased and excited to
hear this, because I’ve been advocating for some time for AEA to be more engaged
internationally. I was invited to serve as an advisor to the IPTF, during which time I shared the
policy proposal that the International Committee had worked on for two years (but then
seemed to have been ‘lost between the cracks’ during the Board transition), as well as my
suggestions for how to structure the International Listening Project (ILP). I recognize many of
my ‘fingerprints’ in the RFP that was circulated on April 13.1

When it became clear that the IPTF wanted to outsource (contract out) the implementation of
the ILP I decided that it would be best for me to disassociate myself from the Task Force, since I
was keenly interested in being directly involved in the ILP process itself. I was not interested in
simply sitting back and waiting for some consultant to provide a summary of many interviews
and a synthesis proposal to the Board. As someone who is personally acquainted with most of
the ‘audience’ of the ILP I wanted to listen directly to what they have to say! I want to
participate in the conversation.


1
    Documented evidence available on request.

International Listening Project Appendixes                                                  Page 10
As is mentioned in my CV (copy appended below, page 29 ff) and in the recognition I received
when I was awarded the 2010 Alva and Gunnar Myrdal Practice Award, I’ve been involved in
international development for 47 years, and as a specialist in the evaluation of international
development programs for 31 years (since getting a MPS degree in the subject from Cornell
University). As you also know, during the past 3+ years I have been the AEA Representative to
the IOCE. In that role I have been privileged to strengthen and broaden my networks and have
gotten to know and work closely with my professional colleagues around the world, including
the leadership of all of the 117 evaluation associations we have made contact with. (On behalf
of the IOCE Board I am the one who maintains that database.)

So when I submitted to the IPTF the list of associations that should be proactively targeted for
the ILP (see page 2 of the RFP) I was referring to organizations and leaders with whom I already
have direct personal working relationships. If I am chosen to be the Coordinator of the
International Listening Project I will not be making cold-call interviews between strangers –
rather, I’d be conducting conversations with colleagues who already know me and my
reputation as (if I may be so un-humble to say so) a recognized leader in the global networks of
international evaluation professionals.

Basic concept of the proposed process:
Note: Most of what was included in the RFP was based on what I had already proposed to the
IPTF. That includes the modalities for soliciting input, target audiences for proactively
conducting interviews, etc. I’ll elaborate a little more here, including some suggestions for how
I’d like this to be a process that actively engages a wide range of colleagues in helping us,
collectively, to not only come up with ideas but to engender commitment to carrying through
prioritized initiatives.

First of all, let me provide two visuals that depict what I envision as a democratically engaging
process. The diagram to the left depicts what might have been presumed to be the expected
process of the consultant collecting input from multiple sources, distilling it, selecting priorities
and then submitting a synthesized proposal to the AEA Board. After the Board decides what
policies it wants to approve, it instructs the AMA secretariat (Susan Kistler’s team) what to
implement. The graphic on the right doesn’t completely replace that process, but it gives
greater emphasis to an interactive, participatory, synergistic process of transparent sharing
(crowd sourcing) among many interested participants.




International Listening Project Appendixes                                                     Page 11
The reason I’d prefer to add a process like that depicted on the right is that I would want this to
be a process that solicits buy-in from a broad cross-section of AEA members and international
partners that share ideas with each other – thus promoting the synergy of creatively bouncing
ideas off of each other. In other words, I envision this process to be much more than
individuals individually submitting ideas (one-way) into a central clearinghouse (black box) that
then unilaterally submits a proposal to the Board.

I propose a process that would be guided by the principles of Reflective Practice (see Patton
2010). I’ll elaborate with more details in the next section.

A more detailed proposed process:
Step 1: Disseminate the invitation to join in this International Listening Project
         I assume that the RFP was disseminated to all AEA members. It was also forwarded to
those on the IOCE EvaLeaders listserv, which includes leaders of many of the other associations
around the world. So many if not most in our community have already received an introduction
to this project. Nevertheless, when we are ready to launch the active learning phase of this
initiative I’d send an ‘Invitation to Participate’ based on a modified version of the RFP in the
form of introduction, along with the relevant excerpts from AEA’s Goals and Policies and the
International Committee’s proposed policies, plus, of course, details giving modalities
(channels) for submitting suggestions and participating in other ways.
         I would ask you (Heidi) or Susan to send the ‘Invitation to Participate in the International
Listening Project’ to all AEA members. In addition I would post it to the IOCE EvaLeaders
listserv (67 leaders of national, regional and global professional associations), the XCEval listserv
(986 subscribers all around the world) and the IDEAS listserv (360 individual members, most of
whom participated at one time or another in the IDPET training.)
         That’s the proposed process for general dissemination – inviting input from whomever
(‘domestic’ or international members of AEA, and other evaluators around the world whether
or not they are AEA members). In addition I’d send personal invitations to targeted leaders of
the groups identified in the RFP (based on the list I submitted), proactively soliciting their input
into this process.

Step 2: Modalities for collecting input



International Listening Project Appendixes                                                   Page 12
         2.1 Conduct direct interviews with targeted leaders of the organizations listed on page 2
of the RFP (professional associations, international organizations, foundations, etc.). In addition
to conducting such interviews via phone or Skype or email, I will have opportunities to meet
with many such leaders in person during conferences I’ll be attending during this timeframe in
Brazil, Sri Lanka, Atlanta and Washington. (I already had preliminary discussions during
AEA/San Antonio and NONIE/Paris conferences and other occasions.)
         2.2 Emailed suggestions from any colleague, submitted to the ILP Coordinator (note:
some on the targeted interview list may opt to communicate their input in written form via
email);
         2.3 Postings to the web-based blog.2 As suggested by the graphic on the right side of
page 2 above, this modality will allow for and promote interactive synergy. For example, the
posting by one person might stimulate creative ideas by others. Cognizant that this process
could generate a multitude of miscellaneous ideas, I’d like to see a Facebook-type function
where readers could click on a ‘Like’ or ‘Dislike’ button to get some measure of affirmation or
disapproval of an idea. I’d go even further, adding a 3rd response option: “I like that idea so
much I volunteer to be part of a task force to implement it!” (See more about this below.)
         Note: though this will be an open, anyone-who-is-interested process, I will do what I can
to proactively solicit input from the range of targeted participants listed on the bottom of page
1 and top of page 2 of the RFP. We might also look for ways to potentially disaggregate the
data by those types of participants. I.e. what are the ‘demographic’ characteristics of those
predominantly proposing certain policies and initiatives?
         Further note: In addition to inviting postings in English, we will welcome contributions in
French or Spanish or Portuguese or Russian (or any other language supported by MS Word’s
translation software) from international colleagues who may not be fully comfortable
expressing their ideas in English. Though there could be sub-conversations that continue in
those languages (as has been done on the AfrEA listserv), I’d want to at least use software-
based translation programs to roughly translate the non-English postings into English so others
could at least get the gist of what others have posted.

Step 3: Phases of reflective practice (modified from Patton 2010 pp. 265-269)
        3.1 The AEA Board has already identified the main purpose for this inquiry: “ to develop
strategies for learning what appropriate roles might be for AEA in the international community,” or “to
develop AEA policies that clearly articulate the role the association wants to play in the international
community of evaluators and evaluation users in accordance with AEA's Goals Policies.” (Another way to
express that might be “To develop strategies that would guide AEA as it seeks ways to more actively and
constructively engage with the international evaluation community.”) I’d like to add a sub-objective: In
addition to informing AEA policy decisions, let’s use this listening process to gauge interest by individuals
and sister organizations to actually participate in activities consistent with policies and priorities agreed by
the AEA Board.
         3.2 Turn the concept into an experiential inquiry question: Another way of formulating our main
question is “In what ways would you want AEA to be more engaged in promoting evaluation
internationally?” I anticipate that some of the responses we’ll get will be in the form of broad ideas that



2
 Note: in addition to many if not most of the ideas in the RFP, the suggestion for web-based feedback was also
mine.

International Listening Project Appendixes                                                                Page 13
could be informative for policy-level considerations. Other responses will likely be very focused on
specific proposed activities.
         3.3 After what MQP refers to as “the stories” have been told (or, in this case, suggestions have
been submitted, however articulated), invite colleagues to help identify patterns and themes, and
consider the implications (perhaps for further discussion). In other words, in addition to and more than
the ILP Consultant sorting ideas into main and sub-categories and then synthesizing them on his own,
welcome others’ input into the process.
         3.4 Generate action agreements. In this case synthesize and prioritize the issues and ideas that
have been collected through this listening (reflective practice) process. We would make this process
consistent with what’s stated in the RFP, i.e. the ensuing policy guidance submitted to the Board for
approval should be in the form of key principles, themes, and criteria for future internationally-focused
work

        Note: though I’m proposing that this process be as open, participatory and transparent as
possible, I also acknowledge that it will still be the main responsibility of the ILP Coordinator, in frequent
communication and collaboration with the International Listening Project Oversight Task Force, to do the
gathering of input, proactively facilitating even the “crowd-sourced” blog-based interactive process, lead
the synthesis and prioritization, and, of course, submission of the resulting recommendations to the
Board. Nevertheless, rather than be a “black box,” closed, one-way process implied in the left graphic on
page 2, my desire is that the listening process itself promote a high level of active involvement in
generating ideas and stimulating interest by many in actively participating in whatever projects we agree
to undertake. One further note: though the “crowd-sourced” open sharing via the web likely will
generate its own categories, I may need to guide the process by suggesting categories of themes as they
evolve in order to provide an easy-to-follow organization of the ideas. To get the process started and
provide some initial structure, as suggested in the RFP, I would suggest some of the categories that
come from the International Committee’s policy proposal – which itself was generated from more than
two brainstorming sessions during AEA conferences plus more than a year of communications among IC
members and others.

Step 4 Final report and next steps
       As alluded to above, in addition to the main product being a set of recommendations to the
Board of policies related to international engagement that might be considered, I’d want to capture many
of the detailed suggestions for how eventually-agreed-upon-by-the-Board priority initiatives could be
implemented, and the names of colleagues who’ve shown their interest in participating in such
undertakings.

Timeline and budget
        The RFP projects the starting date for the actual ‘listening’ process to begin sometime
after May 25 when the Board is to have agreed on the selection of the consultant. The initial
draft report is due by August 15. That’s a 12-week window. Assuming that much of the early
drafting of the report can take place while the data collection and participatory categorization
and prioritization processes are going on, I’d want to submit the 1 st draft report to the ILP
Oversight Task Force by August 3. Giving them a week to provide feedback to me, that would
take us up to August 10, giving me 5 more days to prepare the final version of the report for
submission to the Board. That’s a tight schedule (barring disruptions such as as-yet-unforeseen
travel assignments), but I’d hope to be able to do it that way. Otherwise the active listening
window would be even less than 10 weeks – assuming we’ll be able to launch it shortly after
the May 25 decision is made by the Board, the contract is prepared and signed, and whatever
time it might take for Susan and her team to set up the web-based blog-type data collection-
and-sharing system.

International Listening Project Appendixes                                                            Page 14
        I’m not sure I understand what is to take place between Board approval of the report on
September 1 and the ‘project completion date’ of September 30. Unless it would be to make
final revisions to the report, based on feedback from the Board (in addition to ongoing
feedback from the Oversight TF), before disseminating it to the general public. That could also
be the time to launch specific initiatives consistent with the policy implications the Board will
have approved by Sept. 1
        As to budget: the RFP states that the maximum amount that can be contracted for this
ILP is $8,000. At my usual rate of compensation that would cover 10 days. I guess I’ll just have
to stop counting ‘billable days’ after that! (I’ll add the rest to everything else I do for AEA and
IOCE on a pro-bono basis.)



Addendums to Jim Rugh’s International Listening Project proposal to AEA3 (revised June 3)

1) A preliminary list of named contacts and sampling plan that ensures diverse perspectives

As stated in the RFP (picking up on the list I had submitted to the International Presence Task Force last
December), the International Listening Project (ILP) will solicit the perspectives and advice of a wide
range of AEA stakeholders. The individuals and organizations to be consulted include the following
categories:4

           1.    Leaders of other evaluation associations
           2.    Members of AEA who live outside the United States
           3.    Members of AEA who live in the United States but do international work
           4.    Leaders responsible for evaluation in multilateral and bilateral agencies
           5.    Representatives of foundations interested in evaluation, evaluation capacity building, and
                 evaluation in international contexts
           6.    Members of AEA who only work within the United States
           7.    Members (but not necessarily leaders) of evaluation associations in other countries who
                 are not members of AEA

I propose using three communication modes for soliciting input from these audiences:

       A) I plan to send to the following persons an invitation via e-mail to participate in an online survey
          questionnaire:
               a. All AEA members (via the AEA Management Office)
               b. AEA ICCE TIG (via AEA)5
               c. IOCE-EvaLeaders listserv (68 leaders of evaluation associations around the world)


3
    In response to questions asked by ILP Oversight TF
4
 Though the various modes of communication mentioned below are not specifically targeted at people in one or
another of these categories, I will try to identify which category / categories respondents identify with so that we
                                                                th
can disaggregate the data accordingly. Note: I’ve added the 7 category.
5
 Even though this will probably be duplication (cross-posting), I want to be sure International and Cross-Cultural
Evaluation TIG members feel especially invited to participate in this survey.

International Listening Project Appendixes                                                                   Page 15
    B) I will send a more personalized invitation to selected leaders of each of the following partner
       associations, agencies, foundations, etc., with the questionnaire, but also welcoming the
       opportunity to personally interview them via phone or Skype if they would be interested:

               African Evaluation Association (AfrEA)
               Australasian Evaluation Society (AES)
               Canadian Evaluation Society (CES)
               European Evaluation Society (EES)
               International Development Evaluation Association (IDEAS)
               International Organization for Cooperation in Evaluation (IOCE)
               International Program Evaluation Network (IPEN) (Russia + Newly Independent States)
               Network of Networks of International Evaluators (NONIE)
               Organization for Economic Co-operation and Development (OECD)/Development Co-
                operation Directorate (DAC) EvalNet
               Red de Seguimiento, Evaluacion y Sistematización en America Latina y el Caribe (ReLAC)
               South Asian Community of Evaluators (COE)
               United Nations Evaluation Group (UNEG)
               United States Agency for International Development (USAID)
               World Bank Evaluation Cooperation Group (ECG)
               The Ford Foundation
               The Bill & Melinda Gates Foundation
               The David & Lucile Packard Foundation
               The Rockefeller Foundation

    C) I propose to send an open invitation to an even wider range of colleagues to contribute their
       ideas and suggestions via a web-based blog-type program. I guess this is called crowd-sourcing.
       In addition to those listed above, I propose to send such an invitation to the 1,000 subscribers
       to the XCEval listserv (originally set up for the ICCE TIG, but now includes individuals way beyond
       AEA). I would also send it to the 361-member IDEAS listserv, as they are specialists in the
       evaluation of international development programs.

Note: As this process unfolds I will periodically ascertain what categories of persons we’ve received
responses from. If it appears that some groups seem underrepresented I’ll more proactively reach out to
them to be sure their voices are heard.

2) Additional description of analytic methods (proposal sections 3.3 and 3.4) to ensure
systematic analysis of qualitative data

        I envision using a process similar to that used by the AEA leadership in preparation for the June
        15-16 Board-plus-PAT retreat. I.e. my final report to the Board and others in AEA would look
        somewhat like the report of the results of the pre-retreat survey. This begins with a summary of
        the main themes and sub-themes of policy implications, backed by actual quotes from what
        survey responders submitted. More than that, the full version of my report will contain annexes
        with more details on the range of suggestions received, analysis of the level of support for them
        – disaggregated by the categories of stakeholders identified above – and more extensive
        quotations (stories in their own words).


International Listening Project Appendixes                                                        Page 16
           I propose using a modified dual Delphi technique6 for processing the input we will receive.

           A) After collecting either written or transcribed (from oral interviews) input from the surveys
              and interviews from communications methods A and B mentioned above, I’ll code and
              cluster issues using ATLAS.ti qualitative analysis software. Then share the initial findings –
              including policy ideas emerging from the data – with key stakeholders (described below), to
              solicit their input into further refining and clustering the policy implications.
           B) Take advantage of the “crowd-sourcing” functionality of the web-based blog medium to
              promote what Turoff and Hiltz and Bolognini7 describe as web-based Delphi methods
              involving large numbers of people.

           In other words, though I will be personally involved in and responsible for much of the initial
           process of collecting and analyzing the data obtained from the survey responses and personal
           interviews, rather than deciding on my own what policy-relevant themes should be prioritized
           and subsequently proposed (unilaterally) in my report to the Board, I would prefer utilizing
           various means to solicit involvement in this process by interested colleagues (key stakeholders).
           I will invite colleagues who have shown a special level of interest in the ILP to join in the 2nd
           round of the Delphi process of synthesis and prioritizing the input that has been received.8
           Perhaps we could name them the ILP Support Team.

           In parallel with the primary data collection via questionnaires and interviews, the open, crowd-
           source web-based blog discussions will, in effect, be an on-going Delphi process or conversation,
           wherein as participants see what others have posted, their own ideas may evolve, and they’ll
           comment on others’ comments. It will be interesting to see what different ideas will come out
           of this process, compared to the more traditional data-gathering process mentioned above. I
           anticipate that both will have valuable contributions to make to both the final report to the
           Board on policy implications, as well as a collection of more action-specific ideas that will be
           useful as the International Listening Project evolves into a range of International Action Projects.



3) A timeline of activities and deliverables (e.g., Gantt chart) clarifying page 5 of the proposal

           May 24            Message received from Lennise Baptiste tentatively offering ILP consultancy
                           contract
           May 24            I accepted
           May 27            This addendum to my RFP proposal submitted to Oversight TF
           June 2            Baptiste and Kuo have reviewed and decided on the amended proposal (with
                           request for slight changes)


6
    See http://en.wikipedia.org/wiki/Delphi_method
7
    See within the Wikipedia article mentioned in the previous footnote.
8
 Note: Other than being aware of the earlier policy recommendations that were collected by the AEA International
Committee based on ideas that were generated during several brainstorming sessions at AEA conferences, it might
be necessary for me to mention that I do not have any preconceived ideas of what policy recommendations will
come out of this ILP process. I am completely open to seeing what emerges, and definitely do want to invite other
colleagues to help in the process of analyzing the data and prioritizing the themes.


International Listening Project Appendixes                                                              Page 17
           June 10          Date by which Jim to be notified of approval of contract (or not)
           June 10+       Negotiate, come to agreement on and sign AEA-Rugh contract
           June 10+       Begin pre-testing invitation letter, online survey instrument9 and blog website
           June 15-16     Discussions with Board and Oversight Task Force members during AEA PAT
                          retreat in Atlanta, including soliciting further input into and confirmation of
                          approaches for conducting the ILP. Also discussions with Susan Kistler and her
                          team with regard to sending survey to full AEA e-mailing list, and setting up the
                          web-based blog discussion. (We’ll begin this process via e-mail ahead of time.)
           July 2         Revised survey beta test sent to TF members
           July 8         Launching of the ILP itself, including sending out the invitation to the online
                          survey and inviting key targeted persons to respond in writing or set up time for
                          phone or Skype interview. Hopefully web-based blog system will have been set
                          up by then as well.
           August 1       End of first round of open input solicitation
           August 8       Preliminary compilation and analysis of data completed and shared with ILP
                          Support Team and other key stakeholders who have volunteered to take part in
                          this process. (Delphi phase 2.)
           August 15      Feedback expected from those stakeholders
           August 29    Draft report and presentation materials due to Board
           September 15 Board review and approval of draft materials due back to Consultant
           October 15   Project Completion; report disseminated to AEA audience plus all who
                       participated in the listening project.
           November 2-5 Discussion of results of ILP and follow-up action ideas with others during Think
                       Tank Session at AEA 2011 conference (proposed by ILP Task Force)

           Comment on this timeline: In the RFP the draft was to be ready by August 15, the Board’s
           feedback to the consultant by September 1, and the project was to be completed by September
           30. However, due to the slippage of the Start date, recognizing that this would squeeze the
           window for soliciting input down to just over a month (June 20 – July 25), the Oversight Task
           Force has agreed with the Consultant to allow an additional two weeks for the data collection
           and synthesis processes.

           According to this revised schedule, the Board will receive the draft report by August 29 and
           provide its feedback to the Consultant by September 15. Though the “project completion” date
           has been pushed back to October 15, the Consultant anticipates that the final version of the
           report will be completed and ready for dissemination before then. Certainly well in time for
           interested persons to have time to read it before we gather in Anaheim for the 2011 AEA
           conference.




9
    I could use Survey Monkey unless AEA has a better survey software program


International Listening Project Appendixes                                                           Page 18
Appendix C: Methodology and process timeline used for this International
Listening Project

The methodology proposed for this ILP was already described in the proposal copied above (pages 11-
14). To briefly summarize, an online survey questionnaire (see page 26 ff) was set up on SurveyGizmo
(still visible at http://appv3.sgizmo.com/projects/editor?id=574552). Invitations to participate in that
survey were sent through a variety of mailing lists and listservs to colleagues around the world (see page
22). Customized invitations were also sent to 44 key leaders of international agencies, foundations, and
other evaluation associations (see lower down on page 22). Some of them opted to share their ideas via
the online or Word version of the survey; 8 agreed to be interviewed by me.

A total of 362 persons from 57 countries contributed to a rich collection of qualitative data. It should be
noted that these were open-ended questions, so respondents were sharing what came to their own
minds, not selecting from a pre-selected list of topics.

I spent a considerable amount of time reading through those responses, coding them according to an
emerging set of topics or issues. I then grouped all the responses that mentioned each topic (some
responses touched upon more than one topic), counting how many mentioned that topic, and selecting
a few that best articulated each of those topics. Those clustered topics were then presented in the 1st
draft of the ILP report.

In addition to sharing the draft report with members of the ILP Task Force, all the responses to each
main question were posted to the Wikispace website at http://aea-
internationallistening.wikispaces.com. Participants in the survey were invited to see the results there.
There were 702 clicks on that website, indicating that many colleagues at least went to it (though I have
no measure of how much they read). Though the original intention was to generate a discussion on that
Wikispace website, very few left comments there. So the ‘crowd sourcing’ or even peer review I was
counting on for the 2nd Delphi phase (with others helping with the process of identifying key themes
coming out of the data) did not work out as I had hoped.

Question #14 was in the form of appreciative inquiry, asking respondents what they appreciate about
what AEA has already done to promote evaluation on a global level. I found 11 main themes coming
from the responses to that question. They are identified in Table 1 on page 7 of the report, followed by
illustrative quotations in the rest of section IV.

Even though one of the questions (#15) specifically asked for suggestions of policies AEA should adapt,
most of the replies identified the kinds of activities they would hope such policies would allow for or
actually proactively promote. Since the responses to questions #15, #16 and #19 were so similar, I
grouped them together in the sorting and clustering process. This produced the 18 clustered themes
identified in Table 2 on page 12 of the report. The rest of section V provides illustrative quotations for
those themes. The reader might note that, in keeping with what I thought the main purpose of a
“listening project” to be, I refrained from rewording things in my own words, preferring to let the
respondents’ voices do the ‘talking’. Indeed, many of them captured the meanings behind those themes
very well.

The next step was to convert those clusters of themes into proposed policy statements to be considered
by the AEA Board. Though I had been offered assistance in converting those activities into policy

International Listening Project Appendixes                                                         Page 19
statements, that assistance did not materialize. Instead I looked at the existing policies on the AEA
website (http://www.eval.org/policies.asp), and decided that I would propose a set to expand the few
points within the relevant section (1.4) there, using similar language. This process, the list of proposed
policies, and my discussions about them are included in Chapter VI of the ILP report.



The table below provides a description of the actual steps taken during the implementation of this
project, with dates and commentary.

Date          Process                                           Commentary
April 13      RFP issued by AEA                                 See Appendix A
April 25      I submitted my proposal to undertake ILP          See Appendix B
May 24        ILP Oversight Task Force (OTF) informed me        Asked for a list of key contacts, more
              that I’d been selected to lead ILP                detail on methodology and timeline
May 25        I submitted amendments to my proposal             Within Appendix B, above
June 8        Letter of Agreement (contract) awarded to
              me by AEA
June 18       Initial draft invitational letter and             Constructive feedback received from
              questionnaire shared with members of OTF          Patricia, Victor, Tristi and Jennifer
              and Board Liaison TF                              (especially with invitational letter)
June 26       Beta test of survey questionnaire on              Susan provided useful advice on seting up
              SurveyGizmo as well as Wikispace website          SurveyGizmo and Wikispace. Feedback
              shared with both TFs                              received from Michael B., Victor, Lennise,
                                                                and Mike H.
Between       Invitations sent to the full AEA mailing list,    357 persons responded to the survey on
July 7 - 31   XCEval and IDEAS listservs, IOCE-EvaLeaders       SurveyGizmo, plus 5 others via a Word
              and all e-address on the IOCE database, ICCE      version of the questionnaire = total 362
              TIG members, AEA365, LinkedIn AEA Group,          respondents.
              EvalTalk and AEA e-newsletter.
July 11       Personal invitations sent to 44 key leaders of    18 of the personally invited persons
              foundations, international development            responded to the survey; 8 agreed to be
              agencies, leaders of other associations, etc.     interviewed.
July 31       ILP progress report included in monthly AEA       Deadline for survey responses extended
              e-newsletter                                      to Aug. 5 in case some respond to
                                                                reminder in newsletter
August 7      Initial report of response, including raw data    They were also invited to take part in the
              on Wikispace, shared with all who indicated       Thought Leaders’ Forum. There were 702
              on the survey that they were interested in        clicks on the Wikispace summary page.
              seeing report.
August 7-     AEA Thought Leaders discussion on ILP             Set up with help by Susan. Participants
13                                                              included Victor Kuo, Jennifer Greene,
                                                                Leslie Cooksy, Irina Agoulnik, Stephen
                                                                Maack, Ricardo Furman, Cecilia Hegamin-
                                                                Younger, Teresa Derrick-Mills, Monica
                                                                Oliver and Valerie Caracelli.
August 17     Phone conversation with Victor and Jennifer       They offered to help translate activity


International Listening Project Appendixes                                                          Page 20
              about next steps in the ILP.                    ideas into policy language, also involving
                                                              Leslie. (However, that proposed Policy
                                                              Interpretation Conversation didn’t take
                                                              place.)
August 26     1st draft of ILP report sent to TF members      No direct feedback relative to content of
                                                              report received.
September     2nd draft of ILP report, with proposed policy   Ditto
21            statements, sent to TF members
September     Phone call from Jennifer Greene                 In response to which my letter and these
29                                                            additional details were written.
October 11    Memo from Jennifer                              Saying AEA will seek other assistance in
                                                              providing guidance to the Board on
                                                              interpreting policy recommendations, but
                                                              accepting the work I have completed to
                                                              date
October 15    Report shared with others                       As promised in original plan
November      Think Tank at AEA 2011 conference in            Opportunity for others interested in this
2             Anaheim                                         topic to join in discussions re. next steps,
                                                              including policy implications, based on
                                                              the ILP report.




International Listening Project Appendixes                                                        Page 21
       Mailing lists to whom ILP invitation sent
                       7-Jul   All AEA members                                          6,838
                       7-Jul   XCEval                                                   1,021
                       7-Jul   IDEAS                                                      366
                       8-Jul   IOCE contacts                                              202
                      28-Jul   ICCE TIG                                                   926
                      18-Jul   AEA365                                                   2200
                      22-Jul   LinkedIn AEA Group                                       4500
                      24-Jul   EvalTalk                                                     ?
                      26-Jul   IOCE-EvaLeaders                                             69
                      31-Jul   AEA e-newsletter                                         6,838
                                                                                                    Word
                                                                      Survey responses:         questionnaires:   Interviews:
                      6-Aug Survey responses:                                357                      5               8




                                                                                                                                Interviewed in

                                                                                                                                Responded to
                                                                                                                                person

                                                                                                                                Survey
 List of individuals to whom personal invitations sent
        Victor Kuo's list:

                               Elvis.fraser@gatesfoundation.org                                                                   21-
   1 Elvis Fraser                                                 Gates Foundation                                                Jul      1
   2 Tom Garwin                tom.garwin@gmail.com               Consultant to Rockefeller

                               NMacPherson@rockfound.org
   3 Nancy MacPhearson                                            Rockefeller Foundation
                                                                                                                                  27-
   4 Gail Berkowitz            gberkowitz@mastercardfdn.org       Mastercard Foundation                                           Jul      1
   5 Cyrus Driver              C.Driver@FordFound.org             Ford Foundation                                                          1
     Jennifer Greene's
     list:


International Listening Project Appendixes                                                                                      Page 22
                                                                                             Independent
   6 Christina Magro           magro.christina@gmail.com            IDEAS                    consultant         Brazil
     Cindy Clapp-                                                                            Director of                        27-
   7 Wincek                    ccwincek@aol.com                     USAID                    Evaluation         USA             Jul      1
   8 Thomas Widmer             thow@ipz.uzh.ch                      EES + Swiss Society                         Switzerland              1
   9 Helen Morrell             ukes@profbriefings.co.uk             UKES                     President          UK
       Kristin Amundsen        kristin.amundsen@riksrevisjonen.no   Auditor General of       Deputy Director
  10                                                                Norway                   general            Norway                   1
       Tristi Nicole's list:

  11 Jan San Sorensen          jansandsorensen@gmail.com
  12 Riselia Bezerra           riselia@scanteam.no
  13 Elliot Stern              crofters@clara.net
     Jim's additional list:
  14 Penny Hawkins             phawkins@rockfound.org               Rockefeller Foundation   Evaluation         New Zealand              1
                                                                                             Department
  15 Florence Etta             feanywhere@yahoo.co.uk               AfrEA                    President          Nigeria                  1

  16 Issaka Traore             issakatraore@yahoo.com               AfrEA                    Liaison to other   Burkina Faso             1
                                                                                             organizations
  17 Alan Woodward             Alan.Woodward@lifeline.org.au;       AES                      President          Australia
                               aes@aes.asn.au
  18 Scott Bailey              scottbayley56@yahoo.com.au           AES                      AES Rep. to IOCE   Australia                1

  19 Martha McGuire            president@evaluationcanada.ca        CES                      President          Canada                   1

  20 Francois Dumaine          francois_dumaine@yahoo.ca            CES                      CES Rep. to IOCE   Canada

  21 Ian Davies                idavies@capacity.ca                  EES                      President          UK                       1
  22 Murray Saunders           m.saunders@lancaster.ac.uk           EES                      EES Rep. to IOCE   UK                       1

  23 Ray Rist                  rrist@worldbank.org                  IDEAS                    President          USA


International Listening Project Appendixes                                                                                     Page 23
  24   Linda Morra-Innas       lmorra@ifc.org                            IDEAS                    IPDET Organizer        USA                       1
  25   Denis Jobin             denis_jobin@yahoo.ca                      IDEAS                    Listserv coordinator   Nigeria                   1
  26   Soma de Silva           somadesilva@gmail.com                     IOCE                     President              Sri Lanka                 1
  27   Nermine Wally           nerminewally@gmail.com                    IOCE                     Secretary              Egypt                     1
  28   Natalia Kosheleva       natalia@processconsulting.ru              IPEN                     IPEN Rep. to IOCE      Russia           26-      1
                                                                                                                                          Jul
  29 Jocelyne Delarue          delaruej@afd.fr                           NONIE + AFD              Coordinator of 2011    France
                                                                                                  conf.
  30 Nick York                 n-york@dfid.gov.uk                        NONIE + DfID             Chair of NONIE         UK
  31 Hans Lundgren             Hans.Lundgren@oecd.org                    OECD/DAC EvalNet         Head of Evaluation     France/Sweden
                                                                                                  Unit
  32 Pablo Rodriguez-          pablo67@gmail.com                         ReLAC                    ReLAC Rep. to IOCE     Argentina                 1
     Bilella
  33 Marcia Paterno            marciapaterno@agenciadeavaliacao.org.br   ReLAC                    ReLAC Rep. to IOCE     Brazil           19-      1
     Joppert                                                                                                                              Jul
  34 Anzel Schonfeldt          anzel@enhancesi.co.za                     SAMEA                    SAMEA Board            South Africa     15-      1
                                                                                                                                          Jul
  35 Peeradet Tongumpai peeradet@trf.or.th                               TEN (Thai Evaluation     founder                Thailand         11-      1
                                                                         Network)                                                         Jul
  36 Katherine Hay             khay@idrc.org.in                          IDRC + South Asian COE   COE organizer          India                     1
  37 Fred Carden               fcarden@idrc.ca                           IDRC                     Head of Evaluation     Canada           22-      1
                                                                                                  Unit                                    Jul
  38 Belen Sanz Luque          belen.sanz@unwomen.org                    UNEG                     Chairperson            USA                       1

  39 Marco Segone              msegone@unicef.org                        UNICEF + UNEG            Senior Evaluation      Italy
                                                                                                  Specialist
  40 Gerry Britan              gbritan@usaid.gov                         USAID                    Senior Evaluation      USA                       1
                                                                                                  Specialist
  41 Osvaldo Feinstein         ofeinstein@yahoo.com                      World Bank               former Director of     Spain                     1
                                                                                                  Evaluation



International Listening Project Appendixes                                                                                               Page 24
  42 Patrick Grasso            pgrasso@worldbank.org   World Bank   Independent        USA
                                                                    Evaluation Group
  43 Cheryl Gray               cgray@worldbank.org     World Bank   [BOUNCED]          USA
  44 Keith Mackay              kmckay@worldbank.org    World Bank                      UK

                                                                                       Interviews =     8       27




International Listening Project Appendixes                                                            Page 25
Appendix D: Questionnaire used for survey


AEA International Listening Survey

Greetings. Thank you for agreeing to take this survey. As Jennifer Green and I (Jim Rugh)
wrote in the introductory e-mail message inviting you to participate in this International
Listening Project, its purpose is to help AEA determine if and, if so, how it should become
more involved with evaluation internationally.

This survey contains 22 questions, but depending on some of your responses, you'll be able
to skip some of them. Many of the questions are open-ended, asking for your suggestions in
your own words. Near the end of the survey, you'll be able to indicate whether or not you
want your responses to be confidential. We'll certainly honor whatever you prefer.

Please note, too, that you need to reply before August 1 at the very latest. Thanks in
advance for sharing your opinions. You're helping to inform AEA's policies related to this
important issue.

1.) Your name [note: if you'd prefer that your comments be kept confidential, you will have that option
at the end of this survey] ____________________________________________

2.) Your e-address ____________________________________________

3.) What is your country of residence? [Pull-down menu provided by SurveyGizmo]

4.) Are you a member of AEA? [Your input is welcome, whether not you are a member of AEA.]
         ( ) Yes
         ( ) No
5.) To what extent has your practice included conducting evaluations outside the United States? [We are
definitely interested in your perspectives, whether or not you have been engaged internationally.]
         ( ) Never
         ( ) Once
         ( ) Occasionally
         ( ) Frequently
         ( ) All the time

6.) Of what other professional evaluation association(s) are you a member? [Please give acronym plus
spell out full name of association/society/network.]
        1: _________________________
        2: _________________________
        3: _________________________
        4: _________________________
        5: _________________________




International Listening Project Appendixes                                                       Page 26
7.) What is your position in the association(s) identified above?
                                                              member of
                                                                                  active         passive
                                             officer       board/governing
                                                                                volunteer        member
                                                               council
1st association listed above                     ()               ()                ()              ()
2nd association listed above                     ()               ()                ()              ()
3rd association listed above                     ()               ()                ()              ()
4th association listed above                     ()               ()                ()              ()
5th association listed above                     ()               ()                ()              ()

8.) What is your work setting?
       ( ) Freelance/independent consultant
       ( ) Private consultancy firm
       ( ) University/research institution
       ( ) Government agency
       ( ) International organization
       ( ) Non-profit organization
       ( ) Other:

9.) If other work setting, please describe. ____________________________________________

10.) The name "American Evaluation Association" could imply that its main focus should be on
Americans and evaluation policies and practice within the United States. To what degree do you think
AEA should expand its engagement internationally?
Please select one of the following responses.
        ( ) None: AEA should be restricted to American organizations and issues.
        ( ) A little: AEA's focus should be mostly American but could also include limited international
        involvement.
        ( ) Quite a bit: While retaining its focus on Americans, AEA should strengthen its international
        partnerships.
        ( ) A lot: I feel AEA should be actively engaged internationally.

11.) [If you replied “None” to Question #10] I think AEA should restrict its scope to the USA. And here is
my explanation of why:

12.) [If you replied “A little” or “Quite a bit” to Question #10] My response to question #10 was one of
the two middle options. Here is my explanation of why I feel that way:

13.) [If you replied “A lot” to Question #10] My response to question #10 was that AEA should be
actively engaged with the international community. Here is my explanation of why:

14.) What do you appreciate about ways AEA has already contributed to the evaluation profession
internationally? In general and/or with specific examples:

15.) What other ways would you suggest that AEA do more to promote evaluation at the international
level, in terms of recommended policies?

16.) What other ways would you suggest that AEA do more to promote evaluation at the international
level, in terms of specific activities?
International Listening Project Appendixes                                                         Page 27
17.) If these activities were undertaken by AEA, would you be willing to be actively involved (e.g. serving
on a task force, volunteering)?
          ( ) Yes
          ( ) No

18.) Please identify suggested activities and specify how you would want to be involved.

19.) More broadly we are interested in knowing ways all of us (organizations and individuals) can
collaborate more effectively to promote evaluation internationally. What are some broad approaches
or specific activities that evaluation organizations or individual evaluators in other countries can
undertake to partner with AEA and/or contribute to AEA as an association and to its members?

20.) We welcome any other suggestions or comments you would like to share with us:

21.) Would it be alright to associate your name with the suggestions/comments you have made in
response to this survey? (If so, we'll attribute your name to selected quotes. If not, we will not cite your
name in the synthesis report.)
       ( ) Yes
       ( ) No

22.) Would you be interested in receiving a copy of the report synthesizing the input to this International
Listening Project from you and other colleagues?
         ( ) Yes, please send a copy to the e-address I gave above
        ( ) No thank you


Thank You!
Thank you for taking our survey. Your responses are very important to us.
Please save and return this Word version of the survey to JimRugh@mindspring.com.
You can follow and contribute to the evolving discussions at http://aea-
internationallistening.wikispaces.com/ and/or by writing directly to Jim Rugh at the above e-address.




International Listening Project Appendixes                                                           Page 28
Appendix D CV of Jim Rugh
(officially James W. Rugh)

CONTACT INFORMATION:                     451 Rugh Ridge Way
                                         Sevierville, TN 37876 USA
                                         Mobile phone: +1-865-696-0401
                                         Home phone: +1-865-908-3133
                                         JimRugh@mindspring.com
                                         Skype ID = JimRugh

EDUCATION:                               Master of Professional Studies in International
                                         Agricultural and Rural Development, Cornell University
                                         (Rural Sociology & Adult Education; thesis published as
                                         practical guidelines for participatory evaluation)
                                         BS and MS, Agricultural Engineering, University of
                                         Tennessee
LANGUAGES:                               English, French, Hindi, Urdu

CITIZENSHIP:                          USA
COUNTRIES OF                          Multi-year residency in India, Senegal, Togo and the
WORK EXPERIENCE:                      USA, plus short assignments in many other countries
Experience Summary:
       Jim has been professionally involved for 47 years in rural community development in
Africa, Asia, Appalachia and other parts of the world. For the past 31 years he has specialized
in international program evaluation. He served as head of Design, Monitoring and Evaluation
for Accountability and Learning for CARE International for 12 years, responsible for promoting
strategies for enhanced evaluation capacity throughout that world-wide organization. He has
also evaluated and provided advice for strengthening the M&E systems of a number of other
International agencies. He is recognized as a leader in the international evaluation
profession. Currently serves as the AEA (American Evaluation Association) Representative to
the IOCE (International Organization for Cooperation in Evaluation), the global umbrella of
national and regional professional evaluation associations, where, as Vice President, he is an
active member of the Executive Committee. He has also been involved for many years in
AEA’s International and Cross-Cultural Evaluation Topical Interest Group, as well as
InterAction’s Evaluation Interest Group. He co-authored the popular and practical RealWorld
Evaluation book (published by Sage in 2006) and has led numerous workshops on that topic
for many organizations and networks in many countries. In recognition of his contributions to
the evaluation profession he was awarded the 2010 Alva and Gunnar Myrdal Practice Award
by AEA (announcement annexed below).
       In addition to his M&E expertise at the HQ level of INGOs, Jim brings experience in
community development and in evaluating and facilitating self-evaluation by participants in
such programs. He is committed to helping the poor and marginalized work on self-
empowerment and development, and to encouraging appropriate assistance offered to them.
He brings a perspective of the "big picture," including familiarity with a wide variety of
community groups and assistance agencies in many countries, plus an eye to detail and a
respect for inclusiveness and the participatory process.
Professional Experience:

2007–present. Retired, but still available for occasional short-term consultancy
assignments. Voluntary service as Representative of the American Evaluation Association
International Listening Project Appendixes                                               Page 29
(AEA, see www.eval.org) to the International Organization for Cooperation in Evaluation
(IOCE, see www.IOCE.net), promoting the development and strengthened capacity of
associations/societies/networks of professional evaluators around the world. Still doing
many training workshops on RealWorld Evaluation and other topics (see
www.RealWorldEvaluation.org). Some of the short-term consultancies have included leading
workshops for graduate students at Georgetown University and The Fletcher School at
Tufts University, a workshop for the staff of Swedish Sida, a presentation at a conference
on impact evaluation organized by Norwegian Norad, a plenary presentation to the NONIE
conference in Paris (www.nonie2011.org) on the need for more holistic perspectives on
impact evaluation, and many professional development workshops at AEA (for 9 years),
CES (Vancouver, Toronto and Victoria), EES (London, Lisbon, Prague), IPEN (Kiev), AfrEA
(Nairobi, Cape Town, Niamey, Cairo), SAMEA (Johannesburg), SLEvA (Colombo Sri
Lanka), ReLAC (San José Costa Rica), South Asia COE (New Delhi India) and other
conferences, often leading RealWorld Evaluation workshops. See more details below:

February and April 2011. Part of a team of Social Impact consultants providing training of
USAID evaluation experts, consistent with the new USAID Evaluation Policy.
February 2011. As a part of follow-up of major 2008 evaluation of their transformational
development indicators, served as member of technical team providing advice to World
Vision International as they develop global indicators to measure their impact on child
wellbeing.

January 2011. Facilitated a two-day seminar for staff of the United Nations Department of
Economic and Social Affairs on producing ToRs for evaluations and utilizing evaluation
reports.

November 2010. Led a RealWorld Evaluation professional development workshop at the
AEA conference in San Antonio. Also had roles in 7 other sessions during the conference.

October 2010. Participated in sessions of the European Evaluation Society (EES) in
Prague. Later led a RealWorld Evaluation workshop plus other sessions at the South Asian
Community of Evaluators’ Concave in New Delhi, India.

July 2010. Led a RealWorld Evaluation workshop in English with Spanish PowerPoint and
interpretation at ReLAC regional conference in San José, Costa Rica.

June 2010. Led two short workshops during AEA/CDC Summer Evaluation Institute in
Atlanta.

May 2010. Led a 5-day M&E workshop for regional staff of The Asia Foundation in
Jakarta, Indonesia.

May 2010. Facilitated a professional development pre-session RealWorld Evaluation
workshop as part of the Canadian Evaluation Society (CES) conference in Victoria, BC.

April 2010. Led a 2-day RealWorld Evaluation workshop for Women For Women
International and other Hilton Honoree INGOs in Washington, DC.

November 2009. Facilitated a workshop in Bangkok to introduce staff of American Bar
Association’s Asia regional Rule of Law Initiative on the basics of how international
development programs are monitored and evaluated.
International Listening Project Appendixes                                            Page 30
August 2009. Led professional development workshops on RealWorld Evaluation for the
conference of SAMEA (South African Monitoring and Evaluation Association) in
Johannesburg.

June 2009. Facilitated a 5-day workshop on monitoring, evaluation and learning for 40
government officials and staff of agricultural and natural resource management projects in
India funded by the World Bank.

May 2009. Facilitated a 4-day workshop on impact evaluations for CARE Sierra Leone
and partner INGO staff in Freetown.

March 2009. Led workshops and sessions during the International Impact Evaluation
conference organized by NONIE, 3ie, DfID, AfrEA and others in Cairo, Egypt.

March-October 2008. Led a 12-member team to evaluate World Vision International’s
system of Transformational Development Indicators. Interviewed 107 WVI staff. Major
recommendations have been accepted by WVI leadership, which has the potential to make
a major impact on how this huge INGO measures and reports on its impact on the wellbeing
of children, families and communities.

February-May 2008. Helped with basic design of an impact evaluation of UNICEF’s
rehabilitation and development programs in the tsunami-affected areas of Sri Lanka, The
Maldives and Indonesia.

June-December 2007. Conducted a major evaluation of Catholic Relief Service’s M&E
policies, systems and capacities. Included interviews with 68 CRS staff around the world.
Made specific recommendations on strategies for further enhancing CRS’s M&E and
effectiveness over the next five years.

1995 –2007. Coordinator of Design, Monitoring and Evaluation for Accountability and
Learning (DMEAL), CARE. Established CARE USA Atlanta headquarters-based evaluation
unit responsible for helping CARE and partner staff around the world improve logical project
design, systemized monitoring systems, and the quality, credibility and utility of program
evaluations. This included leadership in the establishment of CARE International evaluation
policies and standards, production of guidelines, training of trainers, strengthening project-
and global-level monitoring systems, promoting more professional evaluation
methodologies, and conducting biannual global meta-evaluations, all through a variety of
forms of coordination and innovation. Was responsible for over-all coordination of a network
of over 80 “CARE DME Cadre” in the 70 countries where CARE works, plus the 12 CARE
International Members. Many of the core CARE documents relevant to program quality are
accessible on the Program Quality Digital Library at http://pqdl.care.org. Also set up a
publically accessible e-library of program evaluations at www.careevaluations.org.

1984-95. Independent evaluation consultant. Community-Based Evaluations, Sevierville,
Tennessee. Multiple short-term evaluation consultancies with international NGOs.

1987-93. Stewardship Coordinator, Commission on Religion in Appalachia (CORA),
Knoxville, Tennessee. Helped channel assistance from 17 church denominations to
community groups throughout Appalachia.

International Listening Project Appendixes                                             Page 31
1973-84. Area Representative for West Africa then Director for Africa, World
Neighbors, Oklahoma City, Oklahoma. Established World Neighbors programs in 7
countries of West and Central Africa while based in Lomé, Togo; later supervised programs
throughout Africa.

1966-69. Associate Director, Peace Corps, North India. Responsible for PCVs in the
state of Uttar Pradesh.

1964-66. Peace Corps Volunteer in Senegal. Rural community development.
Constructed a school, wells and latrines.


SELECT PUBLICATIONS:

RealWorld Evaluation: Working under Budget, Time, Data and Political Constraints,
comprehensive 468 page text book co-authored with Michael Bamberger and Linda Mabry.
SAGE, 2006. More information available at www.RealWorldEvaluation.org.

“RealWorld Evaluation: Conducting Evaluations Under Budget, Time, Data and
Political Constraints” chapter in Country-led Monitoring and Evaluation Systems: Better
evidence, better policies, better development results. UNICEF. 2009.

“Une stratégie pour composer avec les contraintes inhérentes à la pratique”, M.
Bamberger et J. Rugh, chapitre dans Approches et pratiques en évaluation de programme
sous la direction de Valéry Ridde et Christian Dagenais. Collection « Paramètres ». Les
presses de l’Université de Montréal. 2009.

“Shoestring Evaluation: Designing Impact Evaluations under Budget, Time and Data
Constraints” article in American Journal of Evaluation, Vol. 25, No. 1, Spring 2004, co-
authored with Michael Bamberger, Mary Church and Lucia Fort.

“The CARE International Evaluation Standards”, one of a set of articles on “International
Perspectives on Evaluations” in the periodical New Directions in Evaluation, Number 104,
Winter 2004; a publication of Jossey-Bass and the American Evaluation Association.

Edited CARE Impact Guidelines, 1999; coordinated the development of the CARE
International Evaluation Policy, Project Standards, as well as many other official CARE
documents assessable on the CARE Program Quality & Learning Digital Library
(http://pqdl.care.org/) and a publicly accessible collection of CARE evaluation reports at
http://www.careevaluations.org.

“Can Participatory Evaluation Meet The Needs of All Stakeholders? Evaluating the
World Neighbors’ West Africa Program” in Practicing Anthropology Vol. 19, No. 3,
Summer 1997.

Self Evaluation: Ideas for Participatory Evaluation of Rural Community Development
Projects published by World Neighbors 1986 (and subsequent reprints), based on Masters
of Professional Studies (MPS) thesis at Cornell University.

AWARD:

International Listening Project Appendixes                                          Page 32
2010 Alva and Gunnar Myrdal Practice Award presented by the American Evaluation
Association to an evaluator who exemplifies outstanding evaluation practice and who has
made substantial cumulative contributions to the field of evaluation through the practice of
evaluation and whose work is consistent with the AEA Guiding Principles for Evaluators.
(See announcement copied below.)




AEA is proud to announce the winners of its 2010 Awards. Honored this year will be recipients in four
categories who have helped heighten international evaluation efforts, spearhead a groundbreaking new
journal, influence a health initiative that impacted the lives of children and families in five urban
communities and influence a new generation of evaluators who offer greater
diversity both within the field and the association. Join us as we recognize our
2010 honorees at AEA's annual conference in San Antonio.
"This year's winners demonstrate the vitality of our profession," says AEA's 2010
President Leslie Cooksy, "and we are all very fortunate to have such talented people
doing such important work." …
Veteran evaluator Jim Rugh, AEA's representative to the International Organization for
Cooperation in Evaluation (IOCE) where he serves on the Executive Committee. He was
one of the early leaders of AEA's International and Cross-Cultural Evaluation Topical
Interest Group, now one of its largest with 800-plus members, and co-author of the
popular and practical RealWorld Evaluation book. An independent consultant based in
Sevierville, Tennessee, Rugh will be presented AEA's Alva and Gunnar Myrdal Evaluation Practice Award.

FROM THE MOUNTAINS OF NEPAL TO THE PLAINS OF AFRICA, VETERAN EVALUATOR
RECOGNIZED BY AEA

An international evaluator with extensive experience on the ground and in the community will be honored
by the American Evaluation Association at its annual conference this year in San Antonio. Jim Rugh, a
veteran evaluator who helped found AEA’s International and Cross-Cultural Evaluation Topical Interest
Group – now one of its largest with 800-plus members - and who serves as AEA’s Representative to the
International Organization for Cooperation in Evaluation (IOCE), will be honored on Friday, Nov. 12 as the
recipient of AEA’s Alva and Gunnar Myrdal Practice Award.

“Among the many factors recognized were his two decades of sustained service to the American
Evaluation Association and to our field; his exceptional leadership in developing guidelines for helping
communities to become more involved in evaluation; for developing evaluation capacity through his
practice and for encouraging transparency and stakeholder involvement in his work,” says Tarek Azzam,
chairs of AEA’s Awards Committee.

AEA is an international professional association of evaluators devoted to the application and exploration
of program evaluation, personnel evaluation, technology, and many other forms of evaluation. Evaluation
involves assessing the strengths and weaknesses of programs, policies, personnel, products and
organizations to improve their effectiveness. AEA has more 6,000 members representing all 50 U.S.
states as well as over 60 foreign countries.

“It is truly an honor to be chosen to receive the 2010 Alva and Gunnar Myrdal Practice Award,” says
Rugh, an author, trainer and longtime evaluation practitioner who currently resides in Sevierville,
Tennessee. “It is especially gratifying to receive this recognition for my career-long involvement in helping
to promote evaluation capacity and evaluation organizations around the world – a cause that has been,
International Listening Project Appendixes                                                               Page 33
and remains, not only my profession but also my passion.” Rugh is a graduate of the University of
Tennessee as well as Cornell University and is fluent in English, French and Hindi.

“We all come to the evaluation profession through different doors,” says Michael Hendricks, an AEA
board member who nominated Rugh for the award. “Jim’s path was through involvement in rural
community development in Africa, India, and Appalachia. He was raised in India, where his parents
served for 38 years as missionaries. (His father’s PhD dissertation, perhaps not coincidentally, was on
evaluating church-related schools in India.) After he served as a Peace Corps volunteer in Senegal, the
Peace Corps asked Jim to go back to India in the role of Associate Director. After getting an MS in
Agricultural Engineering, the aid organization World Neighbors asked Jim to go to West and Central
Africa to identify and offer support to community groups in several different countries. After six years of
extensive involvement with these grass-roots programs, Jim took a sabbatical to study rural sociology and
adult education at Cornell University. His thesis project developed guidelines for participatory evaluation
by community development groups. That practical manual was published, became very popular, and
launched Jim into evaluation.”

Rugh has since been professionally involved for more than 45 years in rural community development in
Africa, Asia, Appalachia and other parts of the world and he has specialized in international program
evaluation for 30 years, including 12 years as head of Design, Monitoring and Evaluation for
Accountability and Learning for CARE International where he was responsible for promoting strategies for
enhanced evaluation capacity throughout that worldwide non-governmental organization (NGO). He has
also evaluated and provided advice for strengthening the monitoring and evaluation systems of a number
of other international NGOs and is recognized as a leader in the international evaluation profession.
Along with Michael Bamberger and Linda Mabry, Rugh was a coauthor of the popular book RealWorld
Evaluation: Working under Budget, Time, Data and Political Constraints, published by Sage in 2006.

“While Jim has dealt regularly with world leaders, heads of prestigious and important agencies, both
domestically and internationally, and powerful people across many sectors of development, his enduring
focus has been on helping the poor, disadvantaged, and less powerful. Time and again in evaluation
conferences,” notes colleague Michael Quinn Patton, “as evaluation methodologists and model-builders
have immersed themselves in academic debates, I’ve seen Jim quietly but forcefully make an observation
or offer a comment that moved the discussion back to the issue of how evaluation would actually
contribute to improving the lives of people in need. This has been his constant, unwavering commitment.
It is the basis of this work on and passion about RealWorld Evaluation.”

“On various occasions Jim and I have had to select photographs from our evaluation work to use in
evaluation workshops and publications,” notes fellow evaluator and author Michael Bamberger. “Jim
always produces a vast array of his own photographs from the mountains of Nepal or the plains and
deserts of Africa showing community groups, often sitting on the ground in the open air, discussing the
lessons they draw from the programs in their villages – very often without the external evaluator even
appearing in the photos. He truly believes in and practices participatory and empowerment evaluation.”

“Jim has had a long-standing interest in sharing evaluation with those in the developing world, and he has
as extensive, impressive record of working with many people and organizations in developing countries
and regions to introduce and “grow” evaluation around the world,” says Ross Conner, Professor Emeritus
and Former President of both AEA and IOCE.

“In important ways, Jim has been a global ambassador for evaluation for much of his career,” says
Jennifer Greene, AEA’s incoming President. “Building on his early work in international development, Jim
has been instrumental in establishing and promoting evaluation in this sector for the last two decades. His
work at CARE is especially notable, as it involved the development and dissemination of systems of
monitoring and evaluation, to include staff training, meta-evaluation, and ongoing promotion of evaluation
itself. Jim must have developed and nurtured strong, respectful relationships as well as credibility with his
international partners during this time period. For, since then, Jim has been in high demand as an
international evaluation practitioner and also a trainer. And since then, Jim has garnered respect and
renown as an advocate for strong evaluation.”




International Listening Project Appendixes                                                           Page 34
Though he retired from full-time work in 2007, Jim Rugh continues his active involvement in the
evaluation profession. This includes frequent international travel to lead training workshops and
consultancies with a variety of international development agencies. And he will soon be entering his
fourth year as the AEA Representative to the IOCE (the collaborative network that Rugh refers to as “the
UN of professional evaluation organizations”), where he also serves as Treasurer and an active member
of the IOCE Executive Committee.




International Listening Project Appendixes                                                        Page 35

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:8
posted:12/19/2011
language:English
pages:35