AGEP Evaluation Capacity Meeting 2008
Yolanda George, Deputy Director,
Education & Human Resources Programs
Identifying methods and questions for evaluation
studies related to STEM graduate student
progression to the PhD and professoriate, including
admissions/selections, retention/attrition, PhD
completion, and post-doctoral experiences,
including collection of quantitative and qualitative
Identifying methods and questions for Alliance
evaluations, particularly in terms of progression to
PhD and the professoriate. What can AGEPs learn
from cross institutional studies?
As you listen to presentations….
What research informed the design of the study?
What type of data was collected? What was the
rationale for deciding to collect this data?
What methods were used? What was the rationale for
selecting methods used?
How were comparisons groups constructed? What
are the reporting limitations in regards to the
construction of the comparison groups?
Another Objective for this AGEP Meeting
Developing and writing impact statements or
highlights (nuggets) that include data for use in:
AGEP NSF Annual Reports Findings section
AGEP Supplemental Report Questions
Brochures and Web sites
The poster should include quantitative and qualitative data that
provides evidence of:
Graduate student changes for selected STEM fields or all
Infrastructure changes. This can include changes in
institutional or departmental polices or practices
Alliance impact. This can include changes in institutional
or departmental policies or practices related to graduate
school affairs, postdoctoral arrangements, or faculty
Stories and pictures are welcome but the major emphasis
must be on quantitative and, as appropriate, qualitative
Program descriptions need to be kept to a minimum and put
in the context of the data behind decisions to keep or
eliminate strategies. A focus can be on what works and
what doesn't as long as the emphasis on the data that
5 showed whether different strategies worked on not.
Impact Evaluations and Statements
An impact evaluation measures the program's effects and the
extent to which its goals were attained. Although evaluation
designs may produce useful information about a program's
effectiveness, some may produce more useful information than
For example, designs that track effects over extended time
periods (time series designs) are generally superior to those
that simply compare periods before and after intervention (pre-
Comparison group designs are superior to those that lack any
basis for comparison; and
Designs that use true control groups (experimental designs)
have the greatest potential for producing authoritative results.
Strategies that Matter for
Graduate Student Retention & Progression to the PhD
Student admissions/selection criteria
Financial aid packages that reduces debt burden
Mentoring (Faculty and staff)
Supplementary academic support in writing, statistics, and
Social integration into department
Early intellectual integration into research projects
Research productivity (posters, papers, etc)
Attention to PhD milestones
Attention to family/work balance
Institutional and departmental programs and practices
Given limited evaluation budgets:
Use evaluators to conceptualize and design
Don’t evaluate every component of the program
Look for natural opportunities to conduct an
evaluation. Make evaluation a part of the
Use electronic student systems
Involve all faculty and staff in data collection and
1. What types of studies and evaluations are you already
doing to measure retention/attrition or progression to
the PhD? What types of comparisons groups are you
using in these studies?
2. What types of studies and evaluations are you
already doing to measure institutional impact? What
types of comparisons groups are you using in these
Lead Alliance Leaders
3. What types of studies and evaluations are you already
doing to measure Alliance impact? What types of
comparisons groups are you using in these studies?
Work Groups Continued (All Groups)
4. What type of studies and evaluations are you already
doing to measure progression and retention in the
professoriate? What types of comparisons groups
are you using in these studies?
5. What are other natural opportunities for collecting
evaluation data? What types of comparisons groups
would you use in these studies?
6. What are some solutions to IRB challenges?
Write an impact statement about graduate student
changes, as a result of AGEP.
Write an impact statement about your institutional
changes, as a result of AGEP.
Write an impact statement about your Alliance, as
a result of AGEP.
Write an impact statement about progression and
retention in the STEM professoriate, as a result of
In summary, evaluation
Examination of something in order to judge its value,
quality, importance, extent, or condition
Part of the ongoing program implementation
Meaningful activity among the entire project team,
including faculty and administrators