Docstoc

CTSA Evaluation

Document Sample
CTSA Evaluation Powered By Docstoc
					    CTSA Evaluation

Shared Resources Working Group

         November 9, 2008
       Jodi Segal, MD, MPH
                   Working Group
Donna Jeffe              Washington University

Jan Hogle                University of Wisconsin
Sunday Clark             University of Pittsburgh
Harold Pincus            Columbia

Don Yarbrough            University of Iowa
Fred Wolf                University of Washington
Boyd Richards            Columbia
Ann Dozier               Rochester
Christine Weston         Johns Hopkins
Debra Stark              UT San Antonio
Zeanid Breyer            UCSF
Shane Thielman           Scripps
Sheila Kessler           Northwestern University
Cath Kane                Weill Cornell
                        Goals
• Identify “resources” which can be shared

• Provide a mechanism by which these can be shared

• Create a community of committed individuals willing to
  discuss and share processes and tools
                 Process
• Monthly open conference calls on a pre-
  specified theme
• Wiki, maintained by Cath Kane and Jan
  Hogle, as a site for posting resources
• Listserve by which to make requests for
  resources and disseminate information
  about upcoming “themed” calls
     Themed Conference Calls
• Pilot program assessments
• Meetings with Program (Key Function)
  Directors
• CTSA needs assessment/investigator survey
        Themed Conference Calls --
               Upcoming
•   Pilot program assessments (revisited)
•   Training and Education
•   Community Engagement
•   IRB
•   Lessons learned

...plan to survey group members to prioritize
Domains Identified Last February
TOOLS and INSTRUMENTS
Software tools
Social Network Analysis tools
Milestone tracking tools
Tools for measuring changes in capacity
Social network methodology for assessing relationships among core directors
   and other stakeholders
Online survey tools (i.e. experience with SurveyMonkey, Zoomerang, in house
   programs, etc.)
Web-based needs assessment
Collaboration via Bibliometric analysis
Analysis techniques (publication citation analyses)
Systematic review methods
Training—course satisfaction instruments
To assess perceived safety of clinical trials
To assess barriers faced by investigators
Domains Identified Last February
RESULTS
Examples of results from different resources
Lessons learned in using selected software (e.g. network analysis)
Feedback about which tools work best for tracking progress against goal
What works and what doesn’t with tools currently in use?


COMMUNITY
Identifying ‘communities’
Assessment of community engagement
Assessing health of community
Community Impact
Domains Identified Last February
OVERALL EVALUATION
CTSA dashboard for monitoring overall progress
Communication strategies within a CTSA and institution
Assessing progress on milestones and format for milestone assessment
Measuring indirect effects of CTSA
Collaboration assessments
How do you leverage existing data? What types of data? How do you use it?
Data sources for Social Network Analysis (not software)
Attribution assessment (degree to which an outcome can be attributed to CTSA)
Accessibility of core resources
Reduction in redundant infrastructure
Increase in related grants and contracts
Perceptions of ease of access—as different from satisfaction (convenience); how easily
    do people access what they need?
Systems for tracking resource utilization within CTSA
Domains Identified Last February
METHODS
For establishing baselines for key long-term indicators
For setting goals
Strategies to collect publications that CTSA supported
Guides about qualitative questions


SHARED FORMS
Common consent forms
Confidentiality forms—federal certificate of confidentiality
IRB guidelines
                      Wiki
Repository of shared resources (12)
–   2007 Cancer Center Survey      (Donna Jeffe)
–   ITHS Environment Survey         (Fred Wolf)
–   Key informant interview question guide (Jan Hogle)
–   Pilot Proposal Reviewer Survey (Ann Dozier)
                     Wiki
Pilot Program Information Table (10)
  – Definitions of T1 and T2 translational used by
    the pilot programs
  – Information on number of pilots funded/ size of
    grants/funding criteria
  – Samples of RFP’s and applications
We Started with a Main Repository




                  We began to hold discussions and
                  group resources by two themes:
                     1) Key Function
                     2) Method
Grouping by Method: Staff Interviews




                      All the shared tools are stored
                      as attachments with live links
                      in each table.
 Grouping by Key Function:
Use of “Translational” in Pilots



                           This could also
                           eventually be
                           linked with Def Ops
                         Plans
• Reorganize Wiki – develop a more formal structure for
  posting our resources (catalogue)

• Posting of “bios” of evaluation team members and their
  backgrounds and strengths (as a resource)

• Plan for manuscript preparation (descriptive) about our
  approaches to evaluating select key functions

• Additional interaction with other workgroups

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:7
posted:12/17/2011
language:English
pages:16