Contract Tabular Summary

Document Sample
Contract Tabular Summary Powered By Docstoc
					                              Presenting Data Analysis (11.05)


Presenting critical analysis of data is a requirement of the Standards related to program self
evaluation. This document has been designed with actual comments from reviewers and
submissions from programs. After each example provided is a comment in red related to the
example which may provide suggestions for change to improve the example.

Analysis Defined

Study of compiled or tabulated data interpreting cause and effect relationships and trends, with
the subsequent understanding and conclusions used to validate current practices or make
changes as needed for program improvement

Comments to the ARC-PA from Reviewers of the Self Study Report (SSR)

The most common comment by far related to SSR reports:

       “The self-study was descriptive and did not contain outcome data analysis.”

Similar comment:

       “The SSR exceeds the page limit. It is largely descriptive and very little data is offered.
       Great length is given to describing how student surveys are sent out, graduate and
       employer surveys but no data was included in the SSR. It was stated that this
       information would be presented to the site visit team. “

The type of comments programs hope to receive:

       “They presented an excellent outcome data analysis of PACKRAT and PANCE scores,
       content areas of the PANCE, including comparison of student perceptions of their
       preparation for PANCE content areas and their area scores, and graduate practice

       “The program states on page X of the SSR that the periodic self-assessment builds upon
       materials presented in the 2XXX SSR. The time frame for self-assessment covers the
       period (2XXX) through Month 2XXX. A series of 7 meetings were held from Month
       2XXX – Month 2XXX to compile data and drafts of the SSR. The process involves data
       that is gathered throughout each year (includes student, employer and graduate
       surveys, course and instructor evaluations) and then a statistical evaluation is
       performed. A table in the self-study [note: sample of such a table is included on next
       page] reveals that the process is a year round event and has 19 critical elements that
       are evaluated. If followed as outlined, the process is should certainly be strong.”
Presenting Data Analysis                                                                    Page 2

The following is a tabular summary of the data collection and review that comprises the
Program continuous self-assessment. (Some sort of graphic presentation like this is required as
a component of the Self Study Report.)

      Data Source or Committee                  J   F   M   A   M   J   J   A   S   O   N   D
      Strategic Planning Meetings                           x                       x
      Brown Bag Student/Faculty Mtgs.           X   X   X   X   X   X   X   X   X   X   X   X
      Faculty Meetings                          X   X   X   X   X   X   X   X   X   X   X   X
      Admissions Committee                      X   X   X   X                           X   X
      Faculty self assessments                  X                       X
      University Assessment Years                                   X
      Student Course Evaluations                            X               X               X
      Student Program Evaluations                           X                   X
      Faculty Course Evaluations                            X               X               X
      Faculty Program Evaluation                X   X   X   X   X   X   X   X   X   X   X   X
      Graduate Surveys- end 1 yr.                           X   X
      Graduate Surveys- all (every 3 year)                      X
      Employer Surveys- most recent graduates                   X   X
      Clinical Site Visits                      X   X   X   X   X   X   X   X   X   X   X   X
      Student Evaluation of Clinical Sites      X   X   X   X   X   X   X   X   X   X   X   X
      Preceptor Evaluation of Students          X   X   X   X   X   X   X   X   X   X   X   X
      Summative Evaluations                     X   X                                       X
      Exit Interviews w/ Graduates                          X
      NCCPA Results                             X               X               X
      PACKRAT Scores                                X                                   X

What is Descriptive vs. what is Analytical??

Example of an excerpt from SSR that is descriptive:

       “The curriculum of the PA program at Pretend is well designed and sequenced to
       provide a strong foundation in the basic medical and behavioral sciences, with
       subsequent construction of clinical knowledge and experience upon the
       foundation. There is exceptional diversity of experience in learning styles from
       PBL to standard lecture to core competencies (procedures) to simulated patients
       and finally clinical clerkships. Student performance on PACKRAT scores and on
       the NCCPA examination externally validates the adequacy of methods of
       instruction. Additional validation is received via evaluation by clinical preceptors
       on monthly evaluations of students on clinical clerkships. “ Note: the above
       descriptive paragraph would have been a good introduction to the analysis
       of data that should also have included tabular/diagrammatic presentation
Presenting Data Analysis                                                                  Page 3

       of the data alluded to in the paragraph:
           • Representation of diversity of experience (not to duplicate what may have been in
           • PACKRAT performance data for several years, showing trends
           • NCCPA performance data for several years, showing trends
           • Preceptor evaluation summaries, numeric and comments

Example of an excerpt from SSR that is descriptive:

The curriculum has undergone a number of changes in the past two years. Addition of PBL
sessions into the Clinical Medicine sequence of courses has corrected the perceived deficiency
among the clinical faculty that the students were entering the clinical clerkships ill-prepared to
begin their clinical education. The mid-rotation evaluations have borne this out, as more
students are now rated as "above average" (the highest rating on the form) by their clinical
preceptors and virtually no student has been identified for remediation; this represents a
dramatic change from previous years. [Note: The above paragraph needs supportive tabular
data, such as the change in number of PBL sessions, actual data summary from the mid-
rotation evaluations.]

The following paragraphs are an Example of an excerpt from SSR submitted as
an analysis of personnel that is largely descriptive:

       Personnel Analysis

       The qualifications of the faculty, including the Program Director and Medical Director, are
       sufficient to meet the needs of the Program. The faculty has extensive depth and
       breadth of clinical experience and experience in teaching in clinical settings. While the
       core faculty that is clinically trained is relatively new to academia, extensive effort has
       been expended in professional development to develop and enhance their educational
       skills. [Note: this paragraph is considered descriptive]

       Each clinically trained faculty member is given one day per week release time and is
       expected to practice clinically during that time. The faculty member keeps all
       compensation obtained from these clinical activities. The Program Director is released
       one day per week for research, consulting, or patient care activities. The Research
       Coordinator is released at 30% to pursue primary research and teaches courses in other
       Departments or schools and is approximately at a 0.5 FTE (8 credits Program and SHP
       courses plus 4 credits for research committee chairpersonships/24 expected credits
       instruction/9 month contract). [Note: this paragraph is considered descriptive]

       Using the above information, the number of faculty currently teaching in the Program as
       core faculty is as follows:

       Current Core Faculty FTE                     Total   5.1 FTE

        Program Director 0.8                      Medical Director 0.8 (vacant)
        Clinical Director 0.8                     Academic Coordinator 0.8
        Research Director 0.5                     Full-time Faculty 0.8 (vacant)
        Part-Time Faculty 0.6
Presenting Data Analysis                                                                 Page 4

        In addition the Program receives support from faculty in other departments of the
       University who instruct the Program’s students. Total credit hours taught by other
       department faculty total 18 with an expected faculty load of 36 credit hours per three-
       semester contract, equating a 0.5 FTE contribution. Adjunct faculty also assists in the
       Program instructional process as noted above. Calculated on a classroom hour basis,
       their contribution equates to 0.5 FTE. The total faculty FTE’s thus available to the
       Program is 5.45. Using the 90-student rated capacity of the Program, the student-faculty
       ratio is 16:5. [Note: this begins to be analytical and would be helped with a table
       comparison to other departments in the school]

       Using data from the “whatever Annual Report on Physician Assistant Educational
       Programs in the United States, 2XXX-2XXX” as published by APAP, the average
       physician assistant studies program nationally has total personnel of 8.45 FTE,
       excluding adjunct and part-time <0.5. The average Program is 25 months in length and
       has 79 students; the Pretend University Physician Assistant Program exceeds the
       average on both counts. [Note: this paragraph represents some analysis,
       comparing to a national standard]

       The previous self-study report reported 8.25 FTE total personnel. However, direct
       comparison to prior reports is not applicable as the site visit team from 2XXX pointed out
       the discrepancy of release time and other departmental teaching responsibilities as
       subtracting from the FTE status assigned to the Program. . If one calculates the deficit
       of Program faculty versus the national average, it equals 1.5 as the national averages
       omit adjunct support and the Program is 1.0 FTE below the national average, excluding
       factors such as the increased length of the Pretend Program and higher number of
       students. Perhaps most important, the Program’s faculty at a winter retreat proposed 2
       new full time faculty be acquired. [Note: what would be beneficial here is a
       conclusion about this analysis…, thus…….and therefore……]
Presenting Data Analysis                                                                 Page 5

The following pages represent several examples of nicely displayed data and
analysis of data

Example 1

Individual test scores have only been available for the PackRat examination for the past two
years. Scores in the area of hematology have been consistently low. The entire faculty was
made aware of this problem and each instructor who teaches hematology topics was
encouraged to evaluate his or her learning objectives to attempt to improve student’s learning in
this area.

The class that graduated in August 2003 averaged 46% on the hematology section. The class
that is to graduate in August 2004 has slightly improved their scores in this area (49%).

What is not known is if performance on the PackRat accurately predicts performance on the
NCCPA examination. Review of the available data for this program indicates that all of those
graduates who have failed the PANCE at least once did score low on the PackRat (the group
average was 127)

                                 PackRat Score    PANCE Result     PANCE Score
                                    131.00            F                    342
                                    126.00            F                    293
                                    123.00            F                    249
                                    118.00            F                    187
                                    139.00            F                    341
                                    124.00            F               Unknown
                      Average:          126.83                          282.40

There were some graduates who had scores above 126 and still passed the PANCE. To date
only one person has scored above 131 and not passed the national certifying examination.

Each of the sections of the PackRat will continue to be followed and attempts to correlate the
results with performance on the national certifying examinations will be made.

The results on the Physician Assistant National Certifying Examination (PANCE) are evaluated
each year and/or when they become available. The first indicator is the absolute pass/fail rate
of our graduates.

Of 55 graduates, a total of 7 students have failed the PANCE at least once (87% pass rate).
Prior to 2002, individual scores were not available and thus it is not possible to accurately
determine how significant the problems were. By year of graduation, the results break down as
Presenting Data Analysis                                                                    Page 6

       Year of      # Taking Exam    # of Students    % Pass   Our    National   National
      Graduation     for 1st Time       Passing        Rate    Mean    Mean        Mean
           2000            9               8           88.9%    427     498        85.7%
           2001           14               13          92.9%    460     489        94.1%
           2002           14               10          71.4%    377     491        76.8%
           2003           18               17          94.4%    498     487       102.3%

As the table above indicates, the class with the lowest pass rate (2002) had the overall lowest
mean score, and in consequence the worst performance against the national mean.

When the data was analyzed using only the students that did pass, the % of national mean for
those who passed was closer to previous years at 87%:

                                    Student   Score
                                       1        350
                                       2        352
                                       3        361
                                       4        379
                                       5        383
                                       6        392
                                       7        396
                                       8        510
                                       9        532
                                      10        593

The complete historical data available for analysis is presented on the following page.
Presenting Data Analysis   Page 7
Presenting Data Analysis                                                  Page 8

Example 2

The second survey which is conducted during the year after graduation is a “Graduate
Employment Survey”. This survey is designed to determine the employment status of the

The third survey is an “Employer Survey” which is designed to collect data on recent
graduate employment settings, scope of practice, graduate competence, and
suggestions for curriculum improvement.

Results of these surveys will be available to the on-site team during their visit and the
outcomes will be discussed at that time. [Note: It would be better to have a summary
of the survey results and outcomes included here in the SSR]

The Post-NCCPA Exam Survey is designed to ascertain the graduate’s opinion of how
well the program prepared them for the examination. The survey examines 30 different
areas of preparation and asks the graduate to rate his or her opinion about how well the
program did. The return rate for the survey has averaged 56% since the beginning of the
program. The most recent year showed a drop in the response rate to 40%.

The small number of graduates and small class sizes make interpretation of the data
difficult. To facilitate analysis, graphs were produced using Microsoft Excel for each of
the areas being examined.

The data shows that for each of the thirty areas examined, the majority of the students
have either agreed or strongly agreed that they were well prepared. Certain areas were
reported to be stronger than others. The data to date is as follows:
Presenting Data Analysis                                               Page 9

When the overall average of graduates who either “Strongly Agreed” or “Agreed” is less
than 70% the area may or may not have significant difficulties and must be looked at
more closely.

The areas of concern are identified below:
Presenting Data Analysis                                                   Page 10

Example 3

Pharmacology: The overall satisfaction with Pharmacology has generally been good to
excellent with the exception of the class that graduated in 2002. There were personality
issues with the instructor that contributed to this negative evaluation. The Program
Director was aware of the problems during the year and the issues were investigated at
the time. The program director attended several of the lectures throughout the semester
and found them to be well done and clear. The instructor in question was a PharmD and
had been teaching at a school of pharmacy for over 20 years. The instructor was
determined to be highly competent and the issues were mainly on the student side. The
instructor declined to return the following year.

Because the poor ratings all essentially occurred in one class, no action is deemed
necessary until further data points can be collected. Data will be collected for each class
from this point forward.

Example 4

Graduate Employment Survey Data

The areas in which our graduates are practicing (as far as is known) are well distributed
between the types listed above.
Presenting Data Analysis                                               Page 11

There seems to be a trend towards working in private offices, but there are not enough
data points to validate that conclusion. The mission of the PA Program is to prepare
primary care providers and the data available suggests that the mission is being met.
This data will continue to be followed yearly.

Shared By:
Description: Contract Tabular Summary document sample