FLORIDA DEPARTMENT OF JUVENILE JUSTICE
V. Charlie Crist, Governor Frank Peterman, Jr., Secretary
Quality Assurance Process Improvement
November 30, 2010
Beth Davis, Director Program Accountability
Jeff Wenhold, Bureau Chief Quality Assurance
Mary Gaiser, Quality Assurance
Christy Daly, DJJ Interim Chief of Staff
Michael Baglivio, DJJ Programming and Technical Assistance
Joan Wimmer, Residential Services
Steve Bushore, Program Administrator Quality Assurance – Area One
Mike Marino, Program Administrator Quality Assurance – Area Two
Donna Connors, Program Administrator Quality Assurance – Area Three
Peter Plant, G4S
Sheila Asson, Eckerd
Rosemary Haynes, Office of Health Services
Gayla Sumner, Office of Health Services
Dr. Lisa Johnson, Office of Health Services
Amy Johnson, DJJ Contracts
Frank Alarcon, Henry and Rilla White Foundation
Adrianna Sekula, PACE Center for Girls
Deborah Moroney, PACE Center for Girls
Matt Hefelfinger, DJJ Detention
Sonny Peacock, DJJ Probation
Donnie Read, Twin Oaks Juvenile Development Center
Terri Eggers, DJJ Education
Jacki Malone, Eckerd
Robert Patterson, Eckerd
Dorothy Xanos, Youth Services International
Gerri Dolan, Bay Point Schools
Mike Thornton, Associated Marine Institute
Elaine Woods, Eckerd
Stephen Brown, DJJ Residential
Terri Buckley, DJJ Residential
Howland Ellis, Psysolutions
Bill Jordan, Desoto Juvenile
II. Welcome, Roll Call and Opening Remarks (Jeff Wenhold)
2737 Centerview Drive Tallahassee, Florida 32399-3100 (850) 488-1850
The mission of the Department of Juvenile Justice is to increase public safety by reducing juvenile delinquency through effective
prevention, intervention, and treatment services that strengthen families and turn around the lives of troubled youth.
III. Integration of TIER I and TIER II (FY11-12)
Draft Copies (Placed on the DJJ Website):
1.10 Youth Records-Administration
1.11 Community Partnerships-Administration
2.10 Staff Characteristics-
2.11 Delinquency Programming
Michael Baglivio stated he and Joan Wimmer reviewed the existing Tier II Standards in order to
find those that are explicitly supported by the Residential Rule. The elements in Tier II which
were not supported by the Residential Rule are not in the draft indicators.
Joan Wimmer stated a couple of points in the Residential Rule will be reviewed for amendments.
The acceptance of these amendments will be a lengthy process. Beginning in 2011-12, we will use
these four drafts we have in front of us today. The aforementioned amendments will not affect
these four drafts. It was also stated that these four draft indicators are very preliminary and could
be changed before the draft standards for FY11-12 are drafted.
Michael Baglivio and Joan Wimmer were given the opportunity to answer questions regarding this
1.10 Youth Records-Administration
Peter Plant asked why “Youth Records”? Is this a compliance issue? The intent is the
information, not necessarily the order. Michael Baglivio stated the research and intent shows
programs that had better information were performing better. Is it the quality of records that come
into play and not necessarily to be in correct order, but easily accessible.
Peter Plant asked how the reviewers were trained to review youth records. A clearer objective of
this exercise is needed here. He said it would be good to keep these records within one indicator.
Jeff Wenhold said, at this time, the way the Residential Rule is written, it makes it difficult to
review “youth records” without holding programs accountable for the specific rule requirements.
Peter Plant stated the Medical and Mental Health Rule speaks of “working files”. Will these be
allowed? When are records acceptable to be in the working file?
Discussions will follow about re-writing the indicator or holding programs accountable for the
information contained in the files under the appropriate existing indicators.
1.11 Community Partnerships
Peter Plant asked, “Does attempts to recruit apply?” Sometimes a victim’s advocate is not
available to recruit.
Michael Baglivio stated that a victim’s advocate does not have to be a recruit. Any community
members with Juvenile or Criminal Backgrounds can be recruited as board members. Specific
people are not indicated. A good faith effort to recruit and document the attempts will suffice for
Jacki Malone stated the Residential Rule reads “shall recruit”.
Beth asked if the indicator should read “attempts or demonstrates an attempt to recruit”.
What is the meaning of “shall” and “recruit”?
The consensus of the group was, as long as the program can demonstrate their good faith efforts in
recruiting a victim, victim advocate, etc., they would be meeting the intent of the rule requirement.
2.10 Staff Characteristics
The “Director of Programming” has multiple functions and depends on the size of the program.
Programs are not required to have staff with this specific title.
Peter Plant said eleven separate requirements has been a historical problem. Joan agreed these
need to be broken into separate indicators. These factors have the most impact on our youth.
Initially there were 23 requirements for staff characteristics.
Joan Wimmer, Mike Baglivio, and Jeff Wenhold will explore ways to separate the requirements.
2.11 Delinquency Programming
Joan stated this was putting staff into place for positive movement of youth.
Dorothy Xanos asked if these were the only four we are looking at from Tier II. Joan stated this
hits the integrity of the Tier II Standards. A couple of standards do not role unto the Residential
Rule. The existing Tier II standards will no longer exist after FY10-11. These four will be
broken down into more indicators.
The group was instructed to direct all comments regarding suggested changes to these four drafts
to Joan Wimmer at Joan.Wimmer@djj.state.fl.us.
IV. Overview of Quality Assurance Review Cycle
Everyone may not be aware a new Quality Assurance Policy that became effective October 25,
2010. The new policy took the original four QA policies and combined them in to one. The key
element of this policy is the elimination of Conditional Status. Failed standards will now be re-
reviewed within 6 months. If a program is reviewed January – June and fails a standard, your re-
review will fall into the next fiscal year. The original score will not change. The failed standards
information from the re-review will be included as an addendum to the original report. Six to nine
months later, the program will receive a full review and a new score for that fiscal year. This new
policy was a recommendation from the Miami-Dade Grand Jury report. Programs reviewed prior
to October 25, 2010 that failed a standard, but not overall were placed on Conditional Status.
Jeff provided the group with QA data from FY09-10 and FY10-11. A one-page document will be
provided to group members containing the data.
Quality Assurance is exploring new ways to review all program types every year. We will have a
final decision after the Legislature meets.
Jeff expressed appreciation and congratulated all work group members. The efficiencies and
effectiveness of the QA process is a result of efforts of this workgroup.
Donnie Read, during two reviews he received this year, was told certain scores were given because
they knew a challenge would take place. He said he does not want to challenge, but this comment
makes him think a fair and just score was not given. Jeff reiterated that QA teams score programs
based on the rating definitions and whether or not the score is going to be challenged is irrelevant.
Jeff said the workgroup established new scoring system that has considerably reduced the number
of challenges Quality Assurance is currently receiving.
V. Considerations for FY11-12
A list of considerations will be developed for FY11-12. These will be topics for discussion during
QA Process Improvement Meetings. These will include any issues with the current QA Process.
1. Donnie Read would like to discuss the role of the Program Monitors. His thoughts
were they played the role of a two-way advocate during QA reviews. He was not
aware they were given a standard to review.
Sion Doman, Program Monitor, indicated he reviewed Mental Health at RAMC due to
the fact his background was in Mental Health.
Jeff indicated they were being utilized as a QA Team Member and should participate
in the final scoring.
Joan stated some standards are more time consuming than others. In previous
workgroup discussions, the program monitors will be given short standards for
Steve Bushore indicated the Program Monitor is always asked, in advance, what they
would like to review. Almost always, they are given a standard in which they are
educated in. Usually monitors are not given medical or mental health, but in Sion’s
case, because he a Licensed Mental Health Counselor, he was assigned Mental Health
2. Howland Ellis would like to readdress the score of “7”. Howland would like to see
continuity improved with ratings among different facilities. Lessening subjectivity
among the review team is needed.
Peter Plant said he thought there was a significant issue with this score due to the fact
if one item was missing, his program received a 7. Exceptions are not equally
scored for different programs in different areas throughout Florida.
Jeff stated the review team is to use professional discretion with exceptions. For
example, a missing signature or date is not weighted as much as an assessment not
completed at all.
In most cases, an acceptable (7) rating is supported with more than one exception.
There has been cases when the report was changed at headquarters because the one
exception was insignificant and the program met the intent of the indicator.
Beth would like to see reports, like these mentioned above, for further discussion.
Everyone is always quick to mention areas that scored this or that way, but unless we
can visualize, we cannot remedy these effectively.
3. Peter Plant would like to revisit the definition of “Acceptable”. In the definition there
are procedural requirements, such as with escapes. If the program completes drills,
would that be counted as an occurrence? Do you have the ability to get an 8? It was
explained that beginning July 1, 2010, programs that do not have a practice to evaluate
are eligible for a commendable (8) rating based on the rating definitions.
Donnie Read asked if a program had an escape and all procedures were performed
correctly, only a score of 5 will be given? There will be some consideration given to
this for next fiscal year.
4. Frank Alarcon would like to revisit the QA Process among Residential Programs in
regards to program size. There seems to be significant bias against small programs.
Frank performed a snapshot in September 2010, of different residential program sizes
and has data to prove his point. Small programs have less of a chance to receive a
commendable score. He indicated Terri Buckley tried to work with a small group to
make changes for overcoming these obstacles but was pulled away from the group to
perform other duties. Samples sizes seem to be a contributing factor with the QA
Process. His findings were as follows:
25 or Smaller – 4% received a commendable or better
26 - 50 – 40% received commendable or better.
51 – 75 – 51% received commendable or better
100 or more – 80% received commendable or better
The objective is for good reason in regards to timeliness and forms. Larger programs
have the ability to acquire staff to ensure this happens. Due to budget restraints for
smaller programs, administrative staff cannot be acquired.
5. Assistant Secretary Darryl Olson would like to add basic standards for gender specific
6. The workgroup concluded at 3:35 PM.
7. 2011 QA Process Improvement Meetings will be planned after the first of the year.
An e-mail will be sent out to all members with details for this meeting.