Documents
Resources
Learning Center
Upload
Plans & pricing Sign in
Sign Out

Rn Contractor

VIEWS: 11 PAGES: 83

Rn Contractor document sample

More Info
									Developing Quality Management
Activities from the Ground Up


   Elizabeth Graves Love, MPH
   Houston EMA
Outline
   Houston EMA at a Glance
   The CPCDMS
   Outcomes Evaluation
   Clinical Chart Review
   Client Satisfaction Measurement
   Resources
   Conclusions and Questions
I. The Houston EMA at a Glance
The Houston EMA
   Six county area in southeast Texas,
    covering 5,921 square miles

   General population of 4,290,277

   Estimated number of diagnosed PLWH/A
    is 20,045
                   Houston EMA
                          Grantee/CEO
                       Harris County Judge


    Administrative Agency          Ryan White Planning Council
Harris County Health Department             (RWPC)
      HIV Services Section
         (HIV Services)
The Houston EMA
   FY 2003 Title I allocation is $20,526,823

   HIV Services administers 67 service
    contracts with 27 local providers

   Over 7,000 PLWH/A access Title I
    services each year
II. The Centralized Patient Care
Data Management System
The CPCDMS
   The CPCDMS is a real-time, de-identified,
    client-level database application

   The system was implemented in June 2000

   To date over 10,500 clients have been
    registered in the CPCDMS
The CPCDMS
   Records are created, accessed and updated
    by providers via DSL data linking using a
    unique 11-character client code
       No client-identifying information is collected


   Client records are stored at HIV Services
    on a database server in SQL format
The CPCDMS
   Data collection occurs through one of three
    processes
       Client registration
       Service encounter information
       Medical updates

   Through these processes the data that is
    essential to QM activities is collected
The CPCDMS
   Users schedule reports using Crystal
    Reports software
       Providers use reports to generate backup
        billing documentation and manage programs

       HIV Services uses reports to obtain
        unduplicated data across all providers, service
        categories and/or grant codes
The CPCDMS
   31 local Ryan White-funded providers are
    online and using the CPCDMS

       This includes all providers funded by Titles I,
        II, III and IV in a 10-county area
III. Outcomes Evaluation
Background
   HRSA began emphasizing the importance
    of evaluating CARE Act programs in the
    late 1990’s

   The Houston EMA began discussing
    options in FY 1999
Roles and Responsibilities
   The RWPC requested that HIV Services
    develop and implement a comprehensive,
    ongoing evaluation program

   The RWPC determined that its role would
    be one of general process oversight
Getting Ready
   In early FY 2000 HIV Services hired an
    FTE Project Coordinator to manage this
    and other quality-related initiatives

       Job description required a graduate degree
        and documented evaluation experience
Getting Ready
   In summer 2000 HIV Services completed
    necessary background work
       Reviewing HRSA materials and existing
        evaluation models
       Setting project goals and timeline
       Surveying the level of awareness among
        providers and RWPC members
       Conducting a resource inventory
Getting Ready
   Project Goals included:
       Developing appropriate outcomes and
        indicators for each funded service
       Involving all stakeholders
       Minimizing the pain of data collection for
        providers and clients
       Providing accessible, useful data to the
        RWPC and providers on a regular basis
Getting Ready
   In fall 2000, HIV Services conducted an
    orientation meeting for providers, RWPC
    members and consumers

   HIV Services then facilitated work groups
    to select outcomes and indicators for 27
    Title I service categories
Selecting the Outcomes
   Each group worked through the United
    Way’s logic model, which provides steps
    for choosing appropriate outcomes

   For each selected outcome the group chose
    appropriate indicators and data collection
    methods
Selecting the Outcomes
   Example – Primary Medical Care
       Outcome – Slowing/prevention of disease
        progression
       Indicator – 75% of clients will improve or
        maintain CD4 counts and viral loads over
        time
       Data Collection Method – CPCDMS
Selecting the Outcomes
   Example – Rehabilitation
       Outcome – Improved ability to perform
        activities of daily living (ADL)
       Indicator – Change over time in the percent
        of clients who report an improvement in the
        ability to perform ADL after completing
        rehabilitation therapy
       Data Collection Method – Client survey
Selecting the Outcomes
   Example – Outreach
       Outcome – Entrance into the system of care
       Indicator – By the end of the fiscal year, 50%
        of clients will enter Ryan White primary care
       Data Collection Method - CPCDMS
Selecting the Outcomes
   Once the work groups reached consensus,
    the RWPC reviewed and approved the
    outcome measures

   The outcome measures are reviewed and
    revised each fiscal year
Background Work
   During the RWPC approval process, HIV
    Services prepared the following:
       Data collection tools and analysis reports
       Policies and contract language describing
        requirements for providers
       Training for providers
Data Collection
   Through registrations, service encounters and
    medical updates, the CPCDMS collects the
    following data used in outcomes analysis:
       Demographics
       CD4 counts, viral loads and stage of illness
       Opportunistic infections and co-morbidities
       Health and support service utilization
Data Collection
   Through special screens created for certain
    service categories, the CPCDMS collects the
    following data used in outcomes analysis:
       Provider assessment of client progress
       Health data not collected in primary care
       Number of hospitalizations and ER visits
Data Collection
   In general, the CPCDMS cannot provide
    information about
       Quality of life
       Cost-effectiveness
       Knowledge, attitudes and practices


   Client surveys collect this information
Data Collection
   Client Surveys
       HIV Services developed and piloted the pre-
        and post-test surveys

       Virtually all surveys are less than one page in
        length; most are four questions or less

       No demographic information is collected
Data Collection
   Survey Administration
       In FY 01 survey administration and
        data entry was manual

       Since FY 02 survey administration and
        data entry has been automated through
        the CPCDMS
Provider Requirements
   Providers are contractually obligated to
    participate in evaluation activities

   Reimbursements may be withheld if a
    provider is not in compliance
Implementation
   Prior to the beginning of FY 2001,
    providers received instructions and training
    on evaluation activities

   Data collection began March 1, 2001
Data Analysis and Reporting
   Providers must submit outcomes data to
    HIV Services each quarter

   Data is stored in SQL format and analyzed
    using Crystal Reports

   Each provider and the RWPC receives
    results on a quarterly basis
Using Outcomes Data
   Providers use outcomes data to report to
    their boards, complete RFPs and for
    internal quality improvement

   The RWPC uses outcomes data in all
    planning processes
Using Outcomes Data - Example
   Primary Care Outcome 1.1 – Slowing or
    prevention of disease progression
       Indicator - 75% of clients will decrease or
        maintain their viral load over time
   In FY02 79% of Title I primary care clients
    decreased or maintained their viral load
   The RWPC increased the allocation for
    primary care by 10% for FY04
Using Outcomes Data - Example
   Household Items Outcome 3.1 – Improved or
    stabilized living conditions
       Indicator - Change in the percent of clients with
        improved or stabilized living conditions due to
        receiving furniture or household items
   FY01 and 02 data showed that this program
    had no impact on client living conditions
   The RWPC did not fund this service for FY04
Successes
   From conception to implementation, project
    development took just six months

   The project has support and participation from all
    key stakeholders

   The resulting data has enhanced RWPC decision-
    making as well as our Title I grant application
Challenges
   At first providers were wary about the
    possibility of extra work

   RWPC members require ongoing education
    about understanding and using outcomes
IV. Clinical Chart Review
Background
   In April 2001 HRSA issued its guidance on
    quality management
       One goal is to ensure that medical services
        are consistent with treatment guidelines


   The EMA determined that clinical chart
    review could best accomplish this goal
Roles and Responsibilities
   Following HRSA guidance, HIV Services
    assumed project oversight

   The RWPC QA Committee maintains an
    advisory role
Getting Ready
   In FY 2001 HIV Services hired an FTE
    Program Development Coordinator to
    oversee clinical chart review

       Job description required a graduate degree
        along with documented experience in
        QA/utilization review
Getting Ready
   During winter 2001 HIV Services
    completed all necessary background work
       Reviewing PHS Guidelines and HRSA’s
        Primary Care Assessment Tool
       Reviewing tools and methodologies from
        other EMAs
       Determining provider expectations
Scope of Work
   With this information HIV Service
    determined the scope of the project
       Each health-related service would undergo an
        annual review of client records
       A qualified contractor would perform the
        chart reviews
       HIV Services would analyze and report
        findings
Scope of Work
   Participating service
    categories include:
       Primary Care           Substance Abuse
       Case Management         Treatment
       Oral Health Care       Rehabilitation
       Vision Care            Hospice Care
       Professional           Home Health Care
        Counseling             Drug
                                Reimbursement
Contractor
   HIV Services contracted with a masters-
    level RN to help develop the tools and to
    conduct the reviews

   Reimbursement is on a per-chart basis
Tool Development
   For each service category a set of core
    questions was developed
       Example – What percentage of primary care
        clients receive the recommended number of
        CD4, viral load and CBC tests each year?


   These questions drove tool development
Tool Development
   Primary care tool borrows heavily from
    HRSA’s Primary Care Assessment Tool
       30 data elements


   Case management tool follows EMA
    standards of care for case management
       15 data elements
Implementation
   Providers received instructions and training
    on chart review activities
       Provider obligations
       Sample generation
       Review schedule
       Reporting
Provider Requirements
   Providers are contractually obligated to
    participate in chart review activities

   Providers must accommodate the review
       Provide a work space for the contractor
       Have charts pulled and ready for review
Sample Generation
   Desired sample characteristics
       10% of the caseload for each service

       Reflective of the population served

       Randomly selected
Sample Generation
   To generate the sample, a CPCDMS report
    randomly selects 10% of the clients seen
    during the time under review, mirroring the
    demographic make-up of all clients

   HIV Services provides the sample to the
    provider immediately prior to the review so
    charts may be pulled
Implementation
   During FY02 charts for four Primary Care
    sites and eight Case Management sites
    were reviewed
       400 primary care charts
       235 case management charts

   Oral health and vision care have been
    added in FY03
Analysis and Reporting
   The contractor provides raw data to HIV
    Services in MS Access format for analysis

   HIV Services forwards preliminary results
    to each provider for their comment

   Final results are disseminated to providers
    and the RWPC
Using the Data - Example
   Providers are using chart review data for
    internal quality improvement
       Example – One clinic’s results showed that
        very few TB+ clients received confirmatory
        chest x-rays, which were performed off-site
       The clinic purchased the necessary equipment
        to perform chest x-rays on-site
Using the Data - Example
   The RWPC is using chart review data
    during their decision-making processes
       Primary Care chart review data showed that
        just 29% of clients on ART received adequate
        medication adherence education
       The RWPC strengthened the Primary Care
        service definition for FY 2004, mandating
        med ed and specifying who may provide it
Using the Data - Example
   HIV Services is using chart review data to
    strengthen contract language and
    documentation requirements
       Case Management chart review showed the
        quality of client assessment tools varied
        among providers
       HIV Services has developed a standardized
        assessment tool, required in FY 2004
Successes
   Most providers consider chart review to be
    a free service, saving money and staff time

   The RWPC quickly embraced the value of
    chart review data

   After just one year, the data has resulted in
    significant changes in service delivery
Challenges
   Tool development for services other than
    Primary Care has been challenging

   Some providers were concerned that results
    might be used in a punitive manner
V. Client Satisfaction
Background
   Prior to FY 2002 HIV Services required
    that all providers measure satisfaction

   Methodologies and tools varied

   HRSA’s QM guidance in April 2001 led to
    a reconsideration of client satisfaction
Background
   HIV Services decided to centralize the
    measurement of client satisfaction to
    ensure consistent and reliable data

   As with clinical chart review HIV Services
    assumed project oversight
       The RWPC QA and Affected Community
        Committees provide input and feedback
Background
   During FY 2001 HIV Services conducted
    all necessary background work
       Collecting and reviewing providers’ current
        methodologies and tools
       Reviewing methodologies from other EMAs
       Developing methodology and timeline
       Developing survey instruments
Scope of Project
   Methodology employs a survey with
    questions that address the service, the
    provider and the Title I system overall

   On an annual basis a 10% convenience
    sampling is surveyed for each service
Survey Development
   HIV Services developed a core set of
    questions as well as questions relevant to
    each service category
       Each service category has a unique survey


   The surveys were piloted at agency sites
Survey Development
   Providers and RWPC members assisted
    with survey development
       Many survey questions were borrowed from
        providers’ previous survey tools
       RWPC Affected Community Committee
        members provided consumer insight
Survey Administration
   Each provider must survey 10% of their
    clients during a six-week period set by HIV
    Services

   The same methodology used to generate
    outcomes surveys through the CPCDMS is
    used to generate client satisfaction surveys
Survey Administration
   HIV Services provides each agency with a
    locked box in which clients deposit
    completed surveys

   This ensures that providers never see
    completed surveys, thus encouraging
    clients to provide honest answers
Provider Requirements
   Providers are contractually obligated to
    participate in client satisfaction activities

   Providers with successful methods for
    measuring satisfaction already in place may
    be exempt from participation
Implementation
   In FY 2002 1,061 surveys were completed
       The sample mirrored demographic
        characteristics of the entire Title I client
        population


   In FY 2003 1,750 surveys were completed
Data Analysis and Reporting
   Survey forms are scanned at HIV Services and
    the data is stored in a SQL database that is linked
    to other CPCDMS data

   Crystal Reports is used to generate analysis
    reports

   Each provider and the RWPC receives results
    each year
Using the Data - Example
   The RWPC uses the results when setting
    service definitions
       Drug Reimbursement clients indicated they
        were not receiving adequate information from
        pharmacy staff about side effects, drug
        interactions, diet and dosage
       RWPC strengthened the service definition to
        mandate specific education requirements
Successes
   The standardized methodology provides
    the EMA with data from the provider,
    service category and Title I perspectives

   Centralizing satisfaction measurement
    benefits providers and the RWPC
Challenges
   Initially providers were concerned that
    clients would feel “over-surveyed”

       The RWPC Affected Community Committee
        helped alleviate these concerns, and in fact
        most clients have welcomed the opportunity
        to provide feedback
VI. Resources
Staff Resources
   HIV Services has 2.5 FTE assigned to
    evaluation and QM activities

   A masters-level RN contractor provides
    chart review services

   An IT consultant helps build CPCDMS
    survey modules and analysis reports
Financial Resources
   Overall FY 2004 QM budget is $434,760,
    2% of total allocation

       Salary for two FTE
       $150,000 for chart review contractor
       $100,000 for CPCDMS consultant
VII. Conclusions
Conclusions
   Centralizing QM activities at the Grantee level
    results in standardized methodologies, project
    continuity and consistent data
   Buy-in from stakeholders is essential
   Automating processes whenever possible eases
    the burden on all stakeholders
   Regular data reporting keeps stakeholders
    interested and involved
   Borrowing methods and tools is a lifesaver 
For more information…
Elizabeth Graves Love, MPH
Harris County Public Health and
   Environmental Services Department
HIV Services Section
713-439-6041
elove@harriscountyhealth.com
www.harriscountyhealth.com/hivservices

								
To top