Patient Safety Issues by jennyyingdi

VIEWS: 4 PAGES: 62

									Patient Safety Issues

  Where Does the Lab Professional
              Fit In?

              Mary Ann McLane, PhD, CLS(NCA)
              Region II Director
Objectives
At the conclusion of this seminar, the
  participant will be able to:
 Describe the components of the Institute of
  Medicine’s 1999 “To Err Is Human” document
  which relate to the clinical lab.
 Compare and contrast the programs offered
  by JCAHO’s Speak Up” initiative.
 List at least 5 examples of errors involving
  patient safety and pre-analytical/post-
  analytical error.
Unsafe acts are like
mosquitoes…
You can try to swat them one at a time, but
  there will always be others to take their
  place. The only effective remedy is to
  drain the swamps in which they breed. In
  the case of errors and violations, the
  "swamps" are equipment designs that
  promote operator error, bad
  communications, high workloads,
  budgetary and commercial pressures…
Unsafe acts are like
mosquitoes…
…procedures that necessitate their violation in
 order to get the job done, inadequate
 organization, missing barriers, and
 safeguards . . . the list is potentially long but
 all of these latent factors are, in theory,
 detectable and correctable before a mishap
 occurs.
                                     James Reason,
                                    To Err Is Human
Americans harmed by
medical error
   Two studies of large samples of hospital admissions
      New York using 1984 data

      Colorado and Utah using 1992 data

         adverse event (injuries caused by medical

          management) were 2.9 and 3.7 percent
          respectively
         adverse events attributable to errors (i.e.,

          preventable adverse events) was 58 percent in
          New York, and 53 percent in Colorado and
          Utah
   extrapolated to the over 33.6 million
    admissions to U.S. hospitals in 1997
       44,000 to 98,000 Americans die in
        hospitals each year as a result of medical
        errors
       exceed the number attributable to the 8th-
        leading cause of death
       exceed the deaths attributable to motor
        vehicle accidents (43,458), breast cancer
        (42,297) or AIDS (16,516)
Total national costs
   lost income, lost household production,
    disability, health care costs
       $37.6 billion to $50 billion for adverse
        events
       $17 billion to $29 billion for preventable
        adverse events
            slightly higher than the direct and indirect costs
             of caring for people with HIV and AIDS.
Lives lost
   more than 6,000 Americans die from
    workplace injuries every year
   in 1993 medication errors are estimated
    to have accounted for about 7,000
    deaths
       one out of 131 outpatient deaths
       one out of 854 inpatient deaths
   Medication-related errors occur
    frequently in hospitals; not all result in
    actual harm, but those that do are
    costly.
       2% admissions at two large hospitals:
        preventable adverse drug event
            average increased hospital costs of $4,700 per
             admission
            about $2.8 million annually for a 700-bed
             teaching hospital.
   Medication-related errors
       not all result in actual harm
       those that do are costly
       Preventable: $2 billion for the nation as a
        whole.
Not just hospital patients
   In 1998: ~2.5 billion prescriptions were
    dispensed by U.S. pharmacies at a cost
    of about $92 billion.
       errors in
            prescribing medications
            dispensing by pharmacists
            unintentional nonadherence on the part of the
             patient.
Definitions
   Adverse event
       injury caused by medical management
        rather than the underlying condition of the
        patient.
   Preventable adverse event
       adverse event attributable to error
Definitions
   Error
       the failure of a planned action to be
        completed as intended (i.e., error of
        execution)
       the use of a wrong plan to achieve an aim
        (i.e., error of planning)
Definitions
   Negligent adverse event
       the care provided failed to meet the
        standard of care reasonably expected of an
        average physician qualified to take care of
        the patient

    Discussion point: expected of an “average physician” only?
Why focus on medication-
related error?
   One of the most common types of error
   Substantial numbers of individuals are
    affected
   Accounts for a sizable increase in health
    care costs
Why focus on medication-
related error?
   Easy to identify an adequate sample of
    patients who experience adverse drug events
   The drug prescribing process provides good
    documentation of medical decisions, residing
    in automated, easily accessible databases
       Case of Comfort and Caring, Inc
   Deaths attributable to medication errors are
    recorded on death certificates.
    Important note!

   “There are probably other areas of
    health care delivery that have been
    studied to a lesser degree but may offer
    equal or greater opportunity for
    improvement in safety.”
   That is us!!
      What the literature shows

1.   How frequently do errors occur?
2.   What factors contribute to errors?
3.   What are the costs of errors?
4.   Are public perceptions of safety in health
      care consistent with the evidence?
Harvard Medical
Practice Study
   >30,000 randomly selected discharges
   51 randomly selected hospitals in New
    York State in 1984
       Adverse events, manifest by prolonged
        hospitalization or disability at the time of
        discharge or both = 3.7%
       Preventable adverse events = 58%
       Negligence = 27.6%
Harvard Medical
Practice Study
   13.6% resulted in death
   2.6% caused permanently disabling
    injuries

   Type of adverse event
        drug complications = 19%
        wound infections = 14%
        technical complications = 13%
First instinct?
   Blame someone!
    However…
       due most often to the
        convergence of multiple
        contributing factors
       blaming an individual does
        not change these factors
        and the same error is likely
        to recur
            Case of Charles Thompson, deathrow inmate from TX
What would work better?
   Preventing errors and improving safety for
    patients requires a systems approach
      to modify the conditions that contribute to

       errors
      which recognizes people working in health

       care are among the most educated and
       dedicated workforce in any industry
What would work better?
   The problem is not bad people
   The problem is that the system needs to
    be made safer.
Hindsight bias
   things that were not seen or
    understood at the time of the accident
    seem obvious in retrospect
       misleads a reviewer into simplifying the
        causes of an accident
       highlighting a single element as the cause
       overlooking multiple contributing factors
Hindsight bias
   things that were not seen or
    understood at the time of the accident
    seem obvious in retrospect
       information about an accident is spread
        over many participants
       no one may have complete information
       easy to arrive at a simple solution or to
        blame an individual, but difficult to
        determine what really went wrong.
More definitions
   Slips
       action conducted is not what was intended
       observable


   Mistakes
       the planned action is wrong
More definitions
   Slips
       physician chooses an appropriate medication,
        writes 10 mg when the intention was to write 1
        mg
   Mistakes
       selecting the wrong drug because the diagnosis is
        wrong
   Important not to equate slip with "minor."
    Patients can die from slips as well as
    mistakes.
Lab definitions?
   Slips   (action conducted is not what was intended)

    physician chooses an appropriate medication, writes
      10 mg when the intention was to write 1
      mgaaaaaaaaaaaaaaaaaaaaaaaaaaaa
   Mistakes      (the planned action is wrong)
Safety = absence of errors?
   More!
   Multiple dimensions
       an outlook: health care is complex and risky and
        solutions are found in the broader systems
        context;
       a set of processes: identify, evaluate, and
        minimize hazards and continuously improve
       an outcome: manifested by fewer medical errors
        and minimized risk or hazard
Safety definition
   Freedom from accidental injury
       from the patient's perspective, the primary
        safety goal is to prevent accidental injuries
            Safe environment = low risk of accidents
                 reduce defects in the process or departures from the
                  way things should have been done
                 establish operational systems and processes that
                  increase the reliability of patient care.
Active vs. latent error
   Active errors
       occur at the level of the frontline operator
       their effects are felt almost immediately
   Latent errors
       removed from the direct control of the
        operator
       poor design, incorrect installation, faulty
        maintenance, bad management decisions,
        and poorly structured organizations
Active vs. latent error
   Active errors
       the pilot crashed the plane


   Latent errors
       a previously undiscovered design
        malfunction caused the plane to roll
        unexpectedly in a way the pilot could not
        control and the plane crashed
Active vs. latent error
   Latent error
       greatest threat to safety in a complex
        system
       often unrecognized
       have the capacity to result in multiple
        types of active errors.
       Challenger accident traced contributing
        events back nine years
       Three Mile Island accident, latent errors
        were traced back two years
Active vs. latent error
   Latent error
       difficult for the people working in the
        system to notice
            errors may be hidden
                 in the design of routine processes in computer
                  programs
                 in the structure or management of the organization
            people become accustomed to design defects
             and learn to work around them, so they are
             often not recognized
Active vs. latent error
   Latent error
       "normalization of deviance"
            small changes in behavior became the norm
            additional deviations became acceptable
            the potential for errors is created
                 signals are overlooked or misinterpreted
                 signals accumulate without being noticed
Active vs. latent lab error
   Active errors


   Latent errors
First instinct?
   focus on the active errors by punishing
    individuals (e.g., firing or suing them)
   retraining or other responses aimed at
    preventing recurrence of the active
    error
       punitive response may be appropriate in
        some cases (e.g., deliberate malfeasance)
       it is not an effective way to prevent
        recurrence
First instinct?
   Large system failures
       latent failures coming together in
        unexpected ways
       appear to be unique in retrospect
   Same mix of factors is unlikely to occur
    again
       efforts to prevent specific active errors are
        not likely to make the system any safer
Focus on active errors
   lets the latent failures remain in the
    system
   their accumulation actually makes the
    system more prone to future failure
Focus on latent errors
   Discovering and fixing latent failures,
    and decreasing their duration, are likely
    to have a greater effect on building
    safer systems than efforts to minimize
    active errors at the point at which they
    occur
   likely to have a greater effect on
    building safer systems
High reliability theory
   accidents can be prevented through
    good organizational design and
    management
       an organizational commitment to safety
       high levels of redundancy in personnel and
        safety measures
       strong organizational culture for
        continuous learning and willingness to
        change
Correct performance and error
   "two sides of the same coin”
Complexity and tight-coupling
   Systems that are more complex and
    tightly coupled are more prone to
    accidents and have to be made more
    reliable
       complex and tightly coupled systems can
        "spring nasty surprises.“
   Guess what type of system healthcare
    is????!!!
Two cases of success
   Aviation
   Occupational health
       growing awareness of safety concerns and
        the need to improve performance
       comprehensive strategies
            creation of a national focal point for leadership
            development of a knowledge base
            dissemination of information throughout the
             industry
Two cases of success
   Aviation
   Occupational health
       designated government agency with
        regulatory responsibility for safety
       carefully constructed research agenda
       substantial resources devoted to these
        initiatives
Third case of success?
   Healthcare
       no cohesive effort to improve safety in
        health care
       resources devoted to enhancing and
        disseminating the knowledge base are
        wholly inadequate
       “health care is not likely to make significant
        safety improvements without a more
        comprehensive, coordinated approach.“
Center for Patient Safety
   provide leadership for safety
    improvements throughout the industry
   establish goals and track progress in
    achieving results
   expand the knowledge base for
    improving safety in health care
   provide visibility to safety
    concerns
Role of professionals
   Become active leaders in encouraging and
    demanding improvements in patient safety.
   Setting standards, convening and
    communicating with members about safety
   Incorporating attention to patient safety into
    training programs
   Collaborating across disciplines
   Contribute to creating a culture of safety. As
    patient advocates, health care professionals
    owe their patients nothing less.
Center for Patient Safety
should…
   4. Define feasible prototype systems
    (best practices) and tools for safety in
    key processes, including both clinical
    and managerial support systems for…
       management of diagnostic tests, screening,
        and information…
Improve Access to Accurate,
    Timely Information


   Information about the patient,
    medications, and other therapies should
    be available at the point of patient care,
    whether they are routinely or rarely
    used. Examples of ways to make such
    information available are the following
Improve Access to Accurate,
    Timely Information
• Have a pharmacist available on nursing units
  and on rounds.
  (why just a pharmacist? Commercial minute for the
  professional DLM doctorate…)
• Use computerized lab data that alert clinicians
  to abnormal lab values.
• Place lab reports and medication records at
  the patient's bedside.
• Place protocols in the patient's chart.
Improve Access to Accurate,
    Timely Information
• Color-code wristbands to alert of allergies.
• Track errors and near misses and report them
  regularly.
• Accelerate laboratory turn around time.
…also noted the importance of involving the
  patient in their own care…commercial about
  the ASCLS consumer webpage
Joint Commission on Accreditation of
Healthcare Organizations
   Speak Up: Help Prevent Errors In Your Care
    Brochures and Poster
    Speak Up Poster             Hospitals (English)
    Ambulatory Care             Hospitals (Spanish)
    Behavioral Health Care      Laboratory Services
    Health Care Networks        Long Term Care
    Home Care


http://www.jcaho.org/general+public/gp+speak+up/speak+up_bro.htm
630-792-5800, option 5
So what’s happened since 1999?
   2001
       Congress: $50E6 for safety research
       IOM: The Quality Chasm
   2004
       Congress named Agency for Healthcare
        Research and Quality
            Center for Quality Improvement and Safety
                 Education, training, dissemination, setting standards
   Health and Human Services
         Agency for Healthcare Research and Quality
        Quality & Patient Safety
        Health Information Technology
         Electronic health records — innovation — privacy — international standards — data sources —
         clinical vocabulary
        National Quality Measures Clearinghouse™
         Evaluate health care quality — online database — process — outcome — access — patient
         experience
        CAHPS®—Consumer Assessment of Health Plans
         Consumer feedback — survey and report tools — fact sheet — impact
        Measuring Healthcare Quality
         Studies and projects — standardized methods — performance measures
        Medical Errors & Patient Safety
         Scope of problem — reducing errors — research program — patient tips
        WebM&M: Morbidity & Mortality Rounds
         Patient safety forum— learning modules — analysis of medical errors
        Quality Indicators
         Hospital quality measures — prevention — inpatient — patient safety
        Quality Information & Improvement
         Employer experience — consumer information — case studies — glossary
        TalkingQuality
         Communicating with consumers — health care report cards
2005 JAMA (Lucian Leape, Donald Berwick)

   Computerized              Errors 80%
    prescribing
   Including                 Preventable adverse
    pharmacists on             events down 78%
    rounds
   Standardizing             Adverse events
    medication practices       down 60%
Am J Clin Pathol
   Volume 120, 18-26, 2003
   Classifying laboratory incident reports to
    identify problems that jeopardize patient
    safety
   129 incidents
       95% potential adverse events
       73% preventable
            71% preanalytical, 18% analytical, , 11% postanalytical
            30% involved cognitive error (incorrect choices caused
             by insufficient knowledge)
            73% involved noncognitive error (lapses in expected
             automatic behavior)
ADVANCE for MLP
   11/7/05
   Quashing errors
   Streamlining… the lab professionals getting
    involved in the training of nurses…
   Cited Clin Chem 1997 paper (Plebani et al)
       46% lab errors = preanalytical phase
            68.2% of these = specimen collection
            Note…we usually haven’t a clue if it’s been drawn
             correctly unless it’s in the wrong tube…
Comment on the Clin Chem paper
   1998, Volume 44: 1066-67, Witte et al.
   Analyzed 219,353 clin chem results and
    found 98 errors
       447 ppm
       Anesthesia errors = 2.5 ppm
       Aviation errors = 0.18 ppm
       We have a ways to go!!
And then there are the blood glucose meters…

   11/9/05
       Glucose readings done using stix having glucose
        dehydrogenase pyrroloquinolinequinone
        (GDH-PQQ) as the method
            Falsely increases glucose levels in patients receiving
             parenteral products containing maltose, galactose, d-
             xylose
            Peritoneal dialysis
            Immune globulin
Our turn!

								
To top