Docstoc

Attacking the Governments Attacking the Government s Junk Science

Document Sample
Attacking the Governments Attacking the Government s Junk Science Powered By Docstoc
					              Government s
Attacking the Government’s   Robert Epstein, AFD, Eastern
      “Junk Science”           District of Pennsylvania
Attacking the Government’s
      “Junk Science”
      “J k S i     ”


    Rob E t i
    R b Epstein, AFPD ED PA
Why do we care
about experts?
           y
         Why do we care?
“About one quarter of the citizens
who had served on juries which were
presented with scientific evidence
believed that had such evidence
been absent, they would have
changed their verdicts -- from guilty
to t ilt ”
t not guilty.”
Joseph L. Petersen et al., The Use and Effects of Forensic Science in
the Adjudication of Felony Cases, 32 J. FORENSIC SCI. 1730, 1748
(1987) (emphasis added).
    What is
“Junk Science?”
  What i
  Wh t is
 Science?
“Science?”
     “Science” Defined
     “S i    ” D fi d

Knowledge or a system of
knowledge covering general
truths or the operation of
       ll         bt i d
general laws as obtained
through and tested through the
scientific method.
Meriam Webster
       Scientific Method”
  The “Scientific Method
A scientific method consists of the
collection of data through observation
and experimentation and the
formulation and testing of hypothesis.
S i tifi researchers propose
Scientific         h
hypothesis . . . and design experimental
studies               hypotheses.
st dies to test these h potheses
Wikipedia
    Junk Science
   “Junk Science”
      Defined

“Junk Science” is a field
th t has not t t d its
that h     t tested it
claims.
claims
What d     thi h    t d
Wh t does this have to do
      with the law?
    The Daubert Factors
•   Testing
•   Error Rates
•   Standards
•   Publication and Peer Review
•   General Acceptance by the
    Relevant Scientific Community
   Three Fields of “Junk
         Science”

• Fingerprints

• Handwriting

 Toolmarks/Firearms
•T l    k /Fi
  What These Fields Have in
  Common: Individualization
1.                                         crime-scene
1 Fingerprints – Identify – Individualize crime scene
   print to a print from the defendant to the exclusion
                                    world.
   of all other fingerprints in the world
2. Handwriting – Identify – Individualize handwriting,
   like a bank robbery note, to handwriting
   exemplars of the defendant to the exclusion of all
   other persons in the world.
3. Toolmarks/firearms – Identify – Individualize a
   mark left by a tool as having been made by a
   particular tool associated with the defendant the
   exclusion of all other tools in the world.
Many “Junk Science” Fields
  Are Based on the Same
    Untested Premise:
      “Uniqueness”
          q
     y    g
 Everything in the world is
 unique. Every fingerprint,
          ’ handwriting,
 everyone’s h d iti
 every tool, every gun.
Junk Scientists Make the Same
            Claim:
  Because everything in the
  world is unique, we can make
  an identification to the
  exclusion of every other object
          orld every             ith
  in the world -- e er finger with
  respect to fingerprints, every
  person with respect to
  handwriting and every tool with
  respect to tool marks.
         Our Goal:
Expose each of these fields for
the junk science that it actually
is.
• Pre-trial Motion to Preclude
• Convince the jury that they
should have a reasonable doubt
regarding these fields.
   Is it Possible? Yes!

* Bullet lead comparison

* Voice spectragraph
How
Ho are We Going to Do It?

  Learn the fields that
    we’re fighting.
1) David Faigman, et al., Modern Scientific
Evidence: The Law and Science of Expert
E id      Th L        dS i        fE      t
Testimony (West 2008)
2) Robert Epstein, Fingerprints Meet Daubert:
The Myth of Fingerprint Science is Revealed, 75
                    L.
Southern California L Rev 605 (2002)

3) Adina Schwartz, A Systemic Challenge to the
Reliability and Admissibility of Firearms and
Toolmark Identification, 6 Columbia Sci. & Tech. L. Rev 2
   Michael Risinger, et al., E
4) Mi h l Ri i                   i     fI
                       t l Exorcism of Ignorance
as a Proxy for Rational Knowledge: The Lessons
                               Expertise
of Handwriting Identification “Expertise”, 137 U Pa
                                               U. Pa.
L. Rev 731 (1989)
Challenging Fingerprint
   “Junk Science”
Inked Fingerprint
Latent Print
Government s
Government’s Chart
Teasing out the Points
         How Do We Fight It?
                 Experts
            Hire Experts.
  Experts in the field, fingerprint
  examiners, forensic document
            ,
  examiners, toolmark examiners
• Counter experts
  – Fingerprints – Ralph Haber -
    humanfactorsconsultants.com
               g
  – Handwriting – Mark Denbeaux -
    denbeama@shu.edu; Michael Saks -
    michael.saks@asu.edu
  – Toolmark’s – Adina Schwartz -
    aschwartz@jjay.cuny.edu
       How Do We Fight It?
         Get Discovery!
  Get th demonstrative evidence
• G t the d      t ti      id
  for you and y
      y              p
              your experts
• Get the manuals of the crime lab
• Get the government expert’s
  bench notes
             p            g ,
   New Hampshire v. Langill, 05-S-1129
   (4/2/07 Sup. Ct. N.H.)
Inked Fingerprint
     No Probabilities
Different people can have a
number of matching ridge
                    g g
characteristics and there has
been no probability testing to tell
us the probability of that
       p         y
occurring.
Inked Print / Latent Print
  There Are No Standards for
      Comparing Prints
Any unbiased intelligent assessment
of fingerprint Identification practices
      y                        ,
today reveals that there are, in
reality, no standards.

David A. Stoney, Measurement of Fingerprint Individuality in
               g p              gy (    y
Advances in Fingerprint Technology (Henry C. Lee and R.E.
Gaensslen eds. 2nd Ed. 2001.)
Examiners Do Not Know How
  Much They Have to See to
      Declare a Match
No Agreed-Upon St d d
N A     dU     Standard:
1. FBI: No standard
2. Local U.S. crime labs: 8-12
3. France and Italy: 16
4.
4 Brazil and Argentina: 30
    Even Fingerprint Experts
  Recognize the S bj ti it of
  R      i th Subjectivity f
        Their Opinions
                p
 [    ] p
“[The] opinion as to whether there is
sufficient uniqueness of detail present in
                g print to eliminate
the friction ridge p
everyone else in the world as a
                          y    j
possible donor… is very subjective.”

David Ashbaugh, The Premises of Friction Ridge
Identification,   J                              (1994).
Identification 44 J. Forensic Identification 499 (1994)
“Standards” is a Daubert
        Factor

Without t d d
With t standards we
have nothing to hold the
expert to.
    How Are We Going to
   Challenge These Fields?

• Learn the fields
• Hire experts
•R       t discovery
  Request di
• Hold the fields to the
  Daubert factors
  –Standards
Challenging Handwriting
    “Junk Science”
         p
Bruno Hauptman
    Handwriting: Two Principles

1.
1 No two people write exactly the same
   way (inter-writer differences)
2 No person writes exactly the same way
2.
   twice (natural variation)
 Can forensic document
 examiners distinguish between
 inter-writer differences and
 natural variation?
   Different People Can Have
       Similar H d i i
       Si il Handwriting

The writing of different people
    have               similarit
can ha e significant similarity
and we do not know what the
probability is of this occurring.
Similar Handwriting
John Harris,
How Much Do People
      Alike    St d
Write Alike: A Study
of Signatures,
48 J. Crim. L & Criminology &
Pol.
Pol Sci 47 (1958)
         Standards
      No S

            g    p
Handwriting “experts” have
no standards for comparing
h d iti         d th   i
handwriting and there is no
standard for declaring an
                     g
identification.
               No Standards

           q          p
The technique of comparing  g
known writings with questioned
documents appears to be entirely
subjective and entirely lacking in
controlling standards.
United States v. Saelee, 162 F.Supp.2d 1097, 1104 (D.
Alaska 2001).
Handwriting Comparison
              Cross on “Standards”
       [ ] you                you
Court: [D]o y know or are y aware of
any quantitative, by which I mean numerical,
standards which would form a document
examiner’s conclusion that a given feature in
  q                                 y
a questioned document is ... not by the
author of the known samples? … : Is there
     y             g
a way of measuring? . . . Is there some
numerical standard that document
examiners use?
  Ms. Kelly: I would say as a general rule,
no, your Honor; there is no numerical
   ,y           ;
measurement of that kind of slant.
       g g
Challenging Toolmarks
   “Junk Science”
               Toolmark?
    What is a “Toolmark?”
A toolmark is simply a mark left by a tool on a
surface.
Firearms identification is a subspecies of
toolmark identification dealing with the
toolmarks that bullets, and cartridge cases
                       ,           g
acquire by being fired. The gun in effect is
the tool, the ammunition is the surface upon
which the tool has left its mark.
      Toolmarks: Two Types

Striated Toolmarks: Patterns of scratches
or striae produced by the parallel motions
of tools against objects.
Impression Toolmarks: Produced on
objects by the perpendicular, pressurized
          tools.
impact of tools
Three Types of Characteristics
                                                y
Class characteristics: A characteristic shared by all
tools of a particular type made by a particular
manufacturer.
Subclass characteristics: Microscopic
characteristics that are shared by tools that come
from the same batch.
Individual characteristics: Result from random
imperfections or irregularities on tool surfaces
produced by the manufacturing process and/or
subsequent use corrosion or damage.
  Toolmarks Change Over Time
The marks that a tool will make will
change as the tool is used as a result of
wear, and/or damage and corrosion.
What thi            for        l is that
Wh t this means, f example, i th t
there will be significant dissimilarity
between bullets fired from the same gun.

Alfred Biasotti & John Murdock, Criteria for Identification,
   ( )                                                 (
16(4) Ass’n Firearms & Tool Mark Examiners 16,17 (Only 21- y
38% of the striae on pairs of bullets fired from the same
revolver matched).
Bullets Fired From the
     Same B
     S           l
            Barrel
 Similarity of Different
      Toolmarks
Different tools can leave
    k that have significant
marks th t h       i ifi  t
similarity and we don’t
know what the probability is
o t at occu
of that occurringg
Recognized by Experts in the
           Field
           Fi ld
    For the first time, there is access
to hundreds of computerized images
of projectiles fired from similarly rifled
    p j                             y
firearms . . .
   When using a comparison
microscope . . . it is difficult to
eliminate comparisons even though
we know they are from different
firearms.
Joseph Masson, Confidence Level Variations in Firearms,
29(1) AFTE Journal (Winter 1997)
Q. [I] in your article, you discuss the fact that through the advent of
   computerized ballistic identification systems, such as the one you
   have at ATF, you were able to find bu ets t at had bee fired by
     a e        ,        e e ab e     d bullets that ad been ed
   different guns that had . . . “An exceptional number of similarities”,
   correct?
A. That s correct, sir.
A That’s correct sir . . .
Q. All right. So much similarity that they were difficult to eliminate as
   being fired from the same gun?
A. That’s correct, sir.
Q. Okay. Now, for bolt cutters and other types of tools, there is no
   computerized databases that exist that you can do that kind of work
   on, is that correct?
A. That’s correct.
Q. Sir, there is no identification standard in tool mark analysis as to the
   minimum number of matching striations that an examiner need to
                        g                ,
   see before declaring a identification, correct?
A. No, there is none
    How Are We Going to Do It?
•   Learn the fields
•   Hire experts
•   Request discovery
    R       t di
•   Hold the fields to the Daubert factors
    – Standards
   ead the te atu e of the e d, especially
• Read t e literature o t e field, espec a y
  anything that the government “expert” has
  written
No Toolmark Standards

  No standards for comparing
 toolmark impressions and no
   agreed upon identification
standard for declaring a match
Bullets Fired from the Same
            Barrel
            B    l
     Subjectivity Recognized in
              the Fi ld
              th Field
AFTE: Theory of identification as it relates to
toolmarks.
1.
1     The theory of identification as it pertains to the
      comparison of toolmarks enables opinions of
      common origin to be made when the unique surface
          t       ft   t l      k       i “ ffi i t
      contours of two toolmarks are in “sufficient
      agreement.”
2.                      sufficient.
      No definition of “sufficient.”
3.    Currently the interpretation of
      individualization/identification is subjective in nature,
      founded on scientific principles and b
      f     d d       i tifi    i i l              d    the
                                            d based on th
      examiner’s training and experience.
         Conflict in the Field Over
                Standards
                St d d
    Some examiners have
    recognized the need for an
        g
    objective standard and have
    adopted their own
•CMS – 6 consecutive matching striae in one
group, 3 consecutive matching striae in two
g p
groups.
Steven G. Bunch, Consecutive Matching Striation Criteria A General Cretique,
45(5) J. Forensic Sci. 955, 962 (2000).
       Testing
       T ti

Have these fields been
       tested?



        NO
    Fingerprint Solicitation for
        Validation S di
        V lid i Studies

1) Basic research to determine the scientific validity of
individuality in friction ridge examination based on
               t ff t              tifi ti    d t ti ti l
measurement of features, quantification, and statistical
analysis.

• [T]he theoretical basis for . . . individuality has had
limited study and needs additional work to demonstrate
                          identifications.
the statistical basis for identifications It is expected
that proposals would address the relative importance of
different minutiae to establish individuality, as well as
the t ti ti l i ifi           f            f i ti
th statistical significance of groups of minutiae.
 Fingerprint Solicitation for
     Validation Studies

2) Procedures for comparing friction ridge impressions
that are standardized and validated.

• Procedures must be tested statistically in order to
demonstrate that following the stated procedures allows
analysts to produce correct results with acceptable error
rates.   This has not yet been done.
  Forensic Document
Examination Solicitation
 for Validation Studies

 The       Needs :
“The Field Needs”:
• First, basic research to determine the scientific
validity of individuality in handwriting based on
measurement of features quantification and
statistical analysis
• Second, procedures for comparing handwriting
that are standardized and validated.
Toolmarks – Lack of Testing Judicially
     g
Recognized
There is no reason why [the premises of
firearm identification] cannot be tested
f             f
under the Daubert-Kumho standards-
using sound research methods yielding
meaningful data on error rates. The
problem is that they have never been
                       general,
tested in the field in general or in this
case in particular.
United States v. Green, 405 F.Supp.2d 104, 118-19 (D.
Mass. 2005)
Toolmarks – Lack of Testing
  Recognized by National
  Academies of Science
      National Research Counsel on
          Firearms Id tifi ti
          Fi       Identification
The validity of the fundamental assumptions of uniqueness and
reproducibility of firearm-related toolmarks has not yet been fully
demonstrated.

    Additional general research on the uniqueness and reproducibility of firearm-
    related toolmarks would have to be done if the basic premises of firearms
    identification are to be put on a more solid scientific footing.
                                          ***
    Fully assessing the assumptions underlying firearms identification would
    require careful attention to statistical experimental design issues, as well as
    i t    i       k    the d l i
    intensive work on th underlying physics, engineering and metallurgy of
                                          h i       i      i       d    t ll     f
    firearms, but is essential to the long-term viability of this type of forensic
    evidence.
     How Are We Going to Do It?
•   Learn the fields
•   Hire Experts
•   Request discovery
    R       t di
•   Read the literature
•   Hold the fields to the Daubert factors
    – Standards
    – Testing
      Error R t
      E     Rates

No testing – no error rates
      1995 FORENSIC TESTING
     PROGRAM LATENT PRINTS
           EXAMINATION
48 False identifications made by 34 examiners
22% of the examiners made false identifications
Erroneous identifications occurred on all 7 latent prints that
were provided
One of the two elimination latents was misidentified 29 times

Only 44% of the participants correctly identified the
five latent prints that were supposed to be identified and
correctly noted the two elimination latent prints that were not
to b id tifi d
t be identified.
                Shock    Disbelief”
               “Shock to Disbelief
  Reaction to the results of the CTS 1995 Latent Print
                By any measure, this
Proficiency Test within the forensic science community has
ranged from shock to disbelief. Errors of this magnitude within
                represents a profile of
a discipline singularly admired and respected for its touted
                     ti that i
                practice th t is
absolute certainty as an identification process have produced
chilling and mind-numbing realities. Thirty-four participants,
                unacceptable and thus ,           p
an incredible 22% of those involved, substituted presumed but
false certainty for truth. By any measure, this represents a
                demands positive action
profile of practice that is unacceptable and thus demands
                               community.
positive action by the entire community
                by the entire community.
  David L. Grieve, Possession of Truth, 46 J. Forensic Identification 521, 524 (1996).
 HANDWRITING PROFICIENCY
     TESTS 1976-1987
Forensic document examiners were
correct 36% of the time, incorrect 42%
and inconclusive 22%.
      g                                 g
Risenger Denbeaux & Saks, Exorcism of Ignorance
as a Proxy for Rational Knowledge, 137 U. Pa L.
Rev 737 (1989).
  TOOLMARK PROFICIENCY TESTS
          1980-1991
                              y                            ,
• 74% of determinations made by tool examiners were correct, 26%
incorrect
• Toolmark examiners made 30 misidentifications and 41 missed
identifications
• Firearm examiners were correct 88% of the time, and wrong 12%
• Firearm examiners made 12 misidentifications and 17 missed
identifications
• Results understate day-to-day lab error rates because the testing
was declared rather than blind and labs spent much more time on
them than on actual case work

                             N. Markham,
Joseph Peterson & Penelope N Markham Crime Lab Proficiency Testing
Results, 1978-1991, 40 J. Forensic Sci. 1009, 1110, 1019, 1024 (1995).
                      g
       How Are We Going to Do It?
• Learn the field
   – Primary sources
   – Literature of the field
• Hire experts
   – Technicians
   – Counter experts
• Get discovery
   –   Demonstrative evidence
   –   Bench notes
   –   Lab Manuals
   –   Proficiency test file
  Hold the field to the Daubert factors
• H ld th fi ld t th D b t f t
   – Standards
   – Testing
     Error rates
   – E       t
     PUBLICATION AND PEER
            REVIEW
• Purpose of publication and peer review
• Internal non-blind verification does not
  insure reliability
  How Are We Going to Do It?
• Learn the field
          a y sources
   – Primary sou ces
   – Literature of the field
• Hire experts
   – Technicians
   – Counter experts
• Get discovery
   – Bench notes
   – Lab manuals
   – Proficiency test file
• Hold the field to the Daubert factors
   –   Standards
   –   Testing
   –   Error rates
   –   Publication and Peer review
         GENERAL ACCEPTANCE
•   Relevant scientific community
•   Must look beyond the practitioners of the field itself
•   State of Maryland v. Bryan Rose, K06-0545 (Cir. Balt. Co. 2008)
    (g               p
    (“general acceptance of latent p                        y     practitioners
                                      print identification by its p
    does not constitute general acceptance by the ‘scientific community’
    . . .”); United States v. Saelee, 162 F.Supp.2d 1097 (“Finally, the
    evidence does indicate that there is general acceptance of the
    theories and techniques involved in the field of handwriting analysis
    among the closed universe of forensic document examiners. This
    proves nothing.”)
•   Government has not and will not be able to produce anyone beyond
    law enforcement technicians
       How Are We Going to Do It?
• Learn the field
   – Primary sources
     Literature of the field
   – Lit t       f th fi ld
• Hire experts
   – Technicians
     Counter experts
   – C   t        t
• Get Discovery
   – Bench notes
     Lab
   – L b manualsl
   – Proficiency test file
• Hold the field to the Daubert factors
   –   St d d
       Standards
   –   Testing
   –   Error rates
   –   Publication and peer review
   –   General acceptance
 Expose the Lack of
Qualifications of the
Government “Expert”
             Latent Print Training
The harsh reality is that latent print training as a structured,
organized course of study is scarce. Traditionally,
fingerprint t i i h centered around a t
fi      i t training has     t d           d type of f
apprenticeship, tutelage, or on-the-job training, in its best
form, and essentially a type of self study, in its worst.
Many training programs are the “look and learn” variety,
Such apprenticeship . . . often
and aside from some basic classroom instruction in pattern
                                  methods,
interpretation and classification methods are often
  d ith      bj ti
end with a subjective
impromptu sessions dictated more by the schedule and
duties of the trainer than the needs of the student. Such
assessment that the trainer is
       ti    hi is      t ft
apprenticeship i most often expressed i t d in terms off
ready
duration, not in specific goals and objectives, and often end
with a subjective assessment that the trainer is ready.
David L. Grieve, The Identification Process: The Quest For Quality, 40 J. of Forensic Identification 109,
110-111 (1990)
         How Are We Going to Do It?
•   Learn the field
     – Primary sources
     – Literature of the field
•   Hire experts
     – Technicians
     – Counter experts
•   Get Discovery
     – Bench notes
     – Lab manuals
     – Proficiency test file
•   Hold the field to the Daubert factors
     –   Standards
     –   Testing
     –   Error rates
     –   Publication and peer review
     –   General acceptance
•                                                         government s
    Expose the lack of training and qualifications of the government’s
    experts
Expose the Logical Fallacy of
 the “Uniqueness” Premise
“[T]he crux of the matter is not the
individuality of the friction skin ridges but
          y                           g
the ability of the examiner to recognize
sufficient information for the disclosure
of identity from a small distorted latent
fingerprint fragment that may reveal only
limited information in terms of quantity or
quality”
Christophe Champod & Ian W. Evett, A Probabilistic
  pp             g p            , ( )
Approach to Fingerprint Evidence, 51(2) J. Forensic
Identification 101, 115 (2001).
         How Are We Going to Do It?
•   Learn the field
     – Primary sources
     – Literature of the field
•   Hire experts
     – Technicians
     – Counter experts
•   Get Discovery
     – Bench notes
     – Lab manuals
     – Proficiency test file
•   Hold the field to the Daubert factors
     –   Standards
     –   Testing
     –   Error rates
     –   Publication and peer review
     –   General acceptance
•   Expose the lack of training and qualifications of the government’s experts
•   Expose the logical fallacy of the uniqueness premise
                    g
       One Last Thing to Do
Expose the government’s expert for the
        “salesman” that he is

“A fingerprint expert is a
salesperson selling the identification
       j y
to the jury.”

       J                      E. Phillops,
Robert J. Hazen and Clarence E Phillops The Expert
Fingerprint Witness, in Advances in Fingerprint Technology
(CRC)
           g        q
This categorical requirement of
absolute certainty has no
particular scientific principle but
has evolved from a principle
shaped more from allegiance to
dogma than a foundation in
science. . . Whatever this may be,
   i    t i
it is not science.
        Grieve,               Truth,   J.
David Grieve Possession of Truth 46 J of Forensic
Identification 521, 527-28 (1996) (Ex.2).
    National Research Council
              Firearms
           on Fi
Conclusions drawn in firearms identification
should not be made to imply the presence of a
firm statistical basis where none has been
demonstrated. Specifically
demonstrated Specifically, … examiners tend to
cast their assessments in bold absolutes,
commonly asserting that a match can be made
‘to the exclusion of all other firearms in the
world.’ Such comments cloak an inherently      y
subjective assessment of a match with an
extreme probability statement that has no firm
       di      d       li ti ll i li
grounding and unrealistically implies an error
rate of zero.
                 Can it Work?
                      It already has
Handwriting                      Hines,     F.Supp.2d
Hand riting - United States v. Hines 55 F S pp 2d 62 (D (D.
Mass. 19990; United States v. Santillan, 1999 WL 1201765
(N.D. Cal. 1999); United States v. Rutherford, 104 F.Supp.2d
      (D. Neb.                        v. Brown, No.
1190 (D Neb 2000); United States v Brown No CR-184ABC
(C.D. Cal. Dec. 1, 1999); United States v. Fuji, 152 F.Supp.2d
989 (N.D. Ill. 2000); United States v. Saelee, 162 F.Supp.2d
1097 ( Alaska 2001)
 09 (D. as a 00 )

Fingerprints - State of Maryland v. Bryan Rose, K06-0545 (Balt.
Co.                        v. Langill, 05 5 1129 (Apr. 2,
Co 2008); New Hampshire v Langill 05-5-1129 (Apr 2 2007
Sup. Ct. N.H.)

                      v. State,   So.2d      (Fla.
Toolmarks - Ramirez v State 810 So 2d 836 (Fla 2001);
United States v. Green, 405 F.Supp.2d 104 (D. Mass. 2005)
         How Are We Going to Do It?
•   Learn the field
     – Primary sources
     – Literature of the field
•   Hire experts
     – Technicians
     – Counter experts
•   Get Discovery
     – Bench notes
     – Lab manuals
     – Proficiency test file
•   Hold the field to the Daubert factors
     –   Standards
     –   Testing
     –   Error rates
     –   Publication and peer review
     –   General acceptance
•   Expose the lack of training and qualifications of the government’s experts
•   Expose the logical fallacy of the uniqueness premise
•   Expose the government’s expert for the used car salesman th t he i
    E        th              t’       t f th       d       l       that h is
•   Tailor our attack to the specific opinion that’s being offered
                 Can it Work?
                       It already has
Handwriting                      Hines,     F.Supp.2d
Hand riting - United States v. Hines 55 F S pp 2d 62 (D  (D.
Mass. 19990; United States v. Santillan, 1999 WL 1201765
(N.D. Cal. 1999); United States v. Rutherford, 104 F.Supp.2d
      (D. Neb.                        v. Brown, No.
1190 (D Neb 2000); United States v Brown No CR-184ABC
(C.D. Cal. Dec. 1, 1999); United States v. Fuji, 152 F.Supp.2d
989 (N.D. Ill. 2000); United States v. Saelee, 162 F.Supp.2d
1097 ( Alaska 2001)
 09 (D. as a 00 )

Fingerprints - State of Maryland v. Bryan Rose, K06-0545 (Balt.
Co.                        v. Langill, 05 5 1129 (Apr. 2,
Co 2008); New Hampshire v Langill 05-5-1129 (Apr 2 2007
Sup. Ct. N.H.)

                      v. State,   So.2d      (Fla.
Toolmarks - Ramirez v State 810 So 2d 836 (Fla 2001);
United States v. Green, 405 F.Supp.2d 104 (D. Mass. 2005)

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:8
posted:9/6/2011
language:English
pages:92