DEA: Law and Science Unto Itself

In an empty, muted family court, with armed guards at its doors, D.C. Superior Court Judge J. William Ryan released a discovery order revealing that the DEA’s analysts are producing false marijuana test reports resulting in wrongful convictions[1].  By critiquing DEA chemist Heather Hartshorn’s reports and testimony through the prism of the 2009 National Academy of Sciences (NAS) report on forensic tests,[2] Ryan showed that her marijuana report mirrored the NAS’s example of a totally deficient  report. Their example read: “Results: The green-brown plant material in item 1 was identified as marijuana.”[3] Hartshorn’s report read: “Exhibit 1 contains a measurable amount of marijuana.”[4]

A number of state courts have “held that the [prosecution] should provide more than the bare test results and reports to the defendant in discovery under similar [expert notice] rules.”[5]  For instance, the Court of Appeals of North Carolina has ruled that a defendant charged with selling heroin was entitled to the state laboratory analyst’s “laboratory protocols, incidences of false positive test results, quality control and quality assurance, and proficiency tests.” [6]

The Supreme Court in Jackson v. Virginia has ruled that reports such as Hartshorn’s, based on non-specific, screening tests are not worth the paper they’re printed on because they do not provide proof beyond a reasonable doubt of the presence of marijuana in a seized substance.[7]  Hartshorn herself admitted she used non-specific, screening tests yet testified erroneously that they positively confirmed the presence of marijuana.[8]

Ryan also disclosed that Hartshorn’s report lacked adequate details and data to allow a review of her work by an independent defense analyst to see whether she used valid, reliable tests and applied them correctly.[9] This was a significant failing as the NAS report concluded that many forensic tests “are not based on a body of knowledge that recognizes the underlying limitations of the scientific principles and methodologies for problem solving and discovery (Hartshorn claimed there were no limitations) [and] are not informed by scientific knowledge, or are not developed within the culture of science.”

According to Dr. Vedoster Ingram, a 29-year-veteran of the DEA, this was typical of the DEA. “As reports are normally presented, an official report of analysis is introduced into the court records for litigation without significant explanation.”[10] Reviewable data for Hartshorn’s tests should have included microphotographs of the suspected marijuana sample, highlighting the relevant morphological characteristics; photographs of the Duquenois-Levine (D-L) color chemical test results, including side-by-side contemporaneous images of the suspected marijuana and actual marijuana standard for proper comparison; and photographs or photocopies of the Thin Layered Chromatography (TLC) plate with the measured values and observed colors recorded contemporaneously with the testing.

The NAS report said that such reports were unacceptable and should lead to dismissals of charges.[11] In fact, much of Hartshorn’s report was indecipherable with abbreviations known only to herself. She dismissed this concern by stating that: “It’s not our policy to keep [reviewable data]; it’s not needed.” [12]

Reviewability and reproducibility are at the heart of verification and the scientific method. Regarding the Supreme Court’s ruling in Daubert v. Merrell Dow Pharmaceuticals, Inc., the Ninth Circuit Court declared that: “Something doesn’t become ‘scientific knowledge’ just because it’s uttered by a scientist nor can an expert’s self-serving assertions that his conclusions were ‘derived by the scientific method’ be deemed conclusive, else the Supreme Court’s opinion could have ended with footnote 2. As we read the Supreme Court’s teaching in Daubert, therefore, though we are largely untrained in science and certainly no match for any of the witnesses whose testimony we are reviewing, it is our responsibility to determine whether those experts’ proposed testimony amounts to ‘scientific knowledge,’ constitutes ‘good science,’ and was ‘derived by the scientific method.’[13]

Judge Kozinski’s Ninth Circuit opinion noted further that a gate keeping court must decide in part whether “ ‘… scientists have derived their findings through the scientific method or whether their testimony is based on scientifically valid principles….’ (Daubert, 43F. 3d at 1316). In its gate keeping role, the court should view reliability as follows: ‘this means that the expert’s bald assurance of validity is not enough. Rather, the party presenting the expert must show that the expert’s findings are based on sound science, and this will require some objective, independent validation of the expert’s methodology.’”[14] – i.e., review and reproduction of test findings.

The Court of Appeals of Maryland has ruled that: “Access to laboratory information generally is significant for another reason. The validity of testing procedures and principles is assessed in the scientific community by publishing the data in peer review journals …. [P]ublication of a laboratory’s work product and data used in [scientific] analysis, as well as independent replication and validation studies, are essential prerequisites to reliability.”[15] Replication and validation of Hartshorn’s findings were impossible since she presented no supporting data.

For independent reviewability, replication, and validation, lab reports should contain sufficient information to evaluate case notes and interpret the data as well as procedures, standards, blanks, observations, and  test results. Supporting documentation should include charts, graphs, and spectra generated during an analysis. [16]  Since Hartshorn provided none of these details, her reports could not be checked out and proved nothing, least of all that the suspected sample was marijuana.

The DEA founded and presently chairs Scientific Working Group on the Analysis of Seized Drugs (SWGDRUG) which provides minimum standards for  scientifically sound  lab and testing procedures. According to SWGDRUG:

Laboratories shall have documented policies establishing protocols for technical and administrative review.

Laboratories shall have and follow documented analytical procedures.

Laboratories shall have in place protocols for the sampling of evidence.

Laboratories shall monitor the analytical processes using appropriate controls and traceable standards.

Laboratories shall have and follow documented guidelines for the acceptance and interpretation of data.

Analytical procedures shall be validated in compliance with Section 11.

When analysts determine the identity of a drug in a sample, they shall ensure that the result relates to the right submission. This is best established by the use of at least two appropriate techniques based on different principles and two independent samplings.

Method validation is required to demonstrate that methods are suitable for their intended purpose. For qualitative analysis (identifying drugs), the parameters that need to be checked are selectivity, limit or detection and reproducibility.

Minimum acceptability criteria should be described along with the means for demonstrating compliance.

Validation documentation is required. Laboratories adopting methods validated elsewhere should verify their methods and establish their own limits of detection and reproducibility.

Documentation shall contain sufficient information to allow a peer to evaluate case notes and interpret the data.

Analytical documentation should include documentation including charts, graphs, and spectra generated during analysis.

Laboratories shall perform proficiency testing in order to verify the laboratory’s performance. [17]

Hartshorn was asked whether she followed DEA protocols or at least the guidelines of SWGDRUG.  “[T]hey aren’t laws, and so, as of right now, that is not our policy,”[18] she casually responded. In other words, the DEA does not follow its own regulatory body. Even worse, the “DEA does not have such guidance set forth in one particular document type or ‘protocol’ that would provide instruction on how one is to test cocaine or marijuana. . . There are no mandatory methods, and the forensic chemists are afforded considerable discretion in determining which testing methods and instruments to use.”[19] This according to Harshorn’s lab director, James Malone, who testified that the DEA has no protocol or standard methodology and does not validate its drug tests; calibrate its testing instrumentation right before testing; or run contemporaneous scientific controls to prevent and detect contamination.[20]

Judge: For marijuana in this case, for example, there is no calibration? 

James Malone: There is not. . .  So we’re not running a positive control on the Duquenois-Levine (marijuana test) on a daily basis.

Prosecutor: Now with regard to standard methodologies, DEA has a standard methodology on how to do examinations?

JM: No, we don’t.

P: So for qualitative analysis, the actual identification of a drug, you don’t have such (validation) studies, as you understood her (defense expert) to mean, correct?

JM: Correct. . . Identification – (validation) studies related to identification are not generally – there are no requirements for that. (SWGDRUG: “Method validation is required to demonstrate that methods are suitable for their intended purpose.”[21])

According to SWGDRUG Recommendations at Part IV.A.6.1.1 (“Laboratories shall have and follow documented analytical procedures”); id at Part IV.A.6.1.6 (“Analytical procedures shall be validated in compliance with Part IV B Validation”); id, at Part IV.B.IA (“All methods shall be validated or verified to demonstrate that they will perform in the normal operational environment when used by individuals expected to utilize the methods on casework”); id at Part IV.B.1.5 (“The entire validation/verification process shall be documented and the documentation shall be retained. Documentation shall include … personnel involved, dates, observations from the process, analytical data, a statement of conclusions and/or recommendations, authorization approval signature”).”

In short, the DEA is not engaged in scientific testing; a conviction machine. Voodoo science as someone commented. It also means that the DEA labs are, in fact, unaccredited because they received their accreditation on the basis that they follow strict protocols and SOPs, determine error rates and test limitations, validate its tests, and run positive and negative controls.

What really set off Ryan, however, was Hartshorn’s testimony that the DEA’s marijuana tests as well as her testing are infallible.  She claimed a zero percent (0%) error rate with the tests and her testing.[22] “Ridiculous on its face,”[23] said Ryan.  “Ms. Hartshorn makes a bold statement in her testimony in which she asserted that the three tests performed in these cases are infallible in their combined ability to conclusively identify marijuana,” wrote defense expert Heather Harris. “She was unable to offer any scientific studies to confirm this assertion, which is a scientific impossibility.”[24]  The NAS report concluded that “no forensic method has been rigorously shown to have the capacity to consistently, and with a high degree of certainty, demonstrate a connection between evidence and a specific individual or source.”[25]

Infallibility claims fly in the face of the fact that uncertainty enters testing in many ways, and each life stage of the evidence is susceptible to error. Contamination or misidentification can occur during the collection of the evidence. Analytical methods have practical and technical limitations. Reference standards and controls may fail quality control checks. Laboratory analysts who oversee the entire analytical process may make mistakes. Transcription errors can occur. In short, contrary to Hartshorn’s testimony, there is a panoply of errors that can occur. The three tests she used were: a microscopic examination, a presumptive color test named Duquenois-Levine (D-L), and Thin Layer Chromatography (TLC). None of these tests provide a specific identification to the exclusion of all other possible substances, and each of these tests has an associated degree of uncertainty or error rate.

With the microscopic exam, DEA analysts look for so-called cystolith hairs which occur on marijuana plants. But many other plant species unrelated to marijuana have cystolith hairs. Thus, a false positive (error) is possible with this examination.  Also according to the NAS report, the microscopic exam can only be done properly by a qualified botanist.[26] The DEA does not employ botanists.

George Nakamura, who is not a botanist, established use of the microscopic exam as a marijuana test. He examined 600 plants and found 80 with cystolith hairs. He then subjected the 80 “similar” plants to the D-L test, and only marijuana passed the entire test.[27]  However there was an elementary scientific flaw in Nakamura’s procedure for which reason his report should not have been published, let alone adopted as a protocol. His plant population sample was woefully inadequate since there are 200 –500, 000 plants he did not examine, and there are at least 24 species of plants with cystolith hairs. Nakamura himself admitted that there were some 30,000 plants which he did not examine.

Nakamura also claimed that the D-L was confirmatory of, and, specific to, marijuana, i. e. identified it to the exclusion of all other substances and did not render false positives. In fact, with the D-L test, false positives are expected based on the analytical mechanism of color tests. Color tests are screening tests that look at molecular groups rather than the specific molecule as a whole. Many unrelated molecules share common molecular groups, so any substance containing the target molecular group would give a positive response. In other words, the D-L test solely identifies the group of chemicals to which marijuana belongs. And there are other chemicals in that group which could give a positive D-L response, i.e., a false positive. Moreover, Nakamura himself reported that there were 25 substances that had been shown to cause false positives with the D-L test. So his claim was contrary to chemical facts and scientific demonstrations, and, again, should not have been published.

The D-L test is actually a combination of two individual tests. With the Duquenois test, a petroleum ether or chloroform extract of the plant extract is added to an ethanolic solution of vanillin and acetaldehyde, followed by addition of concentrated hydrochloric acid. Marijuana gives a deep blue-violet color. With the Levine modification, the blue-violet test mixture obtained in the Duquenois test is shaken with chloroform. With marijuana, the blue-violet color is transferred into the chloroform layer. However, at least 50 legal substances have been shown to give the same color reactions.

As early as 1938, the French pharmacist Pierre Duquenois, who developed the Duquenois test, found that it was not specific and gave false positives.[28] Yet, he reported that the test was specific.[29]  Although he claimed it was specific, he worked to modify the original test into the D-L test to eliminate false positives —  which as noted above was impossible given the nature of the D-L test. As he should have known in advance, the D-L test was no better and rendered false positives. Still, he reported that the D-L test was specific. Duquenois’s lie was repeated in 1972 by John Thornton and George Nakamura who falsely claimed that the D-L test was specific and in conjunction with a microscopic exam was a confirmatory, identification test.[30] Their study is still the protocol for marijuana identification in crime labs throughout the country even though it was false and rebutted by Fullerton and Kurzman and Whitehurst.

With regard to TLC, its ability to identify a substance, which in this case is not marijuana but rather its active ingredient THC, is limited by the number of distinguishable responses possible. TLC is a method of separation, not of identification. “It is prone to confusion because of the appearance of unrecognized peaks or spots on a chromatograph, particularly when an analyst is dealing with a wide variety of biological samples from a number of sources.”[31] The TLC test as generally performed for marijuana evidence has 100 distinct measurable values and 2 to 3 distinguishable colors. This allows for the distinct identification of at most 300 compounds without taking into account the possibility of compounds that will behave the same as the target molecule, THC.  In other words, a positive TLC test could indicate any one of some 300 compounds in addition to THC.

When these three tests are performed in sequence, the uncertainty of the final result is the sum of the uncertainties attributable to each test. In this case, where each of the tests can produce errors, the uncertainty can be great.  Moreover, a main concern with this sequence of testing is that the D-L and TLC tests produce results that are heavily dependent on the analyst’s subjective interpretation of the colors produced. What’s dark blue to one analyst, is purple to another. At a minimum, a standard reference material (a sample of known marijuana) should be tested along with the evidence sample as a comparison sample. The DEA does not do this. In addition, without the proper determination of the variability of positive results, the final identification is still simply the analyst’s subjective opinion.

Confirmation bias is also a concern with this sequence of testing. This is the tendency of an analyst to interpret analytical information in a way that confirms his/her preconceptions about an item of evidence as well as the results of the previous test. In a sequence of testing that relies entirely upon an analyst’s interpretation of test results, this is a likely source of error.

Hartshorn admitted that separately each of the tests is a screening test that renders false positives, i.e. errors. But miraculously when they are conducted in concert, they are error-free as is the analyst. In direct contradiction of Hartshorn’s claims of infallibility was a study done at her own lab which found false positives and a very high 20% error rate.[32] And every independent scientific study has found an error rate and false positives with these tests. For instance, a comprehensive series of studies in 1974 involving no fewer than 14 scientists and two attorneys concluded, in part, as follows.[33]

The probability of error in using screening tests for forensic identification is particularly great with marijuana because:

1. Screening tests are not specific;

2. Many common plants are confused with marijuana by “users” and law officers alike;

3. Inexpertly collected plant samples are not necessarily homogenous, i. e., only a single plant; and

4. The flowering plants include some 200,000 – 500,000 species besides marijuana.

As many as 20% (An Army study found 30%.) of the samples presumed to be marijuana and submitted to forensic laboratories have been found in recent years not to be Cannabis. “If BNDD (Bureau of Narcotics and Dangerous Drugs, predecessor to DEA) files are any indication, many. . . marijuana users are getting ‘high’ on parsely, alfalfa, or some other weed.” Common plants which have been confused with marijuana include tobacco, catnip, parsley, oregano, tea and other substances – sometimes laced with various chemicals.

Inexpertly collected plant samples commonly contain some extraneous plant materials – a weed grabbed by mistake, a plant which looked like the others, etc. The forensic analyst then needs to be concerned with one plant passing one screening test, and a contaminant passing another. (Furthermore, it’s not possible to determine if ground-up plant samples are from the same species. To avoid a misidentification, the analyst should assume the sample is adulterated or contaminated.) Because of this factor, and the common presence of added chemicals, the specificity of marijuana screening tests, even when used in combination, is no greater than the specificity of the most specific single test.

Because Hartshorne’s testimony raised serious concerns about her qualifications and “integrity” as well as a “flaw” in her testing, Ryan ordered that the prosecution produce all information verifying that the three tests in combination were infallible.[34] What Ryan apparently did not realize was that Hartshorn was repeating unsupported infallibility claims made by DEA lab directors since at least 1999. For instance, on April 9, 1999, Joseph P. Bono, director of the DEA’s Mid-Atlantic Laboratory submitted a sworn affidavit to the courts that all DEA analyses and tests are “incapable of producing a false positive. . . In other words, even if the test results are inaccurate, the results will not indicate the presence of a controlled substance when none is present in the unknown sample. Even if the instruments used in the testing are not properly calibrated, if no controlled substance is present in the exhibit, then no controlled substance will be identified . . . even when an instrument is not functioning properly, it will not identify cocaine, or any other controlled substance, as being present in a sample, unless that controlled substance is actually present.”[35]

Bono’s successor at this lab, Richard Fox, was more specific in his sworn affidavit which stated, in part, that:

“There is no other plant material that will give a positive result for all three tests. . . Neither the analyst in this case, nor any other DEA analyst, has ever misidentified marijuana. . . As such, the uncertainty measurement associated with the conclusions reached by the analyst resulting in the identification of marijuana is zero.[36]

Fox’s successor, James Malone, who is also Hartshorn’s supervisor who has signed off on her reports, has testified, in part, as follows in another marijuana case in D.C.

Prosecutor: To your knowledge, while you’ve been at the lab, has the laboratory ever misidentified a controlled substance?

James Malone: No.

P: And when you say – what are you basing that on?

JM: On my knowledge of the operations of the laboratory. We have not misidentified anything.

P: Are you aware of anything which shows that a mis-calibrated system or chromatographer in this case, but any system that was not calibrated correctly would create a false positive for cocaine or a controlled substance?

JM: No

P: Have you ever seen it in the lab? 

JM: Have I ever seen what?

P: A false positive from a mis-calibrated system.

JM: No.

Judge: But Mr. Chawla’s position was, can it ever – can a mis-calibrated machine ever give a false positive?

JM:  No. A mis-calibrated machine isn’t going to give you a positive cocaine if there’s not cocaine.

Judge: Why not?

JM: It just wouldn’t. . .

P: More specifically, if the reagent isn’t working, is it going to show that the substance isn’t marijuana? In other words, if the reagent isn’t working, what’s the result of the Duquenois-Levine going to be?

JM: It’s going to be negative.

P: Would I get a positive out of a Duquenois-Levine test? If I used a reagent that wasn’t working anymore and tried to run a Duquenois-Levine with that reagent, what would happen?

JM: You wouldn’t get a false positive, no.

P: With regard to identification techniques, is there any – do you have any reason to believe that a mis-calibrated or non-calibrated device would result in a false positive?

JM: No, I don’t.[37]

Malone’s testimony makes clear that he is basing his infallibility claim on ipse dixit evidence as were Bono and Fox who have never presented data to support their unheard of assertions.

Decades before their infallibility claims, several high courts including the U.S. Supreme Court found that the tests did not prove the presence of marijuana beyond a reasonable doubt. The Supreme Court of Wisconsin ruled in 1973 that: “An expert opinion that the substance is probably marijuana (based on a microscopic examination, D-L test and TLC) is not sufficient to meet the burden of proving the identity of the substance beyond a reasonable doubt. . . If this were a possession case, the tests would be insufficient. . . It is quite true that the tests used by Mr. Michael Rehburg, a chemist and witness for the prosecution, were not specific for marijuana. . . . He admitted, . . .  these tests were not specific for marijuana.”[38]

In 1979, a trial judge in North Carolina found that the D-L test was “not specific for marijuana” and had “no scientific acceptance as a reliable and accurate means of identifying the controlled substance marijuana.”[39] This finding was upheld by the North Carolina Court of Appeals as well as the North Carolina Supreme Court which found that: “The determination that the test used was not scientifically acceptable because it was not specific for marijuana was amply supported by the facts. . . The trial court’s ruling that the results of the tests conducted on green vegetable matter by using the Duquenois-Levine color test in the Sirchie drug kit were inadmissible in evidence was supported by the court’s findings that the test is not scientifically accepted, reliable or accurate and that the test is not specific for marijuana because it reportedly also gives a positive reaction for some brands of coffee and aspirin. . . . The conclusion to exclude the test results is amply supported by these findings of fact . . . and the test results were properly suppressed . . .”[40]

Also in 1979, the U.S. Supreme Court in Jackson v Virginia ruled that nonspecific tests could not be the basis for advancing a prosecution or a conviction because they do not provide proof beyond a reasonable doubt.[41]

Ultimately, Judge Ryan concluded that “such claims of infallibility belie one of the most basic tenets of science: that some degree of error is inherent in every scientific test, process, or analysis. . .While explaining that each of these tests used alone is presumptive, as distinct from confirmatory, Ms. Hartshorn nonetheless maintained their infallibility when used in concert. With the designation that these tests are merely presumptive, the DEA chemist acknowledged that there is some degree of inherent error calculable with respect to each of these tests when they are performed in isolation. That there is some distinct and additional degree of error calculable with respect to this analyst’s performance of each test is also without question.”[42]

It is clear from Judge Ryan’s remarks that he would have denied admission of the test results as evidence as well as Hartshorn’s testimony at trial, and this would have resulted in a withdrawal of the charges. He did not do so because defense counsel did not request it. Since Kurzman’s study and others occurred before this case as well as applicable court decisions such as Jackson, Daubert and Kumho Tire, defense counsel should have requested an evidentiary hearing for challenging the tests and sufficiency of evidence. Their failure to do so amounted to ineffective counsel.

This is exactly what U.S. District Court Judge Nancy Gertner concluded in a similar case wherein the defense counsel did not request an evidentiary hearing to challenge the forensic evidence. This is seen if one simply substitutes “marijuana” in her following remarks. “Under the ‘prevailing professional norms,’ reasonably competent counsel should have moved for a Daubert/Kumho Tire hearing before trial on all the expert testimony — a) on the [marijuana] laboratory analysis based on the investigator’s failure to use a comparison or control sample and not test beyond the generic finding of [“Exhibit 1 contains a measurable amount of marijuana”]; b) on the [marijuana] evidence, highlighting problems with proficiency testing and emphasizing the limited scope of the testimony; and, c) on the expert cause-and-origin testimony, when the expert’s proposed testimony was scientifically flawed. If counsel had requested such a hearing, there is more than a ‘reasonable probability’ that it would have been granted, that the laboratory analysis and the [marijuana] evidence would have been excluded, or severely limited, at the very least. . . .  As the Court held in Daubert, some testimony may be so problematic that the usual trial techniques are just not enough to prevent a jury from giving it far more credence than it deserves. See Daubert, 509 U.S. at 596-97. The testimony should not reach the jury at all. (This was absolutely true with Judge Ryan’s case.) Here, the scientific literature cast doubt on the significance of the [marijuana tests] and even raised concerns about . . . “proficiency” testing, concerns counsel never raised. . . just what the law and literature caution against”[43]

It is instructive to compare the two cases in detail because like Judge Ryan, Judge Gertner also critiqued the government’s evidence and experts through the prism of the NAS report. Gertner pointed out that it was significant that by 2006, a number of articles in legal journals and cases had cast a critical eye on the scientific reliability of arson evidence, methodologies, and techniques. Because of this, competent counsel should have been aware that defendants had been convicted and sentenced on the basis of flawed arson evidence and taken appropriate steps to litigate the issues using all the tools available including challenging the tests and requesting an evidentiary hearing.

The same was even more true of marijuana evidence by the time of Judge Ryan’s case.  The marijuana tests had been scientifically established as unreliable and inaccurate, and previous court decisions had excluded admission of the marijuana test results as evidence.

Gertner found that there was ineffective counsel because the defense attorneys did not move for a Daubert hearing prior to trial on any expert issue. They did not seek exclusion of any of the proposed expert testimony or move for its limitation. They did not argue that the expert testimony failed to meet the minimal threshold for reliability of scientific evidence and should not have been admitted at all. They did not alert the Court to the ways in which the government’s investigation undermined their very ability to present a defense.

The same was true with the case of Judge Ryan who called Hartshorn’s testimony “[R]idiculous on its face” and lacking in “integrity.”

In addition, Gertner argued that it was crucial to try to exclude expert testimony before trial because “a certain patina attaches to an expert’s testimony unlike any other witness; this is ‘science,’ a professional’s judgment, the jury may think, and give more credence to the testimony than it may deserve. United States v. Hines, 55 F. Supp. 2d 62, 64 (D. Mass. 1999); see also Michigan Millers Mut. Ins. Corp. v. Benfield, 140 F.3d 915, 920 (11th Cir. 1998) (‘The use of ‘science’ to explain how something occurred has the potential to carry great weight with a jury, explaining both why counsel might seek to couch an expert witness’s testimony in terms of science, as well as why the trial judge plays an important role as the gate-keeper in monitoring the evidentiary reliability of such testimony.’).”[44]

This again was even more true with Judge Ryan’s case because DEA chemists were poised to testify that the marijuana tests as well as their testing were infallible, and that no DEA analyst had ever misidentified marijuana. In fact, defense counsel in Judge Ryan’s case had been involved in previous marijuana case wherein DEA analysts had claimed infallibility under oath. All the more reason why they should have sought to exclude the evidence.

For its part, the DEA was ethically and scientifically bound to suspend Hartshorn and Malone and investigate all their previous marijuana cases. In fact, Hartshorn and Malone were subsequently both witnesses in another discovery hearing in the same court room opposed by the same defense counsel who again had not requested an evidentiary hearing to challenge the same marijuana tests. This hearing was presided over by Judge Florence Y. Pan who had read Judge Ryan’s order.[45] Heather Harris, who was highly praised by Judge Ryan, was the defense expert in this hearing as well. With no justification, Pan found Hartshorn and Malone to be credible as opposed to Harris even though Malone claimed infallibility without any proof. “On my knowledge of the operations of the laboratory,” he said, “We have not misidentified anything.” He also said that “a mis-calibrated instrument would never cause a false positive result.” Asked why by Pan, he replied:  “It just wouldn’t.”[46]

As we saw, Malone further testified that the DEA has no protocols or standard operating procedures and does not validate its tests or run sufficient numbers of controls.[47]  He threw in that the Analysis of Drugs Manual and the Analytical Sufficiency Document  are “the closest thing the DEA has to standard operating procedures for the chemists.”[48] Again, no problem for Pan even though Malone said these documents were “DEA proprietary,”[49] and SWGDRUG and scientific practice require protocols, test validations, and controls. Malone claimed there were published studies validating the tests, but this is not true.

Harris disagreed with Malone on all accounts.  No problem for Pan who decreed that: “To the extent the testimony of the witnesses conflicts, however, the Court credits the testimony of Mr. Malone. . . the Court found the testimony of Mr. Malone to be extremely credible and persuasive [and was] impressed by Mr. Malone’s candor, expertise, and professional demeanor. . . His testimony was very clear and logical, and the Court found him to be forthright.”[50] Pan did not mention the lack of scientific data or explanations for Malone’s testimony or that it showed the DEA was at odds with SWGDRUG requirements and scientific practice.

In short, there were more than enough scientific studies and favorable case law before Judge Pan’s case, not to mention Judge Ryan’s order, to justify requesting an evidentiary hearing in an attempt to deny admission of the test results as evidence. Defense counsel also had a highly qualified expert to confirm that the tests results did not provide proof beyond a reasonable doubt, and that the DEA’s proffered evidence and testimony were false.

As Judge Gertner observed: “If the lawyers do not tee up the issue, the evidence will be introduced without objection.”[51] This is exactly what happens in nearly all marijuana cases. Defense attorneys do not challenge the tests or the sufficiency of the evidence. In 2010, 853, 839 people were arrested on marijuana charges, and you can count on one hand the number of defense attorneys who challenged the tests or even the subjective opinions of arresting police officers.

This failure on the part of defense attorneys is particularly irresponsible because claims of infallibility can be to the advantage of a defendant as they undermine the admissibility of marijuana test results and the credibility of a prosecutor’s expert witnesses. Before a trial, a defense attorney can request an evidentiary hearing wherein he or she can examine the qualifications of the prosecution’s forensic analysts, the laboratory, and the nature and manner of the testing procedures used in identifying the drug the defendant was charged with possessing or selling. If this examination reveals deficiencies or inadequacies, the attorney can challenge the sufficiency of the prosecution’s evidence and seek a dismissal.  Dr. Bruce Stein et al have reported that: “Based on our survey, such a challenge would be warranted in many cases.”[52]

The possibilities of these challenges was seen in a recent case in Michigan in 2010.   Defense attorney Michael Nichols obtained a pre-trial evidentiary hearing and cross examined Michigan State Police lab analyst Jerome Waldron who testified that in more than 6,000 cases, he had never encountered a false positive, and that the marijuana tests had an error rate of zero. Nichols then entered a motion to exclude Waldron’s testimony from trial as well as use of the test results as evidence, citing scientific articles, prior court decisions, and Waldron’s lack of credibility. Even before the judge rendered his decision, the prosecutor withdrew the charges.[53]

If lab conditions or procedures do not conform to scientific guidelines and principles or court rulings such as Daubert, the defense attorney can motion to exclude the test results as well as testimony from the analyst at trial. Below is a list of such requests which have led to pre-trial dismissals of marijuana charges because they revealed deficiencies in the lab.

1. Evidence collection forms or logs (description of evidence, packaging, identification of specimens, identification of individuals collecting samples, sample collection procedures.

2. Chain-of-custody records (field-to-lab transfers, and all transfers of evidence and associated analytical samples within the laboratory).

3. Laboratory receiving records (records documenting the date, time and condition of receipt of the evidence in question; laboratory-assigned identifiers; storage location).

4. Laboratory procedures for subsampling (collection of analytical aliquots) and contamination control.

5. Copies of technical procedures in effect at the time the subject testing was performed (often termed Standard Operating Procedures, or SOP’s) for each procedure used during sample screening and confirmation, including; sample preparation, sample analysis, data reporting, and instrument operation.

6. Copies of the two bracketing controlled substance proficiency results for each analyst and technician responsible for preparation or analysis of subject specimens, including raw data and reported results, target values and acceptance ranges, performance scores, and all related correspondence.

7. Copies of traceability documentation for standards and reference materials used during analysis, including unique identifications, origins, dates of preparation and use, composition and concentration of prepared materials, certifications or traceability records from suppliers, assigned shelf lives and storage conditions.

8. Sample preparation records, including dates and conditions of preparation, responsible analyst, procedural reference, purity, concentration and origins of solvents, reagents, and control materials prepared and used, samples processed concurrently, extract volume.

9. Copies of bench notes, log books, and any other records pertaining to case samples or instruments; records documenting observations, notations, or measurements regarding case testing.

10. Instrument run log with identification of all standards, reference materials, sample blanks, rinses, and controls analyzed during the day/shift with subject samples (as appropriate: run sequence, origins, times of analysis and aborted run sequences).

11. Record of instrument operating conditions and criteria for variables, including as appropriate: Gas chromatograph column, instrument file identification, tuning criteria, instrument performance check (e.g., ion abundance criteria), initial calibration, continuing calibration checks, calibration verification.

12. Record of instrument maintenance status and activities for instruments used in subject testing, documenting routine and as-needed maintenance activities in the weeks surrounding subject testing.

13. Raw data for the complete measurement sequence (opening and closing quality control included) that includes the subject samples.  For GC-MS analysis, this would include: areas and retention times, injection volumes, dilution factors, chromatograms and mass spectra.  As prepared and as determined values for all quality control samples.

14. A description of the library used for spectral matches for the purpose of qualitative identification of controlled substances, including source(s) and number of reference spectra.

15. Copy of records documenting computation of illicit drug laboratory’s theoretical production yield, including the basis for the computation, and the algorithm used, as appropriate.

16. Procedure(s) for operation and calibration checks of analytical balances used to weigh controlled substances

17. Results of calibration checks and documentation of mass traceability for gravimetric determinations.

18. Results of contamination control surveys for trace level analytes relevant to test methods at the time of analysis, including sampling design and analytical procedures.

19. Records and results of internal reviews of subject data.

20.  Method validation records documenting the laboratory’s performance characteristics for qualitative identification and quantitative determinations of the controlled substance, to include data documenting specificity, accuracy, precision, linearity, and method detection limits.

21. Copy of the laboratory’s Quality Manual in effect at the time the subject samples were tested as well as the laboratory’s most recent Quality Manual (however named; the document that describes the laboratory’s quality objects and policies).

22. Copy of the laboratory’s ASCLD-LAB application for accreditation, and most recent Annual Accreditation Review Report, as appropriate.

23. Statement of qualifications of each analyst and/or technician responsible for processing case samples to include all names, locations and jurisdictions of cases in which these personnel testified concerning the same substances found in the present case.

24. Copy of the laboratory’s ASCLD-LAB on-site inspection report, as appropriate, as well as any reports of on-site inspections by any other testing laboratory audit organization.

25.  Copy of internal audit reports generated during the period subject samples were tested..

26.  List of capital instrumentation in the laboratory at the time subject testing was performed, including manufacturer, model number, and major accessories.

27. Production throughput data for the drug testing section: numbers of tests performed per month or per year, and the number of Full Time Equivalent personnel in the drug testing section of the laboratory.[54]

Marijuana field tests also have specific requirements that are seldom observed by the police. For instance, the field tests used by police officers have expiration dates because the chemicals and reagents in the tubes deteriorate over time and as a result of heat or cold. Before going to a hearing or trial, a defense attorney can find out exactly what brand of field test kit was used with his/her client. This can be done through a public records request and sometimes by simply asking the prosecutor. The defense attorney can then purchase the exact same kit online. In court, the defense attorney can show the judge that the test has an expiration date after which the test would be inaccurate. If the police officer did not check the expiration date before using the test, then the test results should be assumed to be invalid. Under the law, any tests or equipment that are not in good working order produce results that are inadmissible as evidence. If the police officer cannot attest to the expiration date or whether the test was used after its expiration date, the drug charges should be dismissed. Some search warrants are based on positive kit results and may be ruled invalid if the police officer did not know the expiration date of the kit. This should also result in a dismissal of charges.

Even if the field test has not expired, the test does not prove the presence of marijuana in a seized substance because it is a presumptive or screening test only. Information accompanying the kits indicate this fact. For instance, the carton containing one commonly used NIK field test states that it is: “A specially formulated reagent system for the presumptive identification of Marijuana.”  In other words, the company itself is saying that the test does not prove the presence of marijuana. It is further stated that: “The results of a single test may or may not yield a valid result. . . There is no existing chemical reagent test, adaptable to field use that will continually eliminate the occurrence of an occasional invalid test results [sic]. A complete forensic laboratory would be required to qualitatively identify an unknown suspect substance.”[55]  A defense attorney can show this to a judge or jury and explain what it means. Therefore, if the only evidence is positive results from a field test, the charges should be dismissed or the defendant acquitted.

Recently, defense attorneys in Colorado did challenge the DEA’s test results and blocked their admission as evidence including results from Gas Chromatography/Mass Spectrometry (GC/MS) analysis, the gold standard of drug testing. U.S. District Court Judge Marcia S. Krieger of Colorado ruled on April 21, 2011 that based on DEA information and the testimony of DEA chemist Anthea Chan, the prosecution failed to show the existence of reliable, accurate testing being reliably applied that proved the presence of amphetamines. She therefore denied admission of the test results as evidence at trial.[56]

The hearing, known as a Rule 702 (of the Federal Rules of Evidence) Hearing, provided a rare glimpse into the inner workings of a DEA lab. It was meant to determine whether their testing conformed to Rule 702 requirements for scientifically sound testing. Rule 702 requirements are all but identical to Daubert requirements. Krieger’s first task was to determine whether Chan had correctly tested according to DEA protocols and SOPs. Chan testified that she followed no protocols or SOPs and, in fact, was not aware of any protocols or SOPs.[57] These facts alone, said Krieger, were enough to deny admission of the test results as evidence because it was impossible to determine whether Chan reliably applied reliable tests.

Krieger did, not, however, rule at this point because she wanted her ruling to encompass  defense expert Janine Arvizu’s findings. Arvizu attempted to reconstruct the practices, protocols, and results relevant to Chan’s qualitative and quantitative test conclusions and whether they adhered to quality requirements and universally accepted standards designed to ensure the quality and reliability of tests, specifically, what’s known as ISO 17025 standards. However, as was the case in Washington, only a very limited amount of laboratory discoverable material was made available making it impossible to determine or evaluate the laboratory’s technical requirements or quality controls during the subject testing.

“That’s exactly the position the Court finds itself in,” noted Judge Krieger, “because it does not have evidence as to the protocol that was used, the reliability of the protocol compared to other labs, or whether Ms. Chan complied with the protocol in a reliable fashion.”[58]

Arvizu was, however, able to determine that Chan’s testing in particular was unreliable and inaccurate. Chan first used the Marquis chemical color test as a screening test, and the suspected substance turned orange/brown suggesting it was amphetamines.  But the test was unreliable and meaningless because she did not use a color chart with which to compare her results. As she herself testified: “I believe it’s the same as you saying something is blue and me saying it’s light blue. It’s subjective.”[59] Subjective tests are unreliable by definition.

Her next test was a GC/MS analysis.  Chan first ran a “blank” or negative control to check for contamination. The test consisted of putting the suspected amphetamines into a solution and then placing this solution onto the machine. But she first put the solution alone onto the machine, to see whether it would register positive. It did, meaning the machine was contaminated.[60] As Arvizu testified: “When quality control samples fail, the run should be terminated and the failure should be investigated and corrective action taken before unknown sample are tested.”[61] Inexplicably, Chan continued the testing with the contaminated machine.

Actually, even before beginning her test, Chan should have also run a positive control by placing a known quantity of amphetamines, known as a standard, on the machine to calibrate it and see whether it was working properly. DEA analysts are required under ASCLD/LAB and ISO 17025 guidelines to run standards immediately before testing. Chan said she was not familiar with these guidelines and was not required to do so. Chan’s superior Shana Irby, who approved her testing, also testified that it is not required to run contemporaneous standards, and that it suffices if the machine has been checked ten months prior.[62] She claimed to have never seen any protocol requiring the running of contemporaneous standards, and that “as soon as I walk up to an instrument, I know – I generally know if it’s working or not.”[63] She also claimed it was not necessary to check beforehand whether the standard had disintegrated because “[M]ethamphetamine to my knowledge does not degrade.”[64] This is false, and these standards come with an expiration date beyond which they are not useable.

DEA labs are accredited by the American Society of Crime Laboratory Directors/ Laboratory Accreditation Board ((ASCLD/LAB) under the international criteria detailed in ISO/IEC 17025:2005 and the 2006 ASCLD/LAB International Supplemental Requirements. Accreditation certifies that the management and technical operations of the laboratory comply with the program requirements, including any corrective action that was required during any of audits. (Details regarding the accreditation program may be obtained from www.ascld-lab.org.) In other words, DEA labs are accredited on the basis that they ascribe to ISO/IEC 17025 and ASCLD/LAB International Supplemental Requirements. Arvizu said the DEA adheres to neither, and is, therefore, de facto, unaccredited.[65]  Judge Krieger stated that: “Her testimony in this regard is unrebutted.”[66]

Ultimately, Judge Krieger concluded that: “The record does not reflect the DEA’s protocol for the performance of any of this testing. It does not reflect evidence that the protocol was accepted, is treated as generally reliable, or it complies with scientific standards applied in other laboratories. And the record does not reflect that Ms. Chan, even though she carefully explained what she did — that her actions complied with an established protocol; in other words, that she reliably applied a reliable methodology.”[67]  For these reasons, Krieger denied admission of the evidence.

Following the series of studies showing that marijuana tests did not provide proof beyond a reasonable doubt, Dr. Marc Kurzman, who was also an attorney, challenged the tests in court and produced a “flurry of acquittals” as well as day before trial dismissals. “In fact, seven such ‘day before trial dismissals’ (out of eight scheduled trials) were achieved by Dr. Kurzman in the weeks preceding completion of this paper.”[68]

With Judge Ryan’s order; Judge Krieger’s ruling; and , the NAS report, defense attorneys now have even a stronger case for challenging the tests and sufficiency of evidence especially since judges are now obliged to screen thoroughly all forensic evidence and forensic expert testimony before admitting them as evidence. As the New Jersey Superior Court noted: “Science moves inexorably forward and hypotheses or methodologies once considered sacrosanct are modified or discarded.  The judicial system, with its search for the closest approximation to the ‘truth,’ must accommodate this ever-changing scientific landscape.”[69] The Supreme Court made the same point in Daubert when it noted that “scientific conclusions are subject to perpetual revision.”[70]

“It is not only unnecessary for the courts to accept conclusory drug identifications based on nonspecific tests,’ wrote Professor Edward Imwinkelried in 1984, “it is also unwise for them to do so. The essence of the scientific method is formulating hypotheses and conducting experiments to verify or disprove the hypotheses. A proposition does not become a scientific fact merely because someone with impressive academic credentials asserts it is a fact. Testimony should not be treated as an expert, scientific opinion without a truly scientific basis, such as experimentation. Conclusory drug identification testimony is antithetical and offensive to the scientific tradition, and courts should not allow ipse dixit to masquerade as scientific testimony.

“. . . It would eviscerate the Jackson standard to sustain a conclusory drug identification in the teeth of the judicially noticeable fact that every test used to identify the substance is nonspecific. Even more importantly, sustaining such drug identifications places a judicial imprimatur on testimony that cannot justifiably be labeled scientific. The rejection of such identifications is necessitated not only by due process but also by the simple demands of intellectual honesty. After Jackson, sustaining conclusory, nonspecific drug identification evidence is both bad science and bad law.”[71]

For 25 years, Imwinkelreid’s words have been ignored by trial courts, and bad science and bad law have prevailed. For instance, in 2006, U.S. District Court Judge William Alsup declared that: “Despite the many hundreds of thousands of drug convictions in the criminal justice system in America, there has not been a single documented false-positive identification of marijuana or cocaine when the methods used by the SFPD Crime Lab are applied by trained, competent analysts.”[72] Based on this erroneous opinion, Alsup admitted the test results from nonspecific tests as evidence. In 2009, the NAS report confirmed that the fundamental questions of the extent to which a particular forensic discipline is founded on a reliable scientific methodology that gives it the capacity to accurately analyze evidence and report findings and the extent to which practitioners in a particular forensic discipline rely on human interpretation that could be tainted by error, the threat of bias, or the absence of sound operational procedures and robust performance standards have not been “satisfactorily dealt with in judicial decisions pertaining to the admissibility” of evidence.[73] Bad science and bad law.

Because of this situation, the report noted that judges needed to begin examining in every case prior to trial: “1) the extent to which a particular forensic discipline is founded on a reliable scientific methodology that gives it the capacity to accurately analyze evidence and report findings and (2) the extent to which practitioners in a particular forensic discipline rely on human interpretation that could be tainted by error, the threat of bias, or the absence of sound operational procedures and robust performance standards.”[74]

The Supreme Court immediately embraced the NAS report and directed that drug tests and analysts had to be thoroughly screened before trial in the case of Melendez-Diaz v. Massachusetts.[75]  In this case, the prosecutor had introduced written certificates by state laboratory analysts claiming that material seized by police was cocaine. The crime lab analysts were not called to testify. During the arguments before the Supreme Court, the state had urged that laboratory analysts should not be made to testify, because forensic science evidence is the product “of neutral, scientific testing.’’  The Court, citing the NAS report, went out of its way to reject this claim noting that “[s]erious deficiencies have been found in the forensic evidence used in criminal trials.” [76]  The Court then pointed out, by way of example, that: “The affidavits submitted by the analysts contained only the bare-bones statement that ‘[t]he substance was found to contain: Cocaine.’ At the time of trial, [the defendant] did not know what tests the analysts performed, whether those tests were routine, and whether interpreting their results required the exercise of judgment or the use of skills that the analysts may not have possessed.”[77]

The Court again cited the NAS report that: “The forensic science system, encompassing both research and practice, has serious problems that can only be addressed by a national commitment to overhaul the current structure that supports the forensic science community in this country.”[78]

So while the Court ruled that the defendant had the right to cross examine the analysts, it emphasized that “[c]onfrontation is one means of assuring accurate forensic analysis.”[79]  In short, while cross-examination provides a minimal constitutional safeguard that helps to check the accuracy and reliability of drug test results that is offered in criminal trials; it is far from adequate. There has to be pre-trial examination of tests and government analysts.

Although the Supreme Court embraced the NAS report, trial judges in marijuana cases have not only ignored it but ruled contrary to its findings as well as the Supreme Court decisions in Jackson, Daubert, and Kuhmo Tire and state supreme court decisions. This amounts to admitting invalid, unreliable test results as evidence as well as false testimony such as infallibility claims. Bad law and bad science continue. This was seen with Judge Pan who agreed that the DEA’s analysts and marijuana tests are infallible because she was “impressed by Mr. Malone’s candor, expertise, and professional demeanor. ”[80]  None of these characteristics, of course, verified his infallibility claims. Even Judge Marcia Krieger, who originally denied the admission of DEA drug test results as evidence, reversed her self without justification and later admitted them as evidence.[81] In a sense, Judge Ryan was even worse. He embraced the NAS report and condemned the DEA’s infallibility claims as impossible but took no action. And his order is sealed allowing it to be ignored by the DEA and prosecutors. Ryan and Judge Zoe Bush have also denied three requests from the author to review the case file with Ryan’s order and Heather Hartshorn’s infallibility testimony despite my signing a legally-binding statement that I would not reveal the name of the juvenile defendant – whose name I do not have.

At the same time, the NAS report ; Judge J. William Ryan’s order; Judge Marcia Krieger’s original ruling; and, Supreme Court decisions have had no effect on the DEA or prosecutors. If anything, it’s made them more intransigent and dishonest. The DEA continues to use the same flawed tests, and U.S. Attorneys have stated that the NAS report need not be considered by judges assessing the admissibility of forensic evidence. One brief, for instance, asserted that: “[T]he NRC Forensic Science Report does not support the conclusion that fingerprint evidence is inadmissible under the Frye calculus. In fact, the Honorable Harry T. Edwards, Co-Chair for the NRC Forensic Science Report, has stated on the public record that the report is not intended to affect the admissibility of any forensic evidence.”[82] (Edwards has responded that “[T]his is a blatant misstatement of the truth.”[83]).   In Judge Florence Pan’s case, noted above, the prosecution and the DEA proceeded on the basis that DEA analysts and marijuana tests are infallible right after being told by Judge Ryan that that was ridiculous on its face. There is another on-going marijuana case in D.C. proceeding on the same basis.

Defense attorneys have also ignored the NAS report as well as infallibility claims by the DEA and do not request evidentiary hearings to challenge the tests. In 2010, there were 853, 839 marijuana arrests in the U.S. One defense attorney challenged the tests and requested an evidentiary hearing. Attorneys in the D.C. Public Defender Service are fully aware of the NAS report and Judge Ryan’s order and have known about DEA infallibility claims since at least 1999. They have never requested an evidentiary hearing to challenge marijuana tests.

The American Academy of Forensic Sciences also needs to open a full-scale investigation of marijuana testing particularly since their recent president, Joseph Bono, led the NAS committee to conclude erroneously that their: “The analysis of controlled substances is a mature forensic science discipline and one of the areas with a strong scientific underpinning. The analytical methods used have been adopted from classical analytical chemistry, and there is broad agreement nationwide about best practices. . . Controlled substances are analyzed by well-accepted standard schemes or protocols. . . The chemical foundations for the analysis of controlled substances are sound, and there exists an adequate understanding of the uncertainties and potential errors. . . [and]  experienced forensic chemists and good forensic laboratories understand which tests (or combinations of tests) provide adequate reliability.”[84] Experience and goodness have no bearing on reliability, and a marijuana test is either reliable or it’s not reliable. Bono was the only witness who testified before the committee about marijuana tests. The report also noted that “an exception”[85] has been made with marijuana tests regarding the need to provide proof beyond a reasonable doubt of the presence of marijuana in a seized substance. This translates into the denial of the Constitutional rights to due process and fair trial.

Although she was not referring specifically to marijuana test results, Judge Nancy Gertner ruled in the above noted case that: “In the past, the admissibility of this kind of evidence was effectively presumed, largely because of its pedigree – the fact that it had been admitted for decades. As such, counsel rarely challenged it, and if it were challenged, it was rarely excluded or limited. The NAS report suggests a different calculus — that admissibility of such evidence ought not to be presumed; that it has to be carefully examined in each case, and tested in the light of the NAS concerns, the concerns of Daubert/Kumho case law, and Rule 702 of the Federal Rules of Evidence.”[86]

Professor Jennifer L. Mnookin agreed:

“Science deals in probabilities, not certainty. The only forensic science that makes regular use of formal probabilities is DNA profiling, in which experts testify to the probability of a match. None of the rest of the traditional pattern-identification sciences –such as fingerprinting, ballistics, fiber and handwriting analysis – currently has the necessary statistical foundation to establish accurate probabilities. (Ed note: Neither does the analysis of controlled substances.)  Yet, instead of acknowledging their imperfect knowledge, fingerprint experts, for example, routinely testify that they can identify a specific person’s prints to the exclusion of all other people in the world with 100% certainty. . . .

The courts have almost entirely turned a deaf ear to these [problems], essentially giving forensic science and its practices a free pass, simply because they’ve been part of the judicial system for so long. Meanwhile, scandals continue to come to light across the nation involving error and even fraud in labs.

The findings in the National Academy of Sciences report should spur judges to require higher standards. At a bare minimum, judges should immediately prohibit experts from testifying to impossibilities such as ‘an error rate of zero’ or asserting that they are capable of making 100% certain identifications. . . .”[87]

In 1975, Fullerton and Kurzman wrote: “During 1974, over 500,000 people will be arrested for possession of marijuana. Assuming  all of them were lawfully arrested and/or  searched, and that they had the requisite intent and knowledge — can they be acquitted? The answer, with the present statutory definitions of “marijuana” (as Cannabis sativa L) and with the analytical tests employed by most forensic/analytical laboratories (microscopic examination, Duquenois-Levine color test, thin-layer or gas chromatography) is unequivocal­ly — yes!”[88]

This is even more true today, and why defense attorneys need to begin challenging the marijuana tests and analyst’s testimony and request evidentiary hearings.

John Kelly is an independent research scientist and a former “Research ScientistI” for the State of Illinois. He is the author (with Phillip Wearne) of Tainting Evidence, which was nominated for a Pulitzer Prize; the critically-acclaimed, False Positives Equal False Justice, a scientific/legal investigatory report about marijuana testing; and, was a contributing author to the award-winning, Into the Buzzsaw. His latest book is Beat Marijuana Charges Without Plea Bargaining.

Notes.


[1] In Re O.W.  D.C. Superior Court/Family Court, 09-DEL 1997, April 2, 2010

 

[2] Committee on Identifying the Needs of the Forensic Science Community, National Research Council of the National Academies of Science, Strengthening Forensic Science in the United States: A  Path Forward, (2009)

 

[3] Id., pp. 21-22

[4] DEA Forensic Chemist Worksheet, Form DEA-86, File Number 025-381, Exhibit No. 1, Lab No.  10136252, March 31, 2010. (Copies available from author: kjohn39679@aol.com.)

 

[5] State v. Dunn, 571 S.E.2d 650, 655 (N.C. Ct. App. 2002)

[6] Id.

[7] Jackson v. Virginia 443 U.S. 307 (1979).

 

[8] Id.,  See  supra note1

[9] Id.

[10] Vedoster Ingram, Law Enforcement and Society Can Benefit From Greater Transparency in Controlled Drug Analyses, undated (Available @ www.vingramenterprisesltd.com)

[11] Id., See supra note 2

[12] Id., See supra note 1

[13] USA v. Frazier, 387 F. 3d 1244 (9th Cir. 2004)

[14] Id.

[15] Cole v. Maryland, 835 A.2d 600, 609-10 (Md. 2003).

[17] Id.

[18] Id. See supra note 1

[19] Declaration of James V. Malone, Lab Director/DEA Mid-Atlantic Laboratory, July 7, 2010, Washington, D.C., p. 3

 

[20] USA v. Darryl M. Williams and Kevin Ross, D.C. Super. Ct. Crim. Nos.2010 CMD 3630 and 2010 CMD 3631, Sept. 17, 2010, Tr. pp. 59, 60, 72.

 

 

[21] Id., See supra note 13. The Court of Appeals of Maryland has held that a defendant is entitled to disclosure of a lab’s standard operating procedures and analyst proficiency records where the government intends to rely on drug testing.

The Court reasoned:

Standard operating procedures are an important part of expert testimony because, like habit evidence, … they tend to prove that the conduct of the expert on a particular occasion was in conformity with the written standard operating procedures. Should an expert testify that she or he followed the procedures in a given case, then the defense would understand how the tests were performed. If the testimony, however, revealed that the standard operating procedures were not followed, that might be exculpatory evidence which, when brought out in cross-examination, could make a meaningful difference to a fact-finder. See Commonwealth v. Brosnick, 530 Pa. 158,607 A.2d 725 (1992) (evidence discovered after trial that the State apparently had not complied with its own regulations regarding alcohol testing resulted in a new trial). (Cole v. Maryland, 835 A.2d 600, 609-10 (Md. 2003).

 

[22] Id., See supra note 1

[23] US v. Kevin Ross  August 19, 2010 (Pan, J.) Sup. Ct. D.C. Kevin Ross’s Reply in Support of Motion to Compel Narcotics Related Discovery, p. 17.

[24]  Affidavit, Heather L. Harris, Washington, DC, July 29, 2010, p.4

[25] Id., See supra note 2, p.7

[26] Id., See supra note 2. SWGDRUG specifically advises that no one but a trained botanist should use microscopy to identify marijuana. SWGDRUG Recommendations, at Part IILB.3.3 (“Botanists may identify cannabis and other botanical material utilizing morphological characteristics (category B) alone provided sufficient botanical features appropriate for identification are observed. Such examinations shall be made only by analysts competent in botanical identifications. In this context botanical competence applies to those examiners recognized as professional botanists or those assessed to be competent by such.”).

[27] G.  R.   Nakamura,   “Forensic   Aspects   of  Cystolith   Hairs   of Cannabis and  Other  Plants,” J.  of the  Association of  Official Analytical Chemists, 52,  (1969); J. L  Thornton and   G.   R.   Nakamura,  ”The   Identification  of Marijuana,” J.  Forensic Science Society, 12, (1972).

[28] Pierre Duquenois and Hassan Negm Moustaha, “Identification and Assay of Cannabis Indicia,”  J. Egyptian Medical Assoc., 21, 21 1938.

[29] Id.

[30] J. L  Thornton and   G.   R.   Nakamura,  ”The   Identification  of Marijuana,” J.  Forensic Science Society, 12, (1972).

[31]R. F. Skinner, E. J. Gallaher,  J. B. Knight and E. J. Boneli, “The  Gas Chromatograph/Mass Spectrophotometer as a New and Important Tool in Forensic  Toxicology,” 11 J. Forensic  Sciences 428 (166)

[32] Hughes, Robert and Warner, Victor J Jr. A Study of “False Positives” in the Chemical Identification of Marihuana – Drug Enforcement Administration Laboratory Notes, “Microgram,” Vol. IX, No. 7 (July, 1976)

[33] Dwight S. Fullerton and Marc G. Kurzman, The Identification and Misidentification of Marijuana, 3 “Cont. Drug. Probs.”, 1974  and Marc G. Kurzman and Dwight Fullerton with contributions by Michael O. McGuire, Winning Strategies for Defense of Marijuana Cases: Chemical and Biological Issues, “J. of Chem. Def.,”Vol. 1:487, 1975

[34] Id., See supra note 1

[35] Declaration of Joseph P. Bono, Lab Director/DEA Mid-Atlantic Laboratory, April 9,1999, Washington, D.C.  pp. 3-4.

 

[36] Affidavit of Richard Fox, Lab Director/DEA Mid-Atlantic Laboratory, June 7, 2006, Washington, D.C., p. 1

[37] Id. See supra note 20, Tr. pp. 56, 78, 79, 83, 85

[38] State v. Wind, 60 Wis. 2d 267 (Supreme Court of Wisconsin, 1973) 208 N.W. 2d 357. See also City of Eagan  v. Mittlesdorf,  Dakota County Court,  Minnesota  (1974) file f/4-1-8556; State v. Byers, (Municipal Court, Minneapolis, Minnesota, 1974) case #1371173; and State  of Missouri v. Richard Gilmore, Missouri 3d Circuit District

Court, October 9, 1974

[39] State of North Carolina v. C. Richard Tate, 300 180 N.C. S.E. 2d (1980) 

[40] Id.

[41] Id., See supra note 7

 

[42]Id., See supra note 1   (“Normal scientists do not deny error. Quite the contrary: They have been obsessed with the measurement and reporting of error at least since the Greek astronomers. Today …. [t]hey not only measure and report error, scientists study error, explore its sources, and work to manage it.” Michael J. Saks and David L. Faigman, Failed Forensics: How Forensic Science Lost Its Way and How it Might Yet Find It, 4 “Ann. Rev. 1. Soc. Sci.,” 149-171,158 (2008).)

[43] USA v. James Hebshie , Memorandum and Order Re: Motion to Vacate Conviction, Criminal No. 02cr10185-NG, (D. Mass. 2010).

[44] Id.

[45] Id., See supra, note 20. In an extraordinary “Editor’s Note” accompanying Judge Pan’s Order as published in the Daily Washington Law Reporter (DWLR), the editor took issue with Pan’s ruling vis-à-vis the NAS report. “In this case, it can be argued the Court’s ruling that proficiency tests are ‘not particularly relevant’ is a bit short-sighted. Simply being told that a given technician has never failed a proficiency test is not useful unless one knows what kind of ‘proficiency’ is involved.” (DWLR, vol. 138, No. 210, Oct. 28, 2010, p.2249)

[46] Id. Tr. pp. 56, 78, 79, 83, 85

 

[47]Id., See supra, note 20

[48] Id.

[49] Id.

[50] Id.

[51] Nancy Gertner, Commentary on the Need for a Research Culture in the Forensic Sciences, 58 “UCLA Law Review,” 789 (2011)

[52] Bruce Stein, Ronald H. Laessig and Andris Indriksons, An Evaluation of Drug Testing Procedures Used by Forensic Laboratories and the Qualifications of their Analysts, “Wis. L. Rev.,” Vol.727, No. 3, 1973, p.785.

[53]  The People of the State of Michigan v Collin Markzon Eidelson,  Defendant’s Motion to Exclude Testimony about Chemical Testing; to Strike the Warrant and to Quash,  Case No. 09 2386 FY, 2010.

[54] Frederic Whitehurst, Forensic Crime Labs: Scrutinizing Results, Audits & Accreditation – Part, “Champion,”  April 2004.

[55]  NIK Public Safety System of Narcotics Identification, Armor Forensics 13386 International Parkway Jacksonville, FL 32218 AH-NIK-1004, p. 5, 2004

[56]  USA vs. Sergio Abraham Beltran et al , 10-CR-00567-MSK, Rule 702 Hearing,  CO., April 21, 2011..

[57] Id., pp. 153-154

[58] Id., p. 156.

[59] Id., p. 43

[60] Id., p. 63.

[61] Id,. p. 107

[62] Id., See supra note 56, May 27, 2011 pp. 13,25, 101

[63] Id., pp. 31, 85

[64] Id., p. 108

[65]Id., See supra note 62, pp. 78, 85

[66] Id., p. 155.

[67] Id., pp. 153-154

[68] Dwight S. Fullerton and Marc G. Kurzman, The Identification and Misidentification of Marijuana, 3  “Contemp. Drug Probs.,”  p. 41, 1974

[69] State v. Behn, 868 A.2d 329, 343 (N.J. Super. Ct. App. Div. 2005).

[70] Daubert, 509, U.S., p. 597, 1993.

 

 

[71] E.J. Imwinkelried,  Jackson v. Virginia: Reopening the Pandora’s Box of the Legal Sufficiency of Drug Identification Evidence,  “Kentucky Law Journal,”   76 (1), pp. 11-12, 1984.

[72]  USA v. Edgar Diaz et al, U.S. District Court for the Northern District of California, No. CR05-0167 WHA, Order Denying in Part and Granting in Part Motions to Exclude Drug Identification Expert Testimony, p. 2, 2006

[73] Id., See supra note 2

[74] Id.

[75] Melendez-Diaz, 129 S. Ct. 557  (2009)

[76] Id.

[77] Id.

[78] Id.

[79] Id.

[80] Id., See supra note 20

[81] Id., See supra note 1, Order Re Opinion Testimony of Shana Irby, Sept. 16, 2011

[82]  Government’s Opposition to Defendant’s Motion to Exclude Expert Testimony Concerning Latent Fingerprint Evidence at 3, United States of America v. Titus Faison, No. 2008-CF2-16636 (D.C. Super. Ct. Feb. 19, 2010).

[83] H.T. Edwards, The NAS Report on Forensic Science – What it Means for the Bench and Bar, Presentation at the Superior Court of the District of Columbia,  Conference on The Role of the Court in an Age of Developing Science & Technology, Washington, D.C. May 6, 2010

[84] Id., See supra note 73

[85] Id.

[86] Id.,See supra note 43

[87] Jennifer L. Mnookin, Op-Ed., Clueless ‘science,’ L.A. Times, Feb. 19, 2009, at A21

[88] Id., See supra note 68