In issuing its approval of Dow’s request to spray sulfuryl fluoride on food, the US EPA dismissed many peer-reviewed studies which FAN had submitted. Among these studies was Phyllis Mullenix’s 1995 paper on the neurotoxicity of sodium fluoride in rats. See page 16 at: www.fluorideaction.org/pesticides/sf.nov.18.2003.epa.docket.pdf
For those that don’t already know, Phyllis Mullenix was the former Chair of Toxicology at the Forysthe Dental Center in Boston, Massachusetts. Forsythe is one of the most prestigious dental research institutes in the US, if not the world. Mullenix was hired by Forsythe to study the neurotoxic effects of chemicals commonly used in dentistry.
As part of her work, Mullenix studied the impacts fluoride. To her surprise she found that fluoride crossed the blood brain barrier, accumulated in the brain, and impacted the behavior of rats in a manner consistent with a neurotoxic agent.
Mullenix’s findings spurred a great deal of controversy, and created a great deal of hardship for her career. Upon learning that her findings were going to be published in the journal Neurotoxicology and Teratology, Mullenix was fired from her position at Forsythe Dental Center. While she later received an out-of-court settlement from Forysthe, Mullenix has yet to receive any further government funding to continue her research – despite multiple requests.
Thus, while the Government (e.g. the National Institute of Dental and Craniofacial Research) criticized her study because of the high doses used, Mullenix was never allowed the opportunity to study the impacts of lower doses – which she very much wanted to do.
Fortunately, however, while Mullenix herself has not been able to continue her research on fluoride’s neurotoxicity, other scientists in other countries (particularly China) have been able to do so without recrimination.
Ok, now for Mullenix’s response to the EPA. As noted above, the EPA criticized her work as part of their recent assessment of Dow’s request to spray sulfuryl fluoride on a wide range of foods.
Mullenix’s response provides an excellent history of the research which led up to her study, and it takes particular exception to EPA’s critique of the computer pattern recognition system which she and her colleagues used to measure the rats’ behavior. Here it is:
Phyllis Mullenix’s response to EPA, March 21, 2004:
There is no scientific basis for the following EPA comment: “There have been no systematic studies comparing the Mullenix method for measuring neurobehavioral effects with the standard neurotoxicology battery, which has undergone extensive and international validation studies. There is no published record of validation of the Mullenix method.”
EPA blatantly distorts and ignores published, peer reviewed neurobehavioral literature. Even more surprising, is that it ignores literature that it paid for and helped to develop.
EPA needs reminding as to the origin of “the Mullenix method”. The computer pattern recognition methodology used in my ’95 NaF paper has a long pedigree. It started with the work done by Kernan & Higby (Energy and Mineral Resources Research Institute, Iowa State University in Ames Iowa), Hopper, Cunningham & Loyd (Veterinary Diagnostic Laboratory, Iowa State University) and L. Reiter (Neurotoxicology Division, Health Effects Research Laboratory, EPA-EPA Contract 68-02-2288) see Kernan et al., 1980 (Pattern recognition of behavioral events in the nonhuman primate. Computer Technology, C2-12) and Kernan et al., 1981 (Computer study of the behavioral effects of pharmacologic and toxicologic agents. Pharmaceutical Technology, 61-68, June).
I was a consultant on these original, EPA funded projects, and my contribution related to my extensive experience with behavioral observational techniques using time-lapse photograpy, a technique developed by Stata Norton at KU (personal friend of John Duell who was also on my thesis committee when I got my Ph.D.). I was the “human observer” of behavior that the physicists used to “train” a computer to recognize behavioral acts. The 1981 EPA sponsored paper described that the advantages of the new computer system were:
- faster classification of behavioral observations
- subjective human errors eliminated from behavioral observations and classifications
- permanent record of data for faster analysis
- capacity for measures not possible with conventional methods
- improved accuracy
- reduced interlaboratory variation in data
In the same paper they compared operant (used by Gary Whitford recently) and direct-observation methodologies (a computer pattern recognition system like I used). The following quote summarizes their comparison: “Since EPA is concerned ultimately with how chronic, low-level exposure to toxic substances affects behavior, the relative sensitivity of the direct observation and operant methodologies is extremely important. A significant question is: which of the two methods can detect behavioral changes at lower levels of exposure to a given substance? Research beyond the scope of the original EPA contract would be needed to determine minimum concentrations that can be detected with the two methods. Direct observation, however, is as sensitive as operant procedures and in some cases is significantly more sensitive. At an exposure of 0.11 mg/kg of d-amphetamine, PROBE [pattern recognition of behavioral events] indicated dramatic changes in the location of the primate within the cage during the observation period, whereas delayed-response performance [operant method] for the same dose changed relatively little.”
In short, EPA admitted over 20 yrs ago that direct observational methods were more sensitive than operant methodologies. Moreover, validation of direct observational methods is found throughout the medical literature, including validation of “the Mullenix method.”
The original computer pattern recognition system (PROBE) was replaced with a new system (RAPID-Rodent activity pattern identification device-the one I used) because public sentiment turned away from the use of monkeys for large neurotoxic studies. The rat system was designed by Kernan again and me at Forsyth. First, we published how the computer identifies behavioral acts (Kernan, Mullenix & Hopper. Pattern recognition of rat behavior. Pharmacol. Biochem. Behav. 27:559-564, 1987). Second, we published how the computer quantifies behavioral data (Kernan, Mullenix, Kent, Hopper, and Cressie. Analysis of the time distribution and time sequence of behavioral acts. Internat. J. Neurosci. 43:35-51, 1988). Third, we expanded data analysis techniques (Mullenix & Kernan. Extension of the analysis of the time structure of behavioral acts. Internat. J. Neurosci. 44:251-262, 1989). Fourth, we demonstrated how the computer system generates dose-response data (Mullenix, Kernan, Tassinari, and Schunior. J. Am. Coll. Toxicol. 8:185-197, 1989.) Fifth, we demonstrated that the computer system could analyze the time structure of behavior and produce a more stable and reproducible measure that boosts sensitivity beyond that of other behavioral methods (Kernan, Mullenix & Hopper. Time structure anaysis of behavioral acts using a computer pattern recognition system. Pharmacol. Biochem. Behav. 34:863-869, 1989; Mullenix, Evolution of motor activity tests into a screening reality. Toxicol. Industrial Health 5:203-219, 1989; and Kernan & Mullenix, Stability and reproducibility of the analysis of time structure in spontaneous motor activity of male rats. Pharmacol. Biochem. Behav. 39:747-754, 1991).
The work by Kernan and myself at Forsyth with the computer pattern recognition system was taken up and investigated by the Iowa State Veterinary Labs to check or “confirm” every step/procedure developed at Forsyth. The Iowa Lab built their own RAPID system and confirmed our findings of its reliability, reproducibility, and reduction of data bias and error (Hopper, Kernan & Wright, Computer pattern recognition: An automated method for evaluating motor activity and testing for neurotoxicity. Neurotoxicol. Teratology 12:419-428, 1990.) The Iowa Lab went further and demonstrated that the RAPID system was more sensitive than other behavioral methods under nocturnal conditions (Hopper, Kernan & Bowes, Reproducibility of time structure in motor activity of rats under nocturnal conditions. Pharmacol. Biochem. Behav. 42:245-250, 1992). In addition, Iowa tested it to see if it would detect the hypoactivity induced by the well known neurotoxin triethyltin (Kernan, Hopper & Bowes, Computer pattern recognition: Spontaneous motor activity studies of rats following acute exposure to triethyltin. J. Am. Coll. Toxicol. 10:705-718, 1991).
The following EPA comment also has no merit: “…the numerous T-tests performed by these authors can lead to significance of results based on chance alone.” From the outset of developing the RAPID system, statisticians were consulted both at Forsyth and at Iowa State. In fact, the RS statistic (which we used in the ’95 fluoride paper) was developed and validated by members of the Department of Statistics at Iowa State University (Kernan & Meeker, A statistical test to assess changes in spontaneous behavior of rats observed with a computer pattern recognition system. J. Biopharmaceutical Statistics 2:115-135, 1992.) As acknowledged in that paper, “The Monte Carlo simulations used in this study were very extensive. Approximately 9,000 analyses of each of the regular acts and the combined acts had to be done, and for each of these analyses every time distribution and time sequence analyzed had to undergo the 1000 repeated simulations for the bootstrap in order to estimate the standard deviation at each time point to assess whether that distribution or sequence was “changed.” This required about 1400 h of CPU time on various DECstations, either 3100s or 5000s. This extensive effort could not have been done without the facilities of the Iowa State University Project Vincent distributed computer network.” The data for this massive study was provided by me (at Forsyth) and Dr. Hopper (at Iowa), and the data from the two different laboratories were subjected to an in depth probe and comparative evaluation by the statisticians. EPA’s simplistic dismissal of this impressive undertaking is a blatant disregard for scientific advances.
EPA made another unjustifiable comment concerning the Mullenix et al. ’95 fluoride study: “Finally, there is no scientific basis to imply that motor changes are surrogate of cognitive deficits…” This comment is similar to one made by Proctor and Gamble scientists (Neurotoxicology and Teratology 17:685-686). My reply to that criticism still stands (Neurotoxicology and Teratology 17: 687-688, 1995). The scientific link between motor changes and cognitive deficits has been recognized for decades by many scientists and clinicians (Mullenix, The computer pattern recognition system for study of spontaneous behavior of rats. A diagnostic tool for damage in the central nervous system. In Motor Activity and Movement Disorders, Humana Press, 1995). We demonstrated that the RAPID system could detect changes in behavior that conventional operant methodology missed, i.e., the changes induced by agents that clinically are well known to cause cognitive impairment such as a lower IQ and impaired memory and attention (Mullenix et al. An animal model to study toxicity of CNS therapy for childhood acute lymphoblastic leukemia. Effects on behavior. Cancer Res. 50:6461-6465, 1990 and Mullenix et al. Interactions of steroid, methotrexate and radiation determine neurotoxicity in an animal model to study therapy for childhood leukemia. Pediatr. Res. 35:171-178, 1994). Furthermore, based on findings using the Rapid System, we detected the role of steroids in neurotoxicity and predicted their relative impact on cognitive function. Our prediction was confirmed in clinical studies of children given steroid therapy for leukemia (Waber et al. Cognitive sequelae in children treated for acute lymphoblastic leukemia with dexamethasone or prednisone. J. Pediatr. Hemat. Oncol. 22:206-213, 2000). Operant methodology was not sensitive enough for studies of cognitive impairment associated with treatments for childhood leukemia.
Given the fact that other more recent studies support the findings of the Mullenix et al. ’95 fluoride study, one would expect that EPA would be more scientific in its evaluation of our findings. It is clear, however, that something other than science is forcing EPA’s head into the sand.
March 21, 2004