Outside scientists, eager to perform independent evaluations of the government's findings, complain the information released contains far too many unknown variables.
By Brad Jacobson, Raw Story
Posted on October 8, 2010
National Oceanic and Atmospheric Administration (NOAA) and FDA officials maintain they've provided results of ongoing Gulf seafood safety tests with the utmost transparency. But outside scientists, eager to perform independent evaluations of the government's findings, complain the information released contains far too many unknown variables that preclude peer review.
In recent interviews, FDA and NOAA officials told Raw Story that they've been completely transparent in sharing ongoing Gulf seafood testing data, protocol and methodologies.
Whenever we reopened [waters], we'd post the data that we used and the FDA certified it as good enough to reopen," said NOAA spokeswoman Christine Patrick. "So that's all publicly available and it has been since we started reopening."
"There's nothing we are withholding," echoed FDA spokeswoman Meghan Scott.
Yet in wide-ranging interviews with Raw Story, multiple independent scientists involved in studying the effects of the Gulf oil spill not only revealed that government claims of sufficient transparency are wholly misleading, but they also provided several key examples of how withholding this information precludes independent evaluation and opens a raft of critical unanswered questions.
Raw Story's investigation also found that federal officials continue to publicly claim (as they as did as well in our interviews) that Gulf states follow the agreed-upon protocol set by NOAA and FDA for the reopening of previously closed waters.
But scientists in close discussions with these agencies informed Raw Story that the Gulf states are actually making their decisions for reopening waters on a case-by-case basis with no consistent set criteria -- making the basis for state reopenings of previously closed waters an even greater unknown for independent scientists.
Released data insufficient for independent evaluation
"We're a little worried that these samples so far may not be as thorough as they might need to be and there could be areas that are missed," said Gina Solomon, a doctor and public health expert in the department of medicine at the University of California at San Francisco.
"That's the fundamental concern," added Solomon, a co-author of the recent peer-reviewed Journal of the American Medical Association (JAMA) study on Gulf seafood safety.
Timothy Fitzgerald, a marine scientist with the Environmental Defense Fund who testified last week to the National Oil Spill Commission, said, "Given the work that we do and the level of resolution we usually rely on, if they're going to provide technical detail I would very much like them to actually provide it in as raw a form as possible."
"What they've done in a lot of instances is [provide] kind of first or second order binning or summarizing or distilling, which makes a lot of the data unusable or unavailable," Fitzgerald continued.
"It's not that it doesn't exist," he said. "It's just that it hasn't been provided in a way that scientists could really make a lot of use out of it."
Miriam Rotkin-Ellman, a staff scientist at the Natural Resources Defense Council (NRDC) and a contributor to the JAMA study, said there is no clear description of the scientific method being applied to determine how they select the locations to sample, how many samples they take, or how they are sampling to ensure that the areas they are reopening are free of oil.
"All of this remains information that we have asked various different agencies for, and this includes NOAA and the FDA, and they have not provided it," Ellman said. "It's not part of the materials that are on their website and it remains this very big blank to the transparency of how these safety considerations are being made."
Solomon and Ellman said NOAA officials have described a more specific sampling plan for federal waters by phone but have not provided them with a copy despite repeated requests.
"So the question is, why?" Solomon said. "It seems strange that it hasn't been made available for peer review."
Solomon, Ellman and other scientists interviewed by Raw Story noted that while there is much information lacking from the sampling plans and data released for the reopening of federal waters, even less is known about the determinations behind the reopening of state waters.
For every reopening of federal waters for commercial and recreational fishing, NOAA and FDA follow an agreed-upon protocol for the testing of seafood and water samples.
The Gulf states, which must submit a proposal to the FDA for any waters reopened, follow no consistent protocol for the reopening of state waters, Raw Story has found.
Federal officials publicly claim that Gulf states also follow "the same" agreed-upon protocol set by NOAA and FDA. Before reopening any previously closed state waters, Gulf state officials must submit a sampling plan for certification to the FDA.
But according to Solomon and Ellman, who have had several discussions with federal agencies regarding state reopenings, each individual Gulf state makes its decision for reopening waters on a case-by-case basis, with no uniform criteria or protocol.
They asserted to Raw Story that when they requested a specific sampling plan for the reopening of state waters, a senior FDA official, Donald Kraemer, was "really quite clear" that none existed.
Solomon said Kraemer told them "that there are no criteria, that there is no sampling plan, that there is no clear protocol, that there is no minimum number of samples required, that there's no specific buffer zone between oil contaminated areas and reopened areas that the agency's requiring."
Solomon lamented, "All of that was surprising and somewhat disturbing."
When Raw Story emailed Kraemer directly and requested that he respond to these statements attributed to him, he wrote back, "Sorry…you need to come in through the press office."
Yet after forwarding our request to his direct press office liaison, Michael Herndon, Raw Story did not receive a response.
Gaping holes in data may impact integrity of testing
FDA and NOAA officials repeatedly assured Raw Story that they've taken a "conservative" approach to testing Gulf seafood.
At the close of an elaborate presentation to Raw Story on seafood safety procedures, NOAA toxicologist John Stein averred, "We're confident that we have a well-designed seafood safety program and comprehensive testing."
But independent scientists not only said there's no way to confirm this, but also provided specific examples of what's missing from the data and how it can impact the integrity of the test results.
Ellman, staff scientist at the NDRC, noted that there's no information to conclude that there is adequate testing of the water below the surface for the presence of oil before areas are reopened.
"We know that dispersants made the oil move from the surface, so it's less likely to be a surface slick and much more likely to be below the surface, a sub-surface plume," she pointed out.
Carys Mitchelmore, an aquatic toxicologist and associate professor at the University of Maryland, agreed.
"Most of this oil is below the surface. So I would be more concerned with the stuff you don't see," Mitchelmore said.
Indeed, Raw Story confirmed that the language in the protocol is vague regarding the testing of waters for the presence of oil before reopening waters to commercial and recreational fishing. Under "Specific Re-opening Criteria" in the protocol, it states:
"Evaluation of oil movement -- Confirmation that the closure area is free of sheen on the surface by visual observation and/or aerial reconnaissance, or the presence of oil in the water column through visual observation or water testing."
Chris Pincetich, a toxicologist and marine biologist at the Sea Turtle Restoration Project, criticized the "very ambiguous" nature of this passage.
"That doesn't define anything. It doesn't give you any detection limits," Pincetich said with a note of exasperation.
He continued, "In one case, they may say, ‘My satellite shows a cloud. Oh, what about aerial? Well, no one saw anything. What about water sample? We didn't get a water sample either.' Well, they may be able to say, ‘Well, our aerial survey didn't see any oil. Let's reopen it.' That's really ambiguous."
Specificity on buffer zones -- areas between closed oiled waters and waters open for commercial and recreational fishing -- is also conspicuously absent from the protocol and is a point of contention for outside scientists.
The only mention of buffer zones in the protocol states:
"Fishery closure areas also include areas that NOAA projects will have surface oil and a precautionary buffer zone around known contaminated waters to account for uncertainty."When Raw Story pointed out to the FDA that there's insufficient information for other scientists to ascertain how buffer zones are being determined, the response was less than forthcoming.
"Are they proposing something different that they think would be a better [method]?" agency spokeswoman Scott replied.
After Raw Story reiterated that outside scientists are merely trying to pinpoint how such determinations are being made, Scott said, "By hundreds and hundreds of scientists."
Raw Story then noted that while it may be based on findings by "hundreds and hundreds of scientists," independent scientists are saying that these kinds of determinations are not clear enough in the materials being released online.
"Uh, sure," Scott said. "If that's what they're saying, I agree that's what they're saying."
Additionally, scientists pointed out that knowing what the contaminant levels of the fish and shellfish in the closed areas are would help them assess how safe it is to fish outside the buffer zones.
But to date, no information has been released on contaminant levels of marine life within closed areas.
"I would like to see the data from a polluted site, from a closed site," Mitchelmore said. "Just for me as a scientist, I would like to see the flip side. I'd like to see what these organisms look like when they've taken them from a closed area."
Raw Story found a related passage in the protocol that alarmed scientists.
"After confirming through subsequent evaluation that oil did not enter an area," it states, "the area may be re-opened without subjecting seafood samples to evaluation under this protocol. This protocol is an added layer of protection being applied to seafood only in areas known to have been contaminated."
That seafood may be deemed uncontaminated based solely on water sampling, scientists said, does not adequately take into account the movement of both oil and marine life.
"There's a defined spatial and temporal context here -- oil moves," explained Mitchelmore. "With these undersea [oil] plumes, you're not going to see sheens on the surface. Sediment resuspends. Organisms move from one area to another."
"So how representative is each sample in a huge big area on a temporal and a spatial scale?" she continued. "What if there's something in the sediment or a plume you're not seeing? Waters are not well mixed."
Bottom-feeders such as shrimp, which also migrate daily from the seabed to the surface, are at an even greater risk for undetected contamination in waters where seafood is not being tested, scientists said.
"The ecology of shrimp is that they stay on the bottom all day long and at night they come up and that's when they fish for them," said Pincetich. "So any oil in the sediment, any oil in the bottom of the ocean, is interacting with the shrimp on a daily basis."
"So now we're talking about long-term chronic exposure of these shrimp to the oil," he added. "And that's a concern, that's really a concern for public safety."
Many scientists also expressed frustration over the lack of clarity in the released sensory and chemical analyses of the seafood, which they say may be inflating both the quality and quantity of the testing.
Scientists explained that while the data might show that ten different fish have been tested, those ten fish are often ground up into one sample. Then one test would be run on that sample.
So ten fish may have been tested but only in one sample, which, in some cases, scientists said, could inflate the number of seafood tested.
"The takeaway number from that is one and not ten," said Fitzgerald of the Environmental Defense Fund. "But that's why you may see numbers like 40,000 fish have been tested. Well, not really -- 40,000 fish may have been collected."
Mitchelmore pointed out that government scientists utilize such sample sizes in analyses to reduce variability, because even within the same species tested there's a great deal of individual variability -- from size, age and sex to "whether one of them just fed on a big plump oil-laden oyster."
She said this methodology would be sound but only if they're repeating such analyses of the same species multiple times in a given area before and after waters are reopened, thus increasing their sample sizes to make up for dilution.
"If you think of the scenario where you've got nine little things that haven't accumulated [contaminants] but one big one that has accumulated a lot, then that one gets diluted out by those other nine," Mitchelmore explained. "So your levels appear lower."
"Increased sample sizes is key," she added.
But based on the data released, it's not clear this is occurring.
Fitzgerald also noted that the numbers are "not nearly as convincing" when you start to get into a finer level of details.
When he analyzed how many sensory tests (or "sniff tests") versus chemical tests have been performed, he found that the number of samples actually sent to a lab to be chemically analyzed is far lower.
"If you count up every single animal that was collected to be tested either by the sniff or the chemical, it's probably in the tens of thousands," Fitzgerald said. "But then if you actually look at how many tests they've done, whether it's sniff or chemical, it's much lower than that."
Catastrophic spill and sluggish response demand greater transparency
Every researcher interviewed by Raw Story -- from marine biologists and toxicologists to public health experts and oceanographers -- agreed that the massive size of the spill, the unprecedented use of dispersants and early ineptitude in responding to the disaster puts the onus on federal and state officials to maximize the transparency of their seafood safety testing.
"Part of food security is the sort of peace of mind that people have about eating it," said Ian MacDonald, an oceanography professor at Florida State University and an expert in measuring oil spills. "People are deeply anxious about the security of all aspects of the Gulf following this spill -- whether it's seafood safety or the dispersants or food chain effects and all that. That's the legacy of this."
Testifying before the National Oil Spill Commission last week, MacDonald said that, contrary to Obama administration claims that the majority of oil is "gone," he estimates that over fifty percent of the oil remains in the Gulf in "a highly durable material that resists further dissipation," with much of it "now buried in marine and coastal sediments."
"Everybody's credibility has been damaged by all this," MacDonald continued in his interview with Raw Story. "Many changes of course that NOAA took. The great concern about EPA and the licensing of dispersant use. The fact of the way it was handled has undermined public confidence."
"So I think it overwhelms the sort of standard approach towards reassuring the public in terms of food safety," he added.
Pincetich, the marine biologist and toxicologist from the Sea Turtle Restoration Project, cautioned, "They really need to throw out their old rulebook. This is a massive, toxic situation with millions of peoples' health at risk."
"And if they're not using the best technology in the most transparent, comprehensive ways," he continued, "then they're really falling short of being public servants and protecting our health and well being with their decisions."
"They have a long, long way to go to restore public trust," MacDonald said.
No comments:
Post a Comment