The anthrax incidents in the fall of 2001 served as a wake-up call to the nation’s emergency-response community. The onslaught of suspicious powder calls, following so closely on the 9/11 terrorist attacks, revealed how unprepared the United States was at that time to handle any sort of suspected biological agent. There were some agencies that had their act together back then, but for the most part, the nation’s fire departments were collectively learning-on the job, in real time–how to deal with anthrax. It was a baptism by fire–or, more correctly, perhaps, a baptism by talcum powder, sheet rock dust, sugar, flour, and cornstarch. The upside was that first responders learned a lot of valuable lessons-including the fact that the field-level biological-detection technology then in place was woefully inadequate.
At the height of the anthrax hysteria, the White House Office of Science and Technology Policy (OSTP) issued a formal statement recommending against the use of hand held immunoassay (HHA) devices for detecting biological agents. Quite an untimely statement, considering that police and fire agencies were swamped with suspicious powder calls, and many already were using HHAs. Needless to say, the that and a similar statement by the Department of Health and Human Services created quite a stir.
What to do? The government was saying not to rely on the technology available–but there wasn’t anything else available, and something, almost anything, was needed. In response, the bio detector marketplace exploded. Vendors of all kinds rushed products to market in an effort to capture a piece of the biological detection pie and, although many of them would not ‘t say it publicly, the fact is that some of the detectors were sketchy at best. There were, and still are, some poorly made and poorly performing biological detectors. Until recently, there really was no way to verify that a particular biological detector would perform as promised. Worse yet, there were no repercussions for those vendors selling devices that did not live up to their claims.
Testing, Validating, Verifying Fast forward to the present day and the new world of biological detection. Recently, a coalition of U.S. government agencies funded by the Department of Homeland Security-i.e., the CIA, the FBI, the U.S. Postal Service, and the Food and Drug Administration– joined forces with the vendor market and the American Association of Analytical Chemistry (AOAC) to develop reasonable performance standards for HHAs. A task force formed by the coalition looked specifically at anthrax detection and spent 16 months testing an array of detectors. At the end, only one detector, the RAMP System built by the Response Biomedical Corporation of Vancouver, Canada, met the testing criteria for anthrax detection. The RAMP Anthrax Test is now laboratory-tested and has been approved by AOAC in accordance with the organization’s Performance Tested Methods and Official Methods of Analysis. That is quite an accomplishment, considering the demanding nature of the process and the prestige of the AOAC. The process was so rigorous,in fact, that an AOAC representative said the organization had never participated in such a comprehensive test. That is a considerable compliment from a world-renowned nonprofit scientific organization with a 120-year history of evaluating analytical methodology.
Joanne Stephenson, vice president of business development for the Response Biomedical Corporation, commented on the new performance standards as follows: “The whole process was really driven by the first-responder community. After the anthrax attacks, a lot of products came out and not all of them were good. Unfortunately, the first responders had no way of knowing whether or not a particular detector could do what it said it could do.”
Stephenson accurately summarized a long-standing deficiency in the field of biological detection-obtaining, and/or validating, reliable information about the performance of a detector. “It’s all about being an informed buyer,” she said. “Independent validation by a reputable third party goes a long way to provide the end user with peace of mind.”
Anyone needing additional assurance will take comfort from the fact that the new performance standards have plenty of independent evaluation and subsequent validation. The AOAC organized 12 laboratories across the United States to assess the performance of the RAMP system and other candidate detectors.
“This is only the beginning,” Stephenson said. “Performance standards for other biological agents are in the works.” The performance standards will have a significant impact on the marketplace, she also said. “I think we’ll see a shake-up across the board in biological detection. Primarily, I can see future grant moneys tied to approved products. I also see buyers having more concrete and reliable information at their disposal. If you have faith in the performance standards, then you should have faith in the machines that gain approval.”
Performance standards are important, but there are other factors to consider when purchasing a biological detector. Among those factors is adaptability. Buyers should also consider how quickly the technology could be adapted to address an emerging threat. “That’s absolutely critical,” Stephenson says. “You want technology that is versatile. If the threats change, the machine should be able to adapt.”
The RAMP System is a good example of adaptable technology. The system’s small hand-held reader provides results in 15 minutes and is simple to use: A swab sample is placed in a cartridge and inserted into the reader, and the machine does the rest.
For anthrax, 4,000 spores is the threshold for a positive result. That amount of anthrax, Stephenson says, would fit on the head of a pin. Test cartridges for ricin, smallpox, and botulinum toxin also are available for the RAMP system. All the user has to do is match the suspected agent with the appropriate test cartridge.
Bill Radvak, president and CEO of Response Biomedical, said in a recent news release focused on the importance of the new performance standards that the company “applauds” the Department of Homeland Security “for commissioning a definitive evaluation and facilitating the introduction of prescribed performance standards to enable first responders to make informed purchasing decisions.”
Radvak is obviously correct about the importance of making informed purchasing decisions. Never before have the nation’s fire services been in such need of credible information about specialized equipment. New products are coming to market at a head-spinning pace, and it has become increasingly difficult to separate the wheat from the chaff.
The trend toward government-driven validation seems to be a positive step forward that will help government agencies-local, state, and national–spend taxpayer’s precious dollars on proven technology.
For more information about the verification process, visit EPA Verifications
Rob Schnepp is division chief of special operations (ret.) for Alameda County (CA) Fire Department. His incident response career spans 30 years as a special operations fire chief, incident commander, consultant, and published author. He commanded numerous large-scale emergencies for the Alameda County (CA) Fire Department, protecting 500 square miles and two national laboratories in the East Bay of the San Francisco Bay Area. He twice planned and directed Red Command at Urban Shield, the largest Homeland Security exercise in the United States. He served on the curriculum development team and instructed Special Operations Program Management at the U.S. Fire Administration’s National Fire Academy. He is the author of “Hazardous Materials: Awareness and Operations.” He has developed risk assessment, incident management, and incident command training for Fortune 500 companies, foreign governments, and U.S. national laboratories.