A Comparison of Methods for Toxin Exposure Measurement: Personal Exposure v Colorimetric Tubes
Several incidents in the past decade have created an increased attention to the issue of toxin exposure level measurements in both the public sphere and amongst security and emergency response agencies. The need to accurately measure the presence of a variety of air-borne and other toxic substances in specific environments in an incredibly rapid manner can be an issue of truly vital importance in emergency situations, and the measuring of exposure levels for certain individuals -- especially first responders and other emergency response personnel -- is also key to an effective yet careful management of events involving the release of toxic substances. This has led to increasing research in the area of toxicity measurements, and an investigation of the best practices for rapidly and accurately measuring exposure.
This paper will present a brief overview of recent literature produced concerning and in some cases directly comparing two common methods of toxin exposure measurement: direct personal measurements and measurements achieved through the use of colorimetric tubes. Both of these methodologies have proven successful in a variety of applications, and the degree to which they provide accurate and reliable results in these varying situations will be assessed in this review. Through the collation and examination of this literature, it is hoped that a better and more comprehensive understanding of current measurement methods and practices of toxin exposure and presence can be established for all relevant agencies and personnel.
Colorimetric tubes have presented an area of increasing innovation and perfection toxin measurement technologies and methods, and a great deal of research has been published concerning the increased abilities, sensitivities, and reliabilities of several specific testing methods developed from this basic measurement methodology (Feng et al. 2010; Medina-Vera et al. 2010). In one particular innovation, an array of colorimetric sensors was demonstrated to have an error rate below seven-tenths of one percent in the detection of twenty different common toxic chemicals used in industrial settings and potentially encountered in situations of error or emergency events (Feng et al. 2010). This seems to suggest a high degree of promise for colorimetric measurement methods generally.
Not all studies have been this positive in their findings concerning the efficacy and reliability of colorimetric tube tests, however. In certain settings, there are a number of complicating factors that can lead to both false positives and false negatives with colorimetric tube measurement methods, and this can have potentially fatal consequences for the individuals exposed (Hughes et al. 2007). Though technologies involved in the design and construction of testing devices and overall methodologies have improved in the few years since this research was conducted and its results published, it is recent enough to give serious pause in the assessment of colorimetric tubes. The devices examined in this research were though to be highly reliable and effective, as well; practical application often shows a different result than purely academic study (Hughes et al. 2007).
It is possible, however, that these concerns are more applicable to specific flawed designs and not really to the technology of colorimetric tubes as a whole. In a study that directly compared the relative accuracies of different types of exposure testing, while at the same time testing the accuracy of a computer model developed to predict exposure levels at various geographical points and moments in time given certain release rates, it was found that the two measurement systems were comparable while the model failed entirely (Zhu et al. 2009). Actual tests were run and though slight differences were noted in the various real measurement devices utilized, these differences did not prove to be significant (Zhu et al. 2009). The model was a close predictor, as well, but differences in the predicted levels of exposure and those measured were statistically significant, suggesting that the testing methods are comparable while current understandings of spread and exposure are inadequate (Zhu et al. 2009).
A finding of comparable accuracy and efficiency in both types of exposure measuring could actually prove to be highly beneficial to many of the entities involved. Community health needs are best met when a wide array of tools is at their disposal, comes the rather unsurprising news from another recent study, and if those tools are comparable in accuracy and reliability then they are likely to serve the needs of most communities in the vast majority of cases (Medina-Vera et al. 2010). This is not to say that there are not at times tangible differences between testing methods, but for community health planning most reliable measurement methods are acceptable as alternatives (Medina-Vera et al. 2010).
There is definitely an issue, however, when "good enough" becomes acceptable on a large scale, and long-term studies have attempted to determine the actual differences that exist between various measurement methods in a controlled and somewhat more meaningful manner. In long-term experimental study that specifically examined passive vs. active collection and exposure measurement methodologies -- the latter of which would be comparable in some ways to personal exposure measurements -- there was a statistically significant difference found in the exposure levels of the two devices (Dodson et al. 2011). Active collection demonstrated higher levels of exposure from volatile organic compounds, and though this study did not determine which measurement represents a truly more accurate indicator of likely human exposure in the same situation, the more sensitive instrument and collection/measurement methodology would definitely be preferable for most personnel (Dodson et al. 2011).
For personnel at truly long-term risk of exposure to certain specific industrial toxins, there are even simpler and more accurate and personal tests that can be conducted. A study that assessed the use of hair samples as a means of biometrically measuring exposure levels found a very comparable result in this method to other standard environmental and personal exposure monitors (Rodrigues et al. 2008). Though this method is obviously impractical for short-term and emergency response situations, the incorporation of minimally invasive biometric tests could lead to more accurate and efficient testing procedures in industrial settings than have yet been imagined on a large-scale (Rodrigues et al. 2010).
A search of certain public records can also yield highly relevant and interesting information, if the proper sources are consulted. A recent patent application demonstrates another new methodology for the collection and measurement of toxins that approaches the problem from an angle about as diametrically opposed to the notion of biometric analysis as one could get, instead using a chemical fixative to actually absorb and trap toxic elements in sludge that is then mechanically removable from this device (Walker et al. 2006). In addition to the potential for using this type of device in toxin detection, it also is designed to work as a reducer or even eliminator of toxins in certain applications, but it is not yet clear whether this technology provides a viable alternative to other methods of measuring personal exposure levels through immediate environment collection (Walker et al. 2006).
Direct environmental analysis such as would be appropriate for certain application of colorimetric tube testing is often the only viable means of obtaining necessary data, especially in non-emergency situations. Determining the efficacy of testing tools in such applications has generally been of reduced importance and interest to researchers, it would seem, though certain limited studies have been carried out in this regard that yield promising results (Hewitt & Gandy 2009). This study also found that the reporting and substantiating of results was often a more difficult and ultimately a more important issue than testing itself, though reliable measurements can be of great benefit in this regard, as well (Hewitt & Gandy 2009). This again shows encouraging promise for colorimetric tube testing methods.
Summary and Conclusion
The research in the area of toxic exposure testing is highly varied, covering a wide range of specific disciplines and applications and including the utilization and examination of a wide array of different measurement methodologies. Current findings seem to suggest that most methods of toxin exposure level measurements still in use re fairly comparable, though newer and more sensitive technologies are being developed that will likely replace many testing instruments and methods over the next decade. These new methodologies, however, consist both of personal exposure measurement tactics as well as colorimetric tube and similar measurement methodologies, and neither one appears to be strongly favored for use in practical situations. In this regard, the current review failed to provide a clear recommendation for one technology or methodology over another, but in its analysis of the many various measurement techniques available found some comfort in the ever-increasing reliability and accurately of those methods that are currently available and employed.
That being said, there does seem to be a growing interest amongst the academic and practical literature regarding the use of colorimetric and similar testing methodologies. It is too early to determine if these will actually present a benefit in comparison to the use of direct personal exposure measurements, which are also becoming more efficient…