Whole-genome sequencing (WGS) has proven to be very effective at identifying where resident Listeria strains may be found in a food facility. To date, however, it has also been proven to be equally effective at ensuring that only a few food companies want to use the technology.
For the most part, scientists, such as molecular biologists, and regulators, such as FDA, view whole-genome sequencing as an evolving state of the art. Using these methods has repeatedly helped regulators identify resident strains in a food facility before the strains can cause significant public health issues. If the strains end up on manufacturing equipment and/or in food, whole-genome sequencing is very useful at identifying the association between the original source, retail distribution, and public consumption of that food.
Whole-genome sequencing is, in essence, an upgrade from its precursor technologies such as pulse-field gel electrophoresis (PFGE), multilocus sequence typing (MLST), and multiple-locus VNTR analysis (MLVA).
As has been demonstrated in recent years, the science of identifying genes, bacteria, and food-related organisms keeps progressing, and always will. As many people have stated in published articles and speeches, WGS technology has probably outpaced the food industry’s ability to keep up with it at the moment. But surprisingly, industry has been reluctant to embrace the technology, in large part because of fear that their products will be linked to human illness and the potential resulting regulatory enforcement actions.
Grabbing any new technology and putting it to work in a manufacturing plant is rarely straightforward, is difficult to mandate, and must be done with careful consideration. This is because new technologies come with baggage, some of which include significantly higher costs initially (compared with past/current technologies), longer lead times initially, and higher alpha- and beta-risks (i.e., chances of false results).
These are all good reasons why the food industry in general has not embraced WGS as a routine tool. And this is in spite of the fact that FDA will sequence any positive it finds during a “swab-a-thon.” In fact, the food industry has barely even embraced WGS as a problem-solving tool.
One of the best uses of WGS in a food production plant is increasing the odds of finding the root source of a strain of Listeria that is resident. For example, positive WGS findings (i.e., matching gene sequences with an extremely high probability) can help a company identify a regularly incoming raw material as the source of a Zone 3 recurring contamination (e.g., on pallets).
FDA takes this same approach in attempting to link a specific Listeria strain sequence found in a clinical isolate or food product with an identical strain found in a food plant.
“FDA wants to encourage use of WGS to help minimize food safety risks to the public, but by its policy execution, it may be driving the industry away from a proven scientific tool that could achieve that very end.
Using WGS in this manner takes a lot of time (and money), and it is at the moment a best practice for identifying Listeria strains that have resided in given locations over time. This allows discovery of “hot spots” and certainly helps in identifying external sources (e.g., a supplier issue). Hence, a regular use of WGS testing of strategic samples around a food facility can pay huge dividends in identifying, and then eradicating, resident Listeria strains.
By contrast, WGS is a poor tool to use to identify transient strains of Listeria and/or when results are needed quickly to be able to respond to an issue or corrective action. This is primarily because of the fact that by the time the WGS results come back, the transient organism is long gone. And although the cost of WGS testing has dropped by orders of magnitude over the years (as happens with new technologies that become mainstream—look at wide-screen TVs), simply churning out WGS samples to try to locate Listeria in a food facility can become extraordinarily expensive.
Other techniques are far quicker and can be equally effective at the end game. For example, there is still value in using some of the older techniques such as PFGE and MLVA. Even easier and cheaper, although less informative, is using well-proven standard microbiological techniques to assay for Listeria (e.g., FDA BAM Chap 10). These standard micro tests have widespread proven use in environmental monitoring programs to “seek out and destroy” pathogens in a food facility, regardless of whether those organisms are transient or resident. WGS can then be used sparingly as a means of continued surveillance, ensuring that strains are or are not taking up residency.
The “time and cost” value equation of WGS, however, is still a reason why food companies do not routinely use WGS. Do note that over the past two years, turnaround time and costs have been significantly reduced (e.g., a five-business day turn for Listeria WGS is about $500). However, this value equation is either still not good enough, and/or companies are still saying that the tests are “too long and too costly” as a shield for not wanting to enter FDA’s territory.
Clearly FDA sees WGS as a savior; euphemistically speaking, if use of the technology prevents even one illness, then its use is warranted. And, since the technology is available, an informed public would most likely want FDA to use it. So, it may be that as time progresses, public and political pressure may push WGS forward independent of food safety professionals wanting to do so or not.
Having WGS in its toolkit also allows FDA to trumpet what it is doing: WGS is a well-defined technology, and the public can understand it (especially when the public watches crime shows that use DNA testing, or consumers use DNA sequencing for ancestry determination purposes). FDA will continue to use these tools to find organisms, such as Listeria monocytogenes, which can cause significant adverse health consequences or death to the small populations that are highly susceptible to the organism’s effects.
This is also why FDA continues to maintain a “zero tolerance” policy for Listeria monocytogenes. This is not the case worldwide (e.g., Canada and Australia), but it is the position FDA has taken. Granted, the rationale for this is to protect the public health, but it also allows FDA to conduct deeper investigations into a food company’s practices and, when resident pathogens are detected and linked to human illness, initiate the enforcement actions the public would expect them to take.
Setting aside consideration of whether or not FDA should be spending so much time enforcing a zero-tolerance principle, one would surmise that if FDA allowed for some tolerance of Listeria, the food companies might begin using the technology much more widely. But, as everyone knows, they are not.
FDA has issued a Draft Guidance for Industry: Control of Listeria monocytogenes in Ready-To-Eat Foods. Although still in draft form, the document essentially offers a “three strikes and you’re out” approach for detection of Listeria species in a food production plant. In other words, if there is a new identification of Listeria on a non-food contact surface (for example), this is not the end of the world. Intensified sampling and/or cleaning might be mandated, but production does not necessarily need to halt.
Many people have praised this approach, and for good reason: The odds that an environmental sample is positive for Listeria monocytogenes are generally low, as are the odds of a consumer being exposed to a sufficient number of organisms to become compromised (since the vast majority of people eating the food are healthy). Moreover, many products require further preparation and heating by consumers prior to consumption. Thus, even when Listeria may be present, the risk to consumers is often generally low.
If FDA finds Listeria in a food facility or in its review of a company’s environmental monitoring records, then FDA often times responds aggressively. Sometimes, some would argue, too aggressively. It is likely that all food companies would likely test more frequently for Listeria in food facilities, and work harder to find Listeria, if FDA did not take such a critical view of zero-tolerance positive testing results.
This is a major reason why the approach of other countries might be better in the long run. Canada, Australia, and the European Union (via its laboratory Guidance Document), all use an allowable limit of <100 CFU/gm for Listeria monocytogenes.
While the focus of this article is FDA, it is important to note that for dual-jurisdiction plants, USDA quietly watches, and sometimes follows the lead of, FDA. USDA conducts its own environmental sampling for enforcement purposes, but for the most part the agency seems to keep an eye out for what action FDA is pursuing. USDA often asks companies for their FDA data with regard to Listeria, even if the company is not legally obliged to share the data with them.
It is also important to note that this article is not advocating a cavalier attitude toward Listeria. A company cannot get a “hit” or “two” and then think they can take some action to avoid the “third strike.” This is indeed a road to perdition, not only for the company but for public health. Rather, the company needs to have a scientifically justified corrective action plan in place, and a very active environmental monitoring program for the organism.
That said, FDA risk becomes even worse if a company has WGS data, and doubly worse should the company know that the sequence of their Listeria sample matches a sequence in FDA’s GenomeTrakr database. This would mean that the company knows of a linkage, and one which may or may not implicate the company. Should FDA be told, or should the company solve the problem and move on?
Let’s say that the company generated its data (and linked an environmental sample to a retail food or, worse, a clinical sample from a patient hospitalized in their immediate vicinity) under a protected status, e.g., attorney-client privilege. The company would seem justified in not sharing the data with FDA. But what if there is a public health problem down the road, and FDA uncovered that same data after the fact?
Almost no company wants to be in either pressure-cooker, i.e., having data that could escape into the public domain, or having data which could be discovered later. What is the result? The result is that few companies want to use WGS under these circumstances, no matter how helpful the data might be to public health protection.
Of course, this is the same reason food companies have their microbiology labs test only for Listeria spp. and not directly for Listeria monocytogenes (Lm). Again, as soon as the Lm notation appears, “zero tolerance” comes to mind. And, once that “Lm” designation is in the corporate files, it could be devastating for the company even if the company is taking all appropriate efforts to eradicate the bacteria from the premises.
This clash of paradigms is clearly a double-edged sword. FDA wants to encourage use of WGS to help minimize food safety risks to the public, but by its policy execution, it may be driving the industry away from a proven scientific tool that could achieve that very end.
My call to action, then, is for FDA to allow a better path for companies to use WGS testing, without having to face the consequences of an initial positive result. Perhaps, human isolates could be stripped from a custom public database that could be accessed by food companies. If this were made available, it seems likely that a significant number of additional companies would begin using WGS to solve contamination challenges in their facilities.
If nothing changes in the near term, then the use of WGS will likely remain low. That would be a shame, not just from the perspective of trying to banish Listeria from food facilities, but also due to the opportunity cost of spending so much time being outmaneuvered by these elusive bacteria.
Source: https://www.foodqualityandsafety.com