Data Integrity Matters | Why is Data Integrity a Hot Topic Now?


A bitter pill to swallow

Despite the uproar in the 1980s about the generic drug scandal (corruption in the approval process for new generic drugs), and then again in the 1990s about the Barr decision (laboratories found testing into compliance), the pharmaceutical industry appeared shocked when, in 2005, the now-defunct New Jersey generic drug manufacturer Able Laboratories was found to be releasing sub- or super-potent products based on falsified tests.

Its Quality Unit was found to have allowed poor quality products to be released based on laboratory tests that apparently demonstrated a pass, while the electronic systems used to create the data contained other versions of those same tests that clearly indicated a failure to meet specifications.

The company had already demonstrated that it had quality issues and the FDA was very aware of these; Able Laboratories had already been working through a Consent Decree between 1992 and 2002. During this time the company was under increased FDA oversight.

In order to satisfy the regulator and lift this Consent Decree, they had very regular inspections. Even after the decree was lifted, Able was subject to 11 additional plant inspections as they gained approvals for 25 new generic drugs in three years. It really looked like Able Laboratories was back in business.

Yet in 2005, patients were advised to seek alternative medications and an entire product line was recalled.

Soon afterwards, the FDA published its form 483. It demonstrated that in Able Laboratories’ very last inspection (by hearsay, the result of an internal whistle blower) massive evidence of laboratory record falsification had permitted the continual supply of adulterated product to the public, despite the FDA’s close supervision. One estimate was that tens of thousands of dosages had been released that did not meet their own specifications.

This was not the actions of one or two individuals but a concerted effort by both analysts and supervisors to falsify data delivered to the Quality Unit. And it had equally fooled the FDA, such that Able Laboratories had their Consent Decree lifted. (In 2007, four managers pleaded guilty to fraud.)

The Quality Unit was highly criticized for “lacking the authority” to fully investigate erroneous data such as false printouts from lab instruments that were submitted and filed in the company’s official documentation. But note: the lab was able to achieve all this because the Quality Unit depended on the paper printouts from electronic systems – printouts provided by the laboratory only after the electronic data was significantly and routinely doctored.

[bctt tweet=”Like footprints in the snow, electronic data systems improve traceability.” username=”WatersCorp”]

What the FDA did – which Able Lab’s Quality Unit never did – was examine the electronic records and the associated audit trails, where the evidence, the “electronic footprints” of the laboratory staff’s actions, was clearly evident. The Quality Unit not only lacked authority – more significantly, they lacked required digital recordkeeping knowledge.

That was 12 years ago, so why is all this a hot topic today all of a sudden? It seems that the tools to catch such fraud generally were not always in place (despite the 20-year anniversary of 21 CFR Part 11), and if they were, reviewers and regulators alike did not have the skills to investigate the electronic data forensically.

Learning how to interrogate original electronic records and breaking reliance on paper printouts takes time and effort. Many, many more companies will be caught out by skilled auditors if they continue to blindly trust static printed reports of highly selected results.

 

Read more articles in Heather Longden’s blog series, Data Integrity Matters.

More resources: