When modern GMPs were first developed in the 1970’s, few peripheral industries benefited more than the manufacturers of blue pens.
Our younger readers may already be lost at this point, so for their benefit I’ll explain: most early photocopiers could only reproduce in black and white. This meant that having both a photocopy and an original document printed with black ink could be very difficult to tell apart, but using blue ink would always make it clear to the auditor which one was the original. This simple method was so reliable that some organizations began to require it. Life Hacks were a lot harder back then.
In the digital age, the very concept of an “original” anything can be difficult to define. We can now have virtual signatures on ephemeral documents made out of nothing more than ones and zeroes and properly placed points of light. Digital copying is pervasive, easy, and in many cases unavoidable. The true original iteration of a digital file is difficult to pin down, and as software becomes more and more sophisticated, the job may be getting even trickier.
Just the same, “original,” documentation remains fundamental to data integrity, and the ability to identify the original version of a piece of information remains as critical as ever. When we consider this in the GxP context, the risks to patient safety and product quality have to be evaluated and managed carefully. The software used to create original records must therefore be uniquely controlled. It must not only support the security, audit trail, and electronic signature requirements of regulations like 21 CFR Part 11 and Annex 11, but it must also enable configurations that are supportive of record integrity.
Consider this: a lab system’s workflow ends with a printed PDF file. Any user with a handful of relatively common and innocuous computer skills could take a screenshot of that PDF, modify it using any number of readily available programs, and save it as a new PDF without leaving any traces that the average person could detect.
In examples like this, the audit trail becomes the blue pen of the modern age. While it may be possible to prevent screenshots or otherwise control files more precisely on a local PC, the nature of cloud delivery makes this impractical on a larger scale. The role of a clear and secure audit trail in maintaining persistent, consistent, and truly periodic reviews is indispensable. All meaningful data integrity guidance sets the expectation that audit trails must be reviewed to confirm the originality of critical records.
Modern lab systems – especially those that are SaaS delivered – must make provisions for the reality that audit trail collection and archiving are essential parts of the forensic chain of custody and are key to establishing a records originality. The most obvious emerging technology for maintaining record fidelity is blockchain, and while the promise is, well, promising, it may be some time before we see functional commercialization for laboratories.
As with all of the Data Integrity principles, the originality of records can only be assured by having a well-trained team that is cognizant of the application and necessity of ALCOA, and a robust data integrity auditing program to consistently challenge its effectiveness.
By early September we’ll have the Data Integrity Blog Series all wrapped up with a real no-brainer, the final “A” in ALCOA: accurate. How hard can it really be to just make sure your information is correct? Probably harder than you think. Join us soon for the exciting conclusion.